ANIMATION DATA GENERATING METHOD, APPARATUS, AND
ELECTRONIC DEVICE
FIELD OF THE TECHNOLOGY
The present disclosure relates to the field of
animation data processing technologies, and in particular, to an animation data
generating method, apparatus, and electronic device.
BACKGROUND OF THE DISCLOSURE
At present, animation production is classified into 2D
animation production and 3D animation production. Mainstream 3D animation
production software on the market includes 3DS Max, Maya, and the like.
Although the foregoing animation production software can be used to produce
desirable 3D animations (which is implemented by creating a 3D model), it
requires professional animation production staff, and the production process is
complex and time-consuming.
Current 2D animation production technical solutions
generally include two implementation schemes: one scheme is hand-drawn
animation, in which sequence actions are drawn by hand frame by frame to
produce animation; in another scheme, sequence frames exported from an existing
3D model are used as materials for producing 2D animation. For example, a 3D
model is used to export sequence frames, and the exported sequence frames are
subject to manual art modification. Specifically, the 3D model is used to
produce maps and actions, the maps and actions are exported as sequence images
frame by frame, the sequence images are subject to art modification frame by
frame, and finally, a 2D animation is produced.
Although the foregoing 2D animation production process
is relatively simple, the two existing technical solutions described above
still consume a lot of manpower and time. Hand-drawn animation consumes a large
amount of art manpower, and has a high requirement on a hand drawing
capability; and the manner of exporting sequence frames by using a 3D model and
performing manual art modification on the exported sequence frames has lower
requirements on the hand drawing capability and time, but it still requires a
high hand drawing capability and consumes a lot of time.
SUMMARY
An animation data generating method, including the
following steps:
scanning sequence frame pictures exported from a 3D
model, to parse out valid small pictures;
synthesizing each parsed-out valid small picture into
a large picture according to a preset rule;
generating sequence frame data according to related
attribute information of each valid small picture in the large picture; and
generating 2D animation data out of the sequence
frame data.
An animation data generating apparatus,
including:
a scanning module, configured to scan sequence frame
pictures exported from a 3D model, to parse out valid small pictures;
a large picture synthesizing module, configured to
synthesize each parsed-out valid small picture into a large picture according
to a preset rule;
a sequence frame data generating module, configured
to generate sequence frame data according to related attribute information of
each valid small picture in the large picture; and
an animation data generating module, configured to
generate 2D animation data out of the sequence frame data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic structural diagram of a working
environment of an electronic device in which an animation data generating
apparatus according to an embodiment of the present invention is located;
FIG. 2 is a schematic flowchart of an animation data
generating method according to an embodiment of the present invention;
FIG. 3 and FIG. 4 are schematic diagrams of valid
pictures obtained by scanning sequence frame pictures exported from a 3D model
according to an embodiment of the present invention; and
FIG. 5 is a schematic structural diagram of an
animation data generating apparatus according to an embodiment of the present
invention.
DESCRIPTION OF EMBODIMENTS
To make the objectives, technical solutions and
beneficial effects of the present disclosure clearer, the present disclosure is
described in further detail below with reference to the accompanying drawings
and embodiments. It should be understood that the specific embodiments
described herein are merely for illustrating the present disclosure but not
intended to limit the present disclosure.
In the following description, unless otherwise
stated, the embodiments of the present disclosure will be described by
referring to steps and symbol indications of operations executed by one or more
computers. Therefore, it can be understood that such steps and operations
sometimes referred to as being executed by computer include manipulation by a
computer's processing unit of electrical signals representing data in
structured form. This manipulation converts data or maintains it in a location
in the computer's memory system, and this reconfigures or changes computer
operations in a manner understood by a person skilled in the art. The data
structure maintaining the data is the physical location of a memory having
specific attributes defined by the data format. However, although the present
disclosure is described in the above context, it does not at all signify
limitation. As a person skilled in the art will understand, the steps and
operations described below may also be realized using hardware.
As used in the present application, the terms
"component," "module," "system" and the like are likewise intended to refer to
a computer-related entity, either hardware, a combination of hardware and
software, software, or software in execution. For example, a component may be,
but is not limited to being, a process running on a processor, a processor, an
object, an executable, a thread of execution, a program, and/or a computer. By
way of illustration, both an application running on a controller and the
controller can be a component. One or more components may reside within a
process and/or thread of execution and a component may be localized on one
computer and/or distributed between two or more computers.
Furthermore, the claimed subject matter may be
implemented as a method, an apparatus, or an article of manufacture using
standard programming and/or engineering techniques to produce software,
firmware, hardware, or any combination thereof to control a computer to
implement the method, apparatus, or article of manufacture of the disclosed
subject matter. The term "article of manufacture" as used herein is intended to
encompass a computer program accessible from any computer-readable device,
carrier, or access media. Of course, a person skilled in the art will recognize
that many modifications may be made to this configuration without departing
from the scope or spirit of the claimed subject matter.
FIG. 1 and the following discussion are intended to
provide a brief, general description of a working environment of an electronic
device in which the animation data generating apparatus according to the
present disclosure is located. The working environment of FIG. 1 is merely an
instance of a suitable working environment but is not intended to suggest any
limitation on the usage or function scope of the working environment. An
instance electronic device 112 includes, but is not limited to, a personal
computer, a server computer, a hand-held or laptop computer, a mobile device
(such as a mobile phone, a personal digital assistant (PDA), or a media
player), a multi-processor system, a consumer electronic device, a
minicomputer, a mainframe computer, a distributed computing environment
including any of the foregoing system or device, and the like.
Though not required, the embodiment is described in a
general background in which a "computer readable instruction" is executed by
one or more electronic devices. The computer readable instruction is
distributed by using a computer readable medium (which is discussed in the
following). The computer readable instruction may be implemented as a program
module, for example, a function, an object, an application programming
interface (API), or a data structure for executing a specific task or
implementing a specific abstract data type. Typically, the function of the
computer readable instruction may be randomly combined or distributed in
various environments.
FIG. 1 shows an instance of an electronic device 112
including one or more embodiments of the animation data generating apparatus of
the present disclosure. In one configuration, the electronic device 112
includes at least one processor unit 116 and a memory 118. According to the
precise configuration and type of the electronic device, the memory 118 may be
a volatile memory (for example, a random access memory (RAM)), a non-volatile
memory (for example, a read-only memory (ROM) or a flash memory), or a
combination thereof. The configuration is shown by a dashed line 114 in FIG.
1.
In other embodiments, the electronic device 112 may
include an additional feature and/or function. For example, the device 112 may
further include an additional storage apparatus (for example, removable and/or
non-removable), which includes, but is not limited to, a magnetic storage
apparatus, an optical storage apparatus, and the like. Such additional storage
apparatus is represented by a storage apparatus 120 in FIG. 1. In one
embodiment, the computer readable instruction for implementing one or more
embodiments provided herein may be stored in the storage apparatus 120. The
storage apparatus 120 may further store other computer readable instructions
for implementing an operating system, an application program, and the like. The
computer readable instruction may be loaded into the memory 118 and executed
by, for example, the processing unit 116.
The technical term "computer readable medium" used
herein includes a computer storage medium. The computer storage medium includes
volatile and non-volatile, removable and non-removable media implemented by any
method or technology used for storing a computer readable instruction or other
information such as data. The memory 118 and the storage apparatus 120 are
instances of the computer storage medium. The computer storage medium includes,
but is no limited to, a RAM, a ROM, an electrically erasable programmable
read-only memory (EEPROM), a flash memory or other memory technologies; a
compact disc read-only memory (CD-ROM), a digital versatile disc (DVD) or other
optical storage apparatuses; a cassette tape, a magnetic tape, a magnetic disk
storage apparatus or other magnetic storage devices; or any other medium that
can be used to store expected information and can be accessed by the electronic
device 112. Any such computer storage medium may be a part of the electronic
device 112.
The electronic device 112 may further include a
communication connection 126 that allows the electronic device 112 to
communicate with another device. The communication connection 126 may include,
but is not limited to, a modem, a network interface card (NIC), an integrated
network interface, a radio frequency transmitter/receiver, an infrared port, a
universal serial bus (USB) connection, or another interface for connecting the
electronic device 112 to another electronic device. The communication
connection 126 may include a wired connection or a wireless connection. The
communication connection 126 may transmit and/or receive a communication
medium.
The term "computer readable medium" may include the
communication medium. The communication medium typically includes a computer
readable instruction or other data in a "modulated data signal" such as a
carrier wave or other transmission mechanisms, and includes any information
transport medium. The term "modulated data signal" may include a signal that
has one or more of its characteristics set or changed in such a manner as to
encode information in the signal.
The electronic device 112 may include an input device
124, such as a keyboard, a mouse, a pen, a voice input device, a touch input
device, an infrared camera, a video input device, and/or any other input
device. The device 112 may also include an output device 122, for example, one
or more displays, loudspeakers, and printers, and/or any other output device.
The input device 124 and the output device 122 may be connected to the
electronic device 112 by a wired connection, a wireless connection, or any
combination thereof. In an embodiment, the input device or output device from
another electronic device may be used as the input device 124 or the output
device 122 of the electronic device 112.
Components of the electronic device 112 may be
connected by various interconnects (such as a bus). Such interconnects may
include a peripheral component interconnect (PCI) (such as a fast PCI), a USB,
a fire wire (IEEE 1394), an optical bus structure, and the like. In other
embodiment, the components of the electronic device 112 may be interconnected
through a network. For example, the memory 118 may consist of multiple physical
memory units located at different physical positions and interconnected through
a network.
A person skilled in the art may realize that, a
storage device for storing the computer readable instruction may be distributed
across networks. For example, an electronic device 130 that can be accessed
through a network 128 may store the computer readable instruction for
implementing one or more embodiments provided by the present disclosure. The
electronic device 112 may access the electronic device 130 and download all or
a part of the computer readable instruction for execution. Alternatively, the
electronic device 112 may download multiple computer readable instructions as
required, or some instructions may be executed at the electronic device 112 and
some instructions may be executed at the electronic device 130.
Various operations of embodiments are provided
herein. In one embodiment, one or more of the operations described may
constitute computer readable instructions stored on one or more computer
readable media, which if executed by a computing device, will cause the
computing device to perform the operations described. The order in which some
or all of the operations are described are not be construed as to imply that
these operations are necessarily order dependent. Alternative ordering will be
appreciated by a person skilled in the art having the benefit of this
description. Further, it will be understood that not all operations are
necessarily present in each embodiment provided herein.
Moreover, the word "exemplary" is used herein to mean
serving as an example, instance, or illustration. Any aspect or design
described herein as "exemplary" is not necessarily to be construed as
advantageous over other aspects or designs. Rather, use of the word "exemplary"
is intended to present concepts in a concrete fashion. As used in this
application, the term "or" is intended to mean an inclusive "or" rather than an
exclusive "or". That is, unless specified otherwise, or clear from context, "X
employs A or B" is intended to mean any of the natural inclusive permutations.
That is, if X employs A; X employs B; or X employs both A and B, then "X
employs A or B" is satisfied under any of the foregoing instances.
Also, although the disclosure has been shown and
described with respect to one or more implementations, equivalent alterations
and modifications will occur to a person skilled in the art based upon a
reading and understanding of this specification and the annexed drawings. The
disclosure includes all such modifications and alterations and is limited only
by the scope of the following claims. In particular regard to the various
functions performed by the above described components (e.g., elements,
resources, etc.), the terms used to describe such components are intended to
correspond, unless otherwise indicated, to any component which performs the
specified function of the described component (e.g., that is functionally
equivalent), even though not structurally equivalent to the disclosed structure
which performs the function in the herein illustrated exemplary implementations
of the disclosure. In addition, while a particular feature of the disclosure
may have been disclosed with respect to only one of several implementations,
such feature may be combined with one or more other features of the other
implementations as may be desired and advantageous for any given or particular
application. Furthermore, to the extent that the terms "include", "have",
"contain", or variants thereof are used in either the detailed description or
the claims, such terms are intended to be inclusive in a manner similar to the
term "comprise".
In the embodiments of the present invention, a
picture pixel scanning technology is used to scan pixels of sequence frame
pictures exported from a 3D model and invalid pixels are removed, thereby
parsing out valid small pictures, and these parsed-out valid small pictures are
combined into a large picture; sequence frame data is generated according to
related attribute information of each valid small picture in the large picture,
and finally, 2D animation data is automatically generated out of the sequence
frame data. With the foregoing technical solution, the present disclosure
achieves an automatic process of rendering 2D animation data in a 3D manner,
which saves a lot of time spent on art modification of a 2D animation, and
significantly lowers the art requirement.
Referring to FIG. 2, FIG. 2 is an implementation
process of an animation data generating method according to an embodiment of
the present invention, and the method includes the following steps:
In step S101, sequence frame pictures exported from a
3D model are scanned, to parse out valid small pictures.
In this embodiment of the present invention, an
existing 3D model may be used to export sequence frame pictures; as shown in
FIG. 3, FIG. 3 shows frame-by-frame pictures exported from the 3D model, and
each frame of picture is subject to processing shown in FIG. 4 (where FIG. 4
only shows a processing process on the leftmost picture in FIG. 3). The
sequence frame picture exported from the 3D model is generally a square
picture, and a valid picture of each frame is located in the middle of the
frame. Picture data actually used in an animation is the picture in the middle,
and therefore, useless peripheral pixels of the square picture can be removed
by using a picture pixel scanning technology, thereby parsing out a valid small
picture. Preferably, in this step, normalization processing may be performed on
all the parsed-pout valid small pictures, that is, size parameters of all the
valid small pictures are set to a same value, to facilitate subsequent
synthesizing of the valid small pictures into a large picture.
In this embodiment of the present invention, the step
of scanning sequence frame picture exported from a 3D model to parse out valid
small pictures includes:
removing all invalid pixels from the scanned sequence
frame pictures exported from the 3D model, to obtain small pictures of valid
pixels, where pixels whose data color values are 0x00000000 (RGBA) are defined
as invalid pixels, and pixels whose data color values are not 0x00000000(RGBA)
are defined as the valid pixels.
Further, pixels of the sequence frame pictures
exported from the 3D model are scanned one by one along a preset direction (for
example, along four directions, namely, from top to bottom, from bottom to top,
from left to right, and from right to left), until valid pixel data is scanned,
and invalid pixels are removed, thereby acquiring small pictures of valid
pixels.
As shown in FIG. 4, four dashed lines in FIG. 4 are
scanning lines, and the scan is started from top to bottom, from bottom to top,
from left to right, and from right to left separately, until scanning line in
each direction can meet at least one valid pixel, and then, the scan in this
direction is stopped. The rectangle with intersections of the four dashed lines
as four corners is the valid picture obtained by means of scan.
In this embodiment of the present invention, a
condition for determining whether a picture is valid is determining whether an
Alpha channel of the color value is 0, and the formula is:
(pixel value&0x000000FF) !=0
When it is determined that the Alpha channel of the
color value is 0, it is considered that the pixel is an invalid pixel; and when
it is determined that the Alpha channel of the color value is not 0, it is
considered that the pixel is a valid pixel.
In step S102, each parsed-out valid small picture is
synthesized into a large picture according to a preset rule.
As an embodiment of the present invention, the step
of synthesizing each parsed-out valid small picture into a large picture
according to a preset rule includes:
setting a serial number for each valid small picture
in advance; and
arranging each parsed-out valid small picture
according to a sequence of the serial numbers to form a large picture, the
small pictures not overlapping each other.
As another embodiment of the present invention, the
step of synthesizing each parsed-out valid small picture into a large picture
according to a preset rule includes:
arranging each parsed-out valid small picture
according to a scanning sequence to form a large picture, the small pictures
not overlapping each other.
As still another embodiment of the present invention,
the step of synthesizing each parsed-out valid small picture into a large
picture according to a preset rule includes:
arranging the parsed-out valid small pictures
according to sequence frame numbers to form a large picture, the small pictures
not overlapping each other.
For example, a large rectangle is formed according to
a maximum length and a maximum width of each valid small picture, an empty
picture having an area equivalent to that of the large rectangle is created, a
serial number is set for each valid small picture, and then these small
pictures are placed at corresponding positions of the empty picture according
to the serial numbers. In this way, a large picture including all the valid
small pictures is formed.
However, it can be understood that, no matter which
manner is used, any solution shall fall within the protection scope of the
present invention as long as the valid small pictures can be arranged to form a
large picture while the valid small pictures do not overlap each other. By
synthesizing small pictures into a large picture can reduce the number of files
of 2D animation data.
In step S103, sequence frame data is generated
according to related attribute information of each valid small picture in the
large picture.
In this embodiment of the present invention, the
related attribute information of the small picture in the large picture mainly
includes, but is not limited to, the following content: a serial number of the
small picture, coordinate information of the small picture in the large
picture, and the width and height of the small picture.
In this embodiment of the present invention, the
sequence frame data is modular data used in animation data.
In this embodiment of the present invention, the step
of generating sequence frame data according to related attribute information of
each valid small picture in the large picture includes:
acquiring, from attributes of the large picture,
coordinate information of the small picture in the large picture, width
information and height information of the small picture, and a serial number of
the small picture; and
generating the sequence frame data out of the
coordinate information, the width information and height information of the
small picture, and the serial number of the small picture.
In step S104, 2D animation data is generated out of
the sequence frame data.
In this embodiment of the present invention, the step
of generating 2D animation data out of the sequence frame data includes:
generating 2D animation data out of the sequence
frame data according to a sequence of the serial numbers.
In this embodiment of the present invention, a tool
may be used to organize the sequence frame data as animation data corresponding
to game development. In an actual development process, the sequence frame data
needs to be organized according to a data structure of a corresponding game
animation. In the game, an animation effect is achieved by playing the sequence
frames of the animation frame by frame.
The foregoing animation data generating method can
achieve an automatic process of rendering 2D animation data in a 3D manner, and
completely saves a lot of time spent on art modification of a 2D animation, and
significantly lowers an art requirement.
Referring to FIG. 5, FIG. 5 is a schematic structural
diagram of an animation data generating apparatus according to an embodiment of
the present invention. For ease of description, FIG. 5 only shows parts related
to this embodiment of the present invention. The animation data generating
apparatus includes: a scanning module 101, a large picture synthesizing module
102, a sequence frame data generating module 103, and an animation data
generating module 104. The animation data generating apparatus may be a
software unit embedded in the electronic device, a hardware unit, or a unit
combining hardware and software.
The scanning module 101 is configured to scan
sequence frame pictures exported from a 3D model, to parse out valid small
pictures.
The large picture synthesizing module 102 is
configured to synthesize each parsed-out valid small picture into a large
picture according to a preset rule.
The sequence frame data generating module 103 is
configured to generate sequence frame data according to related attribute
information of each valid small picture in the large picture.
In this embodiment of the present invention, the
sequence frame data is modular data used in animation data.
The animation data generating module 104 is
configured to generate 2D animation data out of the sequence frame data.
In this embodiment of the present invention, as shown
in FIG. 4, the sequence frame picture exported from the 3D model is generally a
rectangular picture, and a valid picture of each frame is located in the middle
of the frame. Picture data actually used in an animation is the picture in the
middle, and therefore, useless peripheral pixels of the rectangular picture can
be removed by using a picture pixel scanning technology, thereby parsing out a
valid small picture.
In this embodiment of the present invention,
the scanning module 101 is specifically configured to
remove all invalid pixels from the scanned sequence frame pictures exported
from the 3D model, to obtain small pictures of valid pixels, where pixels whose
data color values are 0x00000000 (RGBA) are defined as invalid pixels, and
pixels whose data color values are not 0x00000000(RGBA) are defined as the
valid pixels.
Further, the scanning module 101 is specifically
configured to scan, one by one along a preset direction (for example, from top
to bottom, from bottom to top, from left to right, or from right to left),
pixels of the sequence frame pictures exported from the 3D model, until valid
pixel data is scanned, and remove invalid pixels, to acquire small pictures of
valid pixels.
As shown in FIG. 4, four dashed lines in FIG. 4 are
scanning lines, and the scan is started from top to bottom, from bottom to top,
from left to right, and from right to left separately, until the scanning line
in each direction can meet at least one valid pixel, and then, the scan in this
direction is stopped. The rectangle with intersections of the four dashed lines
as four corners is the valid picture obtained by means of scan.
In this embodiment of the present invention, a
condition for determining whether a picture is valid is determining whether an
Alpha channel of the color value is 0, and the formula is:
(pixel value&0x000000FF) !=0
When it is determined that the Alpha channel of the
color value is 0, it is considered that the pixel is an invalid pixel; and when
it is determined that the Alpha channel of the color value is not 0, it is
considered that the pixel is a valid pixel.
As an embodiment of the present invention, the
animation data generating apparatus further includes:
a serial number setting module, configured to set a
serial number for each valid small picture;
where the large picture synthesizing module 102 is
further configured to arrange each parsed-out valid small picture according to
a sequence of the serial numbers to form a large picture, the small pictures
not overlapping each other.
As another embodiment of the present invention,
the large picture synthesizing module 102 is
specifically configured to arrange each parsed-out valid small picture
according to a scanning sequence to form a large picture, the small pictures
not overlapping each other.
As still another embodiment of the present
invention,
the large picture synthesizing module 102 is
specifically configured to arrange the parsed-out valid small pictures
according to sequence frame numbers to form a large picture, the small pictures
not overlapping each other.
For example, a large rectangle is formed according to
a maximum length and a maximum width of each valid small picture, an empty
picture having an area equivalent to that of the large rectangle is created, a
serial number is set for each valid small picture, and then these small
pictures are placed at corresponding positions of the empty picture according
to the serial numbers. In this way, a large picture including all the valid
small pictures is formed.
However, it can be understood that, no matter which
manner is used, any solution shall fall within the protection scope of the
present invention as long as the valid small pictures can be arranged to form a
large picture while the valid small pictures do not overlap each other. By
synthesizing small pictures into a large picture can reduce the number of files
of 2D animation data.
As an embodiment of the present invention, the
animation data generating apparatus further includes: an acquiring module.
The acquiring module is configured to acquire, from
attributes of the large picture, coordinate information of the small picture in
the large picture, width information and height information of the small
picture, and a serial number of the small picture.
The sequence frame data generating module 103 is
further configured to generate the sequence frame data out of the coordinate
information, the width information and height information of the small picture,
and the serial number of the small picture.
The foregoing animation data generating apparatus can
achieve an automatic process of rendering 2D animation data in a 3D manner, and
completely saves a lot of time spent on art modification of a 2D animation, and
significantly lowers an art requirement.
An implementation process of the animation data
generating method provided by the embodiment of the present invention is
described in detail below.
First, pixels of sequence frame pictures exported
from a 3D model are scanned in four directions, namely, from top to the bottom,
from bottom to top, from left to right, and from right to left, until the
scanning line in each direction can meet at least one valid pixel, and then,
scan in this direction is stopped. Invalid pixels are removed from the scanned
sequence frame pictures exported from the 3D model, to obtain small pictures of
valid pixels; a rectangle with intersections of the four dashed lines in FIG. 4
as four corners is a valid picture obtained by scan.
Then, a large rectangle is formed according to a
maximum length and a maximum width of each valid small picture, an empty
picture having an area equivalent to that of the large rectangle is created, a
serial number is set for each valid small picture, and then these small
pictures are placed at corresponding positions of the empty picture according
to the serial numbers. In this way, a large picture including all the valid
small pictures is formed.
Then, coordinate information of the small picture in
the large picture, width information and height information of the small
picture, and a serial number of the small picture are acquired from attributes
of the large picture; and sequence frame data is generated out of the
coordinate information, the width information and height information of the
small picture, and the serial number of the small picture.
Finally, a tool is used to organize the sequence
frame data as animation data corresponding to game development.
In conclusion, in the present disclosure, a picture
pixel scanning technology is used to scan pixels of sequence frame pictures
exported from a 3D model and invalid pixels are removed, thereby parsing out
valid small pictures, and these parsed-out valid small pictures are combined
into a large picture; sequence frame data is generated according to related
attribute information of each valid small picture in the large picture, and
finally, 2D animation data is automatically generated out of the sequence frame
data. With the foregoing technical solution, the present disclosure achieves an
automatic process of rendering 2D animation data in a 3D manner, which saves a
lot of time spent on art modification of a 2D animation, and significantly
lowers the art requirement.
A person of ordinary skill in the art should
understand that, all of or a part of processes in the method according to the
embodiment may be implemented by a program instructing relevant hardware. The
program may be stored in a computer readable storage medium. The storage medium
may be a ROM/RAM, a magnetic disk, an optical disc, or the like.
The foregoing descriptions are merely preferred
embodiments of the present invention, but are not intended to limit the present
invention. Any modification, equivalent replacement, or improvement made within
the spirit and principle of the present invention shall fall within the
protection scope of the present invention.