WO2015169209A1 - Animation data generating method, apparatus, and electronic device - Google Patents

Animation data generating method, apparatus, and electronic device Download PDF

Info

Publication number
WO2015169209A1
WO2015169209A1 PCT/CN2015/078279 CN2015078279W WO2015169209A1 WO 2015169209 A1 WO2015169209 A1 WO 2015169209A1 CN 2015078279 W CN2015078279 W CN 2015078279W WO 2015169209 A1 WO2015169209 A1 WO 2015169209A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture
small
valid
pictures
sequence frame
Prior art date
Application number
PCT/CN2015/078279
Other languages
French (fr)
Inventor
Haibo Liu
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Priority to KR1020167027583A priority Critical patent/KR101810019B1/en
Priority to MYPI2016703808A priority patent/MY185734A/en
Publication of WO2015169209A1 publication Critical patent/WO2015169209A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/08Animation software package

Definitions

  • the present disclosure relates to the field of animation data processing technologies, and in particular, to an animation data generating method, apparatus, and electronic device.
  • animation production is classified into 2D animation production and 3D animation production.
  • Mainstream 3D animation production software on the market includes 3DS Max, Maya, and the like.
  • the foregoing animation production software can be used to produce desirable 3D animations (which is implemented by creating a 3D model), it requires professional animation production staff, and the production process is complex and time-consuming.
  • Current 2D animation production technical solutions generally include two implementation schemes: one scheme is hand-drawn animation, in which sequence actions are drawn by hand frame by frame to produce animation; in another scheme, sequence frames exported from an existing 3D model are used as materials for producing 2D animation.
  • a 3D model is used to export sequence frames, and the exported sequence frames are subject to manual art modification.
  • the 3D model is used to produce maps and actions, the maps and actions are exported as sequence images frame by frame, the sequence images are subject to art modification frame by frame, and finally, a 2D animation is produced.
  • An animation data generating method including the following steps:
  • An animation data generating apparatus including:
  • a scanning module configured to scan sequence frame pictures exported from a 3D model, to parse out valid small pictures
  • a large picture synthesizing module configured to synthesize each parsed-out valid small picture into a large picture according to a preset rule
  • sequence frame data generating module configured to generate sequence frame data according to related attribute information of each valid small picture in the large picture
  • an animation data generating module configured to generate 2D animation data out of the sequence frame data.
  • FIG. 1 is a schematic structural diagram of a working environment of an electronic device in which an animation data generating apparatus according to an embodiment of the present invention is located;
  • FIG. 2 is a schematic flowchart of an animation data generating method according to an embodiment of the present invention
  • FIG. 3 and FIG. 4 are schematic diagrams of valid pictures obtained by scanning sequence frame pictures exported from a 3D model according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an animation data generating apparatus according to an embodiment of the present invention.
  • steps and operations sometimes referred to as being executed by computer include manipulation by a computer's processing unit of electrical signals representing data in structured form. This manipulation converts data or maintains it in a location in the computer's memory system, and this reconfigures or changes computer operations in a manner understood by a person skilled in the art.
  • the data structure maintaining the data is the physical location of a memory having specific attributes defined by the data format.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a controller and the controller can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • the claimed subject matter may be implemented as a method, an apparatus, or an article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the method, apparatus, or article of manufacture of the disclosed subject matter.
  • article of manufacture as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or access media.
  • FIG. 1 and the following discussion are intended to provide a brief, general description of a working environment of an electronic device in which the animation data generating apparatus according to the present disclosure is located.
  • the working environment of FIG. 1 is merely an instance of a suitable working environment but is not intended to suggest any limitation on the usage or function scope of the working environment.
  • An instance electronic device 112 includes, but is not limited to, a personal computer, a server computer, a hand-held or laptop computer, a mobile device (such as a mobile phone, a personal digital assistant (PDA), or a media player), a multi-processor system, a consumer electronic device, a minicomputer, a mainframe computer, a distributed computing environment including any of the foregoing system or device, and the like.
  • PDA personal digital assistant
  • the embodiment is described in a general background in which a "computer readable instruction" is executed by one or more electronic devices.
  • the computer readable instruction is distributed by using a computer readable medium (which is discussed in the following).
  • the computer readable instruction may be implemented as a program module, for example, a function, an object, an application programming interface (API), or a data structure for executing a specific task or implementing a specific abstract data type.
  • API application programming interface
  • the function of the computer readable instruction may be randomly combined or distributed in various environments.
  • FIG. 1 shows an instance of an electronic device 112 including one or more embodiments of the animation data generating apparatus of the present disclosure.
  • the electronic device 112 includes at least one processor unit 116 and a memory 118.
  • the memory 118 may be a volatile memory (for example, a random access memory (RAM)), a non-volatile memory (for example, a read-only memory (ROM) or a flash memory), or a combination thereof.
  • RAM random access memory
  • ROM read-only memory
  • flash memory flash memory
  • the electronic device 112 may include an additional feature and/or function.
  • the device 112 may further include an additional storage apparatus (for example, removable and/or non-removable), which includes, but is not limited to, a magnetic storage apparatus, an optical storage apparatus, and the like.
  • additional storage apparatus is represented by a storage apparatus 120 in FIG. 1.
  • the computer readable instruction for implementing one or more embodiments provided herein may be stored in the storage apparatus 120.
  • the storage apparatus 120 may further store other computer readable instructions for implementing an operating system, an application program, and the like.
  • the computer readable instruction may be loaded into the memory 118 and executed by, for example, the processing unit 116.
  • the technical term "computer readable medium” used herein includes a computer storage medium.
  • the computer storage medium includes volatile and non-volatile, removable and non-removable media implemented by any method or technology used for storing a computer readable instruction or other information such as data.
  • the memory 118 and the storage apparatus 120 are instances of the computer storage medium.
  • the computer storage medium includes, but is no limited to, a RAM, a ROM, an electrically erasable programmable read-only memory (EEPROM), a flash memory or other memory technologies; a compact disc read-only memory (CD-ROM), a digital versatile disc (DVD) or other optical storage apparatuses; a cassette tape, a magnetic tape, a magnetic disk storage apparatus or other magnetic storage devices; or any other medium that can be used to store expected information and can be accessed by the electronic device 112. Any such computer storage medium may be a part of the electronic device 112.
  • the electronic device 112 may further include a communication connection 126 that allows the electronic device 112 to communicate with another device.
  • the communication connection 126 may include, but is not limited to, a modem, a network interface card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a universal serial bus (USB) connection, or another interface for connecting the electronic device 112 to another electronic device.
  • the communication connection 126 may include a wired connection or a wireless connection.
  • the communication connection 126 may transmit and/or receive a communication medium.
  • the term “computer readable medium” may include the communication medium.
  • the communication medium typically includes a computer readable instruction or other data in a "modulated data signal” such as a carrier wave or other transmission mechanisms, and includes any information transport medium.
  • modulated data signal may include a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • the electronic device 112 may include an input device 124, such as a keyboard, a mouse, a pen, a voice input device, a touch input device, an infrared camera, a video input device, and/or any other input device.
  • the device 112 may also include an output device 122, for example, one or more displays, loudspeakers, and printers, and/or any other output device.
  • the input device 124 and the output device 122 may be connected to the electronic device 112 by a wired connection, a wireless connection, or any combination thereof. In an embodiment, the input device or output device from another electronic device may be used as the input device 124 or the output device 122 of the electronic device 112.
  • Components of the electronic device 112 may be connected by various interconnects (such as a bus). Such interconnects may include a peripheral component interconnect (PCI) (such as a fast PCI), a USB, a fire wire (IEEE 1394), an optical bus structure, and the like. In other embodiment, the components of the electronic device 112 may be interconnected through a network.
  • the memory 118 may consist of multiple physical memory units located at different physical positions and interconnected through a network.
  • a storage device for storing the computer readable instruction may be distributed across networks.
  • an electronic device 130 that can be accessed through a network 128 may store the computer readable instruction for implementing one or more embodiments provided by the present disclosure.
  • the electronic device 112 may access the electronic device 130 and download all or a part of the computer readable instruction for execution.
  • the electronic device 112 may download multiple computer readable instructions as required, or some instructions may be executed at the electronic device 112 and some instructions may be executed at the electronic device 130.
  • one or more of the operations described may constitute computer readable instructions stored on one or more computer readable media, which if executed by a computing device, will cause the computing device to perform the operations described.
  • the order in which some or all of the operations are described are not be construed as to imply that these operations are necessarily order dependent. Alternative ordering will be appreciated by a person skilled in the art having the benefit of this description. Further, it will be understood that not all operations are necessarily present in each embodiment provided herein.
  • the word "exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word “exemplary” is intended to present concepts in a concrete fashion.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, "X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing instances.
  • a picture pixel scanning technology is used to scan pixels of sequence frame pictures exported from a 3D model and invalid pixels are removed, thereby parsing out valid small pictures, and these parsed-out valid small pictures are combined into a large picture; sequence frame data is generated according to related attribute information of each valid small picture in the large picture, and finally, 2D animation data is automatically generated out of the sequence frame data.
  • FIG. 2 is an implementation process of an animation data generating method according to an embodiment of the present invention, and the method includes the following steps:
  • step S101 sequence frame pictures exported from a 3D model are scanned, to parse out valid small pictures.
  • an existing 3D model may be used to export sequence frame pictures; as shown in FIG. 3, FIG. 3 shows frame-by-frame pictures exported from the 3D model, and each frame of picture is subject to processing shown in FIG. 4 (where FIG. 4 only shows a processing process on the leftmost picture in FIG. 3).
  • the sequence frame picture exported from the 3D model is generally a square picture, and a valid picture of each frame is located in the middle of the frame. Picture data actually used in an animation is the picture in the middle, and therefore, useless peripheral pixels of the square picture can be removed by using a picture pixel scanning technology, thereby parsing out a valid small picture.
  • normalization processing may be performed on all the parsed-pout valid small pictures, that is, size parameters of all the valid small pictures are set to a same value, to facilitate subsequent synthesizing of the valid small pictures into a large picture.
  • the step of scanning sequence frame picture exported from a 3D model to parse out valid small pictures includes:
  • pixels whose data color values are 0x00000000 (RGBA) are defined as invalid pixels, and pixels whose data color values are not 0x00000000(RGBA) are defined as the valid pixels.
  • pixels of the sequence frame pictures exported from the 3D model are scanned one by one along a preset direction (for example, along four directions, namely, from top to bottom, from bottom to top, from left to right, and from right to left), until valid pixel data is scanned, and invalid pixels are removed, thereby acquiring small pictures of valid pixels.
  • a preset direction for example, along four directions, namely, from top to bottom, from bottom to top, from left to right, and from right to left
  • four dashed lines in FIG. 4 are scanning lines, and the scan is started from top to bottom, from bottom to top, from left to right, and from right to left separately, until scanning line in each direction can meet at least one valid pixel, and then, the scan in this direction is stopped.
  • the rectangle with intersections of the four dashed lines as four corners is the valid picture obtained by means of scan.
  • a condition for determining whether a picture is valid is determining whether an Alpha channel of the color value is 0, and the formula is:
  • step S102 each parsed-out valid small picture is synthesized into a large picture according to a preset rule.
  • the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
  • the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
  • the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
  • a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
  • any solution shall fall within the protection scope of the present invention as long as the valid small pictures can be arranged to form a large picture while the valid small pictures do not overlap each other.
  • synthesizing small pictures into a large picture can reduce the number of files of 2D animation data.
  • step S103 sequence frame data is generated according to related attribute information of each valid small picture in the large picture.
  • the related attribute information of the small picture in the large picture mainly includes, but is not limited to, the following content: a serial number of the small picture, coordinate information of the small picture in the large picture, and the width and height of the small picture.
  • sequence frame data is modular data used in animation data.
  • the step of generating sequence frame data according to related attribute information of each valid small picture in the large picture includes:
  • step S104 2D animation data is generated out of the sequence frame data.
  • the step of generating 2D animation data out of the sequence frame data includes:
  • a tool may be used to organize the sequence frame data as animation data corresponding to game development.
  • the sequence frame data needs to be organized according to a data structure of a corresponding game animation.
  • an animation effect is achieved by playing the sequence frames of the animation frame by frame.
  • the foregoing animation data generating method can achieve an automatic process of rendering 2D animation data in a 3D manner, and completely saves a lot of time spent on art modification of a 2D animation, and significantly lowers an art requirement.
  • FIG. 5 is a schematic structural diagram of an animation data generating apparatus according to an embodiment of the present invention.
  • the animation data generating apparatus includes: a scanning module 101, a large picture synthesizing module 102, a sequence frame data generating module 103, and an animation data generating module 104.
  • the animation data generating apparatus may be a software unit embedded in the electronic device, a hardware unit, or a unit combining hardware and software.
  • the scanning module 101 is configured to scan sequence frame pictures exported from a 3D model, to parse out valid small pictures.
  • the large picture synthesizing module 102 is configured to synthesize each parsed-out valid small picture into a large picture according to a preset rule.
  • the sequence frame data generating module 103 is configured to generate sequence frame data according to related attribute information of each valid small picture in the large picture.
  • sequence frame data is modular data used in animation data.
  • the animation data generating module 104 is configured to generate 2D animation data out of the sequence frame data.
  • the sequence frame picture exported from the 3D model is generally a rectangular picture, and a valid picture of each frame is located in the middle of the frame.
  • Picture data actually used in an animation is the picture in the middle, and therefore, useless peripheral pixels of the rectangular picture can be removed by using a picture pixel scanning technology, thereby parsing out a valid small picture.
  • the scanning module 101 is specifically configured to remove all invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels, where pixels whose data color values are 0x00000000 (RGBA) are defined as invalid pixels, and pixels whose data color values are not 0x00000000(RGBA) are defined as the valid pixels.
  • the scanning module 101 is specifically configured to scan, one by one along a preset direction (for example, from top to bottom, from bottom to top, from left to right, or from right to left), pixels of the sequence frame pictures exported from the 3D model, until valid pixel data is scanned, and remove invalid pixels, to acquire small pictures of valid pixels.
  • a preset direction for example, from top to bottom, from bottom to top, from left to right, or from right to left
  • four dashed lines in FIG. 4 are scanning lines, and the scan is started from top to bottom, from bottom to top, from left to right, and from right to left separately, until the scanning line in each direction can meet at least one valid pixel, and then, the scan in this direction is stopped.
  • the rectangle with intersections of the four dashed lines as four corners is the valid picture obtained by means of scan.
  • a condition for determining whether a picture is valid is determining whether an Alpha channel of the color value is 0, and the formula is:
  • the animation data generating apparatus further includes:
  • a serial number setting module configured to set a serial number for each valid small picture
  • the large picture synthesizing module 102 is further configured to arrange each parsed-out valid small picture according to a sequence of the serial numbers to form a large picture, the small pictures not overlapping each other.
  • the large picture synthesizing module 102 is specifically configured to arrange each parsed-out valid small picture according to a scanning sequence to form a large picture, the small pictures not overlapping each other.
  • the large picture synthesizing module 102 is specifically configured to arrange the parsed-out valid small pictures according to sequence frame numbers to form a large picture, the small pictures not overlapping each other.
  • a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
  • any solution shall fall within the protection scope of the present invention as long as the valid small pictures can be arranged to form a large picture while the valid small pictures do not overlap each other.
  • synthesizing small pictures into a large picture can reduce the number of files of 2D animation data.
  • the animation data generating apparatus further includes: an acquiring module.
  • the acquiring module is configured to acquire, from attributes of the large picture, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture.
  • the sequence frame data generating module 103 is further configured to generate the sequence frame data out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
  • the foregoing animation data generating apparatus can achieve an automatic process of rendering 2D animation data in a 3D manner, and completely saves a lot of time spent on art modification of a 2D animation, and significantly lowers an art requirement.
  • pixels of sequence frame pictures exported from a 3D model are scanned in four directions, namely, from top to the bottom, from bottom to top, from left to right, and from right to left, until the scanning line in each direction can meet at least one valid pixel, and then, scan in this direction is stopped.
  • Invalid pixels are removed from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels; a rectangle with intersections of the four dashed lines in FIG. 4 as four corners is a valid picture obtained by scan.
  • a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
  • a picture pixel scanning technology is used to scan pixels of sequence frame pictures exported from a 3D model and invalid pixels are removed, thereby parsing out valid small pictures, and these parsed-out valid small pictures are combined into a large picture; sequence frame data is generated according to related attribute information of each valid small picture in the large picture, and finally, 2D animation data is automatically generated out of the sequence frame data.
  • the present disclosure achieves an automatic process of rendering 2D animation data in a 3D manner, which saves a lot of time spent on art modification of a 2D animation, and significantly lowers the art requirement.
  • the program may be stored in a computer readable storage medium.
  • the storage medium may be a ROM/RAM, a magnetic disk, an optical disc, or the like.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An animation data generating method, apparatus, and electronic device, the method includes the following steps: scanning sequence frame pictures exported from a 3D model, to parse out valid small pictures (S101); synthesizing each parsed-out valid small picture into a large picture according to a preset rule (S102); generating sequence frame data according to related attribute information of each valid small picture in the large picture (S103); and generating 2D animation data out of the sequence frame data (S104).

Description

ANIMATION DATA GENERATING METHOD, APPARATUS, AND ELECTRONIC DEVICE
FIELD OF THE TECHNOLOGY
The present disclosure relates to the field of animation data processing technologies, and in particular, to an animation data generating method, apparatus, and electronic device.
BACKGROUND OF THE DISCLOSURE
At present, animation production is classified into 2D animation production and 3D animation production. Mainstream 3D animation production software on the market includes 3DS Max, Maya, and the like. Although the foregoing animation production software can be used to produce desirable 3D animations (which is implemented by creating a 3D model), it requires professional animation production staff, and the production process is complex and time-consuming.
Current 2D animation production technical solutions generally include two implementation schemes: one scheme is hand-drawn animation, in which sequence actions are drawn by hand frame by frame to produce animation; in another scheme, sequence frames exported from an existing 3D model are used as materials for producing 2D animation. For example, a 3D model is used to export sequence frames, and the exported sequence frames are subject to manual art modification. Specifically, the 3D model is used to produce maps and actions, the maps and actions are exported as sequence images frame by frame, the sequence images are subject to art modification frame by frame, and finally, a 2D animation is produced.
Although the foregoing 2D animation production process is relatively simple, the two existing technical solutions described above still consume a lot of manpower and time. Hand-drawn animation consumes a large amount of art manpower, and has a high requirement on a hand drawing capability; and the manner of exporting sequence frames by using a 3D model and performing manual art modification on the exported sequence frames has lower requirements on the hand drawing capability and time, but it still requires a high hand drawing capability and consumes a lot of time.
SUMMARY
An animation data generating method, including the following steps:
scanning sequence frame pictures exported from a 3D model, to parse out valid small pictures;
synthesizing each parsed-out valid small picture into a large picture according to a preset rule;
generating sequence frame data according to related attribute information of each valid small picture in the large picture; and
generating 2D animation data out of the sequence frame data.
An animation data generating apparatus, including:
a scanning module, configured to scan sequence frame pictures exported from a 3D model, to parse out valid small pictures;
a large picture synthesizing module, configured to synthesize each parsed-out valid small picture into a large picture according to a preset rule;
a sequence frame data generating module, configured to generate sequence frame data according to related attribute information of each valid small picture in the large picture; and
an animation data generating module, configured to generate 2D animation data out of the sequence frame data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic structural diagram of a working environment of an electronic device in which an animation data generating apparatus according to an embodiment of the present invention is located;
FIG. 2 is a schematic flowchart of an animation data generating method according to an embodiment of the present invention;
FIG. 3 and FIG. 4 are schematic diagrams of valid pictures obtained by scanning sequence frame pictures exported from a 3D model according to an embodiment of the present invention; and
FIG. 5 is a schematic structural diagram of an animation data generating apparatus according to an embodiment of the present invention.
DESCRIPTION OF EMBODIMENTS
To make the objectives, technical solutions and beneficial effects of the present disclosure clearer, the present disclosure is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely for illustrating the present disclosure but not intended to limit the present disclosure.
In the following description, unless otherwise stated, the embodiments of the present disclosure will be described by referring to steps and symbol indications of operations executed by one or more computers. Therefore, it can be understood that such steps and operations sometimes referred to as being executed by computer include manipulation by a computer's processing unit of electrical signals representing data in structured form. This manipulation converts data or maintains it in a location in the computer's memory system, and this reconfigures or changes computer operations in a manner understood by a person skilled in the art. The data structure maintaining the data is the physical location of a memory having specific attributes defined by the data format. However, although the present disclosure is described in the above context, it does not at all signify limitation. As a person skilled in the art will understand, the steps and operations described below may also be realized using hardware.
As used in the present application, the terms "component," "module," "system" and the like are likewise intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
Furthermore, the claimed subject matter may be implemented as a method, an apparatus, or an article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the method, apparatus, or article of manufacture of the disclosed subject matter. The term "article of manufacture" as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or access media. Of course, a person skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
FIG. 1 and the following discussion are intended to provide a brief, general description of a working environment of an electronic device in which the animation data generating apparatus according to the present disclosure is located. The working environment of FIG. 1 is merely an instance of a suitable working environment but is not intended to suggest any limitation on the usage or function scope of the working environment. An instance electronic device 112 includes, but is not limited to, a personal computer, a server computer, a hand-held or laptop computer, a mobile device (such as a mobile phone, a personal digital assistant (PDA), or a media player), a multi-processor system, a consumer electronic device, a minicomputer, a mainframe computer, a distributed computing environment including any of the foregoing system or device, and the like.
Though not required, the embodiment is described in a general background in which a "computer readable instruction" is executed by one or more electronic devices. The computer readable instruction is distributed by using a computer readable medium (which is discussed in the following). The computer readable instruction may be implemented as a program module, for example, a function, an object, an application programming interface (API), or a data structure for executing a specific task or implementing a specific abstract data type. Typically, the function of the computer readable instruction may be randomly combined or distributed in various environments.
FIG. 1 shows an instance of an electronic device 112 including one or more embodiments of the animation data generating apparatus of the present disclosure. In one configuration, the electronic device 112 includes at least one processor unit 116 and a memory 118. According to the precise configuration and type of the electronic device, the memory 118 may be a volatile memory (for example, a random access memory (RAM)), a non-volatile memory (for example, a read-only memory (ROM) or a flash memory), or a combination thereof. The configuration is shown by a dashed line 114 in FIG. 1.
In other embodiments, the electronic device 112 may include an additional feature and/or function. For example, the device 112 may further include an additional storage apparatus (for example, removable and/or non-removable), which includes, but is not limited to, a magnetic storage apparatus, an optical storage apparatus, and the like. Such additional storage apparatus is represented by a storage apparatus 120 in FIG. 1. In one embodiment, the computer readable instruction for implementing one or more embodiments provided herein may be stored in the storage apparatus 120. The storage apparatus 120 may further store other computer readable instructions for implementing an operating system, an application program, and the like. The computer readable instruction may be loaded into the memory 118 and executed by, for example, the processing unit 116.
The technical term "computer readable medium" used herein includes a computer storage medium. The computer storage medium includes volatile and non-volatile, removable and non-removable media implemented by any method or technology used for storing a computer readable instruction or other information such as data. The memory 118 and the storage apparatus 120 are instances of the computer storage medium. The computer storage medium includes, but is no limited to, a RAM, a ROM, an electrically erasable programmable read-only memory (EEPROM), a flash memory or other memory technologies; a compact disc read-only memory (CD-ROM), a digital versatile disc (DVD) or other optical storage apparatuses; a cassette tape, a magnetic tape, a magnetic disk storage apparatus or other magnetic storage devices; or any other medium that can be used to store expected information and can be accessed by the electronic device 112. Any such computer storage medium may be a part of the electronic device 112.
The electronic device 112 may further include a communication connection 126 that allows the electronic device 112 to communicate with another device. The communication connection 126 may include, but is not limited to, a modem, a network interface card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a universal serial bus (USB) connection, or another interface for connecting the electronic device 112 to another electronic device. The communication connection 126 may include a wired connection or a wireless connection. The communication connection 126 may transmit and/or receive a communication medium.
The term "computer readable medium" may include the communication medium. The communication medium typically includes a computer readable instruction or other data in a "modulated data signal" such as a carrier wave or other transmission mechanisms, and includes any information transport medium. The term "modulated data signal" may include a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
The electronic device 112 may include an input device 124, such as a keyboard, a mouse, a pen, a voice input device, a touch input device, an infrared camera, a video input device, and/or any other input device. The device 112 may also include an output device 122, for example, one or more displays, loudspeakers, and printers, and/or any other output device. The input device 124 and the output device 122 may be connected to the electronic device 112 by a wired connection, a wireless connection, or any combination thereof. In an embodiment, the input device or output device from another electronic device may be used as the input device 124 or the output device 122 of the electronic device 112.
Components of the electronic device 112 may be connected by various interconnects (such as a bus). Such interconnects may include a peripheral component interconnect (PCI) (such as a fast PCI), a USB, a fire wire (IEEE 1394), an optical bus structure, and the like. In other embodiment, the components of the electronic device 112 may be interconnected through a network. For example, the memory 118 may consist of multiple physical memory units located at different physical positions and interconnected through a network.
A person skilled in the art may realize that, a storage device for storing the computer readable instruction may be distributed across networks. For example, an electronic device 130 that can be accessed through a network 128 may store the computer readable instruction for implementing one or more embodiments provided by the present disclosure. The electronic device 112 may access the electronic device 130 and download all or a part of the computer readable instruction for execution. Alternatively, the electronic device 112 may download multiple computer readable instructions as required, or some instructions may be executed at the electronic device 112 and some instructions may be executed at the electronic device 130.
Various operations of embodiments are provided herein. In one embodiment, one or more of the operations described may constitute computer readable instructions stored on one or more computer readable media, which if executed by a computing device, will cause the computing device to perform the operations described. The order in which some or all of the operations are described are not be construed as to imply that these operations are necessarily order dependent. Alternative ordering will be appreciated by a person skilled in the art having the benefit of this description. Further, it will be understood that not all operations are necessarily present in each embodiment provided herein.
Moreover, the word "exemplary" is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as "exemplary" is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word "exemplary" is intended to present concepts in a concrete fashion. As used in this application, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing instances.
Also, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to a person skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The disclosure includes all such modifications and alterations and is limited only by the scope of the following claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure. In addition, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms "include", "have", "contain", or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term "comprise".
In the embodiments of the present invention, a picture pixel scanning technology is used to scan pixels of sequence frame pictures exported from a 3D model and invalid pixels are removed, thereby parsing out valid small pictures, and these parsed-out valid small pictures are combined into a large picture; sequence frame data is generated according to related attribute information of each valid small picture in the large picture, and finally, 2D animation data is automatically generated out of the sequence frame data. With the foregoing technical solution, the present disclosure achieves an automatic process of rendering 2D animation data in a 3D manner, which saves a lot of time spent on art modification of a 2D animation, and significantly lowers the art requirement.
Referring to FIG. 2, FIG. 2 is an implementation process of an animation data generating method according to an embodiment of the present invention, and the method includes the following steps:
In step S101, sequence frame pictures exported from a 3D model are scanned, to parse out valid small pictures.
In this embodiment of the present invention, an existing 3D model may be used to export sequence frame pictures; as shown in FIG. 3, FIG. 3 shows frame-by-frame pictures exported from the 3D model, and each frame of picture is subject to processing shown in FIG. 4 (where FIG. 4 only shows a processing process on the leftmost picture in FIG. 3). The sequence frame picture exported from the 3D model is generally a square picture, and a valid picture of each frame is located in the middle of the frame. Picture data actually used in an animation is the picture in the middle, and therefore, useless peripheral pixels of the square picture can be removed by using a picture pixel scanning technology, thereby parsing out a valid small picture. Preferably, in this step, normalization processing may be performed on all the parsed-pout valid small pictures, that is, size parameters of all the valid small pictures are set to a same value, to facilitate subsequent synthesizing of the valid small pictures into a large picture.
In this embodiment of the present invention, the step of scanning sequence frame picture exported from a 3D model to parse out valid small pictures includes:
removing all invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels, where pixels whose data color values are 0x00000000 (RGBA) are defined as invalid pixels, and pixels whose data color values are not 0x00000000(RGBA) are defined as the valid pixels.
Further, pixels of the sequence frame pictures exported from the 3D model are scanned one by one along a preset direction (for example, along four directions, namely, from top to bottom, from bottom to top, from left to right, and from right to left), until valid pixel data is scanned, and invalid pixels are removed, thereby acquiring small pictures of valid pixels.
As shown in FIG. 4, four dashed lines in FIG. 4 are scanning lines, and the scan is started from top to bottom, from bottom to top, from left to right, and from right to left separately, until scanning line in each direction can meet at least one valid pixel, and then, the scan in this direction is stopped. The rectangle with intersections of the four dashed lines as four corners is the valid picture obtained by means of scan.
In this embodiment of the present invention, a condition for determining whether a picture is valid is determining whether an Alpha channel of the color value is 0, and the formula is:
(pixel value&0x000000FF) !=0
When it is determined that the Alpha channel of the color value is 0, it is considered that the pixel is an invalid pixel; and when it is determined that the Alpha channel of the color value is not 0, it is considered that the pixel is a valid pixel.
In step S102, each parsed-out valid small picture is synthesized into a large picture according to a preset rule.
As an embodiment of the present invention, the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
setting a serial number for each valid small picture in advance; and
arranging each parsed-out valid small picture according to a sequence of the serial numbers to form a large picture, the small pictures not overlapping each other.
As another embodiment of the present invention, the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
arranging each parsed-out valid small picture according to a scanning sequence to form a large picture, the small pictures not overlapping each other.
As still another embodiment of the present invention, the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule includes:
arranging the parsed-out valid small pictures according to sequence frame numbers to form a large picture, the small pictures not overlapping each other.
For example, a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
However, it can be understood that, no matter which manner is used, any solution shall fall within the protection scope of the present invention as long as the valid small pictures can be arranged to form a large picture while the valid small pictures do not overlap each other. By synthesizing small pictures into a large picture can reduce the number of files of 2D animation data.
In step S103, sequence frame data is generated according to related attribute information of each valid small picture in the large picture.
In this embodiment of the present invention, the related attribute information of the small picture in the large picture mainly includes, but is not limited to, the following content: a serial number of the small picture, coordinate information of the small picture in the large picture, and the width and height of the small picture.
In this embodiment of the present invention, the sequence frame data is modular data used in animation data.
In this embodiment of the present invention, the step of generating sequence frame data according to related attribute information of each valid small picture in the large picture includes:
acquiring, from attributes of the large picture, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture; and
generating the sequence frame data out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
In step S104, 2D animation data is generated out of the sequence frame data.
In this embodiment of the present invention, the step of generating 2D animation data out of the sequence frame data includes:
generating 2D animation data out of the sequence frame data according to a sequence of the serial numbers.
In this embodiment of the present invention, a tool may be used to organize the sequence frame data as animation data corresponding to game development. In an actual development process, the sequence frame data needs to be organized according to a data structure of a corresponding game animation. In the game, an animation effect is achieved by playing the sequence frames of the animation frame by frame.
The foregoing animation data generating method can achieve an automatic process of rendering 2D animation data in a 3D manner, and completely saves a lot of time spent on art modification of a 2D animation, and significantly lowers an art requirement.
Referring to FIG. 5, FIG. 5 is a schematic structural diagram of an animation data generating apparatus according to an embodiment of the present invention. For ease of description, FIG. 5 only shows parts related to this embodiment of the present invention. The animation data generating apparatus includes: a scanning module 101, a large picture synthesizing module 102, a sequence frame data generating module 103, and an animation data generating module 104. The animation data generating apparatus may be a software unit embedded in the electronic device, a hardware unit, or a unit combining hardware and software.
The scanning module 101 is configured to scan sequence frame pictures exported from a 3D model, to parse out valid small pictures.
The large picture synthesizing module 102 is configured to synthesize each parsed-out valid small picture into a large picture according to a preset rule.
The sequence frame data generating module 103 is configured to generate sequence frame data according to related attribute information of each valid small picture in the large picture.
In this embodiment of the present invention, the sequence frame data is modular data used in animation data.
The animation data generating module 104 is configured to generate 2D animation data out of the sequence frame data.
In this embodiment of the present invention, as shown in FIG. 4, the sequence frame picture exported from the 3D model is generally a rectangular picture, and a valid picture of each frame is located in the middle of the frame. Picture data actually used in an animation is the picture in the middle, and therefore, useless peripheral pixels of the rectangular picture can be removed by using a picture pixel scanning technology, thereby parsing out a valid small picture.
In this embodiment of the present invention,
the scanning module 101 is specifically configured to remove all invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels, where pixels whose data color values are 0x00000000 (RGBA) are defined as invalid pixels, and pixels whose data color values are not 0x00000000(RGBA) are defined as the valid pixels.
Further, the scanning module 101 is specifically configured to scan, one by one along a preset direction (for example, from top to bottom, from bottom to top, from left to right, or from right to left), pixels of the sequence frame pictures exported from the 3D model, until valid pixel data is scanned, and remove invalid pixels, to acquire small pictures of valid pixels.
As shown in FIG. 4, four dashed lines in FIG. 4 are scanning lines, and the scan is started from top to bottom, from bottom to top, from left to right, and from right to left separately, until the scanning line in each direction can meet at least one valid pixel, and then, the scan in this direction is stopped. The rectangle with intersections of the four dashed lines as four corners is the valid picture obtained by means of scan.
In this embodiment of the present invention, a condition for determining whether a picture is valid is determining whether an Alpha channel of the color value is 0, and the formula is:
(pixel value&0x000000FF) !=0
When it is determined that the Alpha channel of the color value is 0, it is considered that the pixel is an invalid pixel; and when it is determined that the Alpha channel of the color value is not 0, it is considered that the pixel is a valid pixel.
As an embodiment of the present invention, the animation data generating apparatus further includes:
a serial number setting module, configured to set a serial number for each valid small picture;
where the large picture synthesizing module 102 is further configured to arrange each parsed-out valid small picture according to a sequence of the serial numbers to form a large picture, the small pictures not overlapping each other.
As another embodiment of the present invention,
the large picture synthesizing module 102 is specifically configured to arrange each parsed-out valid small picture according to a scanning sequence to form a large picture, the small pictures not overlapping each other.
As still another embodiment of the present invention,
the large picture synthesizing module 102 is specifically configured to arrange the parsed-out valid small pictures according to sequence frame numbers to form a large picture, the small pictures not overlapping each other.
For example, a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
However, it can be understood that, no matter which manner is used, any solution shall fall within the protection scope of the present invention as long as the valid small pictures can be arranged to form a large picture while the valid small pictures do not overlap each other. By synthesizing small pictures into a large picture can reduce the number of files of 2D animation data.
As an embodiment of the present invention, the animation data generating apparatus further includes: an acquiring module.
The acquiring module is configured to acquire, from attributes of the large picture, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture.
The sequence frame data generating module 103 is further configured to generate the sequence frame data out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
The foregoing animation data generating apparatus can achieve an automatic process of rendering 2D animation data in a 3D manner, and completely saves a lot of time spent on art modification of a 2D animation, and significantly lowers an art requirement.
An implementation process of the animation data generating method provided by the embodiment of the present invention is described in detail below.
First, pixels of sequence frame pictures exported from a 3D model are scanned in four directions, namely, from top to the bottom, from bottom to top, from left to right, and from right to left, until the scanning line in each direction can meet at least one valid pixel, and then, scan in this direction is stopped. Invalid pixels are removed from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels; a rectangle with intersections of the four dashed lines in FIG. 4 as four corners is a valid picture obtained by scan.
Then, a large rectangle is formed according to a maximum length and a maximum width of each valid small picture, an empty picture having an area equivalent to that of the large rectangle is created, a serial number is set for each valid small picture, and then these small pictures are placed at corresponding positions of the empty picture according to the serial numbers. In this way, a large picture including all the valid small pictures is formed.
Then, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture are acquired from attributes of the large picture; and sequence frame data is generated out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
Finally, a tool is used to organize the sequence frame data as animation data corresponding to game development.
In conclusion, in the present disclosure, a picture pixel scanning technology is used to scan pixels of sequence frame pictures exported from a 3D model and invalid pixels are removed, thereby parsing out valid small pictures, and these parsed-out valid small pictures are combined into a large picture; sequence frame data is generated according to related attribute information of each valid small picture in the large picture, and finally, 2D animation data is automatically generated out of the sequence frame data. With the foregoing technical solution, the present disclosure achieves an automatic process of rendering 2D animation data in a 3D manner, which saves a lot of time spent on art modification of a 2D animation, and significantly lowers the art requirement.
A person of ordinary skill in the art should understand that, all of or a part of processes in the method according to the embodiment may be implemented by a program instructing relevant hardware. The program may be stored in a computer readable storage medium. The storage medium may be a ROM/RAM, a magnetic disk, an optical disc, or the like.
The foregoing descriptions are merely preferred embodiments of the present invention, but are not intended to limit the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (14)

  1. An animation data generating method, comprising the following steps:
    scanning sequence frame pictures exported from a 3D model, to parse out valid small pictures;
    synthesizing each parsed-out valid small picture into a large picture according to a preset rule;
    generating sequence frame data according to related attribute information of each valid small picture in the large picture; and
    generating 2D animation data out of the sequence frame data.
  2. The animation data generating method according to claim 1, wherein the step of scanning sequence frame pictures exported from a 3D model to parse out valid small pictures comprises:
    removing invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels.
  3. The animation data generating method according to claim 2, wherein the step of removing invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels comprises:
    scanning, one by one in a preset direction, pixels of the sequence frame pictures exported from the 3D model, until valid pixel data is scanned, and removing invalid pixels, to acquire the small pictures of valid pixels.
  4. The animation data generating method according to claim 1, wherein the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule comprises:
    setting a serial number for each valid small picture in advance; and
    arranging each parsed-out valid small picture according to a sequence of the serial numbers to form a large picture, the small pictures not overlapping each other.
  5. The animation data generating method according to claim 1, wherein the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule comprises:
    arranging each parsed-out valid small picture according to a scanning sequence to form a large picture, the small pictures not overlapping each other.
  6. The animation data generating method according to claim 1, wherein the step of synthesizing each parsed-out valid small picture into a large picture according to a preset rule comprises:
    arranging the parsed-out valid small pictures according to sequence frame numbers to form a large picture, the small pictures not overlapping each other.
  7. The animation data generating method according to claim 1, wherein the step of generating sequence frame data according to related attribute information of each valid small picture in the large picture comprises:
    acquiring, from attributes of the large picture, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture; and
    generating the sequence frame data out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
  8. An animation data generating apparatus, comprising:
    a scanning module, configured to scan sequence frame pictures exported from a 3D model, to parse out valid small pictures;
    a large picture synthesizing module, configured to synthesize each parsed-out valid small picture into a large picture according to a preset rule;
    a sequence frame data generating module, configured to generate sequence frame data according to related attribute information of each valid small picture in the large picture; and
    an animation data generating module, configured to generate 2D animation data out of the sequence frame data.
  9. The animation data generating apparatus according to claim 8, wherein,
    the scanning module is specifically configured to remove invalid pixels from the scanned sequence frame pictures exported from the 3D model, to obtain small pictures of valid pixels.
  10. The animation data generating apparatus according to claim 9, wherein,
    the scanning module is specifically configured to scan, one by one in a preset direction, pixels of the sequence frame pictures exported from the 3D model, until valid pixel data is scanned, and remove invalid pixels, to acquire the small pictures of valid pixels.
  11. The animation data generating apparatus according to claim 8, further comprising:
    a serial number setting module, configured to set a serial number for each valid small picture;
    wherein the large picture synthesizing module is further configured to arrange each parsed-out valid small picture according to a sequence of the serial numbers to form a large picture, the small pictures not overlapping each other.
  12. The animation data generating apparatus according to claim 8, wherein,
    the large picture synthesizing module is specifically configured to arrange each parsed-out valid small picture according to a scanning sequence to form a large picture, the small pictures not overlapping each other.
  13. The animation data generating apparatus according to claim 8, wherein,
    the large picture synthesizing module is specifically configured to arrange the parsed-out valid small pictures according to sequence frame numbers to form a large picture, the small pictures not overlapping each other.
  14. The animation data generating apparatus according to claim 8, further comprising: an acquiring module;
    wherein the acquiring module is configured to acquire, from attributes of the large picture, coordinate information of the small picture in the large picture, width information and height information of the small picture, and a serial number of the small picture; and
    the sequence frame data generating module is further configured to generate the sequence frame data out of the coordinate information, the width information and height information of the small picture, and the serial number of the small picture.
PCT/CN2015/078279 2014-05-07 2015-05-05 Animation data generating method, apparatus, and electronic device WO2015169209A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020167027583A KR101810019B1 (en) 2014-05-07 2015-05-05 Animation data generating method, apparatus, and electronic device
MYPI2016703808A MY185734A (en) 2014-05-07 2015-05-05 Animation data generating method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410190376.3A CN105096364B (en) 2014-05-07 2014-05-07 Animation data generation method and device and electronic equipment
CN201410190376.3 2014-05-07

Publications (1)

Publication Number Publication Date
WO2015169209A1 true WO2015169209A1 (en) 2015-11-12

Family

ID=54392150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/078279 WO2015169209A1 (en) 2014-05-07 2015-05-05 Animation data generating method, apparatus, and electronic device

Country Status (4)

Country Link
KR (1) KR101810019B1 (en)
CN (1) CN105096364B (en)
MY (1) MY185734A (en)
WO (1) WO2015169209A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108022276A (en) * 2016-11-01 2018-05-11 北京星辰美豆文化传播有限公司 A kind of 3-D cartoon rendering method, device and electronic equipment
CN106652000A (en) * 2016-12-22 2017-05-10 新乡学院 Amination data generation device, system and method
CN107403460B (en) * 2017-07-11 2021-07-06 北京龙之心科技有限公司 Animation generation method and device
CN109934898A (en) * 2019-03-12 2019-06-25 湖南城市学院 Artistic design application platform and data information processing method based on new media platform
CN112132932B (en) * 2020-09-02 2021-04-27 苏州好玩友网络科技有限公司 Automatic sequence diagram generation method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090079743A1 (en) * 2007-09-20 2009-03-26 Flowplay, Inc. Displaying animation of graphic object in environments lacking 3d redndering capability
CN101520889A (en) * 2008-07-09 2009-09-02 殷宁淳 Method for panoramically displaying articles at multiple angels with multiple static images and device for collecting static images
CN101814190A (en) * 2010-04-28 2010-08-25 邹玉杰 Animation production system and method
CN101908223A (en) * 2009-06-04 2010-12-08 曹立宏 Technology for revealing actions and expressions of 2.5D (2.5 Dimensional) virtual characters
CN102622391A (en) * 2011-10-18 2012-08-01 北京小米科技有限责任公司 Method for processing small pictures

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2216781C2 (en) * 2001-06-29 2003-11-20 Самсунг Электроникс Ко., Лтд Image-based method for presenting and visualizing three-dimensional object and method for presenting and visualizing animated object
KR101701519B1 (en) * 2011-01-20 2017-02-03 중앙대학교 산학협력단 Apparatus and method for generating line animation from video for consistency between image frames

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090079743A1 (en) * 2007-09-20 2009-03-26 Flowplay, Inc. Displaying animation of graphic object in environments lacking 3d redndering capability
CN101520889A (en) * 2008-07-09 2009-09-02 殷宁淳 Method for panoramically displaying articles at multiple angels with multiple static images and device for collecting static images
CN101908223A (en) * 2009-06-04 2010-12-08 曹立宏 Technology for revealing actions and expressions of 2.5D (2.5 Dimensional) virtual characters
CN101814190A (en) * 2010-04-28 2010-08-25 邹玉杰 Animation production system and method
CN102622391A (en) * 2011-10-18 2012-08-01 北京小米科技有限责任公司 Method for processing small pictures

Also Published As

Publication number Publication date
CN105096364B (en) 2021-06-11
KR20160130455A (en) 2016-11-11
KR101810019B1 (en) 2017-12-18
CN105096364A (en) 2015-11-25
MY185734A (en) 2021-06-02

Similar Documents

Publication Publication Date Title
WO2015169209A1 (en) Animation data generating method, apparatus, and electronic device
WO2018094814A1 (en) Video synthesizing method and device
WO2017028601A1 (en) Voice control method and device for intelligent terminal, and television system
WO2017041538A1 (en) Terminal user interface controlled display method and device
WO2013117125A1 (en) Method and apparatus for adding friend, and storage medium
WO2018205545A1 (en) Data generation method, apparatus, terminal, and computer-readable storage medium
WO2016165556A1 (en) Data processing method, device and system for video stream
WO2017152603A1 (en) Display method and apparatus
WO2018120430A1 (en) Page construction method, terminal, computer-readable storage medium and page construction device
WO2016169521A1 (en) Electronic chart display method and system
WO2017059686A1 (en) Desktop displaying method and device
WO2017088318A1 (en) User interface display processing method and device
WO2017084301A1 (en) Audio data playing method and apparatus, and smart television
WO2015144052A1 (en) Method and apparatus for collecting statistics on network information
WO2014048239A1 (en) Smart television playing method and smart television
WO2017107384A1 (en) Image display method of liquid crystal display, and liquid crystal display
WO2019080401A1 (en) Method and apparatus for converting script statement, and computer-readable storage medium
WO2019156408A1 (en) Electronic device and operation method thereof
WO2017063366A1 (en) Method and system for starting application
WO2016029502A1 (en) Signal source switching method and device
WO2014089971A1 (en) Realization method and device of mobile terminal for supporting switching of left-hand mode and right-hand mode
WO2019024472A1 (en) Data operation method and device and computer readable storage medium
WO2019061042A1 (en) Exposure compensation method, device and computer readable storage medium
WO2018171659A1 (en) Image loading method and apparatus, device and computer-readable storage medium
WO2022186443A1 (en) Method and device for correcting image on basis of compression quality of image in electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15788627

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20167027583

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29.03.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15788627

Country of ref document: EP

Kind code of ref document: A1