CN106815880B - Animation multiplexing method and system - Google Patents

Animation multiplexing method and system Download PDF

Info

Publication number
CN106815880B
CN106815880B CN201510864263.1A CN201510864263A CN106815880B CN 106815880 B CN106815880 B CN 106815880B CN 201510864263 A CN201510864263 A CN 201510864263A CN 106815880 B CN106815880 B CN 106815880B
Authority
CN
China
Prior art keywords
file
animation
attribute
node
frame data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510864263.1A
Other languages
Chinese (zh)
Other versions
CN106815880A (en
Inventor
陈昊芝
刘冠群
张晓龙
谢鑫
范力
张�成
刘北辰
刘关强
朱亮
郭建强
肖峰
张东猛
韩东涛
郭伦昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Yaji Software Co Ltd
Original Assignee
Xiamen Yaji Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Yaji Software Co Ltd filed Critical Xiamen Yaji Software Co Ltd
Priority to CN201510864263.1A priority Critical patent/CN106815880B/en
Publication of CN106815880A publication Critical patent/CN106815880A/en
Application granted granted Critical
Publication of CN106815880B publication Critical patent/CN106815880B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a system for multiplexing animation. The animation multiplexing method comprises the following steps: acquiring an animation file to be processed selected by a user, and recording the animation file as a first file; acquiring an animation file with the same attribute as the first file after the previous processing, and recording the animation file as a second file; the timeline for the second file is applied to the first file. The invention synchronizes different animation files based on the attributes, and the animation files of different animation roles can be shared as long as the attributes are the same, thereby reducing a large amount of repeated labor and simply and conveniently realizing animation multiplexing.

Description

Animation multiplexing method and system
Technical Field
The invention relates to the field of computer application, in particular to a method and a system for multiplexing animation.
Background
Animation is to create a series of frames with micro-difference content by using the principle of human visual persistence, to make the frames appear in sequence in a very short time interval, and to play the next frame before the frame is not disappeared to form a smooth visual change effect. The animation image in the game also has various actions, and a corresponding time shaft needs to be written.
CN101079154A discloses a role animation implementation method in 2007-11-28, which comprises the following steps of (A) establishing one or more virtual nodes, each of which is bound with role skeleton data; (B) generating a time axis of the virtual node according to the time axis of the bound role skeleton; (C) binding the hook timeline to the timeline at the virtual node and playing the hook animation. The technical scheme also provides a corresponding role animation realization system. According to the technical scheme, the skeleton, the grid and the hitching object data of the character animation are respectively independent through the virtual nodes which do not occupy any resource basically, and the time shaft of the hitching object is hitched on the virtual nodes. Therefore, the hitches hitched to the virtual nodes can be flexibly replaced, the occupation of system resources is reduced, and the reusability of the character animation is improved.
CN102254335A discloses, in 2011-11-23, a game character editing system, which comprises an animation editing module, a game editing module and a game editing module, wherein the animation editing module is used for integrating a plurality of animation resources into alternative resources and selecting animation resources of the same category from the alternative resources for fusion according to game logic; and a skill editing module for providing visual trigger event editing and trajectory editing to the animation resources. The technical scheme also discloses a game character editing method. The technical scheme improves the creation and editing efficiency of the character roles, saves computer resources, reduces the load of the computer, accelerates the processing speed of the computer and reduces the energy consumption of the computer.
At present, a time axis written for a certain animation (such as people shrinking) is difficult to be reused for another animation (such as monster shrinking) in game development, therefore, workers need to repeatedly write similar codes for different animations, time and labor are wasted, and once the codes are wrong, nested animations cannot normally run, and development efficiency is affected.
Disclosure of Invention
The invention provides an animation multiplexing method and system for improving animation development efficiency.
The purpose of the invention is realized by the following technical scheme:
a method of animation multiplexing, comprising:
acquiring an animation file to be processed selected by a user, and recording the animation file as a first file;
acquiring an animation file with the same attribute as the first file after the previous processing, and recording the animation file as a second file;
the timeline for the second file is applied to the first file.
Further, the first file and the second file are csd files written by xml.
Further, the csd file includes object data and the timeline; the object data corresponds to at least one of the attributes.
Further, the time axis comprises frame data, time data and an animation list; the animation list includes an action of an object, and the frame data corresponds to an operation of an animation.
Further, the time axis comprises a plurality of sub-axes, and each sub-axis comprises the frame data and the time data; the animation list comprises a plurality of actions; each attribute corresponds to a sub-axis; the method of applying the time axis of the second file to the first file includes: and acquiring and applying the time axis of the second file corresponding to the attribute by the first file according to the mapping relation between the attribute and the sub-axis.
Further, the animation file comprises a structure tree, and the structure tree and the time axis share the same structure tree model; the animation file is mapped with the structure tree.
Further, the structure tree includes a header for a play operation of the animation file and a node for setting the attribute.
Further, the method for applying the time axis of the second file to the first file comprises the following steps:
selecting a node corresponding to the attribute needing to be applied to the first file in the second file;
synchronizing the selected node to the rendering zone;
the second file obtains nodes which can be synchronized from the rendering area;
and establishing a mapping relation with a time shaft corresponding to the first file according to the synchronized node.
Further, the frame data includes attribute frame data and node frame data controlling the attribute frame data.
A system for animation multiplexing, comprising:
the first acquisition means: the method comprises the steps of obtaining an animation file to be processed selected by a user and recording the animation file as a first file;
a second acquisition means: the animation file is used for acquiring the animation file which is processed in advance and has the same attribute as the first file, and is marked as a second file;
the application device comprises: for applying the time axis of the second file to the first file.
The invention synchronizes different animation files based on the attributes, and the animation files of different animation roles can be shared as long as the attributes are the same, thereby reducing a large amount of repeated labor and simply and conveniently realizing animation multiplexing.
Drawings
FIG. 1 is a flow chart of a method for multiplexing animations according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a system for animation multiplexing according to an embodiment of the invention.
Detailed Description
As shown in FIG. 1, the present invention discloses a method for multiplexing animation, comprising:
s11, acquiring the animation file to be processed selected by the user and recording as a first file;
s12, acquiring an animation file which is processed in advance and has the same attribute as the first file, and recording the animation file as a second file;
s13, applying the time axis of the second file to the first file.
As shown in fig. 2, the present invention also discloses a system for multiplexing animation, comprising:
the first acquisition means 1: the method comprises the steps of obtaining an animation file to be processed selected by a user and recording the animation file as a first file;
the second acquisition means 2: the animation file is used for acquiring the animation file which is processed in advance and has the same attribute as the first file, and is marked as a second file;
the application device 3: for applying the time axis of the second file to the first file.
The invention synchronizes different animation files based on the attributes, and the animation files of different animation roles can be shared as long as the attributes are the same, thereby reducing a large amount of repeated labor and simply and conveniently realizing animation multiplexing.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The term "computer device" or "computer" in this context refers to an intelligent electronic device that can execute predetermined processes such as numerical calculation and/or logic calculation by running predetermined programs or instructions, and may include a processor and a memory, wherein the processor executes a pre-stored instruction stored in the memory to execute the predetermined processes, or the predetermined processes are executed by hardware such as ASIC, FPGA, DSP, or a combination thereof. Computer devices include, but are not limited to, servers, personal computers, laptops, tablets, smart phones, and the like.
The computer equipment comprises user equipment and network equipment. Wherein the user equipment includes but is not limited to computers, smart phones, PDAs, etc.; the network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of computers or network servers, wherein Cloud Computing is one of distributed Computing, a super virtual computer consisting of a collection of loosely coupled computers. Wherein the computer device can be operated alone to implement the invention, or can be accessed to a network and implement the invention through interoperation with other computer devices in the network. The network in which the computer device is located includes, but is not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, and the like.
It should be noted that the user equipment, the network device, the network, etc. are only examples, and other existing or future computer devices or networks may also be included in the scope of the present invention, and are included by reference.
The methods discussed below, some of which are illustrated by flow diagrams, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. The processor(s) may perform the necessary tasks.
Specific structural and functional details disclosed herein are merely representative and are provided for purposes of describing example embodiments of the present invention. The present invention may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
Key frame Animation (KeyFrame Animation)
The key frame Animation (KeyFrame Animation) is to prepare a set of time-related values for the attribute needing Animation effect, wherein the values are extracted from the frames with comparison key in the Animation sequence, and the values in other time frames can be calculated by using the key values and adopting a specific interpolation method, so that the smooth Animation effect is achieved.
Animation frame by frame
The Frame-By-Frame animation is a common animation form (Frame By Frame), and the principle thereof is to decompose animation actions in "continuous key frames", that is, draw different contents on each Frame of a time axis Frame By Frame, and continuously play the contents to form the animation. Because the frame sequence content of the frame-by-frame animation is different, not only the production is burdened, but also the final output file volume is large, but the advantages are obvious: the frame-by-frame animation has great flexibility, can represent almost any content to be represented, is similar to a playing mode of a movie, and is very suitable for performing fine animation. For example: sharp turns of the character or animal, waving of hair and clothing, walking, speaking, and delicate 3D effects, etc.
The invention is further described with reference to the drawings and the preferred embodiments.
A method of animation multiplexing, comprising:
acquiring an animation file to be processed selected by a user, and recording the animation file as a first file;
acquiring an animation file with the same attribute as the first file after the previous processing, and recording the animation file as a second file;
the timeline for the second file is applied to the first file.
The files for each animation are csd files written by xml. The csd file includes object data and a timeline (which may also be referred to as animation data); wherein the object corresponds to at least one attribute (e.g., zoom); the time axis includes frames corresponding to operations such as playing or stopping of an animation, time data, and an animation list capable of representing actions (running, jumping, etc.) of an object. The object data corresponds to at least one of the attributes.
The time axis comprises a plurality of sub-axes, and each sub-axis comprises the frame data and the time data; the animation list comprises a plurality of actions; each attribute corresponds to a sub-axis; the method of applying the time axis of the second file to the first file includes: and acquiring and applying the time axis of the second file corresponding to the attribute by the first file according to the mapping relation between the attribute and the sub-axis. The frame data includes attribute frame data and node frame data that controls the attribute frame data.
The animation file comprises a structure tree, and the structure tree and a time shaft share the same structure tree model; the animation file is mapped with the structure tree.
The data types of the structure tree are divided into two types, namely a node type and an attribute type.
The structttreemodel provides a function of whether the deleted data should be displayed on the interface, when the node attribute value is expanded, the node attribute value is displayed, and when the node attribute value is contracted, the node attribute is hidden.
The StructTreeModel caches the mapping between the nodes and TreeIter and is used for quickly searching the corresponding structure tree path through the nodes.
The structTreeModel represents a structure tree model and is responsible for mapping the operation model and the cache node
The structure tree shows the structure of the node tree and provides a method for operating on the node tree.
The structure tree includes a header for a play operation of the animation file and a node setting the attribute.
The structure tree is divided into an upper part and a lower part, wherein the upper part is the head part of the structure tree, and the lower part displays the node tree structure. The structure tree header is used to operate on the playing of the animation, such as start/stop playing, skip to the first/last frame, etc., controlled by the header (TreeHeader) class. The node tree part provides operations on the nodes, such as hiding/displaying the nodes, locking/unlocking the nodes, displaying the node attributes, dragging and changing the parent-child structure of the nodes and the like.
The main classes of the structure tree and their functions are shown in the following table:
Figure 142021DEST_PATH_IMAGE001
the invention establishes various mapping relations through the structure tree model and the structure tree, and can realize the time shaft sharing of the same attribute in different animation files without programming. Specifically, the method comprises the following steps:
the method of applying the time axis of the second file to the first file includes:
selecting a node corresponding to the attribute needing to be applied to the first file in the second file;
synchronizing the selected node to the rendering zone;
the second file obtains nodes which can be synchronized from the rendering area;
and establishing a mapping relation with a time shaft corresponding to the first file according to the synchronized node.
The structure tree needs to synchronize the options with the rendering area, i.e. synchronize the currently selected nodes. When a node is selected in the structure tree, a message is required to be sent to inform a rendering area of refreshing options; when the item in the rendering area is changed, a message needs to be sent to inform the structure tree of refreshing the item. In order to avoid the situation of circularly sending the message, whether the message is sent by the user or not needs to be judged to judge whether the message needs to be processed or not.
The present embodiment also provides an editing panel of a time axis.
The timeline panel provides frame data editing functions including functions to delete frame data, add frame data, move frame data, copy/paste frame data, and the like. There are two types of frame data: node frame data and attribute frame data.
The node frame data is of a virtual type, the function of the node frame data is to conveniently control the attribute frame data, and one node frame data can contain a plurality of attribute frame data. The node frame data is operated on, which is equivalent to operating on all attribute frame data of the node frame data.
Node frame data is dynamically created. When an attribute frame data is created, the node frame data can be automatically created; when one node frame data is created manually, one frame data is created for all the displayed attribute frame data of the node. Since the node frame data is modeled, it is not exported into the data.
The attribute frame data is a real type that corresponds to the value of the attribute of the node in the frame data. Attribute frame data may be exported into the data. The node has a plurality of attributes, and the animation is displayed only when attribute frame data exists on the time axis.
In another embodiment of the present invention, a system for multiplexing moving pictures is provided, which can implement the above-mentioned method. The system comprises:
the first acquisition means: the method comprises the steps of obtaining an animation file to be processed selected by a user and recording the animation file as a first file;
a second acquisition means: the animation file is used for acquiring the animation file which is processed in advance and has the same attribute as the first file, and is marked as a second file;
the application device comprises: for applying the time axis of the second file to the first file.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (9)

1. A method for multiplexing animations, comprising:
acquiring an animation file to be processed selected by a user, and recording the animation file as a first file;
acquiring an animation file with the same attribute as the first file after the previous processing, and recording the animation file as a second file;
applying a time axis of a second file to a first file, wherein the time axis comprises frame data, time data and an animation list, the time axis comprises a plurality of sub-axes, each attribute corresponds to one sub-axis, and according to the mapping relationship between the attributes and the sub-axes, the first file acquires and applies the time axis of the second file corresponding to the attributes, and the method specifically comprises the following steps:
selecting a node corresponding to the attribute needing to be applied to the first file in the second file;
synchronizing the selected node to the rendering zone;
the second file obtains nodes which can be synchronized from the rendering area;
and establishing a mapping relation with a time shaft corresponding to the first file according to the synchronized node.
2. The method of animation multiplexing according to claim 1, wherein the first file and the second file are xml written csd files.
3. The method of animation multiplexing according to claim 2, wherein the csd file includes object data and the timeline; the object data corresponds to at least one of the attributes.
4. The method of claim 3, wherein the animation list includes an action of an object, and the frame data corresponds to an operation of an animation.
5. The method of claim 4, wherein the time axis comprises a plurality of sub-axes, each sub-axis comprising the frame data and the time data; the animation list includes a plurality of actions.
6. The animation multiplexing method according to claim 5, wherein the animation file comprises a structure tree, and the structure tree and a time axis share the same structure tree model; the animation file is mapped with the structure tree.
7. The animation multiplexing method according to claim 6, wherein the structure tree includes a header for a play operation of an animation file and a node for setting the attribute.
8. The method of multiplexing animation according to claim 4, wherein the frame data includes node frame data of attribute frame data and control attribute frame data.
9. A system for animation multiplexing, comprising:
the first acquisition means: the method comprises the steps of obtaining an animation file to be processed selected by a user and recording the animation file as a first file;
a second acquisition means: the animation file is used for acquiring the animation file which is processed in advance and has the same attribute as the first file, and is marked as a second file;
the application device comprises: the application device is adapted to apply the time axis of the second file corresponding to the attribute to the first file according to the mapping relationship between the attribute and the sub-axis, and specifically includes:
selecting a node corresponding to the attribute needing to be applied to the first file in the second file;
synchronizing the selected node to the rendering zone;
the second file obtains nodes which can be synchronized from the rendering area;
and establishing a mapping relation with a time shaft corresponding to the first file according to the synchronized node.
CN201510864263.1A 2015-12-01 2015-12-01 Animation multiplexing method and system Active CN106815880B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510864263.1A CN106815880B (en) 2015-12-01 2015-12-01 Animation multiplexing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510864263.1A CN106815880B (en) 2015-12-01 2015-12-01 Animation multiplexing method and system

Publications (2)

Publication Number Publication Date
CN106815880A CN106815880A (en) 2017-06-09
CN106815880B true CN106815880B (en) 2021-07-06

Family

ID=59107162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510864263.1A Active CN106815880B (en) 2015-12-01 2015-12-01 Animation multiplexing method and system

Country Status (1)

Country Link
CN (1) CN106815880B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110300047B (en) * 2018-03-23 2021-10-08 腾讯科技(深圳)有限公司 Animation playing method and device and storage medium
CN109658484A (en) * 2018-12-21 2019-04-19 上海哔哩哔哩科技有限公司 A kind of Automatic Generation of Computer Animation method and Automatic Generation of Computer Animation system
CN112150587A (en) * 2019-06-11 2020-12-29 腾讯科技(深圳)有限公司 Animation data encoding method, animation data decoding method, animation data encoding apparatus, animation data decoding apparatus, storage medium, and computer device
CN112686981B (en) 2019-10-17 2024-04-12 华为终端有限公司 Picture rendering method and device, electronic equipment and storage medium
CN111897615A (en) * 2020-08-06 2020-11-06 福建天晴在线互动科技有限公司 Method and system for realizing animation effect editing in interface
CN113546415B (en) * 2021-08-11 2024-03-29 北京字跳网络技术有限公司 Scenario animation playing method, scenario animation generating method, terminal, device and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521860A (en) * 2011-11-16 2012-06-27 戚军 Skeletal animation implementation method
CN102708583A (en) * 2012-05-02 2012-10-03 厦门大学 Automatic match method of two-dimensional animation characters
CN103116903A (en) * 2013-03-21 2013-05-22 厦门大学 Redirection method of two-dimensional animation role actions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521860A (en) * 2011-11-16 2012-06-27 戚军 Skeletal animation implementation method
CN102708583A (en) * 2012-05-02 2012-10-03 厦门大学 Automatic match method of two-dimensional animation characters
CN103116903A (en) * 2013-03-21 2013-05-22 厦门大学 Redirection method of two-dimensional animation role actions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Cocos入门-使用Cocos编辑器编辑游戏资源;audience;《https://gameinstitute.qq.com/community/detail/101456》;20150824;第1-13页 *

Also Published As

Publication number Publication date
CN106815880A (en) 2017-06-09

Similar Documents

Publication Publication Date Title
CN106815880B (en) Animation multiplexing method and system
Du et al. Zero latency: Real-time synchronization of BIM data in virtual reality for collaborative decision-making
US20190132642A1 (en) Video processing method, video processing device, and storage medium
CN109901894B (en) Progress bar image generation method and device and storage medium
RU2420806C2 (en) Smooth transitions between animations
US8918758B2 (en) Systems and methods for storing object and action data during media content development
US8113959B2 (en) Method and system for rendering the scenes of a role playing game in a metaverse
CN110096277A (en) A kind of dynamic page methods of exhibiting, device, electronic equipment and storage medium
US11069109B2 (en) Seamless representation of video and geometry
EP4130978A1 (en) System and method for streamlining user interface development
KR20080107444A (en) Two dimensional trees to edit graph-like diagrams
US9588651B1 (en) Multiple virtual environments
CN107315580A (en) Component processing method, device and the equipment of user interface, computer-readable recording medium
Ohlenburg et al. The MORGAN framework: enabling dynamic multi-user AR and VR projects
Zhu et al. An object-oriented framework for medical image registration, fusion, and visualization
Ueno et al. 2-scene comic creating system based on the distribution of picture state transition
WO2019006937A1 (en) Virtual gift presentation method and apparatus, server, and storage medium
CN112446948B (en) Virtual reality courseware processing method and device, electronic equipment and storage medium
WO2018049682A1 (en) Virtual 3d scene production method and related device
US11095956B2 (en) Method and system for delivering an interactive video
CN116843802A (en) Virtual image processing method and related product
CN110990104B (en) Texture rendering method and device based on Unity3D
KR20180047200A (en) Apparatus for producting sprite graphic and method for using the same
CN107589978B (en) Method and device for refreshing page in Flash
CN106331834B (en) Multimedia data processing method and equipment thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20171213

Address after: 200436 room 287K, No. 668, Baoshan District Road, Shanghai

Applicant after: Shanghai touch technology development Co., Ltd.

Address before: 100102 No. 1, Wangjing SOHO tower, No. 1, Wangjing East Street, Chaoyang District, Beijing

Applicant before: BEIJING CHUKONG TECHNOLOGY CO., LTD.

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200903

Address after: Unit 3, unit 607, 6 / F, Chuangye building, 1302 Jimei Avenue, phase III, Xiamen Software Park, Fujian Province

Applicant after: XIAMEN YAJI SOFTWARE Co.,Ltd.

Address before: 200436 room 287K, No. 668, Baoshan District Road, Shanghai

Applicant before: Shanghai touch technology development Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant