CN115167940A - 3D file loading method and device - Google Patents

3D file loading method and device Download PDF

Info

Publication number
CN115167940A
CN115167940A CN202210813992.4A CN202210813992A CN115167940A CN 115167940 A CN115167940 A CN 115167940A CN 202210813992 A CN202210813992 A CN 202210813992A CN 115167940 A CN115167940 A CN 115167940A
Authority
CN
China
Prior art keywords
file
loading
loaded
type
attribute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210813992.4A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202210813992.4A priority Critical patent/CN115167940A/en
Publication of CN115167940A publication Critical patent/CN115167940A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/106Display of layout of documents; Previewing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5016Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals the resource being the memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a 3D file loading method, which comprises the following steps: determining the target file type of a file to be loaded in the 3D file; in response to the target file type being the first file type, loading the file to be loaded by adopting a multithreading parallel loading mode; and in response to the target file type being the second file type, loading the file to be loaded by adopting a recursive loading mode. The file loading technical scheme provided by the application has the following advantages: in order to optimize the file loading speed and reduce the occupation of the memory, different loading modes are adopted for different file types, so that different use scenes are adapted, and a more optimized loading effect is realized.

Description

3D file loading method and device
Technical Field
The embodiment of the application relates to the field of computers, in particular to a 3D file loading method and device, computer equipment and a computer readable storage medium.
Background
With the development of computer technology, three-dimensional pictures are more and more favored by a wide range of users. Therefore, a three-dimensional model format is proposed and widely applied to various scenes such as live broadcast and games, and various three-dimensional visual designs are realized. Taking a current game program as an example, in order to create a better game experience for a user, the game program is often developed based on a 3D scene, and when the user runs the game program to play a game, the 3D scene in the game program is rendered and presented through a hardware device, so that rich and detailed game content is presented for the user.
Currently, in the process of loading the 3D file of the application program, all resources of the whole 3D file are usually loaded, and the loading process takes a long time, so that the waiting time of a user is too long, and the use experience of the user is reduced.
Disclosure of Invention
An object of the embodiments of the present application is to provide a 3D file loading method, apparatus, computer device and computer readable storage medium, which are used to solve the above problems.
One aspect of the present embodiment provides an application framework-based 3D file loading method, including:
determining the target file type of a file to be loaded in the 3D file;
in response to the target file type being a first file type, loading the file to be loaded by adopting a multithreading parallel loading mode;
and in response to the target file type being the second file type, loading the file to be loaded by adopting a recursive loading mode.
Optionally, the loading the file to be loaded in response to the target file type being the first file type by using a multithreading parallel loading mode includes:
and executing a loading task in parallel through multiple threads to load a plurality of resources of the file to be loaded into the memory at one time.
Optionally, the loading the file to be loaded in a recursive loading mode in response to that the target file type is the second file type includes:
performing frame-by-frame segmentation on the loading task through asynchronous task processing to obtain subtasks of each frame;
and asynchronously and gradually loading part of resources in the file to be loaded according to the sequence of each frame and the subtasks of each frame.
Optionally, the determining the target file category of the file to be loaded in the 3D file includes:
determining the resource type of the file to be loaded, wherein the resource type corresponds to the format of the file to be loaded; and
and determining the target file type according to the resource type of the file to be loaded so as to select a loading mode.
Optionally, the resource categories include character models, scenes, and GLTF; the determining the target file type according to the resource type of the file to be loaded comprises:
in response to the resource type being a character model, determining that the target file type is the first file type; and
and in response to the resource category being a scene or a GLTF, determining that the target file category is the second file type.
Optionally, the method further includes:
determining default scenes of a plurality of scenes under the condition that the file to be loaded corresponds to the plurality of scenes;
loading resources required by the default scene by adopting a recursive loading mode;
and under the condition that other scenes of the plurality of scenes are needed, loading resources needed by the corresponding other scenes according to the use requirements.
Optionally, the determining a target file category of a file to be loaded in the 3D file includes:
determining the file size of the file to be loaded;
and determining the target file type according to the file size of the file to be loaded so as to select a loading mode.
An aspect of an embodiment of the present application further provides a 3D file loading apparatus, which includes
The determining module is used for determining the target file type of a file to be loaded in the 3D file;
the first response module is used for responding to the fact that the target file type is the first file type, and loading the file to be loaded in a multithreading parallel loading mode;
and the second response module is used for responding to the second file type of the target file type and loading the file to be loaded by adopting a recursive loading mode.
An aspect of the embodiments of the present application further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor is configured to implement the steps of the 3D file loading method as described above when executing the computer program.
An aspect of embodiments of the present application further provides a computer-readable storage medium, in which a computer program is stored, the computer program being executable by at least one processor to cause the at least one processor to perform the steps of the 3D file loading method as described above.
The file loading method, the file loading device, the file loading equipment and the computer readable storage medium provided by the embodiment of the application have the following advantages: in order to optimize the file loading speed and reduce the occupation of the memory, different loading modes are adopted for different file types, so that different use scenes are adapted, a more optimized loading effect is realized, and the user experience is improved.
Drawings
FIG. 1 is a diagram schematically illustrating an application environment of a 3D file loading method according to an embodiment of the present application;
fig. 2 schematically shows a flowchart of a 3D file loading apparatus according to a first embodiment of the present application;
FIG. 3 is a sub-flowchart illustrating step S200 of FIG. 2;
FIG. 4 is a sub-flowchart illustrating step S302 of FIG. 3;
FIG. 5 is another sub-flowchart of step S200 in FIG. 2;
FIG. 6 is a sub-flowchart illustrating step S202 of FIG. 2;
FIG. 7 is a sub-flowchart illustrating step S204 of FIG. 2;
FIG. 8 is another sub-flowchart of step S204 of FIG. 2;
FIG. 9 is a block diagram schematically illustrating a 3D file loading apparatus according to a second embodiment of the present application;
fig. 10 schematically illustrates a hardware architecture diagram of a computer device suitable for implementing a 3D file loading method according to a third embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be noted that the descriptions relating to "first", "second", etc. in the embodiments of the present application are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present application.
In the description of the present application, it should be understood that the numerical references before the steps do not identify the sequence of executing the steps, but merely serve to facilitate the description of the present application and to distinguish each step, and thus should not be construed as limiting the present application.
In order to facilitate those skilled in the art to understand the technical solutions provided in the embodiments of the present application, the following description is provided for the related technologies:
several 3D file formats are currently known: FBX, DAZ, USD, assetBundle, pak, MMD, VRM, and the like.
FBX, DAZ, USD, etc. formats: the method can not be loaded in the running process, intermediate data needs to be generated in a game engine in advance for rendering in the running process when the game engine is used, the intermediate data cannot be directly used as a propagation carrier to be sent to a user terminal, the method is more suitable to be used as a production tool rather than a consumption carrier, is only limited to be used as a medium for productivity in the professional field, and is not suitable to be used as a consumption carrier.
AssetBundle, pak, etc. formats: and the system is strongly bound with the engine version, and the upgrading of the engine version can cause that all resources need to be repackaged and cannot be suitable for products which take the creative theme of the player. The method is strongly related to an operating system, and resource packages of different platforms are not universal and need to be generated respectively. Cannot be propagated and traded as an independent resource, and cannot be endowed with the value of a virtual asset. The export during the operation can not be carried out, the re-creation modification can not be carried out, and the resources can not be reused.
MMD (MikuMikuDance) format: for 3D animated movie scenes, only as engineering derived video in a tool provided exclusively, with commercial licensing restrictions, no ecological chain supports its application in games or vtube (Virtual youtube, virtual UP master).
The VRM format: the virtual live broadcast and social VR game system is used for virtual live broadcast and social VR games, but only contains character part data, larger use scenes cannot be expanded, rendering effect is poor, regional limitation is caused, for example, mouth shape adaptation only includes Japanese, shaders only support MToon (cartoon shaders with global illumination), unlit (shaders without light materials) and PBR (physical Rendering), expansion flexibility is poor, for example, animations are not supported, scene loading is not supported, and function expansion cannot be performed by a third party, so that development of vTuber is hindered.
As mentioned above, several of the above-mentioned 3D file formats have certain limitations. In order to support players to create a 3D scene with high freedom degree and carry out sharing transaction, and the use is not influenced by technical factors such as an operating system, a tool type, a tool version and the like, the application provides a new file format. The format is not influenced by an operating system, a tool type and a version, is easy to use, create, modify and the like, and is convenient to load and export in running.
The new file format (target format) comprises the original specification of the GLTF format, functions are developed in Extensions and Extras fields, the existing GLTF file is compatible, and Json Scheme of the standard GLTF is guaranteed not to be damaged, so that the GLTF can be opened and modified by other tools; the previewing performance of the conventional GLTF tool on the file is reserved, so that a non-special tool can also reserve the previewing and editing capabilities to a certain degree, the minimum data structure of the file is ensured, and default data is used for supporting fields; a large amount of multiplexed data is not required to be stored in an Extra field, and data with strong universality and reusability is stored in Extension. In order to optimize the file loading speed and reduce the occupation of a memory, two sets of different loading mechanisms are provided for adapting to different use scenes.
The application aims to provide different loading mechanisms for optimizing the file loading speed and reducing the occupation of a memory based on the configured new 3D format, so as to adapt to different use scenes and realize a more optimized loading effect.
The following are explanations of terms of the present application:
a 3D (three dimensional) image, which is one of image files for storing information on a three dimensional model. The 3D image includes a three-dimensional model, a 3D animation, and a 3D project file. The 3D image may include model information consisting of polygons and vertices in three-dimensional space interpreted by the three-dimensional software, possibly including color, texture, geometry, light sources, and shadows, among other information. The 3D image file format may be used in VR, 3D printing, games, movie special effects, construction, medicine, and other related scenes.
GLTF (Graphics Language Transmission Format, graphic Language interchange Format): three-dimensional computer graphics formats and standards, which support the storage of three-dimensional models, appearances, scenes, and animations, are a simplified, interoperable format for 3D assets (Asset) that minimizes the file size and processing difficulty of applications. GLTF assets are external data supported by JSON files. Specifically, the GLTF asset contains a JSON format file for a complete scene description (. GLTF): descriptor information for node hierarchy, material, camera and mesh, animation and other constructs; a binary file (. Bin) containing geometry and animation data and other buffer-based data; and texture (. Jpg,. Png). The 3D objects in the scene are defined using the dashes (mesh) connected to the nodes. The material is used to define the appearance of the object. Animatics (Animations) describe how 3D objects transition over time. Skins define the way objects are geometrically shaped based on skeletal pose deformations. Cameras (Cameras) describe the view configuration of the renderer.
Resource: may include pictures, shaders, textures, models, animations, etc.
Material is a data set for a renderer to read, representing the interaction of an object with light, and includes a map, an illumination algorithm, and the like.
Texture (Texture), a regular, repeatable bitmap, is the basic unit of data input.
Map (Map), which includes texture and many other information, such as texture coordinate sets, map input output controls, etc. The map includes various forms such as a light map, an environment map, a reflection map, and the like. The illumination map is used for simulating the illumination effect of the surface of the object. The environment map includes six textures, and corresponding texture coordinate sets.
Texture mapping (Texture mapping), which maps a Texture to the surface of a three-dimensional object by a set of coordinates, such as UV coordinates.
AssetBundle: a file storage format supported by Unity is also a resource storage and update mode recommended by Unity officials, and can compress, package and dynamically load resources (Asset) and realize hot update.
FBX: is the format used by FilmBoX software, and is called Motionbuilder after that. FBX can be used to model, material, motion and inter-conductance of camera information between software such as Max, maya, softimage, etc.
DAZ: is a file format of a 3D scene created by the modeling program DAZ Studio.
USD (Universal Scene Description), which is a file format provided by Pixar based on the animation movie full flow.
VRM (Virtual Reality Modeling): is a virtual 3D human shape model format.
Avatar: is a human form 3D character model.
Metaverse: the meta universe, or called the afterspace, the universe in shape, the hyper-space, and the virtual space, is a network of 3D virtual worlds focused on social links. The metastic universe may relate to a persisted and decentralized online three-dimensional virtual environment.
The game engine: it refers to some core components of the programmed editable computer game system or interactive real-time image application program. These systems provide game designers with the various tools needed to compose games, with the goal of allowing game designers to easily and quickly program a game without starting from scratch. Most support various operating platforms, such as Linux, macOS X, microsoft Windows. The game engine comprises the following systems: rendering engines (i.e., "renderers," including 2D graphics engines and 3D graphics engines), physics engines, collision detection systems, sound effects, scripting engines, computer animation, artificial intelligence, network engines, and scene management.
The technical solutions provided by the embodiments of the present application are described below by way of exemplary application environments.
Referring to fig. 1, an application environment diagram of a 3D file loading method according to an embodiment of the present application is shown. The computer device 2 may be configured to run and process 3D files. Computer device 2 may comprise any type of computing device, such as: smart phones, tablet devices, laptop computers, virtual machines, and the like. The computer device 2 may run a Windows system, android (Android) TM ) An operating system such as a system or an iOS system. In the following, the computer device 2 is used as a hardware main body to provide a plurality of 3D file loading schemes.
Example one
Fig. 2 schematically shows a flowchart of a 3D file loading method according to a first embodiment of the present application.
As shown in fig. 2, the 3D file loading method may include steps S200 to S204, in which:
step S200: and determining the target file type of the file to be loaded in the 3D file.
Step S202: and in response to the target file type being the first file type, loading the file to be loaded by adopting a multithreading parallel loading mode.
Step S204: and in response to the target file type being the second file type, loading the file to be loaded by adopting a recursive loading mode.
Taking a new file format compatible with GTLF as an example, a 3D file comprises various resources, the new file format comprises the original specification of GLTF format, functions are developed in Extensions and Extras fields, the existing GLTF file is compatible, and Json Scheme of standard GLTF is ensured not to be destroyed, so that the Json Scheme can be opened and modified by other tools; the previewing performance of the conventional GLTF tool on the file is reserved, so that a non-special tool can also reserve the previewing and editing capabilities to a certain degree, the minimum data structure of the file is ensured, and default data is used for supporting fields; a large amount of multiplexed data is not required to be stored in an Extra field, and data with strong universality and reusability is stored in Extension. But also means that the target format file is associated with more and more resources and is slowly loaded.
In the embodiment of the application, in order to optimize the file loading speed and reduce the occupation of the memory, different loading modes are adopted for different file types, so that different use scenes are adapted, and a more optimized loading effect is realized.
Each of steps S200 to S204 will be described in detail below.
Step S200: and determining the target file type of the file to be loaded in the 3D file.
The 3D file relates to various resource files, and the resource files have different sizes and types, which results in different display response logic (for example, whether all resources in the 3D file need to be used at the time of initial loading), different response speed requirements, and different consumption of computing resources for the computer device.
Thus, different loading modes may be configured for different response logic, response speed requirements, required resources, etc.
Therefore, the target file type of the file to be loaded needs to be determined, and different file types correspond to different loading modes.
Several embodiments for determining the target document class are provided below.
In an alternative embodiment, as shown in fig. 3, the step 200 may include:
step S300, determining the resource type of the file to be loaded, wherein the resource type corresponds to the format of the file to be loaded; and
step S302, determining the target file type according to the resource type of the file to be loaded, so as to select a loading mode.
Taking the loading of a scene as an example, a lot of assets such as cameras, illumination, sky boxes, textures and the like are involved, computing resources of computer equipment are greatly occupied, and simultaneous loading of various assets can cause picture blocking and even collapse.
Taking the loading of the character model as an example, the character model involves a small amount of assets such as partial network, material, texture and the like, and has high requirements on response speed.
According to the above-listed situation description of part of the files to be loaded, different loading modes can be adopted to adapt to different use scenarios, so as to ensure the loading effect and optimize the loading speed as much as possible and reduce the occupation of the memory.
In an alternative embodiment, the resource categories include character models, scenes, GLTF.
As shown in fig. 4, step S302 may include:
step S400, responding to the fact that the resource type is a character model, and determining that the target file type is the first file type; and
step S402, in response to the resource type being a scene or a GLTF, determining that the target file type is the second file type.
By distinguishing the file types, the smoothness and timeliness of 3D file loading can be effectively guaranteed, and blocking or collapse is reduced.
In another alternative embodiment, as shown in fig. 5, the step 200 may include:
step S500, determining the file size of the file to be loaded;
step S502, determining the target file type according to the file size of the file to be loaded, so as to select a loading mode.
And judging the loading mode according to the file size, wherein if a file volume threshold of 50 million is defined, the file to be loaded which is not more than 50 million is determined as a first file type, and the file to be loaded which is more than 50 million is determined as a second file type.
It should be noted that, the target file type of the file to be loaded is determined according to both the resource type and the size. However, according to actual needs, the type of the target file of the file to be loaded may also be determined according to other conditions, for example, dynamically determined according to the device operating conditions of the operating computer device, and the like. In addition, the file type may be preliminarily determined according to the resource type, and the final file type may be further determined according to the file size.
Step S202: and in response to the target file type being the first file type, loading the file to be loaded by adopting a multithreading parallel loading mode.
When the target file type is the first file type, it indicates that the response logic needs, the response speed requires high, or the required technical resources are small, and therefore, the file to be loaded can be processed in a parallel processing response mode.
In an alternative embodiment, as shown in FIG. 6, the implementation of multithreaded parallel loads is as follows: step S600, a loading task is executed in parallel through multiple threads, so as to load multiple resources of the file to be loaded into a memory at one time.
The plurality of threads may include a main thread and one or more sub-threads; the main thread is used for loading and rendering part of resources in the file to be loaded; and the sub-thread is used for loading another part of resources in the file to be loaded. In an exemplary application, taking a file to be loaded as an Avatar model (Avatar) file as an example, there are no unnecessary resources in the Avatar file, so all resources in the Avatar file can be loaded simultaneously in a multi-thread manner, and tasks can also be executed in parallel by adopting multi-threads. For example, the maps can be loaded all at once, and the sub-threads are started to load the grids at the same time. The method is to display all the resources after the synchronous loading is finished. The method is suitable for smaller file volumes and has high loading speed.
The loading task is executed in parallel through multiple threads, the loading task is not suitable for all 3D files, the main thread is easy to block during loading, and obvious blocking crash risks (the process can be automatically ended without response for a long time) exist when the loading task is used for a large scene.
Step S204: and in response to the target file type being the second file type, loading the file to be loaded by adopting a recursive loading mode.
When the target file type is the second file type, it is indicated that response logic is required, response speed is required to be low, or required technical resources are large, and therefore, the file to be loaded can be processed in a response mode of asynchronous processing.
Taking the loading of the object format file as an example, it relates to various large quantities of assets, but a large quantity of resources are not needed in the initial loading process, if all resources of the object format file are loaded, the computing resources of the computer device are greatly occupied, and the simultaneous loading of various types of assets can also cause the screen to be stuck or even crashed.
In an alternative embodiment, as shown in FIG. 7, the recursive loading is implemented as follows: step S700, performing task frame-by-frame segmentation on the loading task through asynchronous task processing to obtain subtasks of each frame; and step S800, asynchronously and gradually loading part of resources in the file to be loaded according to the sequence of each frame and the subtasks of each frame.
Loading in a recursive mode: in order to give consideration to the progressive loading of a large scene and reduce the memory occupation, a loading mechanism is provided, the resource is loaded only when needed, and a resource recursive loading mode is adopted. Only when data of a certain resource is used, reading into a memory is started, asynchronous task processing is adopted to perform task frame-by-frame segmentation on loading, so that the main thread cannot be blocked, objects can be loaded in a background in a silent mode, rendering and user operation of the main thread are not affected, the method is suitable for loading of large scenes, and due to the fact that response cannot be lost, the risk of breakdown is low. For example, when a large scene file is loaded, only part of resource files required to be used can be loaded and displayed immediately after each part is loaded, a picture cannot be blocked, and all resources do not need to be loaded (under the condition of multiple scenes, a scene of the file contains a lot of resources which are not required to be used immediately) and displayed one by one in the loading process. Progressive loading does not process tasks of various loads in parallel, and the loading time of a map, a grid and the like is linear.
In an alternative embodiment, as shown in fig. 8, the step S204 may implement recursive loading by the following steps:
step S800, determining default scenes of a plurality of scenes under the condition that the file to be loaded corresponds to the plurality of scenes;
step S802, loading the resources required by the default scene by adopting a recursive loading mode;
step S804, in the case that other scenes of the multiple scenes are needed, loading resources needed by the corresponding other scenes according to the usage requirement.
The file to be loaded may itself contain multiple scenes, where the Scene ID specifies one of the multiple scenes. For a single scene, the resource loading and removal are immediate, if the resource loading and removal is needed later, the whole file needs to be reloaded, and in the case of a plurality of scenes, one of the scenes is designated as a default scene. In order to improve user experience, default scenes can be loaded in a default mode so as to render and display the default scenes quickly, and user experience is improved.
The object format file in the present embodiment will be described below.
Newly added attributes are defined in the attribute extension fields of the target format file, the target format file is associated with a target format compatible with the GLTF format, and the target format is obtained by defining the extension field information of the GLTF format.
In an exemplary application, the new attribute includes: an attribute defined in the extension field and used to be pointed to by a node; defining attributes to which no node points in the extension field; and/or define attributes in the nodes.
In the GLTF format, a plurality of elements constituting a 3D image are defined, such as: scene (Scene), node (Node), mesh (Mesh), camera (Camera), material (Material), texture (Texture), skin (Skin).
And a scene, wherein the scene structure describes the item, and the scene graph is defined by referring to one or more nodes.
And the nodes are mounted in the scene. The nodes may reference child nodes, grids, cameras, and skins describing the grid transformation, among others.
A mesh for describing mesh data of a 3D object appearing in the scene,
a camera configured for a viewing frustum for rendering a scene.
Each of these elements has one or more attributes. Attributes are used to define properties, characteristics, descriptions, etc. of the corresponding elements.
Taking a node as an example, the attribute table may include: camera, child node, skin, matrix, grid, quaternion rotation, scaling ratio, position information, weight array of grid, name, attribute extension field, attribute addition field.
In the target format, all functions and effects supported by the GLTF format are inherited, and on the premise of not damaging the GLTF format structure, the newly added attribute of the target format is defined by using the attribute extension field and the attribute additional field. In addition, the object format file support field uses default data. The method is not influenced by an operating system, a tool type and a version, is easy to use, create, modify and the like, and is convenient to load and export at runtime. It should be noted that, in order to optimize the loading speed of the target format file and reduce the memory occupation, two different loading mechanisms are provided for adapting to different usage scenarios, that is: the attribute information which does not need to be massively multiplexed is stored in the attribute additional field, and the attribute information with universality and strong reusability is stored in the attribute extension field.
In an exemplary application, the newly added attribute may include an audio file attribute, an audio behavior attribute, an emoticon attribute, a collision volume attribute, a humanoid skeleton attribute, a flip attribute, a lighting map attribute, a metadata attribute, a skeleton dynamics attribute, a post-processing attribute, a dynamic scenario attribute, a scene rendering attribute, a sky box attribute, a cube map attribute, a scenario timeline attribute, a sprite attribute, a streaming media attribute, a resource variable attribute, a derived attribute, and the like. Of course, other attributes of engine or web support may also be included, supporting more functionality.
The application framework in the embodiment of the present application is used for adapting the object format file, so that the following advantages are provided:
(1) The file in the GLTF format can be compatible.
(2) Extensions of a plurality of functionalities are provided so that respective attributes of the object format file including the added attribute can be supported.
Each new attribute of the object format file is introduced below.
The attribute extension field of the target format file defines the attribute of the audio file;
wherein the audio file attribute is used for providing file information of an audio clip for the reproduction of the audio clip.
The audio file attributes may be pointed to by the node and thus used by the node.
As shown in table 1, the audio file attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, includes the following information:
Figure BDA0003740357920000081
TABLE 1
The export format of the target format file can be selected from two suffix formats: gltf and glb. When exporting a separate gltf file, uri is used; when the export is a.glb file, the information will be stored through the bufferView field. It should be noted that more suffixes can be defined subsequently for different derived types, for example, according to a pure character model or a scene, a file is defined with different suffixes, and the difference is used as a functional distinction.
The attribute extension field of the target format file defines the attribute of the defined audio behavior;
wherein the audio behavior attribute comprises one or more playback parameters for controlling playback of the audio clip.
The node can further refer to the audio behavior attribute on the basis of referring to the audio file attribute.
As shown in table 2, the audio behavior attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, includes the following information:
Figure BDA0003740357920000091
Figure BDA0003740357920000101
TABLE 2
The attribute expansion field of the target format file defines expression transformation attributes;
the expression transformation attributes comprise material information and standard expression file information used for setting grid mixed shapes.
The emoticon attribute may be pointed to by a node and thus used by the node.
As shown in table 3, the emoticon attribute defined in the attribute extension field, based on which the service layer provides corresponding capabilities, includes the following information:
Figure BDA0003740357920000102
TABLE 3
Wherein, the blendshapeValues defines a mapping table, and records the weights of a plurality of grid transformations to expression transformations. The materialVector4Values define a list recording sets of material parameters for four component vectors (e.g., grid tangent, shader). materialColorValues define another list in which sets of material parameters representing colors are recorded. materialFloatValues define another list that includes sets of material parameters of the float type.
The attribute extension field of the target format file defines the attribute of a collision body;
wherein the collision volume attributes comprise one or more parameters for a collision volume for supporting collision interactions.
The collision volume properties may be pointed to by the nodes and thus used by the nodes.
As shown in Table 4, the attribute of the collision volume defined in the attribute extension field (based on which the service layer provides the corresponding capabilities) includes the following information:
Figure BDA0003740357920000111
TABLE 4
The attribute extension field of the target format file defines the attribute of a human-shaped skeleton;
the human-shaped skeleton attributes comprise parameters of a plurality of human-shaped skeletons and relationship and action constraint among the human-shaped skeletons.
The humanoid skeletal properties may be pointed to, and thus used by, nodes that correspond to actual humanoid skeletal points.
The Humanoid skeleton attribute defines Avatar used by the Humanoid model.
Any model imported as a human animation type may generate an Avatar resource in which information driving an actor is stored.
The Avatar system is used to tell the game engine how to recognize that a particular animated model is humanoid in layout, and which parts of the model correspond to legs, arms, head and body, after which step the animation data can be "reused". It should be noted that due to the similarity of the skeletal structure between different human characters, animation can be mapped from one human character to another, thereby achieving repositioning and inverse kinematics.
As shown in table 5, the humanoid skeletal attributes defined in the attribute extension field (based on which the service layer provides corresponding capabilities) include the following information:
Figure BDA0003740357920000121
TABLE 5
In which humanBones record multiple joints, as well as the connection and spatial transformation relationships between individual joints (e.g., neck, head).
The node can further refer to the bone change attribute on the basis of referring to the humanoid bone attribute.
The bone change attribute, based on which the service layer provides the corresponding capabilities, also includes the contents shown in table 6.
Figure BDA0003740357920000122
Figure BDA0003740357920000131
TABLE 6
The attribute extension field of the target format file defines the defined reloading attribute;
the reloading attribute comprises a list of different reloading schemes and a material parameter list of each reloading scheme.
The reloading attribute may be pointed to by the node and thus used by the node.
And on the premise of Avatar, the nodes can refer/point to the reloading attribute, so that reloading of people is supported.
The reloading system is implemented by changing grid visibility or material on the grid.
As shown in tables 7-9, the reloading attribute defined in the attribute extension field (based on which the service layer provides the corresponding capabilities) includes the following information:
types of Description of the invention Whether or not it is necessary to
dressUpConfigs GLTFDress Set of reloading schemes Is that
TABLE 7
Figure BDA0003740357920000132
Figure BDA0003740357920000141
TABLE 8
Figure BDA0003740357920000142
TABLE 9
Where table 7 is a set of reloading schemes, table 8 is information for each reloading scheme, and table 9 is the changes contained for a single reloading.
The attribute extension field of the target format file defines the attribute of the illumination map;
wherein the illumination map attribute is to instruct an engine to pre-compute a change in surface brightness in the scene. The illumination map attribute is defined in the attribute extension field and need not point to other objects.
As shown in Table 10, the lighting map attribute defined in the attribute extension field, based on which the service layer provides the corresponding capabilities, includes the following information:
Figure BDA0003740357920000143
watch 10
Wherein each map stores different information of the lighting of the user scene.
For example, lightmapTextureInfo [ ] includes: the color of the incident light (necessary), the principal direction of the incident light (necessary), the shade of each lamp (necessary), etc.
Metadata attributes are defined in the attribute extension field of the target format file;
wherein the metadata attributes include resource description information, resource management information, legal information, and/or content reference information. The metadata attributes are defined in the attribute extension field and need not be pointed to in other objects.
Resource description information: for discovery and identification, elements may include title, abstract, author, and keywords. Arranged in order to form chapters. It describes the type, version, relationship and other characteristics of the digital material.
Resource management information: information for managing resources, such as resource type, permissions.
Legal information: providing information about the creator, copyright owner and public license.
Content reference information: information about the content.
As shown in table 11, the metadata attributes defined in the attribute extension field, based on which the service layer provides the corresponding capabilities, include the following information:
Figure BDA0003740357920000151
TABLE 11
The attribute expansion field of the target format file defines a skeleton dynamics attribute;
wherein the bone dynamics attributes are used to support simulating dynamic motion of an object bound to the bone.
In an exemplary application, a skirt, hair, pendant, etc. can be simulated to follow the movement of the skeleton, body, etc.
The attribute extension field of the target format file defines post-processing attributes;
wherein the post-processing attributes comprise attributes of the volume component and attributes of the supported post-processing effects.
The post-processing attribute may be pointed to by the node and thus used by the node.
The volume components include attributes that control how they affect the camera and how they interact with other volumes. It is a full screen effect for 3D rendering, can improve rendering effect, and requires little time to set.
The following describes the properties of a volume assembly:
as shown in table 12, the attributes of a volume component (based on which the service layer provides the corresponding capabilities) include the following information:
Figure BDA0003740357920000161
TABLE 12
By means of the profile ID it is possible to specify which effect is used.
Whether globally generated or locally effected, needs to be pointed to by the node to serve the node that specifies the post-processing attribute.
Wherein, the supported post-processing effect may include: ambient light shading, blooming, mixer, color difference, color adjustment, color curve, depth of field, film grain, lens distortion, lifting, gamma and gain, motion blur, panini (Panini) projection, shadow midtone spot, split tone, tone mapping, vignetting, white balance.
Each post-processing effect may define a corresponding attribute in an attribute extension field.
By vignetting, for example, vignetting refers to darkening and/or desaturation of the image edges as compared to the center. Vignetting comprises the attributes in table 13.
Figure BDA0003740357920000162
Figure BDA0003740357920000171
Watch 13
The attribute extension field of the object format file defines the dynamic script attribute (the service layer provides corresponding capability based on the attribute);
wherein the dynamic script attribute comprises a character string for the engine to execute so as to support the interpretation and the running of the external script. The dynamic script attributes are defined in the attribute extension field and do not need to be pointed to in other objects.
In an exemplary application, the above-mentioned character strings may point to external scripts, such as pushers, lua scripts, and the like.
Rendering events and events from the input device are received, and the script engine executes the script upon receiving the corresponding events.
The event may include: the method comprises the steps of rendering a first frame by an object, starting an object assembly, closing the object assembly, destroying the object assembly, updating each frame, and calling all objects periodically according to time after all objects are updated.
Still further, the events may also include manually triggered events, such as events triggered by: keyboard, mouse, joystick, controller, touch screen, motion sensing function (such as accelerometer or gyroscope), VR (Virtual Reality) and AR (Augmented Reality) controllers, etc.
The target format file defines a global scene rendering attribute in the attribute extension field; wherein the scene rendering properties comprise one or more rendering effect parameters for affecting the scene. The scene rendering properties are defined in the property extension field and do not need to be pointed to in other objects. As shown in table 14, the scene rendering attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, includes the following information:
Figure BDA0003740357920000172
Figure BDA0003740357920000181
TABLE 14
The target format file defines a sky box attribute in the attribute extension field; wherein the sky-box attribute is used to instruct an engine to create an unbounded background display to color the pointed material. The sky-box attribute is defined in the attribute extension field and need not point to other objects. As shown in table 15, the sky box attribute defined in the attribute extension field (based on which the service layer provides corresponding capabilities) includes the following information:
type (B) Description of the invention Whether it is necessary or not
material id Texture using sky box shader Is that
Watch 15
Taking the video game level as an example, when the sky box is used, the level is enclosed in a rectangular parallelepiped. Sky, distant mountains, distant buildings and other unreachable objects are projected onto the surface of the cube creating the illusion of a distant 3D environment. The dome is equivalent, using a sphere or hemisphere instead of a cube.
The object format file defines a cube map attribute in the attribute extension field; the attributes of the cube map comprise the layout, the texture mapping and the texture of each surface of the cube map. The cube map attribute is not pointed to by a node, but rather is used within the material as a special map type point. As shown in Table 16, the cube map attribute defined in the attribute extension field, based on which the service layer provides the corresponding capabilities, may include the following information:
Figure BDA0003740357920000191
TABLE 16
Cube maps are a collection of six square textures representing reflections in the environment. The six squares form the faces of an imaginary cube surrounding an object; each face represents a view along the world axis (up, down, left, right, front, back). The image type (imageType) includes: the 6 squares in one row or column are stitched into a texture (aspect ratio 6.
The attribute extension field of the object format file defines a plot time axis attribute (the service layer provides corresponding capability based on the attribute);
wherein the scenario timeline attribute is used to arrange tracks of objects and create cut scenes and game sequences.
The storyline timeline attribute may be pointed to by a node and thus used by the node.
The plot timeline attribute may include the following information:
the name of the track resource;
an animation track group describing an animation track;
an audio track group describing audio tracks;
a track set of expression transformations (typically used for facial animation of facial expressions), describing expression transformations;
a material parameter curve track group, wherein the curve can change in output value (parameter of float floating point type) along with time to describe the change of material;
the material parameter curve track group is used for describing Color change by the change of the curve (Color type parameter) along with time;
a material parameter track group (parameters of int integer type) describing the material;
a material parameter track group (Color type parameter) describing Color;
a material parameter track group (a parameter of a Vector4 type) describing Vector4;
texture parameter track group (parameter of Texture2D map type), describing Texture2D (Texture);
whether the object is activated, the pool type, describing whether the object is activated;
whether the component is activated, the type of the pool, whether the description component is activated;
length of the entire track, floating point type, describes the track length.
Wherein all tracks comprise the following parameters: resource name, start time, end time, resource ID. The resource ID is used to specify the index position of the data source, and may be animation, map, audio, and other data.
Wherein the track parameters may include: track name (string type, not required), start time (floating point type, required), end time (floating point type, required).
In which sub-track data contained in a track group of each category may be represented by a generic type, such as describing a set of all sub-tracks under a category.
Different types of track data classes, such as two track groups representing animation and audio, can be obtained after inheriting the type of the specified generic type.
For the material Curve parameter classes, they may both inherit from generic types, for example: specifying whether to use one of the plurality of textures on the renderer, perform the reverse execution again after the execution is finished, and curve data.
And the curve of the expression transformation is used for smoothly carrying out character face capturing expression conversion.
The floating point parameter curve of the material can be based on the floating point type parameter of the incessantly updated material of time, including: the texture parameter name to be set.
The color parameter curve of the material, which is the color type parameter of the material updated continuously based on time, is inherited from the above classes, and may include: color values at start and end. And carrying out interpolation operation based on time, and continuously updating each frame of color.
When the animation component on the designated node is acquired, only the node ID is exported, and other variables are created during loading.
When the node uses the parameters in the attribute of the plot time axis, the playing behavior of the plot time axis may be specified, where the playing parameters for controlling the playing behavior may include: ID (describing track name, required), whether to play automatically upon loading (pool type, not required), and whether to loop play (pool type, not required).
The attribute extension field of the target format file defines the genius attribute;
wherein the sprite attributes include a layout, a texture reference, a texture location, a bounding box, a physical shape, and/or a spatial location.
The sprite attribute may be pointed to by the node and thus used by the node.
As shown in table 17, the sprite attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, may include the following information:
Figure BDA0003740357920000201
Figure BDA0003740357920000211
TABLE 17
Sprites (Sprite) are two-dimensional graphics objects. In a three-dimensional scene, the sprite is typically a standard texture. Textures can be combined and managed through the above-described sprite attributes to improve efficiency and convenience in the development process.
The target format file defines the stream media attribute in the node;
the streaming media attribute includes a URL (uniform resource locator) name, a URL address, and a streaming media format of the streaming media.
As shown in table 18, the streaming media attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, may include the following information:
type (B) Description of the invention Whether it is necessary or not
name string URL name Whether or not
url string URL address Is that
mimeType string Video format Whether or not
alternate List<string> Spare address Whether or not
Watch 18
Defining resource variable attributes in the nodes by the object format file;
wherein the resource variant attribute comprises a variable type and a set of reference field-pointing indices to support use of resources.
As shown in table 19, the resource variable attribute defined in the attribute extension field, based on which the service layer provides the corresponding capability, may include the following information:
types of Description of the invention Whether or not it is necessary to
type enum Variable type Whether or not
collections List<id> Set of indexes pointing to reference fields Is that
Watch 19
Resource variable attributes to support some resources that are not currently in use, but may be in use in the future. These resources may be, for example, textures, cube maps, textures, audio clips, animation clips, lighting maps.
The object format file defines part of the non-generic parameters in the attribute attachment field, which is mounted under the node or object.
The non-general parameters are relative to general parameters, and refer to parameters which have no global property and are updated frequently.
In the target format, attribute extension fields (Extensions) and attribute addition fields (Extras) are included in addition to the normal fields. The conventional fields in the target format are the same as those of the GLTF format, so that the target format is compatible with the GLTF format. The attribute addition field is used to add some information that is not generalized. The attribute extension field is global and the attribute addition field is local. The attribute attachment field is typically mounted under a certain node or object, providing a customized functional complement. Such attribute information may be recorded in Extras as attributes of a few engine-supported components, or attributes of frequently updated components (after part of the components are updated, their attribute names change or new fields are added). And provides a code generator to quickly generate code for user customized functional supplementation using the SDK (software development kit). And the attribute extension field is used for recording information with strong universality. That is, the attributes recorded in the attribute extension field are more versatile and more reusable than the attributes recorded in the attribute addition field.
For example, the following attribute information may be recorded into extra:
(1) Attributes (names) of human bones.
(2) The rest of the camera's necessary information to better support the restoration of the actual scene.
(3) And 3, customizing the material information to ensure that the tool can be used by other tools.
(4) And (5) UI information.
Currently, supported information includes animation, sound, camera, light, material, physics, rendering and other types of components which are led out of the information, and customized variables accessed by a script in an open mode also support the leading out of the components by using a code generation tool.
As an alternative embodiment, the object format file may implement a custom import/export.
The target format file comprises nodes, and the export attribute is mounted under the nodes so as to expand export function and the like.
The target format file also defines an import and export mode;
wherein, the derivation mode is used for defining the derivation of the provided material parameter and/or the derivation of the provided component parameter.
For example: specify the type (e.g., shader type), and define the derived items of texture parameter information.
And the following steps: derived as additional field information under the node, such as: specifying component types (e.g., animations), and derived items of common parameter information.
As can be seen from the above, compared to the GLTF format, the target format defines a large number of new attributes to support the implementation of a large number of functions or effects, as follows:
(1) The GLTF format is compatible, namely information records of Scene, node, mesh, material, texture and the like are supported.
(2) Extensions to the standard GLTF format are supported, such as official material extensions of KHR materials PbrSpecularGlossification, KHR materials unlit, KHR materials clearcoat, etc.
(3) And official function extensions such as light import and export in a standard GLTF format are supported.
(4) The camera import export adds additional engine specific data, but still retains the support of the camera in the GLTF format.
(5) Supporting colliders such as: spherical, square, cylindrical, capsule, etc.
(6) And the import and export extension of the custom material type is supported.
(7) Bone skinning data derivation is supported.
(8) Grid deformation supporting expression transformation can be used for transformation of Avatar facial expression capture.
(9) Supporting animation, including the transformation of the spatial position (position, rotation, size) and expression transformation of an object.
(10) The method supports recording human skeleton data and is used for universal human-shaped animation and motion capture.
(11) And (4) supporting reloading.
(12) Audio is supported.
(13) Add URL data export.
(14) Streaming video playback is supported, and the URL references various external resources (including network files, streaming media, local files).
(15) Metadata management, etc. is supported for deciding under which uses the model can be used, such as whether use on mild pornography or violent activity is allowed.
(16) And supporting expression mixing output.
(17) The plot time axis is supported, and the mixing of various animations including animation, sound and expression control, object visibility, material parameters and the like can be realized based on the time axis.
(18) Supporting the sky box.
(19) Post-processing is supported.
(20) Skeletal dynamics (hair and clothing physical system) is supported.
(21) And the paint spraying and applique manufacturing are supported.
(22) Supporting grid-based text display
(23) Draco is supported, which is an open source mesh compression standard.
(24) Supporting cube maps.
(25) Sprites are supported for 2D rendering or UI.
(26) And supporting the illumination mapping.
(27) An event system is supported.
To make the advantages of the present application more clear, a comparison of the VRM format and the target format is provided below.
VRM (virtual reality modeling) is also a 3D file format developed based on GLTF. The VRM file allows all supported applications to run the same avatar data (3D model).
As new formats developed based on the GLTF format, the target format has the following advantages over the VRM format:
the GLTF format is compatible, can be used on various game engines, webGL, and can be opened and edited by professional design software (such as Maya, blender, C4D and the like).
The method supports scene export, animation, multimedia, sky box, grid compression, custom material parameters, script parameters and the like, and the functionality can be continuously expanded.
The system is crossed, tools are adopted, version compatibility support is realized, one file is compatible with all devices, only Runtime needs to be owned, the influence of an engine version and target operation devices is avoided, and the method is very suitable for being used as an exchange medium to be put on shelves in stores to build ecology.
The material can be selected by oneself, has established to belong to the standard specification of oneself, and has contained the code generation instrument, can deal with quick transform demand.
The components or user-defined logic can be flexibly customized for the services, and the data can also be exported to files, for example, the application of a VR girlfriend can be put into the files and loaded by a program frame instead of independently generating the application, so that long-term service development and ecological construction are facilitated.
Details are given in table 20 below.
Figure BDA0003740357920000241
Figure BDA0003740357920000251
Watch 20
Example two
Fig. 9 schematically shows a block diagram of a 3D file loading apparatus according to a second embodiment of the present application. The 3D file loading apparatus may be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to implement the embodiments of the present application. The program modules referred to in the embodiments of the present application refer to a series of computer program instruction segments that can perform specific functions, and the following description will specifically describe the functions of the program modules in the embodiments. As shown in fig. 9, the 3D file loading apparatus 900 may include a determination module 910, a first response module 920, and a second response module 930, wherein:
a determining module 910, configured to determine a target file category of a file to be loaded in the 3D file;
a first response module 920, configured to respond that the target file category is a first file category, and load the file to be loaded in a multithreading parallel loading mode;
a second response module 930, configured to, in response to that the target file category is a second file category, load the file to be loaded in a recursive loading mode.
In an optional embodiment, the first response module is further configured to:
and executing a loading task in parallel through multiple threads to load a plurality of resources of the file to be loaded into the memory at one time.
In an optional embodiment, the second response module is further configured to:
performing frame-by-frame segmentation on the loading task through asynchronous task processing to obtain subtasks of each frame;
and asynchronously and gradually loading part of resources in the file to be loaded according to the sequence of each frame and the subtasks of each frame.
In an optional embodiment, the determining module is further configured to:
determining the resource type of the file to be loaded, wherein the resource type corresponds to the format of the file to be loaded; and
and determining the target file type according to the resource type of the file to be loaded so as to select a loading mode.
In an alternative embodiment, the resource categories include character models, scenes, GLTF; the determination module is further configured to:
in response to the resource type being a character model, determining that the target file type is the first file type; and
and determining the target file type to be the second file type in response to the resource type being a scene or a GLTF.
In an optional embodiment, the determining module is further configured to:
determining default scenes of a plurality of scenes under the condition that the file to be loaded corresponds to the plurality of scenes;
loading resources required by the default scene by adopting a recursive loading mode;
and under the condition that other scenes of the plurality of scenes are needed, loading resources needed by the corresponding other scenes according to the use requirements.
In an optional embodiment, the determining module is further configured to:
determining the file size of the file to be loaded;
and determining the category of the target file according to the file size of the file to be loaded so as to select a loading mode.
EXAMPLE III
Fig. 10 schematically shows a hardware architecture diagram of a computer device 2 suitable for implementing the 3D file loading method according to the third embodiment of the present application. In the present embodiment, the computer device 2 is a device capable of automatically performing numerical calculation and/or information processing in accordance with a command set or stored in advance. For example, it may be a smartphone, tablet, laptop, virtual machine, etc. As shown in fig. 10, the computer device 2 includes at least, but is not limited to: the memory 10010, the processor 10020, and the network interface 10030 may be communicatively linked to each other through a system bus. Wherein:
the memory 10010 includes at least one type of computer-readable storage medium comprising flash memory, hard disks, multimedia cards, card-type memory (e.g., SD or DX memory, etc.), random Access Memory (RAM), static Random Access Memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disks, optical disks, etc. In some embodiments, the memory 10010 may be an internal storage module of the computer device 2, such as a hard disk or a memory of the computer device 2. In other embodiments, the memory 10010 can also be an external storage device of the computer device 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the computer device 2. Of course, the memory 10010 may also include both internal and external memory modules of the computer device 2. In this embodiment, the memory 10010 is generally configured to store an operating system installed on the computer device 2 and various application software, such as a program code of a 3D file loading method. In addition, the memory 10010 can also be used to temporarily store various types of data that have been output or are to be output.
Processor 10020, in some embodiments, can be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip. The processor 10020 is generally configured to control overall operations of the computer device 2, such as performing control and processing related to data interaction or communication with the computer device 2. In this embodiment, the processor 10020 is configured to execute program codes stored in the memory 10010 or process data.
Network interface 10030 may comprise a wireless network interface or a wired network interface, and network interface 10030 is generally used to establish a communication link between computer device 2 and other computer devices. For example, the network interface 10030 is used to connect the computer device 2 to an external terminal through a network, establish a data transmission channel and a communication link between the computer device 2 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, bluetooth (Bluetooth), or Wi-Fi.
It should be noted that fig. 10 only shows a computer device having components 10010-10030, but it should be understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the 3D file loading method stored in the memory 10010 can be further divided into one or more program modules, and executed by one or more processors (in this embodiment, the processor 10020) to complete the embodiment of the present application.
Example four
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the 3D file loading method in the embodiments.
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the computer readable storage medium may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. In other embodiments, the computer readable storage medium may be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device. Of course, the computer-readable storage medium may also include both internal and external storage devices of the computer device. In this embodiment, the computer-readable storage medium is generally used to store an operating system and various types of application software installed in the computer device, for example, the program code of the 3D file loading method in the embodiment, and the like. Further, the computer-readable storage medium may also be used to temporarily store various types of data that have been output or are to be output.
It will be apparent to those skilled in the art that the modules or steps of the embodiments of the present application described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different from that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
It should be noted that the above are only preferred embodiments of the present application, and not intended to limit the scope of the present application, and all equivalent structures or equivalent processes performed by the contents of the specification and the drawings, or applied directly or indirectly to other related technical fields, are all included in the scope of the present application.

Claims (10)

1. A3D file loading method is characterized by comprising the following steps:
determining the target file type of a file to be loaded in the 3D file;
in response to the target file type being the first file type, loading the file to be loaded by adopting a multithreading parallel loading mode;
and in response to the target file type being the second file type, loading the file to be loaded by adopting a recursive loading mode.
2. The 3D file loading method according to claim 1, wherein the loading the file to be loaded in response to the target file class being the first file class in a multi-thread parallel loading mode comprises:
and executing a loading task in parallel through multiple threads to load a plurality of resources of the file to be loaded into the memory at one time.
3. The 3D file loading method according to claim 1, wherein the loading the file to be loaded in the recursive loading mode in response to the target file type being the second file type comprises:
performing frame-by-frame segmentation on the loading task through asynchronous task processing to obtain subtasks of each frame;
and asynchronously and gradually loading part of resources in the file to be loaded according to the sequence of each frame and the subtasks of each frame.
4. 3D file loading method according to any of claims 1 to 3,
the determining the target file category of the file to be loaded in the 3D file includes:
determining the resource type of the file to be loaded, wherein the resource type corresponds to the format of the file to be loaded; and
and determining the target file type according to the resource type of the file to be loaded so as to select a loading mode.
5. The 3D file loading method according to claim 4, wherein the resource categories include character model, scene, GLTF; the determining the target file type according to the resource type of the file to be loaded comprises:
in response to the resource type being a character model, determining that the target file type is the first file type; and
and determining the target file type to be the second file type in response to the resource type being a scene or a GLTF.
6. The 3D file loading method according to claim 5, further comprising:
determining default scenes of a plurality of scenes under the condition that the file to be loaded corresponds to the plurality of scenes;
loading resources required by the default scene by adopting a recursive loading mode;
and under the condition that other scenes of the plurality of scenes are needed, loading resources needed by the corresponding other scenes according to the use requirements.
7. 3D file loading method according to any of claims 1 to 3,
the determining the target file category of the file to be loaded in the 3D file includes:
determining the file size of the file to be loaded;
and determining the target file type according to the file size of the file to be loaded so as to select a loading mode.
8. A3D file loading device, characterized in that, the device comprises
The determining module is used for determining the target file type of a file to be loaded in the 3D file;
the first response module is used for responding to the fact that the target file type is the first file type, and loading the file to be loaded in a multithreading parallel loading mode;
and the second response module is used for responding to the second file type of the target file type and loading the file to be loaded by adopting a recursive loading mode.
9. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, is adapted to carry out the steps of the 3D file loading method according to any of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which is executable by at least one processor to cause the at least one processor to perform the steps of the 3D file loading method according to any one of claims 1 to 7.
CN202210813992.4A 2022-07-11 2022-07-11 3D file loading method and device Pending CN115167940A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210813992.4A CN115167940A (en) 2022-07-11 2022-07-11 3D file loading method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210813992.4A CN115167940A (en) 2022-07-11 2022-07-11 3D file loading method and device

Publications (1)

Publication Number Publication Date
CN115167940A true CN115167940A (en) 2022-10-11

Family

ID=83494107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210813992.4A Pending CN115167940A (en) 2022-07-11 2022-07-11 3D file loading method and device

Country Status (1)

Country Link
CN (1) CN115167940A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116501595A (en) * 2023-06-29 2023-07-28 北京大学 Performance analysis method, device, equipment and medium for Web augmented reality application

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116501595A (en) * 2023-06-29 2023-07-28 北京大学 Performance analysis method, device, equipment and medium for Web augmented reality application
CN116501595B (en) * 2023-06-29 2023-09-12 北京大学 Performance analysis method, device, equipment and medium for Web augmented reality application

Similar Documents

Publication Publication Date Title
CA2795739C (en) File format for representing a scene
AU2004319589B2 (en) Integration of three dimensional scene hierarchy into two dimensional compositing system
US8566736B1 (en) Visualization of value resolution for multidimensional parameterized data
US20100060652A1 (en) Graphics rendering system
CN108959392B (en) Method, device and equipment for displaying rich text on 3D model
CN110675466A (en) Rendering system, rendering method, rendering device, electronic equipment and storage medium
CN107393013A (en) Virtual roaming file generated, display methods, device, medium, equipment and system
WO2022183519A1 (en) Three-dimensional graphics image player capable of real-time interaction
US20130127849A1 (en) Common Rendering Framework and Common Event Model for Video, 2D, and 3D Content
CN111179391A (en) Three-dimensional animation production method, system and storage medium
CN115167940A (en) 3D file loading method and device
US11625900B2 (en) Broker for instancing
CN116958344A (en) Animation generation method and device for virtual image, computer equipment and storage medium
CN115170707B (en) 3D image implementation system and method based on application program framework
CN115170708B (en) 3D image realization method and system
CN115205430A (en) 3D file importing and exporting method and device
US20240009560A1 (en) 3D Image Implementation
Rhalibi et al. Charisma: High-performance Web-based MPEG-compliant animation framework
US20240111496A1 (en) Method for running instance, computer device, and storage medium
Luis-Tello et al. Interaction According to Immersion in Virtual Environments: Graphic Development and PBRS in Environments with Real-Time Rendering and Virtual Reality
US9569875B1 (en) Ordered list management
Mendoza Guevarra et al. The Finale
Hogue et al. Volumetric kombat: a case study on developing a VR game with Volumetric Video
CN116774902A (en) Virtual camera configuration method, device, equipment and storage medium
CN116521625A (en) File processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination