CN112596713B - Processing method and device based on illusion engine, electronic equipment and storage medium - Google Patents

Processing method and device based on illusion engine, electronic equipment and storage medium Download PDF

Info

Publication number
CN112596713B
CN112596713B CN202011607969.7A CN202011607969A CN112596713B CN 112596713 B CN112596713 B CN 112596713B CN 202011607969 A CN202011607969 A CN 202011607969A CN 112596713 B CN112596713 B CN 112596713B
Authority
CN
China
Prior art keywords
component
model
dimensional scene
component model
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011607969.7A
Other languages
Chinese (zh)
Other versions
CN112596713A (en
Inventor
田野
徐子安
李纯清
赵磊
翟铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xumi Yuntu Space Technology Co Ltd
Original Assignee
Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xumi Yuntu Space Technology Co Ltd filed Critical Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority to CN202011607969.7A priority Critical patent/CN112596713B/en
Publication of CN112596713A publication Critical patent/CN112596713A/en
Application granted granted Critical
Publication of CN112596713B publication Critical patent/CN112596713B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

According to the processing method, the processing device, the electronic equipment and the storage medium based on the illusion engine, which are provided by the invention, from the introduction of the intermediate format file of the file, the three-dimensional scene can be generated in the illusion engine, and then the material upgrading, the illumination adding, the illumination constructing and the packaging are automatically carried out, and finally the executable file is obtained, and the whole process is automatically executed, so that the real-time engine can display based on the executable file generated automatically. Based on the method and the device, the design file can be converted into the executable file of the real-time engine in a short time, the whole process does not need human intervention, and the effect manufacturing efficiency is improved.

Description

Processing method and device based on illusion engine, electronic equipment and storage medium
Technical Field
The present invention relates to the field of software technologies, and in particular, to a processing method and apparatus based on a fantasy engine, an electronic device, and a storage medium.
Background
After the design is made in the design software, the developer needs to preview the whole so as to find out the problem of omission in the design. Because of the problems of inconvenient roaming viewing, poor display effect and the like existing in the process of software preview design, research and development personnel often make effects by means of a fantasy engine and view the effects in a real-time engine after the effects are made. However, the method requires manual operation of research and development personnel, has long manufacturing period, cannot realize standardization, and has different final effects for different manufacturing parties.
Disclosure of Invention
In view of the above, the present invention provides a processing method, device, electronic device and storage medium based on a fantasy engine, which has the following technical scheme:
a fantasy engine-based processing method applied to a fantasy engine conversion server, the method comprising:
obtaining an intermediate format file of a target file, wherein the intermediate format file contains model information of a plurality of components;
generating a three-dimensional scene in the illusion engine based on the model information of the plurality of components, wherein the three-dimensional scene consists of component models corresponding to the plurality of components, and one component corresponds to one component model;
upgrading the material of each component model in the three-dimensional scene, and adding illumination for the three-dimensional scene;
and calling the illusion engine to execute illumination construction and packaging operation so as to acquire a corresponding executable file, wherein the executable file is the basis for real-time engine display.
Preferably, the obtaining the intermediate format file of the target file includes:
converting the target file through a conversion server to obtain the intermediate format file;
and receiving the intermediate format file output by the conversion server.
Preferably, the upgrading processing of the material of each component model in the three-dimensional scene includes:
invoking a pre-established material library, wherein the material library comprises physically rendered materials corresponding to different material types;
analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target material type of a material used by the component model;
and replacing the materials used by the component model by using the materials corresponding to the target material type in the material library.
Preferably, the intermediate format file further includes attribute information of a plurality of members, and the adding illumination to the three-dimensional scene includes:
adding natural illumination and indoor illumination to the three-dimensional scene;
the indoor illumination adding process comprises the following steps:
analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model;
if the component type of the component model belongs to an indoor light source, acquiring element parameters corresponding to the component type;
and generating indoor illumination corresponding to the component model according to the position and the direction of the component model in the three-dimensional scene and the component parameters.
Preferably, the intermediate format file further includes attribute information of a plurality of members, and the method further includes:
analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model;
determining a first component model belonging to a door and a second component model belonging to a ceiling according to the component type of each component model in the three-dimensional scene;
determining two candidate indoor location points based on the position and direction of the first component model in the three-dimensional scene;
and respectively taking the two candidate indoor position points as references to upwards make rays, taking the candidate indoor position points which can be intersected with the second component model by the made rays as actual indoor position points, and taking the actual indoor position points as the basis for determining the default view angle positions.
Preferably, the method further comprises:
and upgrading each component model in the three-dimensional scene.
Preferably, the upgrading processing for each component model in the three-dimensional scene includes:
invoking a pre-established model library, wherein the model library comprises real models corresponding to different model types;
analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target component type of the component model;
and replacing the component model by using a real model corresponding to the target component type in the model library.
A fantasy engine based processing device, said device comprising:
the system comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for acquiring an intermediate format file of a target file, and the intermediate format file contains model information of a plurality of components;
the creation module is used for generating a three-dimensional scene in the illusion engine based on the model information of the plurality of components, wherein the three-dimensional scene is composed of component models corresponding to the plurality of components, and one component corresponds to one component model;
the processing module is used for upgrading the materials of each component model in the three-dimensional scene and adding illumination for the three-dimensional scene;
and the calling module is used for calling the illusion engine to execute illumination construction and packaging operation so as to acquire a corresponding executable file, wherein the executable file is a basis for real-time engine display.
An electronic device, the electronic device comprising: at least one memory and at least one processor; the memory stores a program, and the processor calls the program stored in the memory, where the program is used to implement any one of the processing methods based on the illusion engine.
A storage medium having stored therein computer executable instructions for performing any of the illusive engine based processing methods.
Compared with the prior art, the invention has the following beneficial effects:
according to the processing method, the processing device, the electronic equipment and the storage medium based on the illusion engine, which are provided by the invention, from the introduction of the intermediate format file of the file, the three-dimensional scene can be generated in the illusion engine, and then the material upgrading, the illumination adding, the illumination constructing and the packaging are automatically carried out, and finally the executable file is obtained, and the whole process is automatically executed, so that the real-time engine can display based on the executable file generated automatically. Based on the invention, the design can be converted into the executable file of the real-time engine in a short time, the whole process does not need human intervention, and the effect manufacturing efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a processing method based on a illusion engine according to an embodiment of the present invention;
FIG. 2 is a flowchart of another method of processing a virtual engine according to an embodiment of the present invention;
FIG. 3 is a flowchart of another method of processing a processing method based on a illusion engine according to an embodiment of the present invention;
FIG. 4 is a flowchart of a processing method based on a illusion engine according to an embodiment of the present invention;
FIG. 5 is a schematic view of a scenario provided by an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a processing device based on a fantasy engine according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
While some embodiments of the present invention are based on a illusion engine to expose a Revit model, embodiments of the present disclosure are applicable to other design software as well. The illusion engine is a complete authoring tool developed by the Epic Games, can be used for constructing Games, simulation and visual contents, and can intuitively display a three-dimensional model of a building by using the illusion engine, and has excellent final display effect. The present invention will be described in detail below.
The embodiment of the invention provides a processing method based on a fantasy engine, which is applied to a fantasy engine conversion server, and a flow chart of the method is shown in figure 1, and comprises the following steps:
s10, obtaining an intermediate format file of the target Revit file, wherein the intermediate format file contains model information of a plurality of components.
In the embodiment of the present invention, the intermediate format file is a Datasmith file, which at least contains model information of the component, and the model information may contain information about both geometry (for describing the shape, size, etc. of the component) and texture. In addition, the intermediate format file also contains attribute information of the component, and the attribute information may contain non-descriptive information such as price, supplier, category, and the like.
Specifically, the user may directly upload the specified Revit file, that is, the target Revit file, to the Revit conversion server, and the Revit conversion server converts the target Revit file into the corresponding intermediate format file. In a specific implementation process, step S10 "obtain the intermediate format file of the target Revit file" may include the following steps:
converting the target Revit file by a Revit conversion server to obtain an intermediate format file; and receiving the intermediate format file output by the Revit conversion server.
In the embodiment of the invention, a user uploads a target Revit file to a Revit conversion server through a front-end page, and notifies the Revit conversion server to process through an HTTP request.
After reading the analysis target Revit file (rvt format), the Revit conversion server extracts the model and attribute of each component and generates a corresponding intermediate format file. And the conversion program of the Revit conversion server can be provided by Epic Games. Of course, the relevant conversion procedure may be modified in advance, the attribute of the component may be supplemented, and the component may be served, which is not limited by the embodiment of the present invention.
In addition, the Revit conversion server also has a file storage service, which can save the target Revit file after converting the target Revit file into an intermediate format file, and output the intermediate format file to the illusion engine conversion server within a specified time.
S20, generating a three-dimensional scene in the illusion engine based on model information of a plurality of components, wherein the three-dimensional scene is composed of component models corresponding to the plurality of components, and one component corresponds to one component model.
In the embodiment of the invention, the illusion engine conversion server can import the intermediate format file into the illusion engine by calling the official Datasmith plugin, and construct and generate the whole three-dimensional scene in the illusion engine.
Of course, the information required for constructing the three-dimension at least comprises model information of a plurality of components, and if the intermediate format file also comprises attribute information of a plurality of components, the attribute information of the components can be added to a corresponding construction model for a user to view.
S30, upgrading the materials of the component models in the three-dimensional scene, and adding illumination for the three-dimensional scene.
Because the material system in Revit is simpler, after the intermediate format file is imported into the illusion engine, the material performance effect of the component model in the three-dimensional scene is poor, such as reflection, unreal roughness and the like. Therefore, the embodiment of the invention upgrades the material of the corresponding component model, and replaces the simple material used by the component model with the material of the same type which is manufactured in advance in the illusion engine.
In addition, when adding illumination to a three-dimensional scene, directional light and sky light can be automatically added based on rules, and the directional light can simulate sunlight, thereby simulating outdoor real natural illumination in the real world. In addition, when adding directional light and sky light, the sunlight and sky light can be added according to the set time, and the sunlight and sky light are all different at different times.
In the specific implementation process, the step S30 "upgrade the material of each component model in the three-dimensional scene" may be the following steps, and the method flowchart is shown in fig. 2:
s3011, calling a pre-established material library, wherein the material library comprises materials which correspond to different material types and are physically rendered.
In the embodiment of the invention, the illusion engine conversion server maintains a material library of the illusion engine in the background in advance, and the materials of all material types in the material library are obtained based on physical rendering, so that the effect is more real compared with the materials of Revit.
In addition, the material type of each material in the material library can be matched with the material type of the Revit, for example, the material of the "stainless steel" in the Revit can be replaced by the material of the "stainless steel" in the material library.
Furthermore, the material types can be distinguished by the names of the materials, so that the materials with the same names and more real effects can be searched in the material library through the names of the materials used by the component model.
S3012, analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target material type of a material used by the component model.
In the embodiment of the present invention, for each component model in the three-dimensional scene, the steps of steps S302 to S303 are performed, and for one component model, the target material name of the component model may be obtained by analyzing the model information thereof, so as to search for whether the material with the target material name exists in the material library, and if so, the searched material may be used to replace the material of the component model. Of course, if no texture of the target texture name is retrieved in the texture library, no subsequent texture replacement is performed.
It should be noted that, for a component model, the materials used may be multiple, so for each material of the component model, the target material corresponding to the corresponding material may be obviously searched in the material library, and the corresponding material may be updated according to the search result.
S3013, replacing the materials used by the component model by the materials corresponding to the target material types in the material library.
Based on the method, as long as the component model uses the materials in the material library, the illusion engine transformation server can upgrade the materials to the materials with better effect and based on physical rendering. Through the material upgrading, the simple material is automatically upgraded into a material which is closer to the real effect.
In other embodiments, the Revit model may also be upgraded, with the simple model being upgraded closer to the model of the presentation effect. The processing method based on the illusion engine shown in fig. 1 further comprises the following steps, preferably, the steps can be set after the material and before the illumination is added:
and upgrading the component models in the three-dimensional scene.
Similar to material upgrades, the component model is replaced with a model of the same type that was previously made in the illusion engine.
In the specific implementation process, the step of upgrading the component models in the three-dimensional scene can be as follows:
invoking a pre-established model library, wherein the model library comprises real models corresponding to different model types; analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target component type of the component model; the component model is replaced with a real model in the model library corresponding to the target component type.
In the embodiment of the invention, the phantom engine conversion server maintains a model library of the phantom engine in the background in advance, and the effect of the real model in the model library is more real compared with that of the Revit model.
In addition, it should be noted that the model type of each real model in the model library can be matched with the model type of the Revit, for example, the model type of one "multi-sofa" for Revit can be replaced by the model type of "multi-sofa" in the model library.
Furthermore, the model types can be distinguished by the model numbers, so that models with the same effect as the codes can be retrieved from a model library by constructing the codes of the models.
It should be further noted that, for each component model in the three-dimensional scene, the above steps are performed, for one component model, the target number of the component model may be obtained by analyzing the model information thereof, and then whether a real model corresponding to the target number exists in the model library may be searched, and if so, the searched real model may be used to replace the component model. Of course, if the real model of the target number is not retrieved in the model library, no subsequent model replacement is performed.
For example, a component model of a multiplayer sofa, numbered 01, then the sofa has a "01" number in all of the different Revit documents. After being imported into the illusion engine, the component model of the multi-person sofa can be identified through the number and replaced by a finer real model of the multi-person sofa in the model library.
In addition, besides adding natural illumination for the three-dimensional scene, the embodiment of the invention can further add indoor illumination. In this case, the intermediate format file is required to further include attribute information of a plurality of members. The indoor illumination adding process comprises the following steps, and a method flow chart is shown in fig. 3:
s3021, analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model.
In the embodiment of the present invention, the steps S3021 to S3023 are executed for each component model in the three-dimensional scene.
By analyzing the attribute information corresponding to the component, the component type such as dining table, sofa, down lamp, spot lamp and the like can be obtained by using one component model.
S3022, if the component type of the component model belongs to the indoor light source, acquiring element parameters corresponding to the component type.
In the embodiment of the invention, a plurality of component categories belonging to the indoor light source, such as a down lamp and a spot lamp, can be predetermined. If the component type of the component model belongs to one of the determined component categories, it may be determined that the component type of the component model belongs to an indoor light source, and corresponding element parameters such as parameters of power, illuminance, color temperature, and the like are further acquired.
Furthermore, if the component type of the component model does not belong to an indoor light source, no subsequent steps are performed.
S3023, generating indoor illumination corresponding to the component model according to the position and the direction of the component model in the three-dimensional scene and the component parameters.
In the embodiment of the invention, the position and the direction of the component model in the whole three-dimensional scene are identified, and the light sources with the same illumination and the same color temperature can be generated at the same position and the same direction, so that the light sources are supplemented at the corresponding lamps to simulate indoor illumination.
It should be noted that if the element parameter includes illuminance, the light source may be directly generated according to the actual illuminance. If the component parameters do not include illuminance but include power, the illuminance may be scaled according to the component type of the component model, such as the average value 80lm/W for an LED lamp.
S40, calling the illusion engine to execute illumination construction and packaging operation so as to acquire corresponding executable files, wherein the executable files are the basis for real-time engine display.
In the embodiment of the invention, the illumination construction and packaging operation belongs to the function of the illusion engine.
Whether natural illumination or indoor illumination is added, superposition such as reflection and refraction between illumination cannot be achieved, and the effect of the illumination superimposed on the component model cannot be generated, so that the illusion engine can be automatically called for illumination construction in a command line mode, and the aim of calculating real illumination information is achieved. In addition, for the display effect, the illumination of the components adopts a static light mode, and the light information is stored in the illumination map of each component model in the scene.
In addition, for the packing operation, the command line mode can also be used for automatically calling the illusion engine to realize that the whole three-dimensional scene is packed into an executable file, such as a file with an exe extension in Windows environment. In order to realize the purpose of being checked by other people, the packaged executable file is automatically compressed into a zip format and uploaded to a server, and a download link of the zip is provided for the front end to support roaming check.
In other embodiments, other settings of the illusion engine may be performed before performing step S40, including stabilizing the exposure value, performing ambient light shielding adjustment, etc., to ensure that the display effect remains uniform under different designs. Specifically, the exposure degree can be dynamically adjusted in the illusion engine, and the illusion engine has the functions of suppressing high light and improving details of dark places. However, in order to restore the real illumination effect, the exposure value is stabilized at a fixed value, and parameters such as ambient light shielding and the like are adjusted to improve the display effect.
In other embodiments, a default view may also be set, and through identification and related computation of the entrance door, the default view is guaranteed to be at the vestibule location within the entrance door. In this case, the intermediate format file is required to further include attribute information of a plurality of components, and the embodiment of the present invention further includes the following steps, where a method flowchart is shown in fig. 4:
s501, analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model.
In the embodiment of the present invention, the specific implementation manner of step S501 may refer to the disclosure portion of step S3021, which is not described herein.
S502, determining a first component model belonging to the door and a second component model belonging to the ceiling according to the component type of each component model in the three-dimensional scene.
S503, determining two candidate indoor position points based on the position and the direction of the first component model in the three-dimensional scene.
In the embodiment of the invention, when entering the illusion engine, the person generally has a default view angle, and the direction perpendicular to the entrance door can be determined by considering the position and the direction of the first component model in the three-dimensional scene. Referring to the schematic view of the scene shown in fig. 5, two directions perpendicular to the door (directions indicated by two arrows) are respectively located at a certain distance (generally 1 meter) from the door, and two position points can be selected from positions with a height of 1.5, and the two position points are candidate indoor position points.
S504, respectively taking the two candidate indoor position points as the reference to upward make rays, taking the candidate indoor position points where the made rays can intersect with the second component model as actual indoor position points, and taking the actual indoor position points as the basis for determining the default view angle positions.
In the embodiment of the invention, rays are respectively made upwards at two candidate indoor position points, and the corresponding rays can be intersected with a second component model with the component type of a ceiling as the indoor position points, otherwise, the outdoor position points cannot be intersected with the second component model. Thus, the actual indoor location point may be determined by determining candidate indoor location points at which the ray can intersect the second component model. Further, a default view angle position may be selected in the space region where the actual indoor location point is located, and of course, the actual indoor location point may also be directly used as the default view angle position.
In summary, the processing method based on the illusion engine provided by the embodiment of the invention can automatically upgrade the material model, add the illumination, set other settings, calculate the default view angle, construct the illumination and package processes, thereby converting the Revit design into the executable file of the real-time engine in a short time, supporting the roaming check, having no need of human intervention in the whole process and improving the effect manufacturing efficiency.
Based on the processing method based on the illusion engine provided by the above embodiment, the embodiment of the present invention correspondingly provides a device for executing the processing method based on the illusion engine, where a schematic structural diagram of the device is shown in fig. 6, and the device includes:
an obtaining module 10, configured to obtain an intermediate format file of the target Revit file, where the intermediate format file includes model information of a plurality of components;
a creation module 20, configured to generate a three-dimensional scene in the illusion engine based on model information of a plurality of components, where the three-dimensional scene is composed of component models corresponding to the plurality of components, and one component corresponds to one component model;
the processing module 30 is used for upgrading the materials of the component models in the three-dimensional scene and adding illumination for the three-dimensional scene;
and the calling module 40 is used for calling the illusion engine to execute illumination construction and packaging operation so as to acquire corresponding executable files, wherein the executable files are the basis for real-time engine display.
Optionally, the acquiring module 10 is specifically configured to:
converting the target Revit file by a Revit conversion server to obtain an intermediate format file; and receiving the intermediate format file output by the Revit conversion server.
Optionally, the process of upgrading the material of each component model in the three-dimensional scene by the processing module 30 includes:
invoking a pre-established material library, wherein the material library comprises physically rendered materials corresponding to different material types; analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target material type of a material used by the component model; and replacing the materials used by the component model by using the materials corresponding to the target material type in the material library.
Optionally, the intermediate format file further includes attribute information of a plurality of components, and the processing module 30 adds illumination to the three-dimensional scene, including:
adding natural illumination and indoor illumination for the three-dimensional scene;
the indoor illumination adding process comprises the following steps:
analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model;
if the component type of the component model belongs to the indoor light source, acquiring element parameters corresponding to the component type;
and generating indoor illumination corresponding to the component model according to the position and the direction of the component model in the three-dimensional scene and the component parameters.
Optionally, the intermediate format file further includes attribute information of a plurality of components, and the processing module 30 is further configured to:
analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model; determining a first component model belonging to the door and a second component model belonging to the ceiling according to the component type of each component model in the three-dimensional scene; determining two candidate indoor location points based on the position and the direction of the first component model in the three-dimensional scene; and respectively taking the two candidate indoor position points as references to upwards make rays, taking the candidate indoor position points, which can be intersected with the second component model, of the made rays as actual indoor position points, and taking the actual indoor position points as the basis for determining the default view angle positions.
Optionally, the processing module 30 is further configured to:
and upgrading the component models in the three-dimensional scene.
Optionally, the process of upgrading each component model in the three-dimensional scene by the processing module 30 includes:
invoking a pre-established model library, wherein the model library comprises real models corresponding to different model types; analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target component type of the component model; the component model is replaced with a real model in the model library corresponding to the target component type.
The processing device based on the illusion engine provided by the embodiment of the invention can automatically upgrade the material model, add the illumination, set other settings, calculate the default view angle and construct the illumination and packaging flow, so that the Revit design is converted into the executable file of the real-time engine in a short time, roaming viewing is supported, the whole flow does not need human intervention, and the effect manufacturing efficiency is improved.
Based on the processing method based on the illusion engine provided in the above embodiment, the embodiment of the present invention further provides an electronic device, where the electronic device includes: at least one memory and at least one processor; the memory stores a program, and the processor calls the program stored in the memory, wherein the program is used for realizing any processing method based on the illusion engine.
Based on the processing method based on the illusion engine provided by the embodiment, the embodiment of the invention also provides a storage medium, wherein the storage medium stores computer executable instructions, and the computer executable instructions are used for executing any one of the processing methods based on the illusion engine.
The processing method, the processing device, the electronic equipment and the storage medium based on the illusion engine provided by the invention are described in detail, and specific examples are applied to the explanation of the principle and the implementation mode of the invention, and the explanation of the above examples is only used for helping to understand the method and the core idea of the invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described as different from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
It is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include, or is intended to include, elements inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A fantasy engine-based processing method, said method comprising:
obtaining an intermediate format file of a target file, wherein the intermediate format file contains model information of a plurality of components and attribute information of the plurality of components;
generating a three-dimensional scene in the illusion engine based on the model information of the plurality of components, wherein the three-dimensional scene consists of component models corresponding to the plurality of components, and one component corresponds to one component model;
upgrading the material of each component model in the three-dimensional scene, and adding illumination for the three-dimensional scene;
the upgrading processing of the material of each component model in the three-dimensional scene comprises the following steps: invoking a pre-established material library, wherein the material library comprises physically rendered materials corresponding to different material types; analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target material type of a material used by the component model; replacing the material used by the component model by using the material corresponding to the target material type in the material library;
wherein the adding illumination to the three-dimensional scene comprises: adding natural illumination and indoor illumination to the three-dimensional scene;
the indoor illumination adding process comprises the following steps: analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model; if the component type of the component model belongs to an indoor light source, acquiring element parameters corresponding to the component type; generating indoor illumination corresponding to the component model according to the position and the direction of the component model in the three-dimensional scene and the component parameters;
and calling the illusion engine to execute illumination construction and packaging operation so as to acquire a corresponding executable file, wherein the executable file is the basis for real-time engine display.
2. The method of claim 1, wherein the obtaining the intermediate format file of the target file comprises:
converting the target file through a conversion server to obtain the intermediate format file;
and receiving the intermediate format file output by the conversion server.
3. The method according to claim 1, wherein the method further comprises:
analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model;
determining a first component model belonging to a door and a second component model belonging to a ceiling according to the component type of each component model in the three-dimensional scene;
determining two candidate indoor location points based on the position and direction of the first component model in the three-dimensional scene;
and respectively taking the two candidate indoor position points as references to upwards make rays, taking the candidate indoor position points which can be intersected with the second component model by the made rays as actual indoor position points, and taking the actual indoor position points as the basis for determining the default view angle positions.
4. The method according to claim 1, wherein the method further comprises:
and upgrading each component model in the three-dimensional scene.
5. The method of claim 4, wherein the upgrading each component model in the three-dimensional scene comprises:
invoking a pre-established model library, wherein the model library comprises real models corresponding to different model types;
analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target component type of the component model;
and replacing the component model by using a real model corresponding to the target component type in the model library.
6. A fantasy engine based processing apparatus, said apparatus comprising:
the device comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for acquiring an intermediate format file of a target file, and the intermediate format file contains model information of a plurality of components and attribute information of the plurality of components;
the creation module is used for generating a three-dimensional scene in the illusion engine based on the model information of the plurality of components, wherein the three-dimensional scene is composed of component models corresponding to the plurality of components, and one component corresponds to one component model;
the processing module is used for upgrading the materials of each component model in the three-dimensional scene and adding illumination for the three-dimensional scene;
the processing module performs upgrading processing on the material of each component model in the three-dimensional scene, and the processing module comprises the following steps: invoking a pre-established material library, wherein the material library comprises physically rendered materials corresponding to different material types; analyzing model information corresponding to each component model in the three-dimensional scene to obtain a target material type of a material used by the component model; replacing the material used by the component model by using the material corresponding to the target material type in the material library;
the processing module adds illumination to the three-dimensional scene, and the processing module comprises the following steps: adding natural illumination and indoor illumination to the three-dimensional scene;
the indoor illumination adding process comprises the following steps: analyzing attribute information corresponding to each component model in the three-dimensional scene to obtain the component category of the component model; if the component type of the component model belongs to an indoor light source, acquiring element parameters corresponding to the component type; generating indoor illumination corresponding to the component model according to the position and the direction of the component model in the three-dimensional scene and the component parameters;
and the calling module is used for calling the illusion engine to execute illumination construction and packaging operation so as to acquire a corresponding executable file, wherein the executable file is a basis for real-time engine display.
7. An electronic device, the electronic device comprising: at least one memory and at least one processor; the memory stores a program, and the processor calls the program stored in the memory, wherein the program is used for implementing the processing method based on the illusion engine according to any one of claims 1 to 5.
8. A storage medium having stored therein computer executable instructions for performing the illusion engine based processing method of any one of claims 1-5.
CN202011607969.7A 2020-12-30 2020-12-30 Processing method and device based on illusion engine, electronic equipment and storage medium Active CN112596713B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011607969.7A CN112596713B (en) 2020-12-30 2020-12-30 Processing method and device based on illusion engine, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011607969.7A CN112596713B (en) 2020-12-30 2020-12-30 Processing method and device based on illusion engine, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112596713A CN112596713A (en) 2021-04-02
CN112596713B true CN112596713B (en) 2024-02-06

Family

ID=75206247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011607969.7A Active CN112596713B (en) 2020-12-30 2020-12-30 Processing method and device based on illusion engine, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112596713B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113398594A (en) * 2021-05-07 2021-09-17 深圳市灼华互娱科技有限公司 Character model light creating method, device, equipment and storage medium
CN113628328B (en) * 2021-08-12 2024-05-10 深圳须弥云图空间科技有限公司 Model rendering method and device for joint members
CN113835736A (en) * 2021-08-18 2021-12-24 华建数创(上海)科技有限公司 Digital-analog real-time linkage implementation mechanism for illusion engine
CN113538706B (en) * 2021-09-16 2021-12-31 深圳须弥云图空间科技有限公司 Digital sand table-based house scene display method, device, equipment and storage medium
CN113791821B (en) * 2021-09-18 2023-11-17 广州博冠信息科技有限公司 Animation processing method and device based on illusion engine, medium and electronic equipment
CN115115766B (en) * 2022-05-17 2023-03-24 清华大学 Multispectral scene data generation method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105005473A (en) * 2015-06-29 2015-10-28 乐道互动(天津)科技有限公司 Game engine system for developing 3D game
CN105741194A (en) * 2016-01-28 2016-07-06 赵云 Unreal engine technology-based home decoration system
CN109753276A (en) * 2018-12-29 2019-05-14 北京天际启游科技有限公司 A kind of control method and relevant apparatus based on illusory engine
US10665011B1 (en) * 2019-05-31 2020-05-26 Adobe Inc. Dynamically estimating lighting parameters for positions within augmented-reality scenes based on global and local features
CN111243068A (en) * 2019-12-09 2020-06-05 佛山欧神诺云商科技有限公司 Automatic rendering method and device for 3D model scene and storage medium
CN111359219A (en) * 2020-03-16 2020-07-03 网易(杭州)网络有限公司 File processing method, device, equipment and storage medium of illusion engine

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9230508B2 (en) * 2012-06-27 2016-01-05 Pixar Efficient feedback-based illumination and scatter culling

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105005473A (en) * 2015-06-29 2015-10-28 乐道互动(天津)科技有限公司 Game engine system for developing 3D game
CN105741194A (en) * 2016-01-28 2016-07-06 赵云 Unreal engine technology-based home decoration system
CN109753276A (en) * 2018-12-29 2019-05-14 北京天际启游科技有限公司 A kind of control method and relevant apparatus based on illusory engine
US10665011B1 (en) * 2019-05-31 2020-05-26 Adobe Inc. Dynamically estimating lighting parameters for positions within augmented-reality scenes based on global and local features
CN111243068A (en) * 2019-12-09 2020-06-05 佛山欧神诺云商科技有限公司 Automatic rendering method and device for 3D model scene and storage medium
CN111359219A (en) * 2020-03-16 2020-07-03 网易(杭州)网络有限公司 File processing method, device, equipment and storage medium of illusion engine

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
虚幻游戏引擎在球幕动画制作中的应用;滕旭;《中国优秀硕士学位论文全文数据库信息科技辑》(第5期);第I138-177页 *

Also Published As

Publication number Publication date
CN112596713A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN112596713B (en) Processing method and device based on illusion engine, electronic equipment and storage medium
US9019269B1 (en) Interactive rendering of building information model data
US11640672B2 (en) Method and system for wireless ultra-low footprint body scanning
US10628666B2 (en) Cloud server body scan data system
US10628729B2 (en) System and method for body scanning and avatar creation
CN110363839B (en) Model rendering method, device, computer equipment and storage medium
CN110262865B (en) Method and device for constructing game scene, computer storage medium and electronic equipment
US20130339403A1 (en) Interoperability format translation and transformation between ifc architectural design file and simulation file formats
CN112182700A (en) BIM three-dimensional building model display method based on Web end
CN110433495B (en) Configuration method and device of virtual scene in game, storage medium and electronic equipment
CN112184880A (en) Building three-dimensional model processing method and device, computer equipment and storage medium
CN111539054A (en) Interior decoration design system based on AR virtual reality technology
CN107679141A (en) Data storage method, device, equipment and computer-readable recording medium
CN113538706A (en) Digital sand table-based house scene display method, device, equipment and storage medium
CN116310143A (en) Three-dimensional model construction method, device, equipment and storage medium
CN114693611A (en) Rendering quality evaluation method, device, computer equipment and medium
US9898873B2 (en) Methods and systems for processing 3D graphic objects at a content processor
CN112807695A (en) Game scene generation method and device, readable storage medium and electronic equipment
US9615009B1 (en) Dynamically adjusting a light source within a real world scene via a light map visualization manipulation
CN116980714A (en) Background image generation method and device, electronic equipment and storage medium
CN114365137A (en) Augmented reality method and system for design
CN114241116A (en) Method and device for previewing illumination effect in real time, electronic equipment and medium
Calabuig-Barbero et al. Computational model for hyper-realistic image generation using uniform shaders in 3D environments
CN114820968A (en) Three-dimensional visualization method and device, robot, electronic device and storage medium
CN114461959A (en) WEB side online display method and device of BIM data and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant