CN115049811B - Editing method, system and storage medium for digital twin virtual three-dimensional scene - Google Patents

Editing method, system and storage medium for digital twin virtual three-dimensional scene Download PDF

Info

Publication number
CN115049811B
CN115049811B CN202210701427.9A CN202210701427A CN115049811B CN 115049811 B CN115049811 B CN 115049811B CN 202210701427 A CN202210701427 A CN 202210701427A CN 115049811 B CN115049811 B CN 115049811B
Authority
CN
China
Prior art keywords
dimensional
dimensional scene
scene
digital twin
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210701427.9A
Other languages
Chinese (zh)
Other versions
CN115049811A (en
Inventor
邓潇
汪璞
刘磊
刘宏春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Digital Hail Technology Co ltd
Original Assignee
Beijing Digital Hail Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Digital Hail Technology Co ltd filed Critical Beijing Digital Hail Technology Co ltd
Priority to CN202210701427.9A priority Critical patent/CN115049811B/en
Publication of CN115049811A publication Critical patent/CN115049811A/en
Application granted granted Critical
Publication of CN115049811B publication Critical patent/CN115049811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an editing method, a system and a storage medium of a digital twin virtual three-dimensional scene, wherein the method comprises the following steps: acquiring a file of a three-dimensional model; editing a three-dimensional scene; setting the material of the three-dimensional model; editing motion rules and motion parameters of the movable part of the three-dimensional model; configuring an animation effect of the movable part of the three-dimensional model; configuring geographic position coordinates of the three-dimensional scene; configuring view angle parameters and visual special effect parameters of the three-dimensional scene; adding ornaments of the three-dimensional scene; establishing parameter snapshot information of the three-dimensional scene; and saving and releasing the three-dimensional scene. The invention can complete a series of processes for constructing the digital twin scene through various original data in the browser, realizes one-stop construction of the digital twin scene, and improves the working efficiency of the construction of the digital twin scene.

Description

Editing method, system and storage medium for digital twin virtual three-dimensional scene
Technical Field
The invention relates to the field of digital twin visualization, in particular to a method, a system and a storage medium for editing a digital twin virtual three-dimensional scene.
Background
In the field of digital twin visualization, a core technology is to construct a digital twin object corresponding to a physical entity and visualize the digital twin object so as to realize the mapping of actions and attributes between the digital twin object and the physical entity in the real world. The construction of the digital twin scene is an important process for realizing digital twin visualization, and the generated three-dimensional scene is combined with geographic information coordinates of the real world to reflect various actions and state changes generated by the real world objects, has rich visual effects, provides enough flexibility and convenience, and meets the requirements of rapid creation and editing.
In order to realize the requirements of quick creation and editing, the prior art adopts a complex manufacturing method, which comprises the steps of firstly completing monomer modeling in three-dimensional software or industrial design software suitable for monomer modeling, then using three-dimensional modeling software suitable for scene editing to carry out monomer modeling combination, and then carrying out material adjustment work on a scene. The above process involves the operation processing of a plurality of professional software and a large amount of programming development, and simultaneously the effect is not intuitive, needs to be adjusted repeatedly, and has low working efficiency.
Disclosure of Invention
In view of the above, the present invention proposes a method, a system and a storage medium for editing a digital twin virtual three-dimensional scene, so as to overcome or at least partially solve the above problems.
According to an aspect of the present invention, there is provided an editing method of a digital twin virtual three-dimensional scene including:
step 1: acquiring a file of a three-dimensional model;
step 2: configuring the three-dimensional scene, and constructing a three-dimensional scene through the three-dimensional model file;
step 3: establishing parameter snapshot information in the three-dimensional scene, obtaining a three-dimensional scene parameter snapshot, and adding the parameter snapshot information into the three-dimensional scene;
step 4: and storing and releasing the three-dimensional scene.
Optionally, the step 2: the configuring of the three-dimensional scene specifically comprises:
step 2.1: setting the material of the three-dimensional model;
step 2.2: editing motion rules and motion parameters of the movable part of the three-dimensional model;
step 2.3: configuring an animation effect of the movable part of the three-dimensional model;
step 2.4: configuring geographic position coordinates of the three-dimensional scene;
step 2.5: configuring view angle parameters and visual special effect parameters of the three-dimensional scene;
step 2.6: adding ornaments of the three-dimensional scene.
Optionally, the file for acquiring the three-dimensional model specifically includes:
using a built-in three-dimensional model in a built-in asset library of the system;
model files are derived from 3DMAX and MAYA three-dimensional modeling software for loading into the three-dimensional scene.
Optionally, the setting the material of the three-dimensional model specifically includes:
performing parameter editing on the materials of the model nodes in the three-dimensional model, wherein the parameters for modification comprise a material map, a metal degree, single and double sides, high light attributes, a reflection map, a concave-convex map, transparency, self-luminescence, an illumination map, a material animation and map coordinates;
replacing the map of the material, and replacing the whole material with a dynamic rendering material realized by means of a pad function of a graphics card;
and adjusting performance attributes of the materials, designating whether reflection is calculated, a spatial range of reflection calculation, and whether shielding is calculated, and balancing the rendering effect and the rendering performance.
Optionally, the editing the motion rules and motion parameters of the movable part of the three-dimensional model specifically includes:
the movable part of the three-dimensional model comprises a plurality of model nodes, and the model nodes are created as a joint object;
the joint object comprises corresponding control variables and is used for controlling the movement of the joint object, and the types of the control variables comprise numerical type, boolean type and enumeration type;
establishing a motion mapping relation between the model node and the control variable, wherein the motion mapping comprises translation, rotation, scaling, color change and transparency change;
the motion map includes concurrent motion of at least one of the joint objects.
Optionally, the configuring the animation effect of the movable part of the three-dimensional model specifically includes:
establishing an animation container for accommodating a plurality of joints for being driven in an animation, wherein the animation container comprises an animation time line, a circulation attribute and a playing speed;
adding a moving joint object in the animation container, and setting time parameters and motion parameters of the joint object;
and adding playing elements of the animation into the animation time line for playing control according to the animation time line, wherein the playing elements comprise starting time and playing duration.
Optionally, the configuring the geographic position coordinates of the three-dimensional scene specifically includes:
acquiring the three-dimensional model as a display object, wherein the display object is a target to be registered;
the method comprises the steps of obtaining a reference object of a scene coordinate system as a registration reference object, wherein the scene coordinate system is a coordinate system universal to the field of digital twin visualization industry, and the registration reference object comprises a two-dimensional map based on a longitude and latitude coordinate system, an engineering drawing based on a plane orthogonal rectangular coordinate system or a two-dimensional map with a longitude and latitude coordinate system from an internet map service provider;
constructing a registration operation picture, superposing the display object on the registration reference object, and adjusting the position relationship between the display object and the registration reference object by a user through parameters to realize spatial position matching and obtain the spatial position corresponding relationship between the display object and the registration reference object;
the spatial position correspondence relationship includes longitude, latitude, altitude, rotation angle and scaling factor of the origin of the display object coordinate system.
Optionally, the configuring the three-dimensional scene visual angle parameter and the visual special effect parameter specifically includes:
setting a camera visual angle parameter, a main light source, an auxiliary light source, a global special effect parameter, a global rendering parameter, a decorative effect animation and a background effect parameter which are overlapped in the three-dimensional scene.
Optionally, the adding the ornament of the three-dimensional scene specifically includes:
placing vegetation in the three-dimensional scene;
placing a custom model ornament in the three-dimensional scene;
and placing icons in the scene.
Optionally, the establishing parameter snapshot information in the three-dimensional scene specifically includes:
generating a snapshot of all configuration parameters of the generated digital twin scene;
the snapshot is stored in a system background in the form of a configuration information file and has a unique snapshot name;
and taking the snapshot according to the snapshot name, and restoring the scene to the state of the snapshot time.
Optionally, the saving and publishing the parameter snapshot three-dimensional scene specifically includes:
storing the parameter snapshot three-dimensional scene;
and the parameter snapshot three-dimensional scene is stored in a file and/or a server, is distributed into an online service, and provides a JS interface to directly perform operation control on the parameter snapshot and elements of the three-dimensional scene.
The invention also provides an editing system of the digital twin virtual three-dimensional scene, which comprises:
the three-dimensional model acquisition module is used for acquiring a file of the three-dimensional model;
the three-dimensional scene configuration module is used for constructing a complete three-dimensional scene through the three-dimensional model file;
the parameter snapshot information establishing module is used for establishing parameter snapshot information in the three-dimensional scene and obtaining a three-dimensional scene parameter snapshot;
and the three-dimensional scene issuing module is used for storing and issuing the three-dimensional scene.
Optionally, the three-dimensional scene editing module specifically includes: a digital material setting unit for setting the material of the three-dimensional model;
a motion parameter editing unit for editing motion rules and motion parameters of the movable part of the three-dimensional model;
an animation effect configuration unit for configuring an animation effect of the movable part of the three-dimensional model;
a position coordinate configuration unit, configured to configure geographic position coordinates of the three-dimensional scene;
the special effect parameter configuration unit is used for configuring the visual angle parameters and the visual special effect parameters of the three-dimensional scene;
an ornament adding unit for adding ornaments of the three-dimensional scene;
the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores computer executable instructions, and the computer executable instructions can cause at least one processor to execute the method for editing the digital twin virtual three-dimensional scene.
The invention provides a method, a system and a storage medium for editing a digital twin virtual three-dimensional scene, wherein the method comprises the following steps: acquiring a file of a three-dimensional model; setting the material of the three-dimensional model; editing motion rules and motion parameters of the movable part of the three-dimensional model; configuring an animation effect of the movable part of the three-dimensional model; configuring geographic position coordinates of the three-dimensional scene; configuring view angle parameters and visual special effect parameters of the three-dimensional scene; adding ornaments of the three-dimensional scene; establishing parameter snapshot information of the three-dimensional scene; and saving and releasing the three-dimensional scene. The browser is utilized to complete a series of processes for constructing the digital twin scene through various original data, so that one-stop construction of the digital twin scene is realized, and the working efficiency of the construction of the digital twin scene is improved.
The foregoing description is only an overview of the present invention, and is intended to provide a better understanding of the technical means of the present invention, and is to be construed as being a complete description of the present invention, as well as the following detailed description of the present invention, in order to provide further understanding of the present invention with the aid of the appended claims.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an editing method of a digital twin virtual three-dimensional scene provided by an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terms "comprising" and "having" and any variations thereof in the description embodiments of the invention and in the claims and drawings are intended to cover a non-exclusive inclusion, such as a series of steps or elements.
The technical scheme of the invention is further described in detail below with reference to the accompanying drawings and the examples.
As shown in fig. 1, an editing method of a digital twin virtual three-dimensional scene includes:
step 100: acquiring a file of a three-dimensional model;
the specific mode of acquiring the three-dimensional model file comprises the steps of using an existing model in a built-in asset library of the system, deriving the three-dimensional model file from 3DMAX and MAYA conventional three-dimensional modeling software, and loading the three-dimensional model file into the three-dimensional scene. The three-dimensional model will form part of the scene.
The model file should include three-dimensional elements constituting a three-dimensional scene, and the used material should be a basic material, and the format is a format directly analyzed by the WEB browser through JavaScript. In the three-dimensional model, each part of the model can be grouped and named in three-dimensional modeling software according to the structural relation of the three-dimensional object, and the three-dimensional model is used as a model node to facilitate setting operation on each part in a subsequent process. The structural relation of the three-dimensional object is a logical relation tree formed by organizing all the components forming the three-dimensional object in a tree structure in three-dimensional modeling software. Sub-branches of the attribute structure are grouped and named. In order to facilitate loading of the model in the browser, the format of the model file is GLB format.
Step 200: configuring the three-dimensional scene;
by utilizing the file of the three-dimensional model, a three-dimensional scene is constructed by editing means, and the three-dimensional scene comprises 6 substeps in total, namely, the following steps:
step 210: setting the material of the three-dimensional model;
parameters that can be used for modification include texture mapping, metaliness, single-sided, high-gloss properties, reflection mapping, relief mapping, transparency, self-luminescence, illumination mapping, texture animation, and mapping coordinates.
And replacing the mapping of the material, and integrally replacing the material with a dynamic rendering material which is realized by depending on a loader function of a display card, wherein the dynamic rendering material comprises a water surface material and a glass material.
And adjusting performance attributes of the materials, such as whether to calculate reflection, whether to calculate the spatial range of reflection calculation and whether to calculate shielding, so that the three-dimensional scene balances the rendering effect and the rendering performance.
Step 220: editing motion rules and motion parameters of the movable part of the three-dimensional model;
a group of model nodes form a movable part of the three-dimensional model, and the group of model nodes forming a movable part are created as a joint object;
the movement of the joint object is controlled by a control variable corresponding to the joint object. The types of the control variables comprise numerical type, boolean type and enumeration type;
the interior of the joint object comprises various model nodes and a control variable, and a motion mapping relation is established between the model nodes and the control variable. The motion mapping method includes translation, rotation, scaling, color change, transparency change.
The interior of the joint object comprises various model nodes, each model node can independently establish a motion mapping relation with the control variable, and the motion mapping relation can simultaneously comprise multiple groups of concurrent motions.
Step 230: the animation effect of configuring the movable part of the three-dimensional model specifically comprises the following steps:
based on the joint object defined in step 220, the automatic motion of the joint object is achieved by the control variable of the automatic driving joint object through a set of rules, and the animation of the movable component is achieved, specifically including:
an animation container is established for accommodating each joint that needs to be driven in animation. The animation container comprises an animation time line, a circulation attribute and a playing speed;
each joint object to be moved is added in the animation container, and the time parameter and the motion parameter of the joint object, such as the start value, the end value, the animation start time and the play duration parameter of the control variable, are set. The time elements of the animation start time and the playing duration are added into the time line of the animation container, so that the subsequent playing control according to the time line is facilitated.
After the animation is configured, the animation can be played, and the control variables of all the joint objects are sequentially controlled according to the time line configured in the animation container, so that the movement of the joint objects is realized. By playing the animation to observe the motion process, the correctness of the animation configuration can be judged in an auxiliary manner.
The playing process can enable the configured joints to do actual movement according to the configured parameters and the movement mode, and the human eyes observe the movement process and judge whether the configuration is correct and reasonable. If so, proceeding to the subsequent steps, and if not, continuing to modify the configuration until it is correct.
Step 240: configuring geographic position coordinates of the three-dimensional scene;
the three-dimensional model imported in the step 100 is obtained as a display object, and the display object is the target to be registered.
And obtaining a reference object with a coordinate system universal in the field of digital twin visualization industry, and obtaining a registration reference object, wherein the registration reference object comprises a two-dimensional map based on a longitude and latitude coordinate system and engineering drawings based on other plane orthogonal rectangular coordinate systems. The registration reference object is a two-dimensional map with a latitude and longitude coordinate system from an internet map service provider.
And constructing a registration operation picture, superposing the display object on the registration reference object, and allowing a user to adjust the position relationship between the display object and the registration reference object through parameters so that the spatial position matching of the display object and the registration reference object is correct and reasonable. The matching is allowed to be performed quickly by adopting the operations of mouse dragging, translation, rotation and scaling, and the user is allowed to control to operate only the display object or only the registration reference object or both of the display object and the registration reference object respectively and independently so as to perform the registration operation more quickly.
Also included is describing registration parameters of the display object by a set of user entered values.
And through the registration operation, obtaining the spatial position corresponding relation between the display object and the registration reference object. The spatial position correspondence relationship includes longitude, latitude, altitude, rotation angle, and scaling factor of the origin of the display object coordinate system as the registration result.
Step 250: configuring view angle parameters and visual special effect parameters of a three-dimensional scene;
based on the digital twin scene generated in the steps, adding the light and special effects of the global scene level, wherein the method specifically comprises the following steps:
setting view angle parameters of a camera, including view direction, distance and angle, of a scene in the three-dimensional scene, and setting automatic rotation of the scene;
the three-dimensional scene is provided with a main light source and an auxiliary light source, wherein the light source is provided with a position, an orientation, a beam angle, light source intensity, light source color and light source projection parameters. The setting is stored as the preset template, so that the quick calling is convenient.
Global special effect parameters are set in the three-dimensional scene, and special effects act on the whole picture, including film grain effect, depth of field blurring effect, corner blurring effect, sharpening effect, color difference effect, glow effect and LUT color mapping.
Setting global rendering parameters in the three-dimensional scene, wherein the global rendering parameters act on a global rendering pipeline, and controlling the working parameters of a rendering engine. Including global projection parameters, reflection parameters, and antialiasing parameters.
In a three-dimensional scene, animation of decorative effects, such as rain, snow, random floating spots, random flow lines, superimposed on the scene are set.
Setting background effect parameters in the three-dimensional scene to enable the scene to be harmonious and unified with the background. For example using an ambient map or solid color as background.
Step 260: adding ornaments of the three-dimensional scene;
based on the digital twin scene generated by the steps, the detail of the scene is further increased by adding ornaments, and the attractiveness is improved. The method specifically comprises the following steps:
and placing vegetation in the three-dimensional scene. The specific method comprises the following steps:
vegetation is a three-dimensional model of various trees, and is placed in the three-dimensional model in a scene through interactive operation to serve as an ornament.
Single-point placement in the scene by screen click or batch placement in the model by painting on the surface of the model. The smearing mode sets up the density of placing, and in order to guarantee that the effect of placing is better natural simultaneously, introduce random interval error for trees interval changes in certain within range, guarantees that the final effect is better to be close to the growth law of natural trees.
And placing a custom model ornament in the scene. The specific method comprises the following steps: the custom model ornament is a three-dimensional model imported from an external file and used for being placed in a scene to be used as decoration. Such as billboards, signal lights, vehicles. The method for adding the custom model ornament in the scene is the same as adding vegetation.
Placing icons in a scene: an icon is a two-dimensional logo that, when placed in a three-dimensional scene, is displayed in a direction facing the camera.
In order to facilitate the rapid addition and use of the decorations, the decorations are classified and stored in a built-in model library, and are provided for a user to rapidly select and use.
Step 300: establishing parameter snapshot information of the three-dimensional scene;
based on the digital twin scene generated in the steps, generating a snapshot for all configuration parameters, wherein the snapshot is stored in a system background in the form of a configuration information file and has a unique snapshot name. And rapidly calling the snapshot according to the snapshot name, and rapidly restoring the scene to the state of the snapshot moment.
In the case of two sets of snapshots, a visually smooth switch is provided, for example the camera changes from the position and parameters of the first snapshot to the parameters of the second snapshot by smooth movement, rotation, scaling.
Step 400: and saving and releasing the three-dimensional scene.
Based on the digital twin scenes generated in the steps, the digital twin scenes are stored in a file for other systems to use.
Based on the digital twin scenes generated in the steps, the digital twin scenes are stored in a server and are distributed into online services, and JS interfaces are provided for directly controlling the operations of the scenes and elements in the scenes.
The beneficial effects are that: the browser is utilized to complete a series of processes for constructing the digital twin scene through various original data, so that one-stop construction of the digital twin scene is realized, and the working efficiency of the construction of the digital twin scene is greatly improved. The technicians do not need to know programming knowledge or install additional software, and can construct a digital twin scene through the existing three-dimensional model data. The whole process of constructing the digital twin scene is simpler, the obtained digital twin scene has better effect, the constructed scene can be used for publishing service, and the JS interface is provided to directly control objects in the scene.
The foregoing detailed description of the invention has been presented for purposes of illustration and description, and it should be understood that the invention is not limited to the particular embodiments disclosed, but is intended to cover all modifications, equivalents, alternatives, and improvements within the spirit and principles of the invention.

Claims (10)

1. A method for editing a digital twin virtual three-dimensional scene, the method comprising:
step 1: acquiring a file of a three-dimensional model;
step 2: configuring the three-dimensional scene, and constructing a three-dimensional scene through the three-dimensional model file;
step 2.1: setting the material of the three-dimensional model;
step 2.2: editing motion rules and motion parameters of the movable part of the three-dimensional model;
step 2.3: configuring an animation effect of the movable part of the three-dimensional model;
step 2.4: configuring geographic position coordinates of the three-dimensional scene;
acquiring the three-dimensional model as a display object, wherein the display object is a target to be registered;
the method comprises the steps of obtaining a reference object of a scene coordinate system as a registration reference object, wherein the scene coordinate system is a coordinate system universal to the field of digital twin visualization industry, and the registration reference object comprises a two-dimensional map based on a longitude and latitude coordinate system, an engineering drawing based on a plane orthogonal rectangular coordinate system or a two-dimensional map with a longitude and latitude coordinate system from an internet map service provider;
constructing a registration operation picture, superposing the display object on the registration reference object, and adjusting the position relationship between the display object and the registration reference object by a user through parameters to realize spatial position matching and obtain the spatial position corresponding relationship between the display object and the registration reference object;
the spatial position corresponding relation comprises longitude, latitude, altitude, rotation angle and scaling factor of the origin of the display object coordinate system;
step 2.5: configuring view angle parameters and visual special effect parameters of the three-dimensional scene;
step 2.6: adding ornaments of the three-dimensional scene;
step 3: establishing parameter snapshot information in the three-dimensional scene, obtaining a three-dimensional scene parameter snapshot, and adding the parameter snapshot information into the three-dimensional scene;
step 4: and storing and releasing the three-dimensional scene.
2. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the obtaining the file of the three-dimensional model specifically comprises:
using a built-in three-dimensional model in a built-in asset library of the system;
model files are derived from 3DMAX and MAYA three-dimensional modeling software for loading into the three-dimensional scene.
3. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the setting the material of the three-dimensional model specifically comprises:
performing parameter editing on the materials of the model nodes in the three-dimensional model, wherein the parameters for modification comprise a material map, a metal degree, single and double sides, high light attributes, a reflection map, a concave-convex map, transparency, self-luminescence, an illumination map, a material animation and map coordinates;
replacing the map of the material, and replacing the whole material with a dynamic rendering material realized by a loader function of a display card;
and adjusting performance attributes of the materials, designating whether reflection is calculated, a spatial range of reflection calculation, and whether shielding is calculated, and balancing the rendering effect and the rendering performance.
4. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the editing motion rules and motion parameters of the movable part of the three-dimensional model specifically comprises:
the movable part of the three-dimensional model comprises a plurality of model nodes, and the model nodes are created as a joint object;
the joint object comprises corresponding control variables and is used for controlling the movement of the joint object, and the types of the control variables comprise numerical type, boolean type and enumeration type;
establishing a motion mapping relation between the model node and the control variable, wherein the motion mapping comprises translation, rotation, scaling, color change and transparency change;
the motion map includes concurrent motion of at least one of the joint objects.
5. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein said configuring the animation effect of the movable part of the three-dimensional model comprises:
establishing an animation container for accommodating a plurality of joints for being driven in an animation, wherein the animation container comprises an animation time line, a circulation attribute and a playing speed;
adding a moving joint object in the animation container, and setting time parameters and motion parameters of the joint object;
and adding playing elements of the animation into the animation time line for playing control according to the animation time line, wherein the playing elements comprise starting time and playing duration.
6. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the configuring of the viewing angle parameters and visual effect parameters of the three-dimensional scene specifically comprises:
setting a camera visual angle parameter, a main light source, an auxiliary light source, a global special effect parameter, a global rendering parameter, a decorative effect animation and a background effect parameter which are overlapped in the three-dimensional scene.
7. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the adding the decoration of the three-dimensional scene specifically comprises:
placing vegetation in the three-dimensional scene;
placing a custom model ornament in the three-dimensional scene;
and placing icons in the three-dimensional scene.
8. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the step of creating parameter snapshot information in the three-dimensional scene specifically comprises:
generating a snapshot of all configuration parameters of the generated digital twin scene;
the snapshot is stored in a system background in the form of a configuration information file and has a unique snapshot name;
and taking the snapshot according to the snapshot name, and restoring the scene to the state of the snapshot time.
9. The method for editing a digital twin virtual three-dimensional scene according to claim 1, wherein the storing and publishing the three-dimensional scene specifically comprises:
storing the three-dimensional scene parameter snapshot;
and the three-dimensional scene parameter snapshot is stored in a file and/or a server, is distributed into an online service, and provides a JS interface to directly perform operation control on the parameter snapshot information and elements of the three-dimensional scene.
10. A computer readable storage medium having stored thereon computer executable instructions which when executed by at least one processor cause the at least one processor to perform the method of editing a digital twin virtual three dimensional scene as claimed in any of claims 1 to 9.
CN202210701427.9A 2022-06-20 2022-06-20 Editing method, system and storage medium for digital twin virtual three-dimensional scene Active CN115049811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210701427.9A CN115049811B (en) 2022-06-20 2022-06-20 Editing method, system and storage medium for digital twin virtual three-dimensional scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210701427.9A CN115049811B (en) 2022-06-20 2022-06-20 Editing method, system and storage medium for digital twin virtual three-dimensional scene

Publications (2)

Publication Number Publication Date
CN115049811A CN115049811A (en) 2022-09-13
CN115049811B true CN115049811B (en) 2023-08-15

Family

ID=83163854

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210701427.9A Active CN115049811B (en) 2022-06-20 2022-06-20 Editing method, system and storage medium for digital twin virtual three-dimensional scene

Country Status (1)

Country Link
CN (1) CN115049811B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457224B (en) * 2022-09-16 2023-07-14 武汉大海信息系统科技有限公司 Three-dimensional geospatial digital twin architecture method and system
CN117095135B (en) * 2023-10-19 2024-01-02 云南三耳科技有限公司 Industrial three-dimensional scene modeling arrangement method and device capable of being edited online
CN117292079B (en) * 2023-11-27 2024-03-05 浙江城市数字技术有限公司 Multi-dimensional scene coordinate point position conversion and mapping method applied to digital twin
CN117475041B (en) * 2023-12-28 2024-03-29 湖南视觉伟业智能科技有限公司 Digital twin shore bridge simulation method based on RCMS
CN117893648A (en) * 2024-01-23 2024-04-16 北京当境科技有限责任公司 Method and system for setting up animation interaction based on three-dimensional scene
CN117876642B (en) * 2024-03-08 2024-06-11 杭州海康威视系统技术有限公司 Digital model construction method, computer program product and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101694615A (en) * 2009-09-30 2010-04-14 成都九门科技有限公司 Browser-based construction system of three-dimensional ultra-large scene
CN110738739A (en) * 2019-10-22 2020-01-31 同济大学 Construction system of robot-assembly-oriented digital twin system
CN111611702A (en) * 2020-05-15 2020-09-01 深圳星地孪生科技有限公司 Digital twin scene creation method, apparatus, device and storage medium
CN112669454A (en) * 2021-03-16 2021-04-16 浙江明度智控科技有限公司 Three-dimensional scene construction method, system, device and storage medium for digital factory
CN112927361A (en) * 2021-03-25 2021-06-08 武汉中创普华科技有限公司 Programmable three-dimensional simulation design system and method for industrial automation
CN113963100A (en) * 2021-10-25 2022-01-21 广东工业大学 Three-dimensional model rendering method and system for digital twin simulation scene
CN113987835A (en) * 2021-11-16 2022-01-28 上海柏楚电子科技股份有限公司 Method and device for constructing digital twin scene of welding scene and automatically generating welding track

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150178988A1 (en) * 2012-05-22 2015-06-25 Telefonica, S.A. Method and a system for generating a realistic 3d reconstruction model for an object or being

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101694615A (en) * 2009-09-30 2010-04-14 成都九门科技有限公司 Browser-based construction system of three-dimensional ultra-large scene
CN110738739A (en) * 2019-10-22 2020-01-31 同济大学 Construction system of robot-assembly-oriented digital twin system
CN111611702A (en) * 2020-05-15 2020-09-01 深圳星地孪生科技有限公司 Digital twin scene creation method, apparatus, device and storage medium
CN112669454A (en) * 2021-03-16 2021-04-16 浙江明度智控科技有限公司 Three-dimensional scene construction method, system, device and storage medium for digital factory
CN112927361A (en) * 2021-03-25 2021-06-08 武汉中创普华科技有限公司 Programmable three-dimensional simulation design system and method for industrial automation
CN113963100A (en) * 2021-10-25 2022-01-21 广东工业大学 Three-dimensional model rendering method and system for digital twin simulation scene
CN113987835A (en) * 2021-11-16 2022-01-28 上海柏楚电子科技股份有限公司 Method and device for constructing digital twin scene of welding scene and automatically generating welding track

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张海程 等.基于Unity3D的三维页游的设计与实现.微型机与应用.2016,第第35卷卷(第第35卷期),第49-51页. *

Also Published As

Publication number Publication date
CN115049811A (en) 2022-09-13

Similar Documents

Publication Publication Date Title
CN115049811B (en) Editing method, system and storage medium for digital twin virtual three-dimensional scene
US5710878A (en) Method for facilitating material application for a group of objects of a computer graphic
Tolba et al. A projective drawing system
US7616201B2 (en) Casting shadows
Harper Mastering Autodesk 3ds Max 2013
CN117011492B (en) Image rendering method and device, electronic equipment and storage medium
WO2014045289A1 (en) Adding objects to digital photographs
US11625900B2 (en) Broker for instancing
CN116310041A (en) Rendering method and device of internal structure effect, electronic equipment and storage medium
US11393180B2 (en) Applying non-destructive edits to nested instances for efficient rendering
CN116485967A (en) Virtual model rendering method and related device
Kutdusov et al. technology for creating 3d objects in Autodesk 3ds Max and Adobe Photoshop
CN112182904A (en) Method, device and equipment for simulating brushing card by using simulated material
Akleman et al. Dynamic Paintings: Real-Time Interactive Artworks in Web
Liu A novel Mesa-based OpenGL implementation on an FPGA-based embedded system
CN116112657B (en) Image processing method, image processing device, computer readable storage medium and electronic device
Tolba A projective approach to computer-aided drawing
Kol et al. Expressive single scattering for light shaft stylization
Denisov Elaboration of New Viewing Modes in CATIA CAD for Lighting Simulation Purpose
Pokorný et al. Department of Computer and Communication Systems, Faculty of Applied Informatics, Tomas Bata University in Zlín, Nad Stráněmi 4511, 760 05 Zlín, Czech Republic {pokorny, h_silhavikova}@ utb. cz
Pokorný et al. A 3D Visualization of Zlín in the Eighteen–Nineties Using Virtual Reality
Teixeira et al. AI-Powered 360 Panoramas: Unveiling Challenges for Realistic XR Prototyping
Li et al. Design and Implementation of Three Dimensional Virtual Roaming System for Modern Park
Lu How to Create 3D Movie Scenes in the Style of Chinese Paintings Using Computer Processing
Ji Design and Modeling of Chinese Classical Lanterns Based on Different Processes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant