CN109934909B - Method and device for reconstructing 3D model, electronic equipment and storage medium - Google Patents

Method and device for reconstructing 3D model, electronic equipment and storage medium Download PDF

Info

Publication number
CN109934909B
CN109934909B CN201910160385.0A CN201910160385A CN109934909B CN 109934909 B CN109934909 B CN 109934909B CN 201910160385 A CN201910160385 A CN 201910160385A CN 109934909 B CN109934909 B CN 109934909B
Authority
CN
China
Prior art keywords
model
unit
restoring
queue
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910160385.0A
Other languages
Chinese (zh)
Other versions
CN109934909A (en
Inventor
柯冠强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Miniwan Technology Co ltd
Original Assignee
Shenzhen Miniwan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Miniwan Technology Co ltd filed Critical Shenzhen Miniwan Technology Co ltd
Priority to CN201910160385.0A priority Critical patent/CN109934909B/en
Publication of CN109934909A publication Critical patent/CN109934909A/en
Application granted granted Critical
Publication of CN109934909B publication Critical patent/CN109934909B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a method, a device, electronic equipment and a storage medium for reconstructing a 3D model, and relates to the field of 3D model processing, wherein the method comprises the following steps: restoring a 3D model based on pre-acquired 3D model data, wherein the 3D model consists of unit models supporting independent control; placing each unit model contained in the sub-model in a queue in response to selected instructions for the restored sub-model in the 3D model; and responding to the transformation instruction of the sub-model, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model. The method improves the flexibility of processing the 3D model.

Description

Method and device for reconstructing 3D model, electronic equipment and storage medium
Technical Field
The present invention relates to the field of 3D model processing, and in particular, to a method, an apparatus, an electronic device, and a storage medium for reconstructing a 3D model.
Background
At the moment of the high development of information processing technology, the processing of 3D models is involved in many fields, for example: indoor design field, archaeological reduction field. In the prior art, when the 3D model is restored, the integrated 3D model is directly restored according to the acquired 3D model data. The 3D model restored by the method is integrated, has low flexibility in further processing, and cannot meet the requirements of users for simulating the specific construction process of the corresponding scene or carrying out the self-defined change on the structure of the 3D model.
Disclosure of Invention
Based on the above, in order to solve the technical problem of how to solve the problem of low flexibility of processing the 3D model in the related art, the invention provides a method, a device, an electronic device and a storage medium for reconstructing the 3D model.
In a first aspect, a method of reconstructing a 3D model is provided, comprising:
restoring a 3D model based on pre-acquired 3D model data, wherein the 3D model consists of unit models supporting independent control;
placing each unit model contained in the sub-model in a queue in response to selected instructions for the restored sub-model in the 3D model;
and responding to the transformation instruction of the sub-model, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
In an exemplary embodiment of the present disclosure, the 3D model data is stored in a binary-serialized form.
In an exemplary embodiment of the present disclosure, the 3D model data is stored in a form of XML file serialization.
In an exemplary embodiment of the present disclosure, the unit model supports independent control by:
a unified data structure is designed in advance for each unit model, wherein the data structure comprises a unique identification of the unit model, position information of the unit model relative to a restoration base point and a required unit time length for restoring the unit model.
In an exemplary embodiment of the present disclosure, restoring a 3D model based on pre-acquired 3D model data includes:
determining a total time length required for restoring the 3D model from preconfigured time-consuming parameters for restoring in response to an instruction for restoring the 3D model;
determining the unit time length required for restoring each unit model in the 3D model based on the total time length required for restoring the 3D model;
and based on the 3D model data, sequentially restoring each unit model in the 3D model according to the unit time length.
In an exemplary embodiment of the present disclosure, determining a unit duration required to restore each unit model in the 3D model based on a total duration required to restore the 3D model includes:
determining a total number of unit models in the 3D model;
dividing the total time length by the total number of the unit models to obtain the unit time length.
In an exemplary embodiment of the present disclosure, sequentially restoring each unit model in the 3D model by the unit duration based on the 3D model data includes:
determining a restoration base point of the 3D model;
and sequentially restoring each unit model in the 3D model according to the unit time length and the identification sequence of each unit model by taking the restoration base point as a reference based on the position information of each unit model relative to the restoration base point.
In an exemplary embodiment of the present disclosure, in response to a selected instruction for a restored submodel in the 3D model, placing in a queue each unit model contained by the submodel comprises:
and sequentially arranging the unit models contained in the sub-models into a queue according to the sequence from small to large of the corresponding unique identifiers.
In an exemplary embodiment of the disclosure, in response to a transformation instruction for the submodel, performing, independently, a corresponding transformation on each unit model in the queue to implement reconstruction of the 3D model, including:
determining the transformation instruction of the sub model as transformation operation to be performed on each unit model in the queue;
and carrying out the transformation operation on each unit model in the queue sequentially according to the front-back sequence in the queue.
According to a second aspect of the present disclosure, there is provided an apparatus for reconstructing a 3D model, comprising:
the restoration module is used for restoring the 3D model based on the pre-acquired 3D model data, wherein the 3D model consists of unit models supporting independent control;
a queue establishing module, configured to respond to a selected instruction for a restored sub-model in the 3D model, and place each unit model included in the sub-model in a queue;
and the transformation module is used for responding to the transformation instruction of the submodel, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
According to a third aspect of the present disclosure, there is provided an electronic device for reconstructing a 3D model, comprising:
and a memory configured to store the executable instructions.
A processor configured to execute the executable instructions stored in the memory to perform the method described above.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium storing computer program instructions which, when executed by a computer, cause the computer to perform the method described above.
In the prior art, when the 3D model is restored, the whole 3D model is directly restored according to the 3D model data. Wherein the 3D model restored in this way is integral and inseparable. This results in a 3D model restored by this method, which cannot support independent control of the sub-models therein, for example: translation and rotation. Accordingly, embodiments of the present disclosure propose a method of reconstructing a 3D model. In the method, the 3D model is composed of unit models each supporting independent control. The transformation of the sub-model of the 3D model is realized through independent control of each unit model, so that the aim of reconstructing the 3D model is fulfilled, and the flexibility of processing the 3D model is improved.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
Fig. 1 shows a flowchart of reconstructing a 3D model according to an example embodiment of the present disclosure.
Fig. 2 illustrates a detailed flow diagram for restoring a 3D model based on pre-acquired 3D model data according to an example embodiment of the present disclosure.
Fig. 3 shows a block diagram of an apparatus for reconstructing a 3D model according to an example embodiment of the present disclosure.
Fig. 4 illustrates an electronic device diagram for reconstructing a 3D model according to an example embodiment of the present disclosure.
Fig. 5 illustrates a computer-readable storage medium diagram for reconstructing a 3D model according to an example embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more example embodiments. In the following description, numerous specific details are provided to give a thorough understanding of example embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, steps, etc. In other instances, well-known structures, methods, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The implementation subject of the embodiments of the present disclosure is a terminal capable of implementing the method, for example, a server terminal, a personal computer terminal. The terminal implementing the method can support the transformation of any sub-model in the 3D model, thereby realizing the flexible processing of the 3D model.
The process of the embodiments of the present disclosure is described below with reference to the accompanying drawings.
Fig. 1 shows a flowchart of a method of reconstructing a 3D model according to an example embodiment of the present disclosure, the method comprising:
step S100: restoring a 3D model based on pre-acquired 3D model data, wherein the 3D model consists of unit models supporting independent control;
step S110: placing each unit model contained in the sub-model in a queue in response to selected instructions for the restored sub-model in the 3D model;
step S120: and responding to the transformation instruction of the sub-model, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
In step S100, a 3D model is restored based on pre-acquired 3D model data, wherein the 3D model is composed of unit models each supporting independent control.
The 3D model data refers to data describing a specific structure of the 3D model in a numerical manner.
The unit model refers to the minimum unit of the 3D model having a uniform structure. The unit model is predefined by constructors of the 3D model data according to own requirements. For example, the 3D model is a building, and according to a predetermined definition, the minimum unit of the building is the bricks that compose the building, and then each brick model in the 3D model is the unit model of the 3D model.
In one embodiment, the 3D model data is stored in binary serialized form.
In this embodiment, from the superclass attribute value description to the subclass attribute value description at the top layer, class data related to the 3D model data object instance is output in advance as a binary byte stream and stored in the 3D model data file for the terminal to read.
The embodiment has the advantage that the terminal can read the 3D model data more quickly, and the process of analyzing the 3D model data is omitted, so that the 3D model is restored more efficiently.
In one embodiment, the 3D model data is stored in a serialized form of an XML file.
In this embodiment, the 3D model data is stored in advance as an XML file and serialized into a byte stream to be stored in the 3D model data file for the terminal to read.
An advantage of this embodiment is that the 3D model data serialized by the XML file can be shared among multiple system platforms (e.g., WINDOWS, ANDROID), improving the compatibility of the 3D model data.
The structure of each unit model in the 3D model is described below.
In one embodiment, the unit model supports independent control by:
a unified data structure is designed in advance for each unit model, wherein the data structure comprises a unique identification of the unit model, position information of the unit model relative to a restoration base point and a required unit time length for restoring the unit model.
The restoration base point is a reference point in the three-dimensional coordinate system when restoring the 3D model.
In this embodiment, the 3D model data stores a data structure corresponding to each unit model. The data structure of each unit model includes, but is not limited to, a unique identification of the unit model (e.g., a serial number ID of the unit model), positional information of the unit model relative to a restoration base point (e.g., coordinates of each vertex of the unit model relative to the restoration base point), and a required unit duration for restoring the unit. The terminal may control the data structure of each unit model to achieve independent control of each unit model (e.g., alter the position information to achieve translation, rotation of the unit model).
The embodiment has the advantage of supporting independent control of each unit model in the 3D model, so that the terminal can change the corresponding specific part of the 3D model according to the acquired instruction, and the other parts of the 3D model are not affected.
The specific procedure for restoring the 3D model is described below.
In one embodiment, as shown in fig. 2, step S100 includes:
step S1001: determining a total time length required for restoring the 3D model from preconfigured time-consuming parameters for restoring in response to an instruction for restoring the 3D model;
step S1002: determining the unit time length required for restoring each unit model in the 3D model based on the total time length required for restoring the 3D model;
step S1003: and based on the 3D model data, sequentially restoring each unit model in the 3D model according to the unit time length.
In an embodiment, the instruction for restoring the 3D model may be triggered by clicking a restore instruction button in a preset interface of the terminal.
In one embodiment, the recovery time parameter may be configured by a user in a parameter configuration interface of the terminal. If the recovery time parameter configured by the parameter configuration interface is not received, determining the default recovery time as the total time length required for recovering the 3D model.
In an embodiment, after receiving an instruction for restoring the 3D model, the terminal determines a total time length required for restoring the 3D model from the configured time-to-restore parameters, and further determines a unit time length required for restoring each unit model in the 3D model. Thereby restoring each unit model in the 3D model with the unit time length.
The method has the advantages that the generation process of each structure of the 3D model can be intuitively displayed by recovering the 3D model in a mode of gradually recovering each unit model, so that the requirements of users on the construction process of simulating the corresponding scene are met.
In one embodiment, determining the unit time length required to restore each unit model in the 3D model based on the total time length required to restore the 3D model includes:
determining a total number of unit models in the 3D model;
dividing the total time length by the total number of the unit models to obtain the unit time length.
In this embodiment, the total number of unit models in the 3D model is determined from the 3D model data, thereby determining the unit time length required to restore each unit model in the 3D model.
In an embodiment, based on the 3D model data, sequentially restoring each unit model in the 3D model with the unit duration includes:
determining a restoration base point of the 3D model;
and sequentially restoring each unit model in the 3D model according to the unit time length and the identification sequence of each unit model by taking the restoration base point as a reference based on the position information of each unit model relative to the restoration base point.
In this embodiment, the 3D model data determines in advance the order of restoring the unit models according to the identification of the unit model, for example: the restoration is performed sequentially in the order of the unit model IDs from smaller to larger.
When the terminal recovers the 3D model, the terminal first determines a recovery base point. The restoration base point can be determined by receiving restoration base point parameters configured by a user in a terminal configuration interface. For example, if the restoration base point input by the user on the terminal configuration interface is (0, 10, 10), the terminal determines the point (0, 10, 10) in the three-dimensional coordinate system as the restoration base point of the 3D model. If the restoration base point parameters configured by the user on the terminal configuration interface are not received, determining a default restoration base point as the restoration base point of the 3D model. For example, the origin (0, 0) in the three-dimensional coordinate system.
The terminal sequentially restores each unit model in the 3D model based on the position information of each unit model relative to the restoring base point and the unit time length based on the restoring base point according to the determined restoring sequence of each unit model. For example, it has been determined that the restoration base point is (0, 10, 10), the unit time length is 0.5s, and the unit models whose IDs are from 001 to 009 are sequentially restored in the order of decreasing IDs of the unit models. Wherein the data structure of the unit model with ID of 001 defines the unit model: the ID is 001, and the coordinates of the eight vertices relative to the restored base point are [ 0, 0), (0, 10, 0), (10, 10, 0), (10,0,0), (0, 10), (0, 10, 10), (10, 0, 10) ], and the unit duration is 0.5s. When the unit model with ID 001 is restored, the eight vertices (0, 10, 10), (0, 20, 10), (10, 10, 10), (0, 10, 20), (0, 20, 20), (10, 20, 20) are restored for 0.5s.
The method has the advantage that the construction process of the scene corresponding to the 3D model can be accurately simulated according to the requirement of the user on the speed of recovering the 3D model.
The reconstruction process for the 3D model is described below.
In step S110, each unit model contained in the sub-model is placed in a queue in response to a selected instruction for the restored sub-model in the 3D model.
The sub model refers to a partial 3D model, and may be one unit model in the 3D model or a combination of a plurality of unit models in the 3D model.
In an embodiment, the selected instruction of the sub-model may be input through a console of the terminal, or may be selected through a mouse circling on the selected interface.
In one embodiment, in response to selected instructions for a restored submodel in the 3D model, placing in a queue each unit model contained by the submodel comprises:
and sequentially arranging the unit models contained in the sub-models into a queue according to the sequence from small to large of the corresponding unique identifiers.
In this embodiment, after receiving a selection instruction for the restored submodel in the 3D model, for the selected submodel, each unit model is sequentially placed in a queue according to the order of the marks from small to large. For example, the unit models included in the selected sub-model are: model 001, model 006, model 002, model 007. The queues obtained by the unit models according to the sequence from the small mark to the large mark are as follows: model 001, model 002, model 006, model 007.
An advantage of this embodiment is that the transformation of the sub-model is achieved by sequentially performing the corresponding transformations on each unit model in the queue without affecting other parts of the 3D model than the sub-model, when a transformation instruction for the sub-model is subsequently received.
In step S120, in response to the transformation instruction for the sub-model, the corresponding transformation is independently performed on each unit model in the queue, so as to reconstruct the 3D model.
In the embodiment of the disclosure, since the 3D model is composed of unit models supporting independent control, transformation of the sub-models can be achieved by performing corresponding transformation on each unit model in the sub-models.
In an embodiment, the transformation instructions for the submodel include, but are not limited to: replication, translation, rotation.
In an embodiment, in response to a transformation instruction for the sub-model, performing corresponding transformation on each unit model in the queue independently to implement reconstruction of the 3D model, including:
determining the transformation instruction of the sub model as transformation operation to be performed on each unit model in the queue;
and carrying out the transformation operation on each unit model in the queue sequentially according to the front-back sequence in the queue.
In this embodiment, after receiving the transformation instruction of the sub-model, the terminal determines the transformation operation to be performed on each unit model in the queue according to the transformation instruction of the sub-model. For example, receiving a transform instruction for a sub-model is: "the submodel is rotated by 70 degrees anticlockwise with the X-axis positive direction as the axis"; the transformation operation that should be performed on each unit model in the queue is: "the unit model is rotated by 70 degrees counterclockwise around the positive X-axis direction".
After the transformation operation of each unit model in the queue is determined, the transformation operation is sequentially carried out on each unit model in the queue according to the front-to-back sequence in the queue, so that the transformation of the sub-model is realized, namely the 3D model is reconstructed. For example, the 3D model is a bedroom model, and the terminal receives a transformation instruction for a cup model in the bedroom model as "translate the submodel 10 units of distance along the positive direction of the X-axis". Then, each unit model in the queue corresponding to the water cup model is translated for 10 unit distances along the positive direction of the X axis, so that a conversion instruction for translating the water cup model is realized, and the bedroom model is reconstructed.
This embodiment has the advantage that the reconstruction of the entire 3D model is very flexible, since the unit models can be controlled independently.
According to an embodiment of the present disclosure, as shown in fig. 3, there is also provided an apparatus for reconstructing a 3D model, including:
a restoration module 210, configured to restore a 3D model based on pre-acquired 3D model data, where the 3D model is composed of unit models supporting independent control;
a queue establishment module 220, configured to place each unit model included in the sub-model into a queue in response to a selected instruction for the restored sub-model in the 3D model;
and the transformation module 230 is configured to respond to the transformation instruction for the sub-model, and independently perform corresponding transformation on each unit model in the queue, so as to reconstruct the 3D model.
In one embodiment, the 3D model data is stored in binary serialized form.
In one embodiment, the 3D model data is stored in a serialized form of an XML file.
In one embodiment, the unit model supports independent control by:
a unified data structure is designed in advance for each unit model, wherein the data structure comprises a unique identification of the unit model, position information of the unit model relative to a restoration base point and a required unit time length for restoring the unit model.
In an embodiment, restoring the 3D model based on pre-acquired 3D model data includes:
determining a total time length required for restoring the 3D model from preconfigured time-consuming parameters for restoring in response to an instruction for restoring the 3D model;
determining the unit time length required for restoring each unit model in the 3D model based on the total time length required for restoring the 3D model;
and based on the 3D model data, sequentially restoring each unit model in the 3D model according to the unit time length.
In one embodiment, determining the unit time length required to restore each unit model in the 3D model based on the total time length required to restore the 3D model includes:
determining a total number of unit models in the 3D model;
dividing the total time length by the total number of the unit models to obtain the unit time length.
In an embodiment, based on the 3D model data, sequentially restoring each unit model in the 3D model with the unit duration includes:
determining a restoration base point of the 3D model;
and sequentially restoring each unit model in the 3D model according to the unit time length and the identification sequence of each unit model by taking the restoration base point as a reference based on the position information of each unit model relative to the restoration base point.
In one embodiment, in response to selected instructions for a restored submodel in the 3D model, placing in a queue each unit model contained by the submodel comprises:
and sequentially arranging the unit models contained in the sub-models into a queue according to the sequence from small to large of the corresponding unique identifiers.
In an embodiment, in response to a transformation instruction for the sub-model, performing corresponding transformation on each unit model in the queue independently to implement reconstruction of the 3D model, including:
determining the transformation instruction of the sub model as transformation operation to be performed on each unit model in the queue;
and carrying out the transformation operation on each unit model in the queue sequentially according to the front-back sequence in the queue.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 400 according to such an embodiment of the invention is described below with reference to fig. 4. The electronic device 400 shown in fig. 4 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 4, the electronic device 400 is embodied in the form of a general purpose computing device. The components of electronic device 400 may include, but are not limited to: the at least one processing unit 410, the at least one memory unit 420, and a bus 430 connecting the various system components, including the memory unit 420 and the processing unit 410.
Wherein the storage unit stores program code that is executable by the processing unit 410 such that the processing unit 410 performs steps according to various exemplary embodiments of the present invention described in the above-described "exemplary methods" section of the present specification. For example, the processing unit 410 may perform step S100 as shown in fig. 1: restoring a 3D model based on pre-acquired 3D model data, wherein the 3D model consists of unit models supporting independent control; step S110: placing each unit model contained in the sub-model in a queue in response to selected instructions for the restored sub-model in the 3D model; step S120: and responding to the transformation instruction of the sub-model, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
The storage unit 420 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 4201 and/or cache memory 4202, and may further include Read Only Memory (ROM) 4203.
The storage unit 420 may also include a program/utility 4204 having a set (at least one) of program modules 4205, such program modules 4205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 430 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 400 may also communicate with one or more external devices 500 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 400, and/or any device (e.g., router, modem, etc.) that enables the electronic device 400 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 450. Also, electronic device 400 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 460. As shown, the network adapter 460 communicates with other modules of the electronic device 400 over the bus 430. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 400, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this specification, when said program product is run on the terminal device.
Referring to fig. 5, a program product 600 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (8)

1. A method of reconstructing a 3D model, the method comprising:
determining a total time length required for restoring the 3D model from preconfigured time-consuming parameters for restoring in response to an instruction for restoring the 3D model; wherein the 3D model is composed of unit models supporting independent control;
determining the unit time length required for restoring each unit model in the 3D model based on the total time length required for restoring the 3D model;
determining a restoration base point of the 3D model based on the 3D model data, and sequentially restoring each unit model in the 3D model according to the unit time length and the identification sequence of each unit model by taking the restoration base point as a reference based on the position information of each unit model relative to the restoration base point; the 3D model data refers to data describing a specific structure of the 3D model in a numerical mode, and the unit model refers to the minimum unit of the 3D model with a unified structure; the 3D model data is stored in a binarization sequence form, class data related to a 3D model data object instance is output into a binary byte stream from superclass attribute value description to subclass attribute value description at the topmost layer, and the binary byte stream is stored in a 3D model data file; the 3D model data is pre-stored as an XML file and serialized into a byte stream to be stored in the 3D model data file;
placing each unit model contained in the sub-model in a queue in response to selected instructions for the restored sub-model in the 3D model;
and responding to the transformation instruction of the sub-model, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
2. The method of claim 1, wherein the unit model supports independent control by:
a unified data structure is designed in advance for each unit model, wherein the data structure comprises a unique identification of the unit model, position information of the unit model relative to a restoration base point and a required unit time length for restoring the unit model.
3. The method of claim 2, wherein the data structure is stored in a binary serialized form.
4. The method of claim 1, wherein said placing each unit model contained in a sub-model in a queue in response to selected instructions for a restored sub-model in the 3D model comprises:
and sequentially arranging the unit models contained in the sub-models into a queue according to the sequence from small to large of the corresponding unique identifiers.
5. The method according to claim 1, wherein said independently transforming each unit model in said queue in response to a transformation instruction for said sub-model to effect reconstruction of said 3D model comprises:
determining the transformation instruction of the sub model as transformation operation to be performed on each unit model in the queue;
and carrying out the transformation operation on each unit model in the queue sequentially according to the front-back sequence in the queue.
6. An apparatus for reconstructing a 3D model, comprising:
a restoration module, configured to respond to an instruction for restoring the 3D model, and determine a total time length required for restoring the 3D model from preconfigured restoration time-consuming parameters; wherein the 3D model is composed of unit models supporting independent control; determining the unit time length required for restoring each unit model in the 3D model based on the total time length required for restoring the 3D model; determining a restoration base point of the 3D model based on the 3D model data, and sequentially restoring each unit model in the 3D model according to the unit time length and the identification sequence of each unit model by taking the restoration base point as a reference based on the position information of each unit model relative to the restoration base point; the 3D model data refers to data describing a specific structure of the 3D model in a numerical mode, and the unit model refers to the minimum unit of the 3D model with a unified structure; the 3D model data is stored in a binarization sequence form, class data related to a 3D model data object instance is output into a binary byte stream from superclass attribute value description to subclass attribute value description at the topmost layer, and the binary byte stream is stored in a 3D model data file; the 3D model data is pre-stored as an XML file and serialized into a byte stream to be stored in the 3D model data file;
a queue establishing module, configured to respond to a selected instruction for a restored sub-model in the 3D model, and place each unit model included in the sub-model in a queue;
and the transformation module is used for responding to the transformation instruction of the submodel, and independently carrying out corresponding transformation on each unit model in the queue so as to reconstruct the 3D model.
7. An electronic device for reconstructing a 3D model, comprising:
a memory configured to store executable instructions;
a processor configured to execute executable instructions stored in the memory to perform the method according to any one of claims 1-5.
8. A computer readable storage medium, characterized in that it stores computer program instructions, which when executed by a computer, cause the computer to perform the method according to any of claims 1-5.
CN201910160385.0A 2019-03-04 2019-03-04 Method and device for reconstructing 3D model, electronic equipment and storage medium Active CN109934909B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910160385.0A CN109934909B (en) 2019-03-04 2019-03-04 Method and device for reconstructing 3D model, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910160385.0A CN109934909B (en) 2019-03-04 2019-03-04 Method and device for reconstructing 3D model, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109934909A CN109934909A (en) 2019-06-25
CN109934909B true CN109934909B (en) 2023-05-23

Family

ID=66986341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910160385.0A Active CN109934909B (en) 2019-03-04 2019-03-04 Method and device for reconstructing 3D model, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109934909B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090065965A (en) * 2007-12-18 2009-06-23 주식회사 케이티 3d image model generation method and apparatus, image recognition method and apparatus using the same and recording medium storing program for performing the method thereof
US9158796B1 (en) * 2013-03-11 2015-10-13 Ca, Inc. Data source modeling methods for heterogeneous data sources and related computer program products and systems
CN109254965A (en) * 2018-08-22 2019-01-22 中国平安人寿保险股份有限公司 Model treatment method and system, storage medium and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8335757B2 (en) * 2009-01-26 2012-12-18 Microsoft Corporation Extracting patterns from sequential data
KR102184766B1 (en) * 2013-10-17 2020-11-30 삼성전자주식회사 System and method for 3D model reconstruction
CN105989198B (en) * 2015-01-29 2019-10-25 中交宇科(北京)空间信息技术有限公司 Highway parametrization method for automatic modeling and system based on BIM
KR101934645B1 (en) * 2017-08-09 2019-01-02 한국동서발전(주) System and method for managing construction activity of 4-dimension using virtual construction simulation
CN108197376A (en) * 2017-12-28 2018-06-22 中铁二局集团有限公司 A kind of temporary building design and modeling method and system based on BIM

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090065965A (en) * 2007-12-18 2009-06-23 주식회사 케이티 3d image model generation method and apparatus, image recognition method and apparatus using the same and recording medium storing program for performing the method thereof
US9158796B1 (en) * 2013-03-11 2015-10-13 Ca, Inc. Data source modeling methods for heterogeneous data sources and related computer program products and systems
CN109254965A (en) * 2018-08-22 2019-01-22 中国平安人寿保险股份有限公司 Model treatment method and system, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN109934909A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
WO2016082311A1 (en) System and method for generating machining control data of numerical control machine tool
CN110855482A (en) Three-dimensional dynamic communication network analog simulation method, system and storage medium
CN113641413B (en) Target model loading updating method and device, readable medium and electronic equipment
CN110968305A (en) Applet visualization generation method, device, equipment and storage medium
CN110478898B (en) Configuration method and device of virtual scene in game, storage medium and electronic equipment
CN111045675A (en) Page generation method, device, equipment and storage medium based on Flutter
CN109388843B (en) Visualization system and method of truss antenna based on VTK (virtual terminal K), and terminal
CN110825807A (en) Data interaction conversion method, device, equipment and medium based on artificial intelligence
CN109766319B (en) Compression task processing method and device, storage medium and electronic equipment
CN110807111A (en) Three-dimensional graph processing method and device, storage medium and electronic equipment
CN107817962B (en) Remote control method, device, control server and storage medium
CN111428165A (en) Three-dimensional model display method and device and electronic equipment
CN111124409A (en) Sketch-based business page generation method, device, equipment and storage medium
CN112807695B (en) Game scene generation method and device, readable storage medium and electronic equipment
CN110764864A (en) Terraform-based visual resource arrangement method
CN110825802A (en) Multi-type database data backup method, device, equipment and storage medium
CN111462269B (en) Image processing method and device, storage medium and electronic equipment
CN109934909B (en) Method and device for reconstructing 3D model, electronic equipment and storage medium
CN112162822A (en) Mirror image construction method, device, equipment and readable storage medium
CN116956402A (en) Forward design method and device for building information model, medium and electronic equipment
CN108667902B (en) Remote control system, method, device and equipment of iOS equipment
CN115544622B (en) Urban and rural participated three-dimensional planning design platform, method, equipment and storage medium
CN110704766A (en) Interface rendering optimization method and device based on real-time snapshot and electronic equipment
US11797277B2 (en) Neural network model conversion method server, and storage medium
CN110465093B (en) Method and device for analyzing inclusion redundant resources based on Unity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant