CN111986311A - Template data conversion system and method - Google Patents

Template data conversion system and method Download PDF

Info

Publication number
CN111986311A
CN111986311A CN202010785477.0A CN202010785477A CN111986311A CN 111986311 A CN111986311 A CN 111986311A CN 202010785477 A CN202010785477 A CN 202010785477A CN 111986311 A CN111986311 A CN 111986311A
Authority
CN
China
Prior art keywords
data
unit
scene
module
dimensional space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010785477.0A
Other languages
Chinese (zh)
Inventor
周安斌
王野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jindong Digital Creative Co ltd
Original Assignee
Shandong Jindong Digital Creative Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jindong Digital Creative Co ltd filed Critical Shandong Jindong Digital Creative Co ltd
Priority to CN202010785477.0A priority Critical patent/CN111986311A/en
Publication of CN111986311A publication Critical patent/CN111986311A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a template data conversion system and a method, which belong to the technical field of data conversion, and comprise an acquisition module, a data processing module and a conversion module, wherein the template data conversion system and the method are combined with rendering output of Autodesk 3dsMax, Adobe After-Effect and the like, so that the Autodesk 3dsMax and the Adobe After-Effect are perfectly combined and interacted, meanwhile, the problem that the traditional manufacturing cannot be combined with the later-stage manufacturing is solved, a large amount of time for back-and-forth interaction among other software is saved, the time is accelerated, the manufacturing cost is saved, a three-dimensional space which can be complete is provided for the later-stage manufacturing of a project, the time for testing the rendering of a spatial relationship of the project is greatly saved, and the manufacturing cost of the project is saved.

Description

Template data conversion system and method
Technical Field
The invention belongs to the technical field of data conversion, and particularly relates to a template data conversion system and a template data conversion method.
Background
The 3D image is an image having a stereoscopic effect, the 3D movie is a movie having a stereoscopic effect, the 3D image is a planar image having a sense of reality created by three-dimensional software, and ordinary photographs are 3D images, the 3D image is a computer drawing work created by using a stereoscopic vision phenomenon of human eyes, and ordinary drawings and photographic works, including three-dimensional animation created by a computer, obtain a stereoscopic sense only by using senses of light and shadow, brightness, and reality of human eyes, and do not use a stereoscopic vision of both eyes. However, in the prior art, the 3D spatial data manufacturing process is cumbersome, resulting in a long manufacturing cycle and high manufacturing cost.
Disclosure of Invention
The embodiment of the invention provides a template data conversion system and a template data conversion method, and aims to solve the problems of long manufacturing period and high manufacturing cost caused by the fact that the existing 3D space data manufacturing process is relatively complicated.
In view of the above problems, the technical solution proposed by the present invention is:
a template data conversion system comprises an acquisition module, a data processing module and a conversion module;
the acquisition module is used for importing corresponding scene data, processing the scene data and transmitting the processed scene data to the data processing module;
the data processing module is used for receiving the scene data of the acquisition module, processing the scene data according to the set corresponding data and transmitting the processed scene data to the conversion module;
and the conversion module is used for receiving the processed scene data, converting the scene data and transmitting the converted scene data to the conversion module.
As a preferred technical solution of the present invention, the obtaining module includes an importing unit, and the importing unit is configured to import corresponding scene data, process the scene data, and transmit the processed scene data to the data processing module.
As a preferred technical solution of the present invention, the data processing module includes a data setting unit and a data processing unit, the data setting unit is configured to receive the scene data imported by the importing unit, and simultaneously receive a corresponding numerical value set by a user and transmit the corresponding numerical value to the data processing unit, and the data processing unit is configured to receive the scene data transmitted by the data setting unit and the set corresponding numerical value, integrate and scale the scene data and the corresponding data to generate fused data, and transmit the fused data to the converting module.
As a preferred technical solution of the present invention, the conversion module includes a data conversion unit and a storage unit, the data conversion unit is configured to receive the fusion data of the data processing unit, convert the fusion data into three-dimensional space data, and transmit the three-dimensional space data to the storage unit, and the storage unit is configured to receive the three-dimensional space data of the data conversion unit, process the three-dimensional space data, and store the three-dimensional space data.
In a second aspect, an embodiment of the present invention provides a method based on a panoramic active stereo generating system, including the following steps:
and S1, importing the scene data, importing the corresponding scene data by the importing unit, processing the scene data and transmitting the processed scene data to the data setting unit.
And S2, setting and processing the spatial parameters, wherein the data setting unit receives the scene data imported by the import unit, simultaneously receives corresponding numerical values set by a user and transmits the numerical values to the data processing unit, and the data processing unit receives the scene data transmitted by the data setting unit and the set corresponding numerical values, integrates and scales the scene data and the corresponding data to generate fused data and transmits the fused data to the data conversion unit.
And S3, converting and storing the three-dimensional space data, wherein the data conversion unit receives the fusion data of the data processing unit, converts the fusion data into the three-dimensional space data and transmits the three-dimensional space data to the storage unit, and the storage unit receives the three-dimensional space data of the data conversion unit, processes the three-dimensional space data and stores the three-dimensional space data.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
(1) by combining rendering output of the Autodesk 3dsMax, the Adobe After Effect and the like, the Autodesk 3dsMax and the Adobe After Effect are perfectly combined and interacted, the problem that the traditional making cannot be combined with the making in the later stage is solved, the time for back and forth interaction among other software is greatly saved, the time is shortened, and the making cost is saved.
(2) For post-production of the project, a three-dimensional space which can be complete is provided, the rendering time of the project for testing the spatial relationship is greatly saved, and the production cost of the project is saved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
FIG. 1 is a schematic diagram of a template data transformation system according to the present disclosure;
FIG. 2 is a flow chart of a method of a template data transformation system as disclosed herein.
Description of reference numerals: 100-acquisition module, 110-import unit, 200-data processing module, 210-data setting unit, 220-data processing unit, 300-conversion module, 310-data conversion unit and 320-storage unit.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings of the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Example one
Referring to the attached figure 1, the invention provides a technical scheme: a template data conversion system comprises an acquisition module 100, a data processing module 200 and a conversion module 300.
The acquisition module 100 is configured to import corresponding scene data, process the scene data, and transmit the processed scene data to the data processing module 200.
The data processing module 200, the data processing module 200 is configured to receive the scene data of the obtaining module 100, process the scene data according to the set corresponding data, and transmit the processed scene data to the converting module 300.
A conversion module 300, configured to receive the processed scene data, convert the scene data, and transmit the converted scene data to the conversion module 300.
Further, the obtaining module 100 includes an importing unit 110, where the importing unit 110 is configured to import corresponding scene data, process the scene data, and transmit the processed scene data to the data processing module 200.
Specifically, the corresponding scene data is imported by the importing unit 110, processed and transmitted to the data setting unit 210.
Further, the data processing module 200 includes a data setting unit 210 and a data processing unit 220, the data setting unit 210 is configured to receive the scene data imported by the importing unit 110, receive a corresponding numerical value set by a user, and transmit the corresponding numerical value to the data processing unit 220, and the data processing unit 220 is configured to receive the scene data transmitted by the data setting unit 210 and the set corresponding numerical value, perform integration and scaling on the scene data and the corresponding data to generate fused data, and transmit the fused data to the conversion module 300.
Specifically, after the data setting unit 210 obtains the scene data transmitted by the importing unit 110, the staff sets a spatial scaling value and a resolution value at the same time, and transmits the spatial scaling value and the resolution value to the data processing unit 220 according to the spatial scaling value and the resolution value set by the staff, and after the data processing unit 220 receives the scene data, the spatial scaling value, and the resolution value, the scene data, the spatial scaling value, and the resolution value are integrated, unnecessary data are deleted, and the data are transmitted to the data converting unit 310.
Further, the conversion module 300 includes a data conversion unit 310 and a storage unit 320, the data conversion unit 310 is configured to receive the fusion data of the data processing unit 220, convert the fusion data into three-dimensional space data, and transmit the three-dimensional space data to the storage unit 320, and the storage unit 320 is configured to receive the three-dimensional space data of the data conversion unit 310, process the three-dimensional space data, and store the three-dimensional space data.
Specifically, the data conversion unit 310 receives the relevant data, converts the data into three-dimensional space data, transmits the three-dimensional space data to the storage unit 320, and stores the three-dimensional space data through the storage unit 320.
Example two
The embodiment of the invention also discloses a method of the template data conversion system, which is shown by referring to the attached figure 2 and comprises the following steps:
s1, scene data import, the import unit 110 imports the corresponding scene data, processes the scene data, and transmits the processed scene data to the data setting unit 210.
Specifically, after the import unit 110 imports scene data, the scene data is written using maxscript in 3dsMax to ae, and transmitted to the data setting unit 210.
S2, setting and processing spatial parameters, the data setting unit 210 receives the scene data imported by the importing unit 110, receives corresponding values set by the user, and transmits the values to the data processing unit 220, and the data processing unit 220 receives the scene data transmitted by the data setting unit 210 and the corresponding set values, integrates and scales the scene data and the corresponding data to generate fused data, and transmits the fused data to the data converting unit 310.
Specifically, after the data setting unit 210 obtains the scene data transmitted by the importing unit 110, the staff sets a spatial scaling value and a resolution value at the same time, and transmits the spatial scaling value and the resolution value to the data processing unit 220 according to the spatial scaling value and the resolution value set by the staff, and after the data processing unit 220 receives the scene data, the spatial scaling value, and the resolution value, the scene data, the spatial scaling value, and the resolution value are integrated, unnecessary data are deleted, and the data are transmitted to the data converting unit 310.
And S3, converting and storing the three-dimensional space data, wherein the data conversion unit 310 receives the fusion data from the data processing unit 220, converts the fusion data into the three-dimensional space data, and transmits the three-dimensional space data to the storage unit 320, and the storage unit 320 receives the three-dimensional space data from the data conversion unit 310, processes the three-dimensional space data, and stores the three-dimensional space data.
Specifically, the data conversion unit 310 receives the relevant data, converts the data into three-dimensional space data, transmits the three-dimensional space data to the storage unit 320, and stores the three-dimensional space data through the storage unit 320.
The invention combines rendering output of Autodesk 3dsMax and Adobe After Effect, and the like, so that Autodesk 3dsMax and Adobe After Effect are perfectly combined and interacted, meanwhile, the problem that traditional manufacturing cannot be combined with later manufacturing is solved, a large amount of time for back and forth interaction between other software is saved, time is accelerated, manufacturing cost is saved, later manufacturing of a project is provided, a three-dimensional space which can be complete is provided, rendering time of the project for testing spatial relationship is greatly saved, and manufacturing cost of the project is saved.
It should be understood that the specific order or hierarchy of steps in the processes disclosed is an example of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not intended to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, invention lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby expressly incorporated into the detailed description, with each claim standing on its own as a separate preferred embodiment of the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. Of course, the processor and the storage medium may reside as discrete components in a user terminal.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in memory units and executed by processors. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean a "non-exclusive or".

Claims (5)

1. The template data conversion system is characterized by comprising an acquisition module (100), a data processing module (200) and a conversion module (330);
the system comprises an acquisition module (100), a data processing module (200) and a data processing module (100), wherein the acquisition module (100) is used for importing corresponding scene data, processing the scene data and then transmitting the processed scene data to the data processing module (200);
the data processing module (200), the data processing module (200) is used for receiving the scene data of the acquisition module (100), processing the scene data according to the set corresponding data and transmitting the processed scene data to the conversion module (300);
the conversion module (330) is used for receiving the processed scene data, converting the scene data and transmitting the converted scene data to the conversion module (330).
2. The template data conversion system according to claim 1, wherein the obtaining module (100) includes an importing unit (110), and the importing unit (110) is configured to import corresponding scene data, process the scene data, and transmit the processed scene data to the data processing module (200).
3. The template data transformation system according to claim 1, wherein the data processing module (200) comprises a data setting unit (210) and a data processing unit (220), the data setting unit (210) is configured to receive the scene data imported by the importing unit (110), receive corresponding values set by a user, and transmit the corresponding values to the data processing unit (220), and the data processing unit (220) is configured to receive the scene data and the set corresponding values transmitted by the data setting unit (210), integrate and scale the scene data and the corresponding data to generate fused data, and transmit the fused data to the transformation module (330).
4. The template data conversion system according to claim 1, wherein the conversion module (330) comprises a data conversion unit (310) and a storage unit (320), the data conversion unit (310) is configured to receive the fused data from the data processing unit (220), convert the fused data into three-dimensional space data, and transmit the three-dimensional space data to the storage unit (320), and the storage unit (320) is configured to receive the three-dimensional space data from the data conversion unit (310), process the three-dimensional space data, and store the three-dimensional space data.
5. A method of a template data conversion system, applied to the template data conversion system of claim 1, comprising the steps of:
and S1, importing scene data, wherein the corresponding scene data is imported by the importing unit (110), processed and transmitted to the data setting unit (210).
S2, setting and processing spatial parameters, receiving scene data imported by the import unit (110) by the data setting unit (210), receiving corresponding values set by a user and transmitting the values to the data processing unit (220), receiving the scene data transmitted by the data setting unit (210) and the set corresponding values by the data processing unit (220), integrating and scaling the scene data and the corresponding data to generate fused data, and transmitting the fused data to the data conversion unit (310).
And S3, converting and storing the three-dimensional space data, wherein the data conversion unit (310) receives the fusion data of the data processing unit (220), converts the fusion data to generate the three-dimensional space data, transmits the three-dimensional space data to the storage unit (320), and the storage unit (320) receives the three-dimensional space data of the data conversion unit (310), processes the three-dimensional space data and stores the three-dimensional space data.
CN202010785477.0A 2020-08-06 2020-08-06 Template data conversion system and method Pending CN111986311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010785477.0A CN111986311A (en) 2020-08-06 2020-08-06 Template data conversion system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010785477.0A CN111986311A (en) 2020-08-06 2020-08-06 Template data conversion system and method

Publications (1)

Publication Number Publication Date
CN111986311A true CN111986311A (en) 2020-11-24

Family

ID=73445255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010785477.0A Pending CN111986311A (en) 2020-08-06 2020-08-06 Template data conversion system and method

Country Status (1)

Country Link
CN (1) CN111986311A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8274506B1 (en) * 2008-04-28 2012-09-25 Adobe Systems Incorporated System and methods for creating a three-dimensional view of a two-dimensional map
CN104484522A (en) * 2014-12-11 2015-04-01 西南科技大学 Method for building robot simulation drilling system based on reality scene
US20190035129A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
WO2019058266A1 (en) * 2017-09-21 2019-03-28 Varghese Thombra Sobin A system and method for conversion of a floor plan to a 3d scene for creation & rendering of virtual reality architectural scenes, walk through videos and images
CN109558643A (en) * 2018-11-05 2019-04-02 西南交通大学 A kind of modeling of traffic scene and model monomerization approach
CN109816772A (en) * 2018-12-28 2019-05-28 南京维伍网络科技有限公司 A kind of processing method quickly generating virtual reality scenario by CAD house type file
CN110415343A (en) * 2019-08-05 2019-11-05 中国电建集团北京勘测设计研究院有限公司 A kind of engineering BIM visualization of 3 d automotive engine system
CN111047672A (en) * 2019-11-26 2020-04-21 湖南龙诺数字科技有限公司 Digital animation generation system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8274506B1 (en) * 2008-04-28 2012-09-25 Adobe Systems Incorporated System and methods for creating a three-dimensional view of a two-dimensional map
CN104484522A (en) * 2014-12-11 2015-04-01 西南科技大学 Method for building robot simulation drilling system based on reality scene
US20190035129A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
WO2019058266A1 (en) * 2017-09-21 2019-03-28 Varghese Thombra Sobin A system and method for conversion of a floor plan to a 3d scene for creation & rendering of virtual reality architectural scenes, walk through videos and images
CN109558643A (en) * 2018-11-05 2019-04-02 西南交通大学 A kind of modeling of traffic scene and model monomerization approach
CN109816772A (en) * 2018-12-28 2019-05-28 南京维伍网络科技有限公司 A kind of processing method quickly generating virtual reality scenario by CAD house type file
CN110415343A (en) * 2019-08-05 2019-11-05 中国电建集团北京勘测设计研究院有限公司 A kind of engineering BIM visualization of 3 d automotive engine system
CN111047672A (en) * 2019-11-26 2020-04-21 湖南龙诺数字科技有限公司 Digital animation generation system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
老潘: "竟然还有Max导AE这么个插件,有点意思", pages 1 - 2, Retrieved from the Internet <URL:https://mp.weixin.qq.com/s/WSmcW2bKqMwWNTTif_wYfw> *

Similar Documents

Publication Publication Date Title
CN103945208B (en) A kind of parallel synchronous zooming engine for multiple views bore hole 3D display and method
JP2020535758A (en) Image processing methods, devices, and devices
CN108848354B (en) VR content camera system and working method thereof
CN108200324B (en) A kind of imaging system and imaging method based on zoom lens
CN116310076A (en) Three-dimensional reconstruction method, device, equipment and storage medium based on nerve radiation field
CN103634588A (en) Image composition method and electronic apparatus
CN112017228A (en) Method for three-dimensional reconstruction of object and related equipment
CN115147545A (en) Scene three-dimensional intelligent reconstruction system and method based on BIM and deep learning
CN112509106A (en) Document picture flattening method, device and equipment
CN105791793A (en) Image processing method and electronic device
CN111612878A (en) Method and device for making static photo into three-dimensional effect video
CN116962657B (en) Color video generation method, device, electronic equipment and storage medium
CN113887568B (en) Anisotropic convolution binocular image stereo matching method
CN111986311A (en) Template data conversion system and method
CN102111562A (en) Projection conversion method for three-dimensional model and device adopting same
CN112017242A (en) Display method and device, equipment and storage medium
CN109697693B (en) Method for realizing operation based on big data space
CN108961161B (en) Image data processing method, device and computer storage medium
CN102110298A (en) Method and device for projecting three-dimensional model in virtual studio system
CN111034187A (en) Dynamic image generation method and device, movable platform and storage medium
CN114283241A (en) Structured light three-dimensional reconstruction device and method
WO2022224112A1 (en) Inherited geometry patches
CN111953956B (en) Naked eye three-dimensional special-shaped image three-dimensional camera generation system and method thereof
EP3236422A1 (en) Method and device for determining a 3d model
US11636708B2 (en) Face detection in spherical images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination