CN115115766A - Multispectral scene data generation method and device - Google Patents

Multispectral scene data generation method and device Download PDF

Info

Publication number
CN115115766A
CN115115766A CN202210542253.6A CN202210542253A CN115115766A CN 115115766 A CN115115766 A CN 115115766A CN 202210542253 A CN202210542253 A CN 202210542253A CN 115115766 A CN115115766 A CN 115115766A
Authority
CN
China
Prior art keywords
rendering
scene
spectrum
data
spectral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210542253.6A
Other languages
Chinese (zh)
Other versions
CN115115766B (en
Inventor
季向阳
杨楚皙
魏恒璐
连晓聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202210542253.6A priority Critical patent/CN115115766B/en
Publication of CN115115766A publication Critical patent/CN115115766A/en
Application granted granted Critical
Publication of CN115115766B publication Critical patent/CN115115766B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a multispectral scene data generation method and a multispectral scene data generation device, wherein the method comprises the following steps: constructing a three-dimensional geometric model of each object in the simulation scene; setting material properties according to the actual surface properties of each object, and determining the arrangement of light sources in a scene; based on the three-dimensional geometric model, the material property and the light source arrangement of each object, scene rendering is carried out in a preset spectrum rendering mode, and after the rendering is finished, multispectral scene data are output according to rendering results. Therefore, the technical problem that the full-flow spectrum rendering and rendering data output form and the rendering effect are difficult to combine in the related technology is solved.

Description

Multispectral scene data generation method and device
Technical Field
The present application relates to the field of image data processing or generation technologies, and in particular, to a method and an apparatus for generating multispectral scene data.
Background
In recent years, with the rise of technologies such as automatic driving and simulation imaging, the importance of simulation scene generation has been gradually increased. The purpose of scene simulation is to simulate the light distribution of a scene entering a lens under a real environment. In order to reduce the inter-domain gap between the real scene data and the simulation scene data as much as possible, the simulation scene generation module preferably has the following characteristics: 1. flexible and diverse three-dimensional modeling support; 2. physically real rendering; 3. and the spectrum rendering and the spectrum data output are supported. The method comprises the following steps of (1) realizing geometric construction corresponding to a real world better by diversified three-dimensional modeling capacity; the physical and real rendering makes the appearance of the object more vivid; particularly, in the imaging process, the processes of lens filtering, sensor color filtering and the like all need the participation of spectral data, so compared with RGB data, multispectral scene data can effectively improve the simulation precision of the lens and the sensor, and the input/output interface with the spectral data and the support of spectral rendering are particularly important for the scene generation module.
The current rendering method can be mainly divided into rasterization rendering and physical ray tracing, and the realistic rendering is mainly realized by the ray tracing rendering based on physics. Currently, mainstream three-dimensional modeling software includes 3DS Max, Cinema4D, Maya and the like, and although the software supports ray tracing rendering, the software generally only supports RGB three-channel rendering, and currently, no related extension of spectral rendering exists, so that the software does not have the capability of outputting multispectral data; some physical ray trace engines supporting spectrum rendering, such as Arion, FluidRay, MaverickRender, etc., which, although supporting spectrum computation in the rendering stage, the final output is still RGB three-channel data; in addition, there are some academic-oriented rendering engines, such as PBRT, Mitsuba2, which can support spectral rendering and output of multispectral data, but the material setting is simple, and the combination form is not flexible enough, so that the rendering result sometimes cannot reach the expectation; in addition, when the PBRT and the Mitusba2 are rendered, no matter the geometry, the illumination or the material setting are all in text form as input, visual modeling and editing interfaces are not provided, and the interactive experience is not ideal. Although related plug-ins support exporting the Cinema4D or the Blender model into a PBRT format file, the problems of incomplete material translation, no support of complex geometric transformation, and errors in UV mapping export still exist.
In summary, in the related art, it is difficult to combine the output form and rendering effect of the full-flow spectrum rendering and rendering data, and only one of them can be selected, and the purpose of visual modeling and interface editing cannot be achieved, so a solution is urgently needed.
Disclosure of Invention
The application provides a multispectral scene data generation method and device, and aims to solve the technical problem that in the related technology, full-flow spectral rendering and rendering data output forms and rendering effects are difficult to achieve at the same time.
An embodiment of a first aspect of the present application provides a method for generating multispectral scene data, including the following steps: constructing a three-dimensional geometric model of each object in the simulation scene; setting material properties according to the actual surface properties of the objects, and determining the arrangement of light sources in the scene; and based on the three-dimensional geometric model of each object, the material property and the light source arrangement, performing scene rendering by adopting a preset spectrum rendering mode, and outputting multispectral scene data according to a rendering result after the rendering is finished.
Optionally, in an embodiment of the present application, the setting a material property according to an actual surface property of each object includes: detecting a representation type of input data; and performing data processing in a corresponding data processing mode according to the representation type so that the material attribute meets a spectrum rendering condition.
Optionally, in an embodiment of the present application, when the preset spectrum rendering manner is a Spectral Cycles engine, the performing scene rendering by using the preset spectrum rendering manner includes: and in the rendering process, accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface to obtain the rendering results.
Optionally, in an embodiment of the present application, the accumulating, one by one, the spectrum rendering result obtained by each random sampling by using a preset spectrum output interface includes: obtaining a rendering result of a corresponding wavelength channel according to the space light and the wavelength of each sample; and storing the spectral data of each pixel according to the corresponding wavelength channel.
An embodiment of a second aspect of the present application provides a multispectral scene data generation apparatus, including: the modeling module is used for constructing a three-dimensional geometric model of each object in the simulation scene; the setting module is used for setting material properties according to the actual surface properties of the objects and determining the light source arrangement in the scene; and the generation module is used for rendering the scene by adopting a preset spectrum rendering mode based on the three-dimensional geometric model, the material property and the light source arrangement of each object, and outputting multispectral scene data according to a rendering result after rendering is finished.
Optionally, in an embodiment of the present application, the setting module includes: a detection unit for detecting a representation type of input data; and the processing unit is used for performing data processing in a corresponding data processing mode according to the representation type so as to enable the material attribute to meet a spectrum rendering condition.
Optionally, in an embodiment of the application, when the preset spectrum rendering manner is a spectrum Cycles engine, the generating module includes: and the accumulation unit is used for accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface in the rendering process to obtain the rendering results.
Optionally, in an embodiment of the present application, the accumulating unit includes: the calculating subunit is used for obtaining a rendering result of the corresponding wavelength channel according to the space light and the wavelength of each sample; and the storage subunit stores the spectral data of each pixel according to the corresponding wavelength channel.
An embodiment of a third aspect of the present application provides an electronic device, including: the multispectral scene data generation method comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the multispectral scene data generation method according to the embodiment.
A fourth aspect of the present application provides a computer-readable storage medium, on which a computer program is stored, the program being executed by a processor for implementing the method for generating multispectral scene data according to any one of claims 1 to 4.
The method and the device for rendering the three-dimensional geometric model can set the material attributes based on the actual surface attributes of all objects in the simulation scene, the material attribute combination form is flexible, an expected rendering result is favorably achieved, the scene rendering is carried out by combining the corresponding three-dimensional geometric model and the light source arrangement, multispectral scene data are output according to the rendering result, the full-process spectral rendering is achieved, and the rendering effect is improved. Therefore, the technical problem that the full-flow spectrum rendering and rendering data output form and the rendering effect are difficult to combine in the related technology is solved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a multispectral scene data generation method according to an embodiment of the present disclosure;
FIG. 2 is a flow diagram of a method of multispectral scene data generation according to an embodiment of the present application;
fig. 3 is a spectrum file input node of a multispectral scene data generation method according to an embodiment of the present application;
FIG. 4 is a flowchart of reading material reflectance properties at corresponding wavelengths during a spectral rendering process according to one embodiment of the present disclosure;
fig. 5 is a spectral reflectance curve corresponding to RGB three channels generated by using ILLSS algorithm according to an embodiment of the present application;
fig. 6 is a schematic storage manner of a spectral data saving method of the multispectral scene data generation method according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a visual rendering result obtained by modeling an MCC chart scene in a D65 lighting environment according to an embodiment of the application;
FIG. 8 is a graph illustrating the results of performing spectral verification on an MCC chart scene in a D65 lighting environment according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a multispectral scene data generation apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
The multispectral scene data generation method and device according to the embodiment of the present application are described below with reference to the drawings. In order to solve the technical problem mentioned in the background technology center that the output form of the full-process spectral rendering and rendering data and the rendering effect are difficult to achieve simultaneously in the related technology, the application provides a multispectral scene data generation method. Therefore, the technical problem that the full-flow spectrum rendering and rendering data output form and the rendering effect are difficult to combine in the related technology is solved.
Specifically, fig. 1 is a schematic flowchart of a multispectral scene data generation method according to an embodiment of the present disclosure.
As shown in fig. 1, the multispectral scene data generation method includes the following steps:
in step S101, a three-dimensional geometric model of each object in the simulation scene is constructed.
In the actual execution process, the embodiment of the application can complete the visual three-dimensional geometric modeling of each object in the simulation scene through modeling software to obtain the three-dimensional mesh representation of the object geometry, such as Blender software, which has strong geometric modeling capability, supports various geometric transformations, particle systems and the like, and is convenient for creating a movie-level fine scene three-dimensional model.
In step S102, material properties are set according to actual surface properties from each object, and the light source arrangement within the scene is determined.
Specifically, in the embodiment of the present application, material properties, such as reflectivity, refractive index, texture information, and the like, may be set for each object in a scene according to actual surface properties of the object, and compared with the RGB rendering adopted in the related art, the spectral rendering adopted in the embodiment of the present application requires that the reflection properties (including reflectivity, refractive index, and the like) of the material are all wavelength-dependent.
Further, the embodiments of the present application may perform light source arrangement in a scene, wherein the types of light sources applicable to the embodiments of the present application may include point light sources, spotlights, surface light sources, ambient lighting, and the like. The spectrum of the light source can be set by setting the color temperature or inputting a spectrum file.
Optionally, in an embodiment of the present application, setting a material property according to an actual surface property of each object includes: detecting a representation type of input data; and performing data processing in a corresponding data processing mode according to the representation type so that the material attribute meets the spectrum rendering condition.
In an actual implementation process, the embodiment of the present application may perform data processing in a corresponding manner according to the representation type of the input data.
When the input data is represented by RGB, the embodiments of the present application may convert the RGB values into spectral data. In fact, spectral estimation based on RGB values is an ill-defined problem, i.e. objects with different spectral properties may have the same RGB color appearance. Therefore, the embodiments of the present application can be set to have the RGB values [255,0 respectively],[0,255,0]And [0,0,255]The reflectance spectrum curve of the object is represented by C r 、C g And C b This is used as the base curve for the spectral conversion process.
For any oneRGB values whose corresponding spectral estimates can be considered from the base curve C r 、C g And C b The result of the linear combination can be expressed as:
S=R linear ×C r +G linear ×C g +B linear ×C b
wherein R is linear 、G linear 、B linear The corresponding linear rgb (linear-rgb) values after degamma transformation of s-rgb are shown, and the conversion relationship between s-rgb and linear-rgb can be:
C linear =g(C sRGB ),
wherein, C linear Denotes the red, green or blue channel in linear-rgb, C sRGB Representing the red, green or blue channel corresponding to s-rgb, g (K) being a transfer function, when K > 0.04045, there are
Figure BDA0003648709630000051
Otherwise there is
Figure BDA0003648709630000052
s-rgb needs to be normalized to [0, 1 ] before conversion]The value range of (a).
When the input data is represented by a spectrum, that is, when the input end directly uses the spectrum to define the reflection attribute of an object, the embodiment of the present application may first consider whether the wavelength range in the input data is consistent with the wavelength range of the target output; when the wavelength value sampled in the rendering process is inconsistent with the wavelength sampling value of the input data, the embodiment of the present application also needs to consider the problem of data interpolation.
Aiming at the problem of wavelength range consistency, the embodiment of the application can sample the wavelength by taking the wavelength range of target output as a reference in the rendering process and feed back the wavelength to the input data reading end, and if the sampling wavelength is contained in the waveband range of the input data, the sampling is reserved; if the sampling is reserved, the dichotomy is adopted to judge the wavelength value of the sampled wavelength is in which wavelength interval of the input data, and the sampling wavelength lambda is assumed i Lambda in the input data a ,λ b ) Zone(s)M, then λ i The spectral values at are:
Figure BDA0003648709630000053
wherein S is a And S b Respectively representing the wavelength lambda of the input data a And λ b The corresponding spectral value. If the sampling wavelength is not included in the input data band, the spectral value corresponding to the front end point or the back end point of the input data band is taken as lambda i To output a spectral value.
In step S103, a preset spectral rendering mode is adopted to perform scene rendering based on the three-dimensional geometric model, the material attributes, and the light source arrangement of each object, and after the rendering is finished, multispectral scene data is output according to the rendering result.
In an actual execution process, the embodiment of the application can perform scene rendering in a preset spectrum rendering mode based on the three-dimensional geometric modeling, the material attribute setting and the light source arrangement of each object completed in the steps, and output multispectral scene data according to a rendering result after the rendering is finished, so that the spectrum rendering of the whole process is realized, the derivation of the spectrum data is realized, and the rendering effect is improved.
It should be noted that the spectral rendering manner may be set by a person skilled in the art according to actual requirements, and is not limited herein.
Optionally, in an embodiment of the present application, when the preset spectrum rendering manner is a spectrum Cycles engine, the scene rendering is performed by using the preset spectrum rendering manner, including: and in the rendering process, accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface to obtain the rendering results.
For example, when a spectrum cycle engine is adopted as the preset spectrum rendering mode in the embodiment of the present application, a spectrum data storage and output process may be added on the basis of the spectrum Cycles, so that the spectrum rendering of the full process may be supported, and in the rendering process, since the spectrum Cycles are random and non-uniform for sampling the wavelength, and the probability of matching the sampling wavelength with the target output wavelength is very low, the data storage method proposed for uniform sampling is not suitable for spectrum data storage of the spectrum Cycles, the embodiment of the present application may accumulate the spectrum rendering results obtained by random sampling each time one by using the preset spectrum output interface, obtain the rendering results, and realize the spectrum rendering of the full process while ensuring the rendering effect.
Optionally, in an embodiment of the present application, accumulating, one by one, a spectrum rendering result obtained by each random sampling by using a preset spectrum output interface includes: obtaining a rendering result of a corresponding wavelength channel according to the space light and the wavelength of each sample; and storing the spectral data of each pixel according to the corresponding wavelength channel.
Specifically, in the embodiment of the present application, a band may be divided near each output wavelength as a data receiving range corresponding to the center wavelength, where the upper and lower limit wavelengths of the data receiving range are:
Figure BDA0003648709630000061
Figure BDA0003648709630000062
wherein, B i Denotes the output wavelength of the target, the index i denotes its channel index, BW ═ B i -B i-1 Representing the data bandwidth of the spectral output.
According to this division principle, the wavelength B i The spectral output value of (A) may be a band range [ B ] i- ),B i+ ) Mean of all sample values within).
The final output result can be expressed as:
Figure BDA0003648709630000063
wherein D (x) represents a spectral value corresponding to the target wavelength x, n [a,b) Representing the number of times sampled within the band range a, b).
It can be understood that, because the sampling times in different bands are different, the embodiment of the application can dynamically record the number of received sampling points in each band while storing spectral data.
The working principle of the multispectral scene data generation method according to the embodiment of the present application is described in detail with reference to fig. 2 to 7.
As shown in fig. 2, the embodiment of the present application may include the following steps:
step S201: and (4) visual geometric modeling. In the geometric modeling stage, the embodiment of the application can complete visual three-dimensional geometric modeling by means of Blender three-dimensional modeling software to obtain the three-dimensional mesh representation of the object geometry.
Step S202: and setting material properties. The material property can be set by interconnecting various material nodes in the Blender coloring editor, and because the embodiment of the application adopts spectral rendering, the material setting module is required to support the reflection property definition in the spectral form.
As shown in fig. 3, the embodiment of the present application provides a spectrum file input node, which supports accurate setting and editing of material reflection attributes at various wavelengths in a text form, and can implement reflection data reading and interpolation in a spectrum rendering process.
As shown in fig. 4, in the process of rendering, according to different setting modes of material reflection attributes, the embodiment of the present application may implement data reading at a target wavelength by using different data processing modes. In the data reading process shown in fig. 4, in order to obtain a more practical spectrum estimation for the RGB spectrum conversion operation, the embodiment of the present application may generate three reflectivity base curves C according to the ILLSS algorithm r 、C g And C b Respectively represent when the RGB number is [255, 0%],[0,255,0]And [0, 255]The reflectivity spectrum curve of the object is generated by utilizing three reflectivity curves generated by ILLSS algorithmAs shown in fig. 5.
Step S203: and (4) setting illumination. The scene generation method provided by the embodiment of the application simultaneously supports multiple light source types such as a point light source, a spotlight, a surface light source and ambient light, and the spectrum of the light source can be set in two modes of setting color temperature or inputting a spectrum file.
When the spectrum of the light source is defined by using the color temperature, the spectrum of the target color temperature value is defined as the radiation spectrum of the black body at the corresponding temperature, so the radiance of the light source at the target wavelength can be calculated according to the planck black body radiation law, which is expressed as:
Figure BDA0003648709630000071
wherein, I λ And the radiance corresponding to the target wavelength lambda is represented, T is the color temperature of the light source, h is the Planck constant, c is the speed of light, and k is the Boltzmann constant.
When the spectrum file is used for defining the spectrum of the light source, the spectrum file input node provided by the embodiment of the application can be connected with the self-luminous attribute node of the light source, so that the reading of the spectrum data is guided, the numerical value setting is more accurate, and the editing is more convenient.
Step S204: and (5) performing spectrum rendering. When the scene geometry, the object material and the light source are all set, the scene rendering can be performed by adopting a Spectral Cycles rendering engine in the embodiment of the application.
The Spectral Cycle adopts a physical path tracking algorithm to render a scene, the rendering process can carry out sampling for multiple times, in the process of Spectral rendering, sampling is carried out on the wavelength besides spatial light sampling, and the Spectral Cycle is random and non-uniform for sampling of the wavelength.
As shown in FIG. 6, the present embodiment can provide a wavelength B at each output wavelength i A band is divided nearby as a data receiving range corresponding to the center wavelength, and is denoted as B i- ),B i+ ) The calculation method of the upper and lower limit wavelengths of the reception range may be as follows:
Figure BDA0003648709630000081
Figure BDA0003648709630000082
wherein, B i Denotes the output wavelength of the target, the index i denotes its channel index, BW ═ B i -B i-1 Representing the data bandwidth of the spectral output.
In the rendering process, if the sampled wavelength satisfies B i- )≤λ n <B i+ ) Then the sampling point is considered to be corresponding to the target output wavelength B i The spectral values contribute.
In consideration of differences of the sampling times in different bands, the embodiment of the application also dynamically records the number of sampling points received in each band while storing spectral data. After rendering, the target wavelength B i The final spectral output value is the band range [ B ] i- ),B i+ ) ) the average of all the samples within the sample, the final output can be expressed as:
Figure BDA0003648709630000083
wherein D (x) represents a spectral value corresponding to the target wavelength x, n [a,b) Representing the number of times sampled within the band range a, b).
Step S205: and (4) outputting the spectrum. The spectral data of each pixel can be stored according to different wavelength channels and output in the form of an EXR file, and the scene spectral data can be exported.
In addition, the simulation scene generation method provided by the embodiment of the application can be verified by taking a standard 24-Color MCC (Macbeth Color Checker, macbetes Color check chart) chart scene as a test scene.
The relevant settings of the scene are: the neutral grey backplate of one side has been put at the scene center, adsorbs the test picture card on the backplate to make the upper and lower edge of picture card keep the level, then put two D65 area light sources with 45 angle faces towards the colour chip symmetry.
When shooting, the optical axis of the camera system is ensured to vertically pass through the center of the color card, and the shooting distance is 1 m.
And (4) building a real scene and a simulation scene according to the setting, and comparing the matching degree of the actual measurement scene radiance and the simulation scene radiance.
A visualization rendering result of the simulation scene established according to the method provided by the embodiment of the present application is shown in fig. 7. The contrast curve between the actually measured radiance spectrum and the simulated radiance spectrum of 24 color blocks in the MCC chart is shown in fig. 8, and it can be known from fig. 8 that the simulated scene data and the real scene data have a higher spectrum matching degree.
According to the multispectral scene data generation method provided by the embodiment of the application, the material attribute can be set based on the actual surface attribute of each object in the simulation scene, the material attribute combination form is flexible, the expected rendering result can be achieved, the scene rendering is performed by combining the corresponding three-dimensional geometric model and the light source arrangement, the multispectral scene data is output according to the rendering result, the full-process spectral rendering is achieved, and the rendering effect is improved. Therefore, the technical problem that the full-flow spectrum rendering and rendering data output form and the rendering effect are difficult to combine in the related technology is solved.
Next, a multispectral scene data generation apparatus proposed according to an embodiment of the present application is described with reference to the drawings.
Fig. 9 is a block diagram of a multispectral scene data generation apparatus according to an embodiment of the present application.
As shown in fig. 9, the multispectral scene data generation apparatus 10 includes: a modeling module 100, a setup module 200, and a generation module 300.
In particular, the modeling module 100 is used for constructing a three-dimensional geometric model of each object in the simulation scene.
And the setting module 200 is configured to set material properties according to the actual surface properties of the objects, and determine light source arrangement in the scene.
The generating module 300 is configured to perform scene rendering in a preset spectral rendering manner based on the three-dimensional geometric model, the material attribute, and the light source arrangement of each object, and output multispectral scene data according to a rendering result after the rendering is completed.
Optionally, in an embodiment of the present application, the setting module 200 includes: a detection unit and a processing unit.
The detection unit is used for detecting the representation type of the input data.
And the processing unit is used for performing data processing in a corresponding data processing mode according to the representation type so that the material attribute meets the spectrum rendering condition.
Optionally, in an embodiment of the present application, when the preset spectrum rendering manner is a spectrum Cycles engine, the generating module 300 includes: and an accumulation unit.
The accumulation unit is used for accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface in the rendering process to obtain the rendering results.
Optionally, in an embodiment of the present application, the accumulating unit includes: a calculation subunit and a storage subunit.
The computing subunit is used for obtaining a rendering result of the corresponding wavelength channel according to the space light and the wavelength of each sample;
and the storage subunit stores the spectral data of each pixel according to the corresponding wavelength channel.
It should be noted that the explanation of the embodiment of the multispectral scene data generation method is also applicable to the multispectral scene data generation device of the embodiment, and details are not repeated here.
According to the multispectral scene data generation device provided by the embodiment of the application, the material attribute can be set based on the actual surface attribute of each object in the simulation scene, the material attribute combination form is flexible, the expected rendering result can be achieved, the scene rendering is performed by combining the corresponding three-dimensional geometric model and the light source arrangement, the multispectral scene data is output according to the rendering result, the full-process spectral rendering is achieved, and the rendering effect is improved. Therefore, the technical problem that the full-flow spectrum rendering and rendering data output form and the rendering effect are difficult to combine in the related technology is solved.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device may include:
memory 1001, processor 1002, and computer programs stored on memory 1001 and executable on processor 1002.
The processor 1002, when executing the program, implements the multispectral scene data generation method provided in the above-described embodiments.
Further, the electronic device further includes:
a communication interface 1003 for communicating between the memory 1001 and the processor 1002.
A memory 1001 for storing computer programs that may be run on the processor 1002.
Memory 1001 may include high-speed RAM memory and may also include non-volatile memory, such as at least one disk memory.
If the memory 1001, the processor 1002, and the communication interface 1003 are implemented independently, the communication interface 1003, the memory 1001, and the processor 1002 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 10, but this is not intended to represent only one bus or type of bus.
Optionally, in a specific implementation, if the memory 1001, the processor 1002, and the communication interface 1003 are integrated on one chip, the memory 1001, the processor 1002, and the communication interface 1003 may complete communication with each other through an internal interface.
The processor 1002 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present Application.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the multispectral scene data generation method as above.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A multispectral scene data generation method is characterized by comprising the following steps:
constructing a three-dimensional geometric model of each object in the simulation scene;
setting material properties according to the actual surface properties of the objects, and determining light source arrangement in a scene; and
and based on the three-dimensional geometric model of each object, the material property and the light source arrangement, performing scene rendering in a preset spectral rendering mode, and outputting multispectral scene data according to a rendering result after rendering is finished.
2. The method of claim 1, wherein said setting material properties according to actual surface properties from said respective object comprises:
detecting a representation type of input data;
and performing data processing in a corresponding data processing mode according to the representation type so as to enable the material attribute to meet a spectrum rendering condition.
3. The method according to claim 1, wherein when the preset Spectral rendering mode is a Spectral Cycles engine, the scene rendering by the preset Spectral rendering mode comprises:
and in the rendering process, accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface to obtain the rendering results.
4. The method according to claim 3, wherein accumulating the spectrum rendering results obtained from each random sampling one by using a preset spectrum output interface comprises:
obtaining a rendering result of a corresponding wavelength channel according to the space light and the wavelength of each sample;
and storing the spectral data of each pixel according to the corresponding wavelength channel.
5. An apparatus for generating multispectral scene data, comprising:
the modeling module is used for constructing a three-dimensional geometric model of each object in the simulation scene;
the setting module is used for setting material properties according to the actual surface properties of the objects and determining the light source arrangement in the scene; and
and the generation module is used for rendering the scene by adopting a preset spectrum rendering mode based on the three-dimensional geometric model, the material property and the light source arrangement of each object, and outputting multispectral scene data according to a rendering result after rendering is finished.
6. The apparatus of claim 5, wherein the setup module comprises:
a detection unit for detecting a representation type of input data;
and the processing unit is used for performing data processing in a corresponding data processing mode according to the representation type so as to enable the material attribute to meet a spectrum rendering condition.
7. The apparatus of claim 5, wherein when the predetermined Spectral rendering mode is a Spectral Cycles engine, the generating module comprises:
and the accumulation unit is used for accumulating the spectrum rendering results obtained by random sampling each time one by utilizing a preset spectrum output interface in the rendering process to obtain the rendering results.
8. The apparatus of claim 7, wherein the accumulating unit comprises:
the computing subunit is used for obtaining a rendering result of the corresponding wavelength channel according to the space light and the wavelength of each sample;
and the storage subunit stores the spectral data of each pixel according to the corresponding wavelength channel.
9. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the multispectral scene data generation method of any one of claims 1-4.
10. A computer-readable storage medium, on which a computer program is stored, the program being executable by a processor for implementing the method of generating multispectral scene data as claimed in any one of claims 1 to 4.
CN202210542253.6A 2022-05-17 2022-05-17 Multispectral scene data generation method and device Active CN115115766B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210542253.6A CN115115766B (en) 2022-05-17 2022-05-17 Multispectral scene data generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210542253.6A CN115115766B (en) 2022-05-17 2022-05-17 Multispectral scene data generation method and device

Publications (2)

Publication Number Publication Date
CN115115766A true CN115115766A (en) 2022-09-27
CN115115766B CN115115766B (en) 2023-03-24

Family

ID=83325727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210542253.6A Active CN115115766B (en) 2022-05-17 2022-05-17 Multispectral scene data generation method and device

Country Status (1)

Country Link
CN (1) CN115115766B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467752A (en) * 2010-11-05 2012-05-23 上海威塔数字科技有限公司 Physical real-time rendering 3D scene method and system thereof
CN108550178A (en) * 2018-04-19 2018-09-18 深浅度视觉科技(大连)有限公司 The virtual glasses texturing rendering intents of AR and system
US20190287216A1 (en) * 2018-03-19 2019-09-19 Mitsubishi Electric Research Laboratories, Inc. Systems and Methods for Multi-Spectral Image Super-Resolution
US20200090398A1 (en) * 2018-09-18 2020-03-19 Microsoft Technology Licensing, Llc Multi-spectral rendering for synthetics
CN111340928A (en) * 2020-02-19 2020-06-26 杭州群核信息技术有限公司 Ray tracing-combined real-time hybrid rendering method and device for Web end and computer equipment
CN112596713A (en) * 2020-12-30 2021-04-02 深圳须弥云图空间科技有限公司 Processing method and device based on illusion engine, electronic equipment and storage medium
CN113674389A (en) * 2021-10-25 2021-11-19 深圳须弥云图空间科技有限公司 Scene rendering method and device, electronic equipment and storage medium
CN113963100A (en) * 2021-10-25 2022-01-21 广东工业大学 Three-dimensional model rendering method and system for digital twin simulation scene

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467752A (en) * 2010-11-05 2012-05-23 上海威塔数字科技有限公司 Physical real-time rendering 3D scene method and system thereof
US20190287216A1 (en) * 2018-03-19 2019-09-19 Mitsubishi Electric Research Laboratories, Inc. Systems and Methods for Multi-Spectral Image Super-Resolution
CN108550178A (en) * 2018-04-19 2018-09-18 深浅度视觉科技(大连)有限公司 The virtual glasses texturing rendering intents of AR and system
US20200090398A1 (en) * 2018-09-18 2020-03-19 Microsoft Technology Licensing, Llc Multi-spectral rendering for synthetics
CN111340928A (en) * 2020-02-19 2020-06-26 杭州群核信息技术有限公司 Ray tracing-combined real-time hybrid rendering method and device for Web end and computer equipment
CN112596713A (en) * 2020-12-30 2021-04-02 深圳须弥云图空间科技有限公司 Processing method and device based on illusion engine, electronic equipment and storage medium
CN113674389A (en) * 2021-10-25 2021-11-19 深圳须弥云图空间科技有限公司 Scene rendering method and device, electronic equipment and storage medium
CN113963100A (en) * 2021-10-25 2022-01-21 广东工业大学 Three-dimensional model rendering method and system for digital twin simulation scene

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHAO TANG: "Multi-spectral imaging system based on light field rendering", 《SPIE》 *
景海龙等: "基于PBRT的光学遥感成像仿真", 《微型电脑应用》 *
李宏宁等: "基于光谱的渲染技术及其在多光谱颜色再现中的应用", 《激光与光电子学进展》 *
马晨光: "高分辨率光谱视频采集研究", 《电子学报》 *

Also Published As

Publication number Publication date
CN115115766B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
US7200262B2 (en) 3-dimensional image processing method, 3-dimensional image processing device, and 3-dimensional image processing system
US10726580B2 (en) Method and device for calibration
JP2022032937A (en) Computer vision method and system
US8711171B2 (en) Image processing apparatus, method, and storage medium for performing soft proof processing
CN100571335C (en) Image syncretizing effect real-time estimating method and device based on pixel space relativity
Logothetis et al. A cnn based approach for the near-field photometric stereo problem
CN102985943A (en) Color image processing method, color image processing device, and color image processing program
US10559085B2 (en) Devices, systems, and methods for reconstructing the three-dimensional shapes of objects
Menk et al. Visualisation techniques for using spatial augmented reality in the design process of a car
CN107330966A (en) A kind of rendering intent and device
Pintus et al. State‐of‐the‐art in Multi‐Light Image Collections for surface visualization and analysis
Ciortan et al. A practical reflectance transformation imaging pipeline for surface characterization in cultural heritage
CN113379698A (en) Illumination estimation method based on step-by-step joint supervision
CN113533256A (en) Method, device and equipment for determining spectral reflectivity
Zhao et al. Adaptive light estimation using dynamic filtering for diverse lighting conditions
CN116167932A (en) Image quality optimization method, device, equipment and storage medium
CN105574844B (en) Rdaiation response Function Estimation method and apparatus
CN115115766B (en) Multispectral scene data generation method and device
CN112330654A (en) Object surface material acquisition device and method based on self-supervision learning model
CN111105365A (en) Color correction method, medium, terminal and device for texture image
CN110335219A (en) A kind of bearing calibration, means for correcting and the terminal of pixel distortion
CN110310341A (en) The generation method of default parameters, device, equipment and storage medium in color algorithm
CN114170367B (en) Method, apparatus, storage medium, and device for infinite-line-of-sight pyramidal heatmap rendering
JP7412610B2 (en) Using bidirectional texture functions
Koch et al. Hardware design and accurate simulation for benchmarking of 3D reconstruction algorithms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant