CN116485704A - Illumination information processing method and device, electronic equipment and storage medium - Google Patents

Illumination information processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116485704A
CN116485704A CN202211484001.9A CN202211484001A CN116485704A CN 116485704 A CN116485704 A CN 116485704A CN 202211484001 A CN202211484001 A CN 202211484001A CN 116485704 A CN116485704 A CN 116485704A
Authority
CN
China
Prior art keywords
illumination
real
virtual
scene
lamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211484001.9A
Other languages
Chinese (zh)
Inventor
吴卓莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211484001.9A priority Critical patent/CN116485704A/en
Publication of CN116485704A publication Critical patent/CN116485704A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The application provides a lighting information processing method, a lighting information processing device, electronic equipment, a computer program product and a computer readable storage medium; the method comprises the following steps: acquiring real illumination information corresponding to a real scene; obtaining virtual illumination information corresponding to a virtual scene; comparing and calculating the virtual illumination information and the real illumination information to obtain illumination difference information; performing light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for carrying out supplementary illumination; matching processing is carried out on the basis of the configuration parameters of each supplementary light source and the lamp database, so that a target lamp corresponding to each supplementary light source in the lamp database is obtained; and taking each target lamp and the configuration parameters corresponding to each target lamp as configuration information, and performing scene illumination synchronization based on the configuration information. By the method and the device, accuracy of illumination between synchronous virtual scenes and real scenes can be improved.

Description

Illumination information processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to computer technology, and in particular, to a method and apparatus for processing illumination information, an electronic device, and a storage medium.
Background
The display technology based on the graphic processing hardware expands the perception environment and the channel for acquiring information, particularly the display technology of virtual scenes, can realize diversified interactions among virtual objects controlled by users or artificial intelligence according to actual application requirements, and has various typical application scenes, such as video and movie production based on the virtual scenes.
Virtual production combines virtual reality, augmented reality, computer-generated animation (CGI) and game engine technology, in some cases, in the process of carrying out virtual production, real scenes need to be shot to obtain materials, the virtual scenes and the real scenes need to have the same lamplight atmosphere, in the related art, illumination synchronization of the virtual scenes and the real scenes usually depends on manual adjustment of technicians, operation efficiency is low, and accuracy of illumination synchronization is affected by human subjective factors.
In the related art, there is no better way to synchronize the illumination of the virtual scene and the real scene.
Disclosure of Invention
The embodiment of the application provides a lighting information processing method, a device, electronic equipment, a computer readable storage medium and a computer program product, which can improve the accuracy of lighting between synchronous virtual scenes and real scenes.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a lighting information processing method, which comprises the following steps:
acquiring real illumination information corresponding to a real scene;
obtaining virtual illumination information corresponding to a virtual scene;
comparing and calculating the virtual illumination information with the real illumination information to obtain illumination difference information;
performing light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for carrying out supplementary illumination;
matching processing is carried out on the configuration parameters of each supplementary light source and a lamp database, so that a target lamp corresponding to each supplementary light source in the lamp database is obtained;
and taking each target lamp and the configuration parameters corresponding to each target lamp as configuration information, and performing scene illumination synchronization based on the configuration information.
The embodiment of the application provides an illumination information processing device, which comprises:
the illumination acquisition module is configured to acquire real illumination information corresponding to a real scene;
the illumination acquisition module is further configured to acquire virtual illumination information corresponding to the virtual scene;
The difference acquisition module is configured to compare and calculate the virtual illumination information and the real illumination information to obtain illumination difference information;
the light source configuration module is configured to perform light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for carrying out supplementary illumination;
the light source configuration module is further configured to perform matching processing with a lamp database based on the configuration parameters of each supplementary light source to obtain a target lamp corresponding to each supplementary light source in the lamp database;
and the illumination synchronization module is configured to take each target lamp and the configuration parameters corresponding to each target lamp as configuration information, and perform scene illumination synchronization based on the configuration information.
An embodiment of the present application provides an electronic device, including:
a memory for storing computer executable instructions;
and the processor is used for realizing the illumination information processing method provided by the embodiment of the application when executing the computer executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium, which stores computer executable instructions for implementing the illumination information processing method provided by the embodiment of the application when being executed by a processor.
The embodiment of the application provides a computer program product, which comprises a computer program or computer executable instructions, and when the computer program or the computer executable instructions are executed by a processor, the illumination information processing method provided by the embodiment of the application is realized.
The embodiment of the application has the following beneficial effects:
the illumination difference information is obtained by acquiring illumination information in the virtual scene and the real scene and comparing and calculating the illumination information, so that the accuracy of acquiring the illumination difference in the virtual scene and the real scene is improved; the configuration parameters corresponding to the light supplementing light source are obtained based on the illumination difference information, the target light for supplementing light is obtained in the light database, and illumination synchronization is carried out on a real scene or a virtual scene to be supplemented with illumination based on the configuration information, so that accuracy and efficiency of illumination synchronization are improved, and cost required for carrying out illumination synchronization is saved.
Drawings
Fig. 1A is a schematic diagram of an application mode of an illumination information processing method according to an embodiment of the present application;
fig. 1B is a schematic diagram of an application mode of an illumination information processing method according to an embodiment of the present application;
fig. 2A is a schematic structural diagram of a server 200 according to an embodiment of the present application;
Fig. 2B is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application;
fig. 3A to 3G are schematic flow diagrams of an illumination information processing method according to an embodiment of the present application;
fig. 4 is an interactive flow diagram of an illumination information processing method according to an embodiment of the present application;
FIG. 5A is a schematic illustration of a reference provided in an embodiment of the present application;
FIG. 5B is a schematic diagram of a handheld spectrospectrometer provided in an embodiment of the present application;
fig. 5C to 5D are schematic diagrams of a man-machine interaction interface of the handheld spectrometer provided in the embodiments of the present application;
FIG. 5E is a luminance simulation diagram of illumination information provided in an embodiment of the present application;
fig. 6A to 6B are schematic flow diagrams of an illumination information processing method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a real scene provided in an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a particular order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
It should be noted that, in the embodiments of the present application, related data such as user information, user feedback data, etc., when the embodiments of the present application are applied to specific products or technologies, user permission or consent needs to be obtained, and the collection, use, and processing of related data needs to comply with related laws and regulations and standards of related countries and regions.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
Before further describing embodiments of the present application in detail, the terms and expressions that are referred to in the embodiments of the present application are described, and are suitable for the following explanation.
1) Virtual scenes, namely, a scene which is output by equipment and is different from the real world, can form visual perception of the virtual scenes through naked eyes or the assistance of equipment, for example, a two-dimensional image output by a display screen, and a three-dimensional image output by three-dimensional display technologies such as three-dimensional projection, virtual reality and augmented reality technologies; in addition, various simulated real world sensations such as auditory sensations, tactile sensations, olfactory sensations, and motion sensations can also be formed by various possible hardware. In the embodiment of the application, the virtual scene may be a digital scene produced according to the requirements of technicians or a real scene in a game engine, or a scene produced according to a shooting shed of a virtual film.
2) Virtual production is a broad term referring to various production methods for producing visual movies with the aid of a computer. The definition of the vita digital team for virtual production is that "virtual production is a region where the real world and the digital world blend. The "alliance diagram (MPC) further perfects the technical details of this definition: the virtual production combines virtual reality, augmented reality, computer-generated animation (CGI) and game engine technology, so that producers can see scenes spread in front of them as if they were being live-action synthesized and photographed.
3) A Light-Emitting Diode (LED) curtain wall, hereinafter referred to as LED curtain wall, is a large-sized LED screen for displaying virtual contents in a virtual film-making shooting shed.
4) The virtual film studio, namely, the virtual film making shooting studio, is a real film studio for shooting virtual film making films, the common virtual film studio is a film studio for shooting by combining green film studio with on-site lamplight, weiya and special props and combining later special effects, and the novel film studio at present combines an LED curtain wall, a motion capture technology and a camera for following.
5) And the field shooting camera is used for capturing a fusion picture of the LED screen and the screen front scene. In this embodiment, the live camera is referred to as a real video camera.
6) Physical lighting, light emitted from a physical lamp (real lamp) used for illuminating a person and a scene in a virtual shooting booth.
7) Virtual Engine (UE) developed by Epic and being one of game engines with the most widely known world authorization, and widely applied to content production except the game field
8) Gray spheres, spheres as references, for indicating the direction of light, the intensity of shadows, for example: coarse shading or soft shading, ensuring the same light level, color temperature of the light, etc
9) Chromium spheres, as reference spheres, with smooth surfaces can be used for reflection, illumination, alignment of high dynamic range images.
10 The standard color plate, also called a color difference meter, is developed based on three primary colors of colorimetry, is formed by sintering a colored glaze porcelain block through a special process, and at least comprises five colors of red, yellow, green, blue and white. The standard color plate has the advantages of smooth surface, uniform color, high color saturation, stable optical performance and physical and chemical properties, covers the whole visible spectrum wavelength, and can be used as a standard measuring instrument for color measurement.
11 High-Dynamic Range (HDR), an image that can provide more Dynamic Range and image details than a normal image. The high Dynamic Range image may be synthesized by acquiring Low Dynamic Range images (LDR) of different exposure times, and using the Low Dynamic Range images of the best detail corresponding to each exposure time. The high dynamic range image can better reflect the visual effect in the real environment.
12 High-Dynamic Range image, HDRi), a format for digitally storing images, for storing High dynamic range images. An HDRi format image is a very wide range of brightness images with a greater brightness data store than other formats. And it records brightness in a manner different from that of the conventional picture, not compressing brightness information into 8bit or 16bit color space in a nonlinear manner, but recording brightness information in a direct correspondence manner.
13 Digital dimming (Digital Multiple X, DMX) signal consoles, which are consoles for controlling the lighting of lamps by digital signals, are commonly used for stage lighting, studios and the like.
The embodiment of the application provides an illumination information processing method, an illumination information processing device, electronic equipment, a computer readable storage medium and a computer program product, which can improve the accuracy of illumination between synchronous virtual scenes and real scenes.
An exemplary application of the electronic device provided in the embodiments of the present application is described below, where the electronic device provided in the embodiments of the present application may be implemented as a notebook computer, a tablet computer, a desktop computer, a set-top box, a mobile device (e.g., a mobile phone, a portable music player, a personal digital assistant, a dedicated messaging device, a portable game device), a vehicle-mounted terminal, a Virtual Reality (VR) device, an augmented Reality (Augmented Reality, AR) device, or various types of user terminals, and may also be implemented as a server. In the following, an exemplary application when the device is implemented as a terminal device or a server will be described.
The virtual scene may be a game virtual scene, and before describing fig. 1A, a description is first given of a game mode related to an implementation in which the terminal device and the server are cooperatively implemented. Aiming at the scheme of collaborative implementation of terminal equipment and a server, two game modes, namely a local game mode and a cloud game mode, are mainly involved, wherein the local game mode refers to that the terminal equipment and the server cooperatively run game processing logic, an operation instruction input by a player in the terminal equipment is partially processed by the game logic run by the terminal equipment, the other part is processed by the game logic run by the server, and the game logic process run by the server is more complex and consumes more calculation power; the cloud game mode is that a server runs game logic processing, and a cloud server renders game scene data into audio and video streams and transmits the audio and video streams to a terminal device for display. The terminal device only needs to have the basic streaming media playing capability and the capability of acquiring the operation instruction of the player and sending the operation instruction to the server.
Referring to fig. 1A, fig. 1A is a schematic diagram of an application mode of an illumination information processing method according to an embodiment of the present application; for example, fig. 1A relates to a server 200, a network 300, a terminal device 400, and a luminaire device 500. The terminal device 400 is connected to the server 200 through the network 300; the terminal device 400 is connected to the lamp device 500 through the network 300, or the terminal device 400 is directly connected to the lamp device 500, and the network 300 may be a wide area network or a local area network, or a combination of the two.
By way of example, the real scene is a studio of a virtual production, the virtual scene is a virtual scene of a virtual film, the luminaire device 500 is a luminaire in the studio of a virtual production, the server 200 is a server running a game virtual engine, adapted to complete the virtual scene calculation depending on the computing power of the server 200 and to output an application mode of the virtual scene at the terminal device 400, and the terminal device 400 incorporates a digital dimming signal console for controlling the luminaire device 500, as will be explained below in connection with the above examples.
In some embodiments, the server 200 receives real illumination information corresponding to the obtained real scene sent by the terminal device 400, and obtains virtual illumination information corresponding to the virtual scene; comparing the two, and determining the configuration parameters of the target lamp and the target lamp to be supplemented in the real scene or the virtual scene. When the virtual scene is to be supplemented with illumination, the server 200 synchronizes the illumination of the virtual scene, and sends the synchronized picture of the virtual scene to the terminal device 400, and the terminal device 400 displays the corresponding picture. When the real scene is to be supplemented with the illumination, the server 200 sends the configuration parameters of the target lamp and the target lamp to be supplemented in the real scene to the terminal device 400, and the terminal device 400 generates a digital dimming signal based on the configuration parameters and sends the digital dimming signal to the lamp device 500 corresponding to the target lamp, so that the illumination in the real scene is synchronized with the virtual scene.
Before describing fig. 1B, the application scenario of fig. 1B is explained, and fig. 1B is suitable for some application modes that can complete the calculation of relevant data of a virtual scenario completely depending on the graphics processing hardware computing capability of the terminal device 400, for example, games in a stand-alone mode/off-line mode, and output of the virtual scenario is completed through various different types of terminal devices 400 such as a smart phone, a tablet computer, and a virtual reality/augmented reality device. By way of example, the types of graphics processing hardware include central processing units (CPU, central Processing Unit) and graphics processors (GPU, graphics Processing Unit).
In some embodiments, referring to fig. 1B, fig. 1B is a schematic diagram of an application mode of an illumination information processing method provided in an embodiment of the present application; for example, fig. 1B relates to a network 300, a terminal device 400, and a luminaire device 500. The terminal device 400 is connected to the lamp device 500 through the network 300, or the terminal device 400 is directly connected to the lamp device 500, and the network 300 may be a wide area network or a local area network, or a combination of the two.
By way of example, the real scene is a studio of a virtual production, the virtual scene is a virtual scene of a virtual movie, the luminaire device 500 is a luminaire in the studio of the virtual production, the terminal device 400 runs a virtual engine of the virtual scene, the terminal device 400 integrates a digital dimming signal console for controlling the luminaire device 500, as will be described in connection with the above examples.
In some embodiments, the terminal device 400 obtains real illumination information corresponding to a real scene, and obtains virtual illumination information corresponding to a virtual scene; comparing the two, and determining the configuration parameters of the target lamp and the target lamp to be supplemented in the real scene or the virtual scene. When the virtual scene is to be supplemented with illumination, the terminal device 400 synchronizes the illumination of the virtual scene and displays the synchronized picture of the virtual scene. When the real scene is to be supplemented with illumination, the terminal device 400 generates a digital dimming signal based on the configuration parameters of the target luminaire and the target luminaire, and sends the digital dimming signal to the luminaire device 500 corresponding to the target luminaire, so that the illumination in the real scene is synchronous with the virtual scene.
The embodiment of the application can be realized by a block chain technology, the data related to the lamp applied by the embodiment of the application can be uploaded to the block chain for storage, and the reliability of illumination synchronization is ensured by a consensus algorithm. Blockchains are novel application modes of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The embodiment of the application can be realized through a Database technology, and a Database (Database) can be taken as a place where the electronic file cabinet stores electronic files in short, so that a user can perform operations such as adding, inquiring, updating, deleting and the like on the data in the files. A "database" is a collection of data stored together in a manner that can be shared with multiple users, with as little redundancy as possible, independent of the application.
The database management system (Database Management System, DBMS) is a computer software system designed for managing databases, and generally has basic functions of storage, interception, security, backup, and the like. The database management system may classify according to the database model it supports, e.g., relational, XML (Extensible Markup Language ); or by the type of computer supported, e.g., server cluster, mobile phone; or by classification according to the query language used, such as structured query language (SQL, structured Query Language), XQuery; or by performance impact emphasis, such as maximum scale, maximum speed of operation; or other classification schemes. Regardless of the manner of classification used, some DBMSs are able to support multiple query languages across categories, for example, simultaneously.
The embodiment of the application can also be realized by Cloud Technology, and the Cloud Technology (Cloud Technology) is based on the general terms of network Technology, information Technology, integration Technology, management platform Technology, application Technology and the like applied by a Cloud computing business mode, can form a resource pool, and is used as required, flexible and convenient. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the advanced development and application of the internet industry and the promotion of requirements of search services, social networks, mobile commerce, open collaboration and the like, each article possibly has a hash code identification mark, the hash code identification mark needs to be transmitted to a background system for logic processing, data of different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
In some embodiments, the server 200 in fig. 1A may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The electronic device may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present invention.
Referring to fig. 2A, fig. 2A is a schematic structural diagram of a server 200 provided in an embodiment of the present application, and the server 200 shown in fig. 2 includes: at least one processor 410, a memory 450, at least one network interface 420. The various components in terminal 400 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable connected communication between these components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus system 440 in fig. 2A.
The processor 410 may be an integrated circuit chip having signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
Memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 450 optionally includes one or more storage devices physically remote from processor 410.
Memory 450 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 450 described in the embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 451 including system programs, e.g., framework layer, core library layer, driver layer, etc., for handling various basic system services and performing hardware-related tasks, for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for accessing other electronic devices via one or more (wired or wireless) network interfaces 420, the exemplary network interface 420 comprising: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.
In some embodiments, the apparatus provided in the embodiments of the present application may be implemented in software, and fig. 2A shows the illumination information processing apparatus 455 stored in the memory 450, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the light acquisition module 4551, the difference acquisition module 4552, the light source configuration module 4553, the light synchronization module 4554 are logical, and thus may be arbitrarily combined or further split according to the functions implemented. The functions of the respective modules will be described hereinafter.
Referring to fig. 2B, fig. 2B is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application, and the terminal 400 shown in fig. 2B includes: at least one processor 410, a memory 450, at least one network interface 420, and a user interface 430. The various components in terminal 400 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable connected communication between these components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled in fig. 2 as bus system 440.
The processor 410 may be an integrated circuit chip having signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable presentation of the media content. The user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
Memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 450 optionally includes one or more storage devices physically remote from processor 410.
Memory 450 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 450 described in the embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 451 including system programs, e.g., framework layer, core library layer, driver layer, etc., for handling various basic system services and performing hardware-related tasks, for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for accessing other electronic devices via one or more (wired or wireless) network interfaces 420, the exemplary network interface 420 comprising: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
A presentation module 453 for enabling presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 431 (e.g., a display screen, speakers, etc.) associated with the user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the apparatus provided in the embodiments of the present application may be implemented in software, and fig. 2 shows the illumination information processing apparatus 455 stored in the memory 450, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the light acquisition module 4551, the difference acquisition module 4552, the light source configuration module 4553, the light synchronization module 4554 are logical, and thus may be arbitrarily combined or further split according to the functions implemented. The functions of the respective modules will be described hereinafter.
The illumination information processing method provided by the embodiment of the application will be described with reference to an exemplary application and implementation of the terminal provided by the embodiment of the application.
Next, the method for processing illumination information provided in the embodiment of the present application is described, and as before, the electronic device implementing the method for processing illumination information in the embodiment of the present application may be a terminal device, a server, or a combination of both. Referring to fig. 3A, fig. 3A is a schematic flow chart of a method for processing illumination information according to an embodiment of the present application, and the steps shown in fig. 3A will be described.
In step 301, real illumination information corresponding to a real scene is acquired.
For example, the illumination information may be obtained by acquiring a high dynamic range image. The illumination information includes at least the following parameters: illumination intensity, color temperature, illumination color, illumination direction.
In some embodiments, the real scene includes at least one real reference, the real reference including: gray spheres, chrome spheres, and standard color plates. Referring to fig. 5A, fig. 5A is a schematic diagram of a reference provided in an embodiment of the present application; fig. 5A is a real reference, in which a corresponding virtual reference can be simulated in a virtual scene. The true references include: gray spheres 503A, chrome spheres 501A, standard color plates 502A, brackets 504A. The following parameters can be obtained from the HDRi image corresponding to the gray sphere: the direction of the light, the intensity of the shadow, such as a rough shadow or a soft shadow and ensures the same light level, the color temperature of the light, etc.; the following parameters can be obtained from the HDRi image corresponding to the chrome spheres: reflection, illumination, alignment of high dynamic range images; the color plate is used as a color standard reference, is affected by illumination and has different performances under different spectrums, and can be used as a reference when the field environment illumination is constructed. The stand 504A is used to support and integrate the three references.
In some embodiments, referring to fig. 3B, fig. 3B is a schematic flow chart of a light information processing method provided in the embodiments of the present application, and step 301 may be implemented by the following steps 3011 to 3014, which are specifically described below.
In step 3011, when a real lamp is set in the real scene, sub-illumination information of a reference object position where each real reference object is located is detected, and each sub-illumination information is combined to obtain real illumination information of the real scene.
For example, the real scene may be a studio, the reference object position where the real reference object is located may be a position where an object to be photographed is located in film photographing, and sub-illumination information of the reference object position may be obtained through a spectrometer, where the sub-illumination information includes: reference position, illumination intensity, illumination color, color temperature, illumination direction. Each piece of sub-illumination information can be used as a vector, and each vector is combined into an illumination information matrix, and the illumination information matrix is the real illumination information of the real scene.
In step 3012, when no real luminaire is set in the real scene, the following processing is performed for each real reference in the real scene: and shooting the real reference object according to different exposure time to obtain a plurality of first low dynamic range images corresponding to the real reference object.
Step 3012 is implemented, for example, by a real camera. The position set by the real camera can be the camera position in the film shooting process, and the real camera shoots a plurality of images at the same position with different exposure time aiming at the same real reference object, so as to obtain a plurality of first low dynamic range images with different exposure time.
In step 3013, a first high dynamic range image is synthesized based on each of the first low dynamic range images.
For example, the first high dynamic range image carries sub-illumination information of a reference object position corresponding to the real reference object, and a pixel bit depth of the first high dynamic range image is higher than a pixel bit depth of the first low dynamic range image.
The following illustrates a process of synthesizing a first high dynamic range image, such as: the following three first low dynamic range images are acquired for a real reference: 1. an underexposed image is used to capture very bright portions of the image. 2. Normally exposed image: the camera captures a conventional image according to the illuminance predicted by natural light. 3. Overexposed image: for capturing very dark portions of the image. The exposure degrees of the three images rise in sequence. And aligning the three images to eliminate double images in the high dynamic range image, combining the aligned images into one high dynamic range image, wherein the high dynamic range image carries illumination information.
In step 3014, the sub-illumination information of each first high dynamic range image is combined to obtain real illumination information of the real scene.
For example, the real scene includes a plurality of real references, each real reference corresponds to at least one first high dynamic range image, sub-illumination information corresponding to each first high dynamic range image is characterized as a vector form, the vectors are combined into a matrix, and the matrix is used as real illumination information of the real scene.
According to the embodiment of the application, the accuracy of acquiring the illumination information is improved by acquiring the illumination information of a plurality of positions in the real scene; the illumination information is acquired by shooting the high dynamic range image, so that the efficiency and accuracy for acquiring the illumination information are improved, the high dynamic range image can be directly read by a computer, and the computing resources required for acquiring the illumination information are saved.
In some embodiments, prior to step 301, environmental parameters in the real scene are acquired, wherein the environmental parameters include the size of the real scene; and constructing a virtual scene based on the environment parameters of the real scene, wherein the virtual scene corresponds to the real scene one by one.
For example, the virtual scene may be a virtual scene for performing virtual production, the real scene is a studio, and the virtual scene is a scene simulated according to a scale of the real scene. The relative position of the virtual reference in the virtual scene is the same as the relative position of the real reference in the real scene. And the positions of the virtual camera and the real camera are also in one-to-one correspondence.
In step 302, virtual lighting information corresponding to a virtual scene is obtained.
For example, the virtual scene includes at least one virtual reference, the virtual reference is a virtual device simulated according to a real reference, and the virtual reference includes: the virtual gray ball, the virtual chrome ball and the virtual color plate, wherein the position of each virtual reference object in the virtual scene corresponds to the position of each real reference object in the real scene one by one.
Prior to step 302, a virtual reference and a virtual camera are configured in a virtual scene by: the following is performed for each virtual reference: the method comprises the steps of acquiring the real position of a real camera used for shooting a real reference object in a real scene, wherein the relative position of the real reference object in the real scene is the same as the relative position of a virtual reference object in a virtual scene. And acquiring a virtual position corresponding to the real position in the virtual scene, and setting a virtual camera at the virtual position, wherein the virtual camera is used for acquiring a second low dynamic range image corresponding to the virtual reference object.
For example: and establishing a real three-dimensional coordinate system in the real scene, establishing the same virtual three-dimensional coordinate system in the virtual scene, wherein the real camera A and the virtual camera a in the virtual scene are in a corresponding relation, and the coordinate value of the real camera A in the real three-dimensional coordinate system is the same as the coordinate value of the virtual camera a in the virtual three-dimensional coordinate system. Based on the above-described settings, the virtual camera a and the real camera a can acquire low dynamic range images of the corresponding references.
Referring to fig. 3C, fig. 3C is a schematic flow chart of a light information processing method provided in an embodiment of the present application, and step 302 may be implemented by the following steps 3021 to 3023, which are specifically described below.
In step 3021, shooting the virtual reference object according to different exposure times by using a virtual camera corresponding to the virtual reference object, so as to obtain a plurality of second low dynamic range images corresponding to the virtual reference object.
By way of example, step 3021 is implemented by a virtual camera, and may be implemented by a terminal device or a server as an execution subject, and the principle of step 3021 is the same as that of step 3012, and will not be described here again.
In step 3022, a second high dynamic range image is synthesized based on each of the second low dynamic range images.
For example, the second high dynamic range image carries sub-virtual illumination information of a reference object position corresponding to the virtual reference object, and a pixel bit depth of the second high dynamic range image is higher than a pixel bit depth of the second low dynamic range image. Referring to fig. 5E, fig. 5E is a luminance simulation diagram of illumination information provided in an embodiment of the present application, which illustrates luminance information (illumination intensity) carried by a second high dynamic range image corresponding to a virtual scene. The illuminance range 502E represents that the illuminance range corresponding to the high dynamic range image 501E is 0 to 3000 lux, where the high dynamic range image 501E is a schematic diagram of the high dynamic range image, and the color shade of each pixel in the high dynamic range image 501E represents the illumination intensity corresponding to each pixel.
The execution of step 3022 is the same as step 3013, and will not be described here.
In step 3023, each piece of sub-virtual lighting information is used to obtain virtual lighting information of the virtual scene.
The execution of step 3023 is the same as step 3014, and will not be described here.
In some embodiments, the illumination information of the reference object position in the virtual scene can also be directly obtained through a virtual engine running the virtual scene.
According to the embodiment of the application, the illumination information of the virtual scene, corresponding to the real scene, at a plurality of positions is acquired, so that the accuracy of acquiring the illumination information is improved, the consistency of acquiring the illumination information of the two scenes is improved, and the illumination information synchronization is facilitated; the illumination information is acquired by shooting the high dynamic range image, so that the efficiency and accuracy for acquiring the illumination information are improved, the high dynamic range image can be directly read by a computer, and the computing resources required for acquiring the illumination information are saved.
With continued reference to fig. 3A, in step 303, the virtual illumination information and the real illumination information are compared and calculated, so as to obtain illumination difference information.
By way of example, the high dynamic range image of the real scene and the high dynamic range image of the virtual scene are subjected to graph comparison calculation processing, and illumination difference information is obtained.
In some embodiments, the real illumination information is carried by a first high dynamic range image of the real scene; the virtual illumination information is carried by a second high dynamic range image of the virtual scene; step 303 may be implemented by: performing image comparison processing on the first high dynamic range image and the second high dynamic range image to obtain a difference high dynamic range image; and taking the illumination information carried by the image with the high dynamic range difference as illumination difference information.
By way of example, the image comparison process may be implemented as follows: and performing subtraction operation on corresponding pixels between the two images, wherein the subtraction operation can detect a difference high dynamic range image carrying difference information of the two images. Or, the illumination information corresponding to the high dynamic range image of the real scene and the virtual scene can be characterized as a matrix form, and the two matrixes are subtracted to obtain illumination difference information.
In step 304, light attenuation calculation is performed based on the illumination difference information, so as to obtain configuration parameters of at least one supplemental light source.
For example, the supplemental light source is used to supplement the illumination to synchronize the illumination effects of the real scene and the virtual scene. The supplemental light source may be a virtual luminaire in a virtual scene, or a real luminaire in a real scene, as explained below for different scenes.
In some embodiments, the illumination difference information includes at least one of: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference; when the virtual scene is to be supplemented with illumination, referring to fig. 3D, fig. 3D is a schematic flow chart of the illumination information processing method provided in the embodiment of the present application, and step 304 may be specifically described below through the following steps 3041D to 3046D.
In step 3041D, a real luminaire location for each supplemental light source is determined based on the first luminaire location and the differential illumination direction for each real luminaire in the real scene, and each real luminaire location is mapped to a virtual luminaire location in the virtual scene.
For example, the differential illumination direction may be determined by gray spheres in the high dynamic range image, which are used to indicate the direction of the light. And after the real lamp position is obtained, taking the relative position which is the same as the real lamp position in the virtual scene as the virtual lamp position.
In some embodiments, there are two cases (virtual light fixtures already set, virtual light fixtures not set) for the illumination to be supplemented in the virtual scene, step 3041D is implemented by: when the virtual lamp is not set in the virtual scene, the first lamp position of each real lamp in the real scene is used as the real lamp position corresponding to each supplementary light source respectively, and each real lamp position is mapped into the virtual lamp position in the virtual scene.
For example: and directly mapping the first lamp position of each real lamp in the real scene into the virtual scene to obtain the virtual lamp position corresponding to each supplementary light source respectively.
When the virtual lamps are set in the virtual scene, the first lamp position of each real lamp in the real scene and the reference object position of the real reference object are obtained, the direction of each first lamp position towards the reference object position is respectively used as the real illumination direction, the real illumination direction parallel to the differential illumination direction is used as the target direction, the first lamp position of the target direction is used as the real lamp position of the supplementary light source, and each real lamp position is mapped into the virtual lamp position in the virtual scene.
For example, the position of the virtual reference corresponds to the position of the real reference, the differential illumination direction is the direction in which the virtual scene emits illumination as compared with the direction in which the virtual scene lacks illumination, and the real illumination direction parallel to the differential illumination direction is the illumination direction corresponding to the supplemental light source, that is, the target direction. The first luminaire position in the real scene where there is an intersection with the target direction is the real luminaire position of the supplemental light source.
The following steps 3042D to 3046D are performed for each supplemental light source.
In step 3042D, a first distance between the real luminaire position and the reference object position is obtained.
For example, the reference object position is a position of a real reference object in the real scene, and a direction in which the real luminaire position is oriented toward the reference object position is the target direction. And acquiring a first distance between the position of the real lamp and the position of the reference object in the real scene.
In step 3043D, light attenuation calculation processing is performed based on the differential illumination intensity, the target direction and the first distance, so as to obtain the light emission intensity of the supplemental light source.
In some embodiments, step 3043D is implemented by: determining first illumination intensity of a supplementary light source of the target direction aiming at the reference object position based on the difference illumination intensity and the target direction; and obtaining the square of the first distance, and multiplying the square by the first illumination intensity to obtain the luminous intensity of the supplementary light source.
For example, the illumination intensity is data of carrying directions, the differential illumination intensity extracted from the illumination differential information includes illumination intensities of a plurality of directions, and the illumination intensity parallel to the target direction is used as a first illumination intensity of the supplemental light source for the reference object position. The light irradiation attenuation can refer to the inverse square law, namely the action intensity of an object or particle, and the light irradiation attenuation linearly attenuates along with the square of the distance, namely the action force is inversely related to the square of the distance. The relationship between the luminous intensity of the supplemental light source and the first illumination intensity may be characterized by the following formula (1):
Wherein I is the luminous intensity of the light supplementing light source, L 1 Is a first distance, E 1 Is the first illumination intensity, and the luminous intensity is the first distance L can be determined based on the formula (1) 1 Square of (1), first illumination intensity E 1 And the product between them.
In step 3044D, the light source color of the supplemental light source in the target direction is determined based on the differential illumination color corresponding to each differential illumination direction and the target direction.
For example, the differential illumination color extracted from the illumination differential information includes illumination colors of a plurality of directions, and the illumination color of a direction parallel to the target direction is extracted as the light source color of the supplemental light source.
In step 3045D, the light color temperature of the supplemental light source in the target direction is determined based on the differential illumination color temperature and the target direction corresponding to each differential illumination direction.
For example, the differential illumination color temperature extracted from the illumination differential information includes illumination color temperatures of a plurality of directions, and the illumination color temperature of a direction parallel to the target direction is extracted as the light color of the supplemental light source.
In step 3046D, the target direction, the color temperature of the light, the color of the light source, the luminous intensity, and the virtual lamp position are combined to obtain the configuration parameters of the supplemental light source.
For example, the target direction, the color temperature of the light, the color of the light, the luminous intensity and the virtual lamp position of the same supplementary light source can be respectively used as parameters of one dimension, and the parameters of each dimension can be combined into a vector, that is, the configuration parameters of the supplementary light source are characterized as vector forms.
In some embodiments, the illumination difference information includes: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference; when the real scene is to be supplemented with illumination, referring to fig. 3E, fig. 3E is a flowchart of the illumination information processing method provided in the embodiment of the present application, and step 304 may be specifically described below through the following steps 3041E to 3046E.
In step 3041E, a virtual luminaire position for each supplemental light source is determined based on the second luminaire position and the differential illumination direction for each virtual luminaire in the virtual scene, and each virtual luminaire position is mapped to a real luminaire position in the real scene.
In some embodiments, step 3041E may be implemented by: when the real lamps are not arranged in the real scene, the second lamp position of each virtual lamp in the virtual scene is used as the virtual lamp position corresponding to each supplementary light source respectively; when the real lamps are set in the real scene, the direction of each second lamp position towards the reference object position is used as a virtual illumination direction, the virtual illumination direction parallel to the differential illumination direction is used as a target direction, and the second lamp position in the target direction is used as the virtual lamp position of the supplementary light source.
For example, obtaining a virtual luminaire location is interworking with obtaining a real luminaire location. An implementation of step 3041E may refer to step 3041D.
The following steps 3042E to 3046E are performed for each supplemental light source.
In step 3042E, a second distance between the virtual luminaire position and the reference object position is obtained.
For example, the reference position is a position of a virtual reference in the virtual scene, and a direction in which the virtual luminaire position is oriented toward the reference position is the target direction. And acquiring a second distance L2 between the virtual real lamp position and the reference object position in the virtual scene.
In step 3043E, light attenuation calculation processing is performed based on the differential illumination intensity, the target direction, and the second distance, so as to obtain the light emission intensity of the supplemental light source.
In some embodiments, step 3043E may be implemented by: determining second illumination intensity of a supplementary light source of the target direction aiming at the reference object position based on the difference illumination intensity and the target direction; and obtaining the square of the second distance, and multiplying the square by the second illumination intensity to obtain the luminous intensity of the supplementary light source.
For example, the illumination intensity is data of carrying directions, the differential illumination intensity extracted from the illumination differential information includes illumination intensities of a plurality of directions, and the illumination intensity parallel to the target direction is used as a second illumination intensity of the supplemental light source for the reference object position. The light irradiation attenuation can refer to the inverse square law, namely the action intensity of an object or particle, and the light irradiation attenuation linearly attenuates along with the square of the distance, namely the action force is inversely related to the square of the distance. The relationship between the luminous intensity of the supplemental light source and the second illumination intensity may be characterized by the following equation (2):
Wherein I is the luminous intensity of the light supplementing light source, L 2 Is a second distance E 2 Is the second illumination intensity, and the luminous intensity is the first distance L can be determined based on the formula (1) 2 Square of (2), second illumination intensity E 2 And the product between them.
In step 3044E, when no real lamp is set in the real scene, the color temperature of the lamp and the color of the light source of the virtual lamp at the virtual lamp position are obtained.
For example, when the real lamp position corresponding to the light supplementing light source in the real scene does not have the real lamp, the light color temperature and the light color of the virtual lamp in the virtual scene can be directly obtained as the configuration parameters of the light supplementing light source.
In step 3045E, when a real luminaire has been set in the real scene, the light source color of the supplemental light source is determined based on the differential illumination color, the target direction, and the light color temperature of the supplemental light source is determined based on the differential illumination color temperature.
For example, when the real lamp position corresponding to the light supplementing light source in the real scene already stores the real lamp, the corresponding lamp is required to be supplemented to the real lamp position, and then the corresponding configuration parameters are determined based on the illumination difference information. The principle of step 3045E is the same as that of steps 3044D to 3045D.
In step 3046E, the target direction, the color temperature of the light, the color of the light source, the luminous intensity, and the actual lamp position are combined to obtain the configuration parameters of the supplemental light source.
In the embodiment of the application, aiming at different conditions, the configuration parameters of the light source are acquired from different directions, so that the accuracy of acquiring the configuration parameters of the light source is improved, and the effect of illumination synchronization processing is improved.
With continued reference to fig. 3A, in step 305, a matching process is performed with the luminaire database based on the configuration parameters of each supplemental light source, so as to obtain a target luminaire corresponding to each supplemental light source in the luminaire database.
For example, the luminaire database stores performance parameters for a plurality of luminaires. The performance parameter is a value interval of the configuration parameter. For example: the lamp A is a lamp with adjustable illumination intensity, and the performance parameters are as follows: the illumination intensity is 220 Lx-340 Lx, and the illumination intensity generated by uniformly distributing the luminous flux of lumens on an area of 1 square meter is one lux Lx.
In some embodiments, referring to fig. 3F, fig. 3F is a flowchart illustrating a method for processing illumination information according to an embodiment of the present application, and step 305 may be described in detail below by the following steps 3051 to 3053.
In step 3051, performance parameters of a luminaire in a luminaire database are obtained.
Exemplary performance parameters of the luminaire include: the illumination parameters of different types include: illumination intensity, illumination color temperature. In some embodiments, the value interval of the position where the lamp can be set may also be considered as a performance parameter, for example: the lamp can be arranged on a track of a ceiling of the studio, and a position interval corresponding to the track is a value interval of the position of the lamp; the lamp can be arranged on the mechanical arm of the studio, and the movable range of the mechanical arm is a value interval of the position of the lamp.
The following steps 3052 to 3053 are performed for the configuration parameters of each supplemental light source.
In step 3052, different types of illumination parameters in the configuration parameters are obtained.
Exemplary illumination parameters of the supplemental light source include: illumination intensity, illumination color temperature.
In step 3053, each type of illumination parameter is matched with the performance parameters of different lamps, and the lamps meeting the matching conditions are used as target lamps corresponding to the supplementary light sources.
Exemplary, the matching conditions include: each type of illumination parameter of the supplementary light source belongs to a value interval of the illumination parameter corresponding to the lamp.
For example: the configuration parameters of the supplemental light source Y are: color temperature 2800K (Kelvin), illumination color white, illumination intensity 500Lx; the performance parameter of the lamp X is obtained from the lamp database, and the color temperature is fixed to 2800K; the illumination color can be changed, including white, yellow, beige; the illumination intensity is 350 Lx-750-Lx, each configuration parameter of the supplemental light source Y belongs to the interval of the corresponding performance parameter of the lamp X, and the lamp X is the target lamp.
In some embodiments, when there are multiple target luminaires that match the conditions, the most energy-efficient, least bulky, or mobile luminaire may be selected among the multiple target luminaires as the final target luminaire. For example: the supplementary light source is a real lamp, and the interval for acquiring the position performance parameters from the target lamp meets the target lamp parameters of the real lamp.
According to the embodiment of the application, the required target lamp can be accurately obtained by matching in the lamplight database based on the configuration parameters, so that shooting efficiency in a real scene can be improved, computing resources are saved, and labor cost required by virtual film making is reduced.
With continued reference to fig. 3A, in step 306, each target luminaire and the configuration parameters corresponding to each target luminaire are used as configuration information, and scene illumination synchronization is performed based on the configuration information.
In some embodiments, referring to fig. 3G, fig. 3G is a schematic flow chart of a light information processing method provided in an embodiment of the present application, and step 306 may be specifically described below through the following steps 3061 to 3062.
In step 3061, when the virtual scene is to be supplemented with illumination, setting corresponding virtual lamps in the virtual scene based on the configuration information, and configuring parameters corresponding to each virtual lamp.
For example, when the virtual scene is to be supplemented with illumination, the virtual engine of the virtual scene is called to set the corresponding virtual lamp based on the configuration parameters, so that illumination synchronization of the virtual scene is realized.
In step 3062, when the real scene is to be supplemented with illumination, a luminaire control signal of a real luminaire corresponding to each target luminaire is generated based on the configuration information, and each luminaire control signal is sent to each real luminaire in the real scene.
For example, the luminaire control signal is used to set configuration parameters of the real luminaire, i.e. the digital dimming (Digital Multiple X, DMX) signal.
In some embodiments, the supplementary lighting is needed in the real scene, and the real lighting corresponding to the target lighting can be controlled to be turned on by the digital dimming signal and moved to the corresponding position in the real scene.
According to the embodiment of the application, the light supplementing operation is carried out in different modes aiming at different scenes, so that the illumination of the virtual scene and the real scene is synchronous, the efficiency of illumination synchronization is improved, the computing resources required for illumination synchronization are saved, and the shooting efficiency of the virtual film making is improved.
In some embodiments, the embodiments of the present application are implemented by cooperation of a server and a terminal device, and referring to fig. 4, fig. 4 is a flowchart of an illumination information processing method provided in the embodiments of the present application, which illustrates an interaction process between the server 200 and the terminal device 400, the luminaire device 500, and the detection device 600.
The detection device 600 performs step 401 of transmitting real illumination information of the real scene to the server 200.
By way of example, the detection device 600 may be a camera, spectrometer, or the like, for acquiring illumination information in a real scene. For example: the camera acquires a high dynamic range image of the real scene carrying illumination information. Or the spectrometer acquires illumination intensity and illumination chromatogram in the real scene.
The server 200 performs step 402 of determining configuration information based on the virtual lighting information and the real lighting information.
For example, the specific process of step 402 performed by server 200 may refer to steps 301 through 305 in fig. 3A above.
The server 200 performs step 403, and transmits configuration information to the terminal device 400 when the real scene is to be supplemented with illumination.
The terminal device 400 performs step 404 of transmitting a digital dimming signal to the luminaire device 500 based on the configuration information.
For example, the terminal device 400 may be a computer integrated with a digital dimming console, provided with a display screen and a user interface, and a technician may control a real lamp in a real scene through the terminal device 400 or obtain virtual lighting information formed by a virtual lamp in a virtual scene through the terminal device 400.
The server 200 performs step 405, when the virtual scene is to be supplemented with illumination, transmitting the virtual scene image after the virtual scene is supplemented with illumination to the terminal device 400. The terminal device performs step 406 to display the virtual scene image.
According to the embodiment of the application, the illumination difference information is obtained by acquiring the illumination information in the virtual scene and the real scene and comparing and calculating the illumination information, so that the accuracy of acquiring the illumination difference in the virtual scene and the real scene is improved; the configuration parameters corresponding to the light supplementing light source are obtained based on the illumination difference information, the target light for supplementing light is obtained in the light database, and illumination synchronization is carried out on a real scene or a virtual scene to be supplemented with illumination based on the configuration information, so that accuracy and efficiency of illumination synchronization are improved, and cost required for carrying out illumination synchronization is saved.
Next, an exemplary application of the illumination information processing method according to the embodiment of the present application in an actual application scenario will be described.
The shooting process of virtual manufacture is different from the traditional shooting means, because the shooting shed content relates to self-luminous LED equipment (such as an LED curtain wall), the process of mutually migrating the lights of a virtual scene and a real scene exists, because of the shooting requirement of the real scene in the virtual production, a plurality of devices which can emit illumination except illumination equipment exist in the real scene, corresponding devices are not required to be arranged in the virtual scene, how the virtual lights and the real lights are matched and balanced quickly and the effect in the virtual scene is restored becomes very important, in the related technology, the light configuration of the virtual scene needs manual setting of technical staff, the illumination matching in the real scene needs the configuration of technical staff, the illumination matching is inaccurate, and the operation difficulty of the illumination matching is improved.
According to the embodiment of the application, the information intercommunication between virtual light and real light in virtual shooting (comprising the steps that the illumination of a virtual scene is synchronized to the real scene and the illumination of the real scene is synchronized to the virtual scene) can be achieved through the establishment of illumination information carried by a high dynamic range image, so that the problem of the stuck point of repeated manual debugging of virtual light and real light in virtual shooting is overcome. The illumination information carried by the high dynamic range image can be quickly reproduced in the virtual scene of the virtual engine and the shooting shed (real scene), so that the light debugging time in the virtual film making process can be effectively saved, and the uniformity of the illumination effect is achieved.
For example, high dynamic range image acquisition techniques are commonly used in post-production, where the high dynamic range image format image records illumination information in the picture environment, which can be used to set scene illumination in a virtual scene to "illuminate" the scene. Many high dynamic range image files are provided in panorama form, which can serve as an environmental background to create reflection and refraction. The high dynamic range image is essentially different from the panoramic image, the panoramic image refers to a common image containing a scene with 360 degrees, can be in a JPG format, a BMP format, a TGA format and the like, belongs to a low dynamic range image, and does not carry illumination information.
In this embodiment of the present application, the technical schemes of synchronizing the light of the virtual scene to the real scene and synchronizing the light of the real scene to the virtual scene are intercommunication and reciprocal, and in this embodiment, description is given by taking the light of the virtual scene to the real scene as an example. Synchronizing the light of the virtual scene to the real scene includes two cases, the first one, no luminaire device is arranged in the real scene. Second, a luminaire device is already set in the real scene.
The first case will be explained below, referring to fig. 6A, taking the terminal device 400 as an execution subject, the terminal device 400 may be a computer running a virtual engine. Fig. 6A is a flowchart of a method for processing illumination information according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 6A.
In step 601A, illumination information of a key point in a virtual environment is acquired.
By way of example, the key point is the location in the virtual scene where the virtual reference is placed. The virtual scene is constructed based on the environment information of the real scene.
After the virtual scene generation and production are completed (i.e. the illumination effect is already set in the virtual scene), a virtual reference is placed at the reference position corresponding to each virtual camera in the virtual scene. The virtual references include: gray spheres, chrome spheres, and standard color plates. The illumination information of each reference object position in the virtual scene can be obtained through the virtual engine, wherein the illumination information comprises various parameters, such as: illumination direction, illumination intensity, illumination color, color temperature, light source position, etc.
Referring to fig. 5A, fig. 5A is a schematic diagram of a reference provided in an embodiment of the present application; fig. 5A is a real reference, in which a corresponding device can be simulated in a virtual scene. The real references include gray spheres 503A, chrome spheres 501A, standard color plates 502A, brackets 504A. The following parameters were obtained by ash balls in the reference: the direction of the light, the intensity of the shadow, such as a rough shadow or a soft shadow and ensures the same light level, the color temperature of the light, etc.; the following parameters were obtained by chromium balls in the reference: reflection, illumination, alignment of high dynamic range images; the color plate is used as a color standard reference, is affected by illumination and has different performances under different spectrums, and can be used as a reference when the field environment illumination is constructed.
For example, referring to fig. 5E, fig. 5E is an illumination information brightness simulation diagram provided in an embodiment of the present application, which illustrates brightness information (illumination intensity) carried by a second high dynamic range image corresponding to a virtual scene.
In step 602A, initial illumination information of a key point in a real scene is acquired.
In the embodiment of the present application, a shooting shed in which a real scene is a virtual production is taken as an example for explanation, and referring to fig. 7, fig. 7 is a schematic diagram of the real scene provided in the embodiment of the present application. The shooting shed 702 for virtual production is provided with various electronic devices, and the electronic devices are provided with display screens, and the display screens emit light, for example: the LED curtain wall 701 is used for playing background images (or videos), and the LED curtain wall 701 emits illumination when displaying content. In this case, even if the same lamps and parameters corresponding to the lamps are adopted in the real scene and the virtual scene, the illumination of the virtual scene and the illumination of the real scene are different. When a lamp for illumination is not arranged in the real scene, initial illumination information formed by natural illumination of the electronic equipment in the real scene is acquired.
For example, taking the example that only the LED curtain wall emits illumination in a real scene as an explanation, for example: the same gray ball, chrome ball and standard color plate are placed at the same reference object position in the virtual scene, and basic illumination information of the real environment only when the LED curtain wall is illuminated is obtained through a real camera, so that the influence of illumination information brought by the LED curtain wall can be eliminated. Sampling a real scene by a real camera to obtain a high dynamic range image file carrying illumination information; and transmitting the high dynamic range image file to the terminal equipment in real time.
For example, a spectrometer may be used to measure data at the reference location, where the spectrometer may obtain information on brightness, color temperature, spectrum, color, etc. at the location in real time. The spectrometer is a device for measuring the intensities of spectral lines at different wavelength positions by using a light detector such as a photomultiplier tube.
Referring to fig. 5B, fig. 5B is a schematic diagram of a handheld spectrometer provided in an embodiment of the present application; the illumination intensity of different reference object positions can be measured by the handheld spectrometer 501B in the shooting shed, a picture displayed in the man-machine interaction interface 502B of the handheld spectrometer 501B is a schematic diagram of an interface for acquiring environmental chromatographic information, and information such as the illumination intensity is input into a terminal device for processing illumination information.
Referring to fig. 5C and 5D, fig. 5C to 5D are schematic diagrams of a man-machine interaction interface of a handheld spectrometer according to an embodiment of the present application; the man-machine interface 501C in fig. 5C represents an interface for obtaining ambient brightness information, where the LUX218.6 represents that the handheld spectrometer measures an illumination intensity of 218.6 LUX; fig. 5D illustrates an interface for obtaining environmental color spectrum information, where in the man-machine interface 501D, the color space in which CIE197 characterizes the color spectrum information is the uniform color space recommended by the international society of illumination (CIE) in 1976. U ', V' are the chromaticity coordinates of the reference object position measured by the hand-held spectrospectrometer; in the man-machine interface 502D, X, Y is the coordinate value of the color of the reference object position measured by the hand-held spectroscope in the uniform color space.
For example, step 602A and step 601A may be performed synchronously.
In step 603A, the high dynamic range images of the key points corresponding to the virtual scene and the real scene are compared to obtain illumination difference information.
And the terminal equipment performs calculation simulation and comparison on illumination information carried by the high dynamic range image file of the real scene and the gray ball, the chromium ball and the high dynamic range image of the color board which are placed in the virtual scene to obtain illumination difference information. Illumination information for each reference position in the virtual scene is known, including: direction, brightness, color temperature, color, etc. By means of image comparison, a high dynamic range image carrying illumination difference information can be obtained.
In step 604A, based on the illumination difference information, a three-dimensional light matrix diagram corresponding to the real scene is simulated according to the performance parameters and the light attenuation formula of the lamp.
Example, exemplary, lamplight library includes: the lamp information commonly used in the market, how the lamp is placed, and the corresponding effect can be obtained through calculation, and the method is explained below.
The terminal equipment is internally provided with lamp information (lamplight library) which is commonly used in market, including but not limited to the brightness, color temperature, hue range and the like of each lamp, and automatically simulates the lighting effect in a virtual scene, wherein the information such as gray balls, chrome balls and color plates which serve as reference objects is added with the specific brightness, spectrum, color temperature and the like, and the difference between illumination information is obtained by carrying out graph comparison calculation on high dynamic range images of the reference objects; the effect of setting the numerical value in advance can be achieved by simulating the type and the position of each lamp which should be placed in the real scene in the virtual scene through light irradiation attenuation, so that the shooting efficiency and the aim of light matching are improved. Simulating each lamp type and position which should be placed in a real scene in a virtual scene through light irradiation attenuation comprises the following steps: acquiring illumination directions in the illumination information difference, and determining the position of a light source; and carrying out light irradiation attenuation calculation on the light intensity corresponding to the reference object based on the distance between the light source position and the reference object, so as to obtain the luminous intensity (brightness) of the light source.
For example, the light irradiation attenuation may refer to the inverse square law, meaning that the intensity of action of an object or particle decays linearly with the square of the distance, i.e. the force is inversely proportional to the square of the distance. Illumination intensity of the reference surface = luminous intensity of the light source/square of the distance between the reference and the light source. The luminous intensity (brightness) corresponding to the light source can be obtained based on the above formula.
By way of example, the terminal device outputs a three-dimensional light matrix after calculation, and the dimensions of the light matrix include: light erection information (specific position of each lamp in a real scene), lamp brand type, configuration parameters corresponding to the lamps, and the like.
In step 605A, a digital dimming signal is generated based on the three-dimensional light matrix diagram, and the digital dimming signal is sent to a corresponding real luminaire in the real scene.
For example, the light fixture may be positioned on a particular track and moved to a target location carried by the digital dimming signal when the digital dimming signal is received. The luminaire device may also be set into the real scene by means of a manual setting.
For example, when the real light fixture in the real scene is set up, the illumination of the two scenes may be checked to determine whether the illumination is completed synchronously. The verification may be achieved by: after the real lamps of the real scene are arranged, taking a random reference object position (different from the reference object position corresponding to the step 601A or the step 602A) in any shooting range of the real scene, carrying out data acquisition measurement, comparing the consistency of illumination information in the two scenes, if the illumination information is consistent, indicating that the illumination effect is synchronous and accurate, and if the illumination information is inconsistent, calculating and adjusting the numerical values, such as brightness, color temperature, color and the like, required to be output by the lamps through the terminal equipment; until it is consistent.
In some embodiments, a luminaire is already provided in the real scene. The lamplight type is designated in advance, the site lamplight is set according to the mode of the lamplight operator, the position of the lamplight is known through lamplight positioning, in some embodiments, 4-5 common camera positions (common camera positions are any one of 9 conventional machine positions of cinematology for example) can be arranged in a real scene, gray balls, chrome balls, color plates and colorimeters are placed, and illumination information is acquired through the gray balls, the chrome balls, the color plates and the spectroscope.
Referring to fig. 6B, with the terminal apparatus 400 as an execution subject, the terminal apparatus 400 may be a computer running a virtual engine. Fig. 6B is a flowchart of a method for processing illumination information according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 6B. By way of example, fig. 6B is a process of synchronizing illumination information in the case where a luminaire has been provided in a real scene. The execution of steps 601B to 604B may refer to steps 601A to 603A above.
In step 601B, illumination information of key points in a virtual scene is acquired.
In step 602B, illumination information of key points in a real scene is acquired.
After step 602B and step 601B, step 603B is executed to compare the high dynamic range images of the key points corresponding to the virtual scene and the real scene respectively, so as to obtain illumination difference information.
In step 604B, based on the illumination difference information, the light parameters corresponding to the real scene are simulated according to the performance parameters and the light attenuation formula of the lamp.
For example, in the case that the type of the lamp of the real scene has been acquired, the lamp light parameter to be adjusted of the lamp of the real scene may be calculated by a light attenuation formula. For example: the illumination intensity to be reduced, the illumination color temperature to be adjusted, etc.
In step 605B, a digital dimming signal is generated based on the light parameters and sent to the real light fixtures already configured in the real scene.
The terminal device running the virtual engine sends the parameters corresponding to the lamplight to the digital dimming signal console, and sends the parameters to the real lamp through the digital dimming signal, so that the corresponding lamplight is adjusted to the target value.
For example, the digital dimming signal console may be a stand-alone terminal device, and in the case of a computer having a function of transmitting a digital dimming signal, the digital dimming signal console may be integrated into the computer.
In step 606B, whether the illumination difference information exists in the high dynamic range images of the key points corresponding to the virtual scene and the real scene respectively is compared. When the execution result of step 606B is yes, step 607B is executed to obtain the type and position of the lamp to be added based on the illumination difference information. When the execution result of step 606B is no, the light synchronization is ended.
After the real lamps of the real scene are adjusted, a random reference object position (different from the reference object position corresponding to the step 601B or the step 602B) is taken in any shooting range of the real scene, data acquisition measurement is carried out, consistency of illumination information in the two scenes is compared, if the illumination information is consistent, synchronization accuracy of illumination effects is indicated, if the illumination information is inconsistent, it is indicated that the currently set real lamps cannot meet the requirement of synchronous illumination effects, configuration parameters of the real lamps to be supplemented and the real lamps are calculated through the terminal equipment, and the corresponding real lamps are started and controlled to move to the corresponding positions and are configured to the corresponding configuration parameters. Or the lamplight engineer sets the newly added lamp at the corresponding position and configures the configuration parameters of the newly added real lamp through the digital dimming signal.
According to the method and the device for achieving the illumination synchronization, the illumination synchronization between the virtual scene and the real scene can be achieved, the on-site shooting efficiency can be improved, the light scheme is set for a team which does not use virtual shooting rapidly, the illumination effect in the virtual scene is restored, a lot of debugging and light placing time is saved, and the shooting efficiency and the shooting effect are improved.
Continuing with the description below of an exemplary structure of the illumination information processing apparatus 455 implemented as a software module provided in an embodiment of the present application, in some embodiments, as shown in fig. 2A or fig. 2B, the software module stored in the illumination information processing apparatus 455 of the memory 440 may include: the illumination acquisition module 4551 is configured to acquire real illumination information corresponding to a real scene; the illumination acquisition module 4551 is further configured to acquire virtual illumination information corresponding to the virtual scene; the difference acquisition module 4552 is configured to perform comparison calculation processing on the virtual illumination information and the real illumination information to obtain illumination difference information; the light source configuration module 4553 is configured to perform light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for performing supplementary illumination; the light source configuration module 4553 is further configured to perform matching processing with the lamp database based on the configuration parameters of each supplementary light source to obtain a target lamp corresponding to each supplementary light source in the lamp database; the illumination synchronization module 4554 is configured to perform scene illumination synchronization based on configuration information by using each target luminaire and the configuration parameter corresponding to each target luminaire as configuration information.
In some embodiments, the real scene includes at least one real reference; the illumination acquisition module 4551 is configured to detect sub-illumination information of a reference object position where each real reference object is located when a real lamp is set in the real scene, and combine each sub-illumination information to obtain real illumination information of the real scene; when no real luminaire is set in the real scene, the following processing is performed for each real reference in the real scene: shooting the real reference object according to different exposure time to obtain a plurality of first low dynamic range images corresponding to the real reference object; synthesizing a first high dynamic range image based on each first low dynamic range image, wherein the first high dynamic range image carries sub-illumination information of a reference object position corresponding to a real reference object, and the pixel bit depth of the first high dynamic range image is higher than that of the first low dynamic range image; and combining the sub-illumination information of each first high dynamic range image to obtain the real illumination information of the real scene.
In some embodiments, the virtual scene includes at least one virtual reference, wherein a position of each virtual reference in the virtual scene corresponds one-to-one with a position of each real reference in the real scene; the illumination acquisition module 4551 is configured to perform, for each virtual reference, the following processing before acquiring virtual illumination information corresponding to a virtual scene: acquiring a real position of a real camera used for shooting a real reference object in a real scene, wherein the relative position of the real reference object in the real scene is the same as the relative position of a virtual reference object in a virtual scene; and acquiring a virtual position corresponding to the real position in the virtual scene, and setting a virtual camera at the virtual position, wherein the virtual camera is used for acquiring a second low dynamic range image corresponding to the virtual reference object.
In some embodiments, the illumination acquisition module 4551 is configured to perform the following processing for each virtual reference in the virtual scene: shooting the virtual reference object according to different exposure time by a virtual camera corresponding to the virtual reference object to obtain a plurality of second low dynamic range images corresponding to the virtual reference object; synthesizing a second high dynamic range image based on each second low dynamic range image, wherein the second high dynamic range image carries sub-virtual illumination information of a reference object position corresponding to a virtual reference object, and the pixel bit depth of the second high dynamic range image is higher than that of the second low dynamic range image; and obtaining the virtual illumination information of the virtual scene from each piece of sub-virtual illumination information.
In some embodiments, the real illumination information is carried by a first high dynamic range image of the real scene; the virtual illumination information is carried by a second high dynamic range image of the virtual scene; the difference acquisition module 4552 is configured to perform image comparison processing on the first high dynamic range image and the second high dynamic range image to obtain a difference high dynamic range image; and taking the illumination information carried by the image with the high dynamic range difference as illumination difference information.
In some embodiments, the illumination difference information includes at least one of: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference; the light source configuration module 4553 is configured to determine a real luminaire position of each supplementary light source based on a first luminaire position of each real luminaire in the real scene and a differential illumination direction when the virtual scene is to be supplemented with illumination, and map each real luminaire position to a virtual luminaire position in the virtual scene; the following processing is performed for each supplemental light source: acquiring a first distance between a real lamp position and a reference object position, wherein the reference object position is a position of a real reference object in a real scene, and a direction of the real lamp position towards the reference object position is a target direction; performing light attenuation calculation processing based on the difference illumination intensity, the target direction and the first distance to obtain the luminous intensity of the supplementary light source; determining the light source color of the supplementary light source in the target direction based on the difference illumination color corresponding to each difference illumination direction and the target direction; determining the light color temperature of a supplementary light source in the target direction based on the differential illumination color temperature and the target direction corresponding to each differential illumination direction; and combining the target direction, the light color temperature, the light source color, the luminous intensity and the virtual lamp position to obtain configuration parameters of the supplementary light source.
In some embodiments, the light source configuration module 4553 is configured to, when no virtual light fixture is set in the virtual scene, use a first light fixture position of each real light fixture in the real scene as a real light fixture position corresponding to each supplemental light source respectively; when the virtual lamps are set in the virtual scene, the first lamp position of each real lamp in the real scene and the reference object position of the real reference object are obtained, the direction of each first lamp position towards the reference object position is respectively used as the real illumination direction, the real illumination direction parallel to the differential illumination direction is used as the target direction, and the first lamp position of the target direction is used as the real lamp position of the supplementary light source.
In some embodiments, the light source configuration module 4553 is configured to determine a first illumination intensity of the supplemental light source of the target direction for the reference position based on the differential illumination intensity, the target direction; and obtaining the square of the first distance, and multiplying the square by the first illumination intensity to obtain the luminous intensity of the supplementary light source.
In some embodiments, the illumination difference information includes: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference; the light source configuration module 4553 is configured to determine a virtual luminaire position of each supplementary light source based on the second luminaire position and the differential illumination direction of each virtual luminaire in the virtual scene when the real scene is to be supplemented with illumination, and map each virtual luminaire position to a real luminaire position in the real scene; the following processing is performed for each supplemental light source: obtaining a second distance between a virtual lamp position and a reference object position, wherein the reference object position is a position of a virtual reference object in a virtual scene, and a direction of the virtual lamp position towards the reference object position is a target direction; performing light attenuation calculation processing based on the difference illumination intensity, the target direction and the second distance to obtain the luminous intensity of the supplementary light source; when the real lamp is not arranged in the real scene, acquiring the lamplight color temperature and the light source color of the virtual lamp at the position of the virtual lamp; when a real lamp is set in the real scene, determining the light source color of the supplementary light source based on the differential illumination color and the target direction, and determining the light color temperature of the supplementary light source based on the differential illumination color temperature; and combining the target direction, the color temperature of the lamplight, the color of the light source, the luminous intensity and the position of the real lamp to obtain the configuration parameters of the supplementary light source.
In some embodiments, the light source configuration module 4553 is configured to, when no real light fixture is set in the real scene, use the second light fixture position of each virtual light fixture in the virtual scene as the virtual light fixture position corresponding to each supplementary light source respectively; when the real lamps are set in the real scene, the direction of each second lamp position towards the reference object position is used as a virtual illumination direction, the virtual illumination direction parallel to the differential illumination direction is used as a target direction, and the second lamp position in the target direction is used as the virtual lamp position of the supplementary light source.
In some embodiments, the light source configuration module 4553 is configured to determine a second illumination intensity of the supplemental light source of the target direction for the reference position based on the differential illumination intensity, the target direction; and obtaining the square of the second distance, and multiplying the square by the second illumination intensity to obtain the luminous intensity of the supplementary light source.
In some embodiments, the light source configuration module 4553 is configured to obtain performance parameters of the luminaire in the luminaire database, wherein the performance parameters of the luminaire include: the value interval of different types of illumination parameters; the following processing is performed for the configuration parameters of each supplemental light source: obtaining different types of illumination parameters in the configuration parameters, wherein the illumination parameters comprise: illumination intensity, illumination color temperature; matching the illumination parameters of each type with the performance parameters of different lamps, and taking the lamps meeting the matching conditions as target lamps corresponding to the supplementary light sources; wherein, the matching condition includes: each type of illumination parameter of the supplementary light source belongs to a value interval of the illumination parameter corresponding to the lamp.
In some embodiments, the illumination synchronization module 4554 is configured to set a corresponding virtual luminaire in the virtual scene based on the configuration information and configure a parameter corresponding to each virtual luminaire when the virtual scene is to be supplemented with illumination; when the real scene is to be supplemented with illumination, generating lamp control signals of the real lamps corresponding to each target lamp based on the configuration information, and sending each lamp control signal to each real lamp in the real scene, wherein the lamp control signals are used for setting configuration parameters of the real lamps.
In some embodiments, the illumination synchronization module 4554 is configured to obtain an environmental parameter in the real scene before obtaining the real illumination information corresponding to the real scene, where the environmental parameter includes a size of the real scene; and constructing a virtual scene based on the environment parameters of the real scene, wherein the virtual scene corresponds to the real scene one by one.
Embodiments of the present application provide a computer program product comprising a computer program or computer-executable instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer-executable instructions from the computer-readable storage medium, and the processor executes the computer-executable instructions, so that the electronic device executes the illumination information processing method according to the embodiment of the application.
The present embodiments provide a computer-readable storage medium storing computer-executable instructions, which when executed by a processor, cause the processor to perform the illumination information processing method provided by the embodiments of the present application, for example, the illumination information processing method as shown in fig. 3A.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, computer-executable instructions may be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, in the form of programs, software modules, scripts, or code, and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, computer-executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or, alternatively, on multiple electronic devices distributed across multiple sites and interconnected by a communication network.
In summary, according to the embodiment of the application, the illumination difference information is obtained by acquiring the illumination information in the virtual scene and the real scene and comparing and calculating the illumination information, so that the accuracy of acquiring the illumination difference in the virtual scene and the real scene is improved; the configuration parameters corresponding to the light supplementing light source are obtained based on the illumination difference information, the target light for supplementing light is obtained in the light database, and illumination synchronization is carried out on a real scene or a virtual scene to be supplemented with illumination based on the configuration information, so that accuracy and efficiency of illumination synchronization are improved, and cost required for carrying out illumination synchronization is saved.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and scope of the present application are intended to be included within the scope of the present application.

Claims (18)

1. A method for processing illumination information, the method comprising:
Acquiring real illumination information corresponding to a real scene;
obtaining virtual illumination information corresponding to a virtual scene;
comparing and calculating the virtual illumination information with the real illumination information to obtain illumination difference information;
performing light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for carrying out supplementary illumination;
matching processing is carried out on the configuration parameters of each supplementary light source and a lamp database, so that a target lamp corresponding to each supplementary light source in the lamp database is obtained;
and taking each target lamp and the configuration parameters corresponding to each target lamp as configuration information, and performing scene illumination synchronization based on the configuration information.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the real scene comprises at least one real reference object;
the obtaining real illumination information corresponding to the real scene includes:
when a real lamp is set in the real scene, detecting sub-illumination information of a reference object position where each real reference object is located, and combining each sub-illumination information to obtain real illumination information of the real scene;
When no real luminaire is set in the real scene, performing the following processing for each of the real references in the real scene:
shooting the real reference object according to different exposure time to obtain a plurality of first low dynamic range images corresponding to the real reference object;
synthesizing a first high dynamic range image based on each first low dynamic range image, wherein the first high dynamic range image carries sub-illumination information of a reference object position corresponding to the real reference object, and the pixel bit depth of the first high dynamic range image is higher than that of the first low dynamic range image;
and combining the sub-illumination information of each first high dynamic range image to obtain the real illumination information of the real scene.
3. The method of claim 2, wherein the virtual scene includes at least one virtual reference, wherein a position of each virtual reference in the virtual scene corresponds one-to-one with a position of each real reference in the real scene;
before the virtual illumination information corresponding to the virtual scene is obtained, the method further comprises:
The following is performed for each of the virtual references: acquiring a real position of a real camera used for shooting the real reference object in the real scene, wherein the relative position of the real reference object in the real scene is the same as the relative position of the virtual reference object in the virtual scene;
and acquiring a virtual position corresponding to the real position in the virtual scene, and setting a virtual camera at the virtual position, wherein the virtual camera is used for acquiring a second low dynamic range image corresponding to the virtual reference object.
4. The method according to claim 3, wherein the obtaining virtual lighting information corresponding to the virtual scene includes:
performing the following processing on each of the virtual references in the virtual scene:
shooting the virtual reference object according to different exposure time by using a virtual camera corresponding to the virtual reference object to obtain a plurality of second low dynamic range images corresponding to the virtual reference object;
synthesizing a second high dynamic range image based on each second low dynamic range image, wherein the second high dynamic range image carries sub-virtual illumination information of a reference object position corresponding to the virtual reference object, and the pixel bit depth of the second high dynamic range image is higher than that of the second low dynamic range image;
And obtaining the virtual illumination information of the virtual scene by using each sub virtual illumination information.
5. The method of claim 1, wherein the real illumination information is carried by a first high dynamic range image of the real scene; the virtual illumination information is carried by a second high dynamic range image of the virtual scene;
the comparing and calculating the virtual illumination information and the real illumination information to obtain illumination difference information comprises the following steps:
performing image comparison processing on the first high dynamic range image and the second high dynamic range image to obtain a difference high dynamic range image;
and taking the illumination information carried by the difference high dynamic range image as illumination difference information.
6. The method of claim 1, wherein the illumination difference information comprises at least one of: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference;
when the virtual scene is to be supplemented with illumination, performing light attenuation calculation processing based on illumination difference information to obtain configuration parameters of at least one supplementing light source, including:
Determining a real luminaire position of each of the supplemental light sources based on a first luminaire position of each of the real luminaires in the real scene and the differential illumination direction, and mapping each of the real luminaire positions to a virtual luminaire position in the virtual scene;
the following is performed for each of the supplemental light sources:
acquiring a first distance between the real lamp position and a reference object position, wherein the reference object position is a position of a real reference object in the real scene, and a direction of the real lamp position towards the reference object position is a target direction;
performing light attenuation calculation processing based on the difference illumination intensity, the target direction and the first distance to obtain the luminous intensity of the supplementary light source;
determining the light source color of a supplementary light source in the target direction based on the difference illumination color corresponding to each difference illumination direction and the target direction;
determining the lamplight color temperature of a supplementary light source in the target direction based on the differential illumination color temperature and the target direction corresponding to each differential illumination direction;
and combining the target direction, the lamplight color temperature, the light source color, the luminous intensity and the virtual lamp position to obtain configuration parameters of the supplementary light source.
7. The method of claim 6, wherein the determining the actual luminaire position for each of the supplemental light sources based on the first luminaire position for each of the actual luminaires in the actual scene and the differential illumination direction comprises:
when a virtual lamp is not set in the virtual scene, taking the first lamp position of each real lamp in the real scene as the real lamp position corresponding to each supplementary light source respectively;
when virtual lamps are set in the virtual scene, a first lamp position of each real lamp in the real scene and a reference object position of a real reference object are obtained, the direction of each first lamp position towards the reference object position is respectively used as a real illumination direction, the real illumination direction parallel to the differential illumination direction is used as a target direction, and the first lamp position of the target direction is used as a real lamp position of a supplementary light source.
8. The method of claim 6, wherein performing the light attenuation calculation based on the differential illumination intensity, the target direction, and the first distance to obtain the luminous intensity of the supplemental light source comprises:
Determining a first illumination intensity of a supplementary light source of the target direction for the reference object position based on the differential illumination intensity and the target direction;
and obtaining the square of the first distance, and multiplying the square by the first illumination intensity to obtain the luminous intensity of the supplementary light source.
9. The method of claim 1, wherein the illumination difference information comprises: for the differential illumination direction, differential illumination intensity, differential illumination color temperature of the reference;
when the real scene is to be supplemented with illumination, performing light attenuation calculation processing based on illumination difference information to obtain configuration parameters of at least one supplementing light source, including:
determining a virtual luminaire position of each of the supplemental light sources based on a second luminaire position of each virtual luminaire in the virtual scene and the differential illumination direction, and mapping each virtual luminaire position to a real luminaire position in the real scene;
the following is performed for each of the supplemental light sources:
obtaining a second distance between the virtual lamp position and a reference object position, wherein the reference object position is a position of a virtual reference object in the virtual scene, and a direction of the virtual lamp position towards the reference object position is a target direction;
Performing light attenuation calculation processing based on the difference illumination intensity, the target direction and the second distance to obtain the luminous intensity of the supplementary light source;
when the real lamp is not arranged in the real scene, acquiring the lamplight color temperature and the light source color of the virtual lamp at the position of the virtual lamp;
when a real lamp is set in the real scene, determining the light source color of the supplementary light source based on the differential illumination color and the target direction, and determining the light color temperature of the supplementary light source based on the differential illumination color temperature;
and combining the target direction, the lamplight color temperature, the light source color, the luminous intensity and the real lamp position to obtain configuration parameters of the supplementary light source.
10. The method of claim 9, wherein the determining a virtual luminaire location for each of the supplemental light sources based on the second luminaire location for each virtual luminaire in the virtual scene and the differential illumination direction comprises:
when no real lamp is set in the real scene, taking the second lamp position of each virtual lamp in the virtual scene as the virtual lamp position corresponding to each supplementary light source respectively;
When the real lamps are set in the real scene, the direction of each second lamp position towards the reference object position is used as a virtual illumination direction, the virtual illumination direction parallel to the differential illumination direction is used as a target direction, and the second lamp position in the target direction is used as a virtual lamp position of a supplementary light source.
11. The method of claim 9, wherein performing the light attenuation calculation based on the differential illumination intensity, the target direction, and the second distance to obtain the luminous intensity of the supplemental light source comprises:
determining a second illumination intensity of a supplementary light source of the target direction for the reference object position based on the differential illumination intensity and the target direction;
and obtaining the square of the second distance, and multiplying the square by the second illumination intensity to obtain the luminous intensity of the supplementary light source.
12. The method of claim 1, wherein the matching the configuration parameters based on each of the supplemental light sources with a luminaire database to obtain a target luminaire in the luminaire database corresponding to each of the supplemental light sources comprises:
Obtaining performance parameters of the lamp in the lamp database, wherein the performance parameters of the lamp comprise: the value interval of different types of illumination parameters;
the following processing is performed for the configuration parameters of each of the supplemental light sources:
obtaining different types of illumination parameters in the configuration parameters, wherein the illumination parameters comprise: illumination intensity, illumination color temperature;
matching each type of illumination parameter with performance parameters of different lamps, and taking the lamps meeting the matching conditions as target lamps corresponding to the supplementary light sources;
wherein the matching condition includes: each illumination parameter of the type of the supplementary light source belongs to a value interval of the illumination parameter corresponding to the lamp.
13. The method of claim 1, wherein the synchronizing scene lighting based on the configuration information comprises:
when the virtual scene is to be supplemented with illumination, setting corresponding virtual lamps in the virtual scene based on the configuration information, and configuring parameters corresponding to each virtual lamp;
when the real scene is to be supplemented with illumination, generating a lamp control signal of a real lamp corresponding to each target lamp based on the configuration information, and sending each lamp control signal to each real lamp in the real scene, wherein the lamp control signals are used for setting configuration parameters of the real lamps.
14. The method according to claim 1, wherein before the obtaining the real illumination information corresponding to the real scene, the method further comprises:
acquiring environmental parameters in the real scene, wherein the environmental parameters comprise the size of the real scene;
and constructing a virtual scene based on the environment parameters of the real scene, wherein the virtual scene corresponds to the real scene one by one.
15. An illumination information processing apparatus, characterized by comprising:
the illumination acquisition module is configured to acquire real illumination information corresponding to a real scene;
the illumination acquisition module is further configured to acquire virtual illumination information corresponding to the virtual scene;
the difference acquisition module is configured to compare and calculate the virtual illumination information and the real illumination information to obtain illumination difference information;
the light source configuration module is configured to perform light attenuation calculation processing based on the illumination difference information to obtain configuration parameters of at least one supplementary light source, wherein the supplementary light source is used for carrying out supplementary illumination;
the light source configuration module is further configured to perform matching processing with a lamp database based on the configuration parameters of each supplementary light source to obtain a target lamp corresponding to each supplementary light source in the lamp database;
And the illumination synchronization module is configured to take each target lamp and the configuration parameters corresponding to each target lamp as configuration information, and perform scene illumination synchronization based on the configuration information.
16. An electronic device, the electronic device comprising:
a memory for storing computer executable instructions;
a processor for implementing the illumination information processing method according to any one of claims 1 to 14 when executing computer-executable instructions stored in the memory.
17. A computer-readable storage medium storing computer-executable instructions, wherein the computer-executable instructions, when executed by a processor, implement the illumination information processing method of any one of claims 1 to 14.
18. A computer program product comprising a computer program or computer executable instructions which, when executed by a processor, implement the illumination information processing method of any one of claims 1 to 14.
CN202211484001.9A 2022-11-24 2022-11-24 Illumination information processing method and device, electronic equipment and storage medium Pending CN116485704A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211484001.9A CN116485704A (en) 2022-11-24 2022-11-24 Illumination information processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211484001.9A CN116485704A (en) 2022-11-24 2022-11-24 Illumination information processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116485704A true CN116485704A (en) 2023-07-25

Family

ID=87216614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211484001.9A Pending CN116485704A (en) 2022-11-24 2022-11-24 Illumination information processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116485704A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740201A (en) * 2023-08-14 2023-09-12 苏州深捷信息科技有限公司 HDR field intensity calculation method, device and storage medium based on LDR image
CN117496095A (en) * 2023-10-27 2024-02-02 神力视界(深圳)文化科技有限公司 Virtual-real light alignment method and device, electronic equipment and medium
CN118365839A (en) * 2024-06-19 2024-07-19 杭州群核信息技术有限公司 Method and device for generating lamplight display effect graph based on two-dimensional image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740201A (en) * 2023-08-14 2023-09-12 苏州深捷信息科技有限公司 HDR field intensity calculation method, device and storage medium based on LDR image
CN116740201B (en) * 2023-08-14 2023-10-13 苏州深捷信息科技有限公司 HDR field intensity calculation method, device and storage medium based on LDR image
CN117496095A (en) * 2023-10-27 2024-02-02 神力视界(深圳)文化科技有限公司 Virtual-real light alignment method and device, electronic equipment and medium
CN118365839A (en) * 2024-06-19 2024-07-19 杭州群核信息技术有限公司 Method and device for generating lamplight display effect graph based on two-dimensional image

Similar Documents

Publication Publication Date Title
CN116485704A (en) Illumination information processing method and device, electronic equipment and storage medium
US20150338722A1 (en) System and method for re-configuring a lighting arrangement
CN111833423A (en) Presentation method, presentation device, presentation equipment and computer-readable storage medium
WO2016161486A1 (en) A controller for and a method for controlling a lighting system having at least one light source
WO2019127317A1 (en) Control method and control system for light fixture, and electronic device
CN116506993A (en) Light control method and storage medium
CN111698391B (en) Method for controlling real-time change of light parameters through simulated environment light parameters
US12015851B2 (en) System and method for visual enhancement of a scene during capture thereof
CN116486048A (en) Virtual-real fusion picture generation method, device, equipment and system
US10121451B2 (en) Ambient light probe
Pomaska Stereo vision applying opencv and raspberry pi
US20200257831A1 (en) Led lighting simulation system
US10621769B2 (en) Simplified lighting compositing
KR102657733B1 (en) Multi-spectral volumetric capture
US11979692B2 (en) Systems and methods for optimal color calibration for LED volume stages
US9615009B1 (en) Dynamically adjusting a light source within a real world scene via a light map visualization manipulation
JPH04212193A (en) Illumination control method
US20080247727A1 (en) System for creating content for video based illumination systems
CN114022646A (en) Virtual film production light synchronization method and device, storage medium and electronic equipment
CN115861502A (en) Weather rendering method and device in virtual environment, storage medium and electronic equipment
KR102677114B1 (en) Lighting matching system for real and virtual environments based on in-camera visual effects
CN117424970B (en) Light control method and device, mobile terminal and storage medium
CN110493540A (en) A kind of scene dynamics illumination real-time collecting method and device
Navvab et al. Evaluation of historical museum interior lighting system using fully immersive virtual luminous environment
Wang Measurement and application of spectrum curve for virtual digital studio simulation scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40091423

Country of ref document: HK