CN117478860A - Method for establishing color calibration mapping relation, virtual shooting system and related device - Google Patents

Method for establishing color calibration mapping relation, virtual shooting system and related device Download PDF

Info

Publication number
CN117478860A
CN117478860A CN202311395095.7A CN202311395095A CN117478860A CN 117478860 A CN117478860 A CN 117478860A CN 202311395095 A CN202311395095 A CN 202311395095A CN 117478860 A CN117478860 A CN 117478860A
Authority
CN
China
Prior art keywords
color
acquisition
image
shooting
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311395095.7A
Other languages
Chinese (zh)
Inventor
李晓阳
陈石平
刘灏
华伟彤
梅大为
张欢
张中杰
刘杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenli Vision Shenzhen Cultural Technology Co ltd
Original Assignee
Shenli Vision Shenzhen Cultural Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenli Vision Shenzhen Cultural Technology Co ltd filed Critical Shenli Vision Shenzhen Cultural Technology Co ltd
Priority to CN202311395095.7A priority Critical patent/CN117478860A/en
Publication of CN117478860A publication Critical patent/CN117478860A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/15Processing image signals for colour aspects of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The present disclosure provides a method for establishing a color calibration mapping relationship, a virtual shooting system and a related device, wherein a main control end maintains at least one conversion rule, and each conversion rule is used for converting a collected signal in a specific format into an image in a specific color space; the method comprises the following steps: the method comprises the steps that an original color is obtained and sent to a rendering engine, the rendering engine renders the original color into a rendering image based on a target color space, and the rendering image is displayed by a display screen; receiving acquisition signals transmitted by shooting equipment in the process of shooting a display screen to display a rendered image; acquiring an acquisition format of an acquisition signal and a target color space, and further inquiring an adaptive target conversion rule; converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to the rendering images in a target color space based on a target conversion rule; acquiring an acquisition color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.

Description

Method for establishing color calibration mapping relation, virtual shooting system and related device
Technical Field
The disclosure relates to the technical field of virtual shooting, and in particular relates to a method for establishing a color calibration mapping relationship, a virtual shooting system and a related device.
Background
At present, some film and television production adopts a virtual shooting technology, and in a shooting site, a rendering device can render a virtual scene and display the virtual scene in a display screen; the actors can perform in front of the display screen based on virtual scenes displayed by the display screen, and the shooting device can shoot the actors and the virtual scenes. Before formal shooting, a color calibration mapping relation between a rendering color of the rendering device and an acquisition color of the shooting device can be established in advance; this is due to the influence of various factors such as the performance of the photographing apparatus or the external ambient light, and the collected color in the image obtained by photographing the display screen by the photographing apparatus may be different from the rendered color in the rendered image rendered to be displayed on the display screen. Therefore, how to quickly and conveniently establish the color calibration mapping relationship is a technical problem to be solved.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a method for establishing a color calibration mapping relationship, a virtual photographing system, and related devices.
According to a first aspect of embodiments of the present disclosure, a method for establishing a color calibration mapping relationship is provided, where the method is applied to a master control end in a virtual shooting system, where the master control end maintains at least one conversion rule, and each conversion rule is used for converting an acquisition signal in a specific format into an image in a specific color space; the virtual shooting system also comprises a rendering engine, a display screen and shooting equipment, and the method comprises the following steps:
acquiring one or more original colors and sending the one or more original colors to the rendering engine, rendering the one or more original colors into one or more rendered images by the rendering engine based on a target color space used by the rendering engine, and displaying each of the rendered images by the display screen;
receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image;
acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule;
Based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space;
and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.
According to a second aspect of embodiments of the present disclosure, there is provided an apparatus for establishing a color calibration mapping relationship, where the apparatus is applied to a master terminal in a virtual photographing system, and the master terminal maintains at least one conversion rule, where each conversion rule is used to convert an acquisition signal in a specific format into an image in a specific color space; the virtual shooting system further comprises a rendering engine, a display screen and shooting equipment, and the device comprises:
the sending module is used for acquiring one or more original colors and sending the one or more original colors to the rendering engine, the rendering engine renders the one or more original colors into one or more rendered images based on a target color space used by the rendering engine, and the display screen displays each rendered image;
a receiving module for: receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image;
A query module for: acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule;
a conversion module for: based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space;
the establishing module is used for: and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.
According to a third aspect of embodiments of the present specification, there is provided a virtual photographing system including a main control computer, a rendering device, a display screen, and a photographing device; the main control computer runs a main control program, and the main control program realizes the steps of the method embodiment of the first aspect when being executed by the processor.
According to a fourth aspect of embodiments of the present specification, there is provided a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method embodiments of the first aspect are implemented when the computer program is executed by the processor.
According to a fifth aspect of embodiments of the present specification, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method embodiments of the first aspect described above.
The technical scheme provided by the embodiment of the specification can comprise the following beneficial effects:
in the embodiment of the specification, the main control end maintains at least one conversion rule, and each conversion rule is used for converting a collected signal in a specific format into an image in a specific color space; therefore, the master control end can acquire one or more original colors and send the one or more original colors to the rendering engine, the rendering engine renders the one or more original colors into one or more rendering images based on a target color space used by the master control end, and a display screen displays each rendering image; receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image; acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule; based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space; and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation. In the above embodiment, since the master control end maintains a plurality of conversion rules, the master control end can adapt to a plurality of different photographing devices, and the master control end can not only convert the acquisition signals transmitted by the photographing devices into the color space of the rendering engine, but also establish a color calibration mapping relationship. Therefore, various toning software is not required to be introduced, and a user does not need to manually copy data from a storage medium of the shooting equipment, so that the efficiency of establishing the color calibration mapping relation is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the specification and together with the description, serve to explain the principles of the disclosure.
Fig. 1A is a schematic view of a scene of a virtual shot according to an exemplary embodiment of the present disclosure.
Fig. 1B is a scene diagram illustrating establishment of a color calibration mapping relationship in the related art according to an exemplary embodiment of the present specification.
Fig. 2A is a flowchart illustrating a method for establishing a color calibration mapping relationship according to an exemplary embodiment of the present disclosure.
Fig. 2B is a schematic diagram of an output color calibration comparison result according to an exemplary embodiment of the present disclosure.
Fig. 2C is a schematic diagram of a color calibration comparison result before and after calibration of a certain original color according to an exemplary embodiment of the present disclosure.
Fig. 2D is a schematic diagram of a scenario illustrating the establishment of a color calibration mapping relationship according to an exemplary embodiment of the present disclosure.
Fig. 3 is a hardware configuration diagram of a computer device where a device for establishing a color calibration mapping relationship is located according to an exemplary embodiment of the present disclosure.
Fig. 4 is a block diagram of a device for establishing a color calibration map according to an exemplary embodiment of the present specification.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present specification. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present description as detailed in the accompanying claims.
The terminology used in the description presented herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this specification to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of the present description. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
User information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in this disclosure are both user-authorized or fully authorized information and data by parties, and the collection, use and processing of relevant data requires compliance with relevant laws and regulations and standards of relevant countries and regions, and is provided with corresponding operation portals for user selection of authorization or denial.
Virtual shooting is a virtual shooting technique that constructs a virtual background using a display screen. The method combines real-time rendering and LED (Light Emitting Diode ) display technologies, and can present a realistic virtual environment in real time on site to replace the traditional green curtain or blue curtain shooting. The main principle of virtual photography is to digitize and render the background in real time onto a set of large display screens. The displays are arranged around the shooting area and the display content can be adjusted in real time as required to present a realistic virtual background.
As shown in fig. 1A, a schematic diagram of a virtual shooting scene according to an exemplary embodiment of the present disclosure may include a virtual shooting system formed by one or more computer devices, where the virtual shooting scene may include, as an example, one or more of the following devices in combination: one or more host computers 011, one or more rendering devices 021 (which may also be referred to as a screen loader), one or more broadcast control processing devices 031, one or more display screens 040 (3 display screens are shown: display screen 041, display screen 042 and display screen 043), and one or more photographing devices 051; the number of the various devices can be flexibly configured according to actual needs, and the embodiment is not limited to this. In practical applications, other devices, such as a mobile terminal or a network device, may be further included in the virtual shooting system according to needs, which is not limited in this embodiment.
Optionally, each host computer 011 may be connected to one or more rendering devices 021, and a specific connection manner may be selected according to actual requirements and compatibility of the devices. As an example, a connection such as wired or wireless may be made through a local area network or the internet, and communication may be made using a network transmission protocol. As an example, the main control machine may transmit various control instructions to the rendering device connected thereto, for example, may be a control instruction containing specific image information, or the like.
Optionally, each rendering device 021 may be connected to one or more broadcast control processing devices 031; the specific connection mode can be selected according to the actual requirements and the compatibility of the equipment. As an example, a DP (DisplayPort, a digital display interface standard) connection may be included, which may be used to transmit high quality audio and video signals. Or HDMI (High-Definition Multimedia Interface, a High-definition digital audio-video interface standard), which can combine audio, video and control signals for transmission over one cable. As an example, the rendering device 021 may send various control instructions to a broadcast control processing device connected thereto, for example, the rendering device may serve as an image signal source, send a control instruction containing a rendering image, and the like.
In practice, the play control processing device 031 is optional, and may not be configured in some scenarios. Optionally, each of the broadcast control processing devices 031 may be connected to one or more display screens 040 (display screen 041, display screen 042 and display screen 043 are shown in the figure); the specific connection mode can be selected according to the actual requirements and the compatibility of the equipment. As examples, DP or HDMI connections may also be included, USB (Universal Serial Bus ) or network connections, etc. The broadcast control processing device may be used to control and manage a display screen connected thereto, as an example, the broadcast control processing device 031 may be used for data transmission and decoding, e.g., the broadcast control processing device 031 may receive a signal from an external source (e.g., an on-screen machine, a computer, a mobile terminal or a media player, etc.) and decode it into a format suitable for display on the display screen; the method can also be used for display control, such as overall control and scheduling of a display screen, including brightness adjustment, color correction, gray scale control and the like; the method can be used for carrying out partition management on the display screen, the display screen can be divided into a plurality of independent areas, and each area can display different contents.
Optionally, each master control machine 011 may be connected to one or more photographing devices 051; the specific connection mode can be selected according to the actual requirements and the compatibility of the equipment. As an example, a wired connection such as HDMI or SDI (Serial Digital Interface, a digital video transmission standard) may be included, and a wireless connection such as Wi-Fi (Wireless Fidelity ) or RF (Radio Frequency) may be included. The photographing device 051 may transmit photographed data to the main control computer.
Alternatively, the display screen 040 may be an LED screen, a liquid crystal screen, or other types, and may be a curved screen or a flat screen, and it should be understood that, according to actual needs, those skilled in the art may set the type, number, size, resolution, etc. of the display screen in the virtual shooting system in a user-defined manner, which is not limited in this embodiment of the present disclosure. It should be understood that the embodiments of this specification are not limited in the manner in which the devices communicate with each other.
Virtual shooting requires a display screen to render a background picture, and the acquired color obtained by shooting the display screen by shooting equipment is inconsistent with the rendering color received by the display screen; the reasons for this are: 1. different brands, even different batches of display screens of the same brand, have different accuracy in their color rendering due to inconsistent quality control standards. 2. For example, the spectrum distribution of a display screen such as an LED is relatively narrow, and small differences in the response of a sensor in a photographing device may cause a picture acquired by the photographing device to have obvious color differences from a rendered picture of the display screen, and different photographing devices have inconsistent performances on the same display screen. Therefore, certain distortion exists between the rendering color displayed by the display screen and the acquired color shot by the shooting device, so that color calibration is required to be performed on each display screen and the shooting device in the virtual shooting site, and the acquired color acquired by the shooting device is ensured to be consistent with the rendering color received by the display screen.
Some rendering devices may run a rendering Engine, such as a UE (virtual Engine) or the like, to render the virtual Engine in real time. The rendering engine has its own color space, and different brands or models of photographing devices also have their own color processing systems, and the photographing device recording typically selects a Raw (Raw image format) format for recording. As shown in fig. 1B, a scene diagram of the related art in which a color calibration mapping relationship is established is shown in this specification according to an exemplary embodiment, in this solution, a rendering engine may render a self-luminous RGB image in a no-illumination mode and display the self-luminous RGB image on an LED screen, and after a photographing device photographs a picture of the LED screen, a user is required to copy Raw data collected by the photographing device from a storage medium of the photographing device, for example, an SD (Secure Digital) card, into third party toning software a adapted to the photographing device. Wherein, because the Raw data refer to the original data read directly from the image sensor by the shooting device during shooting, the third-party toning software A is required to convert the Raw data into the color space of the rendering engine; here, since different photographing apparatuses employ different color processing algorithms, different software may be required for color conversion of Raw data of different photographing apparatuses into a color space of a rendering engine. In addition, after conversion, a color calibration mapping relation LUT (Look-Up Table) is generated based on the rendering color and the acquisition color, and the aforementioned third party palette software a that can be adapted to the photographing device may not have the LUT function, and at this time, other third party palette software B may also need to be run to generate the color calibration mapping relation LUT, and finally, the generated color calibration mapping relation LUT is imported into the rendering engine for subsequent use by the rendering engine.
Therefore, in the virtual shooting scene, the process of acquiring the color calibration mapping relation in the color calibration process is complicated; for example, the user needs to manually import the data recorded in the SD card by the photographing device into the third party palette software; in addition, different third party toning software may need to be used multiple times to complete the output of the conversion of the collected colors and the color calibration mapping relationship, thus being inefficient and prone to error.
Based on this, the embodiment of the specification provides a method for establishing a color calibration mapping relationship, which can be applied to a master control end in a virtual shooting system, wherein the virtual shooting system further comprises a rendering engine, a display screen and shooting equipment. As shown in fig. 2A, a flowchart of a method for establishing a color calibration mapping relationship according to the present embodiment of the present invention is shown, and the method embodiment is applicable to a master terminal in a virtual shooting system, where the master terminal maintains at least one conversion rule. The method may comprise the steps of:
step 202, obtaining one or more original colors and sending the obtained one or more original colors to the rendering engine, rendering the one or more original colors into one or more rendered images by the rendering engine based on a target color space used by the rendering engine, and displaying each rendered image by the display screen.
Step 204, receiving an acquisition signal transmitted by the shooting device in the process of shooting the display screen to display each rendered image.
Step 206, acquiring an acquisition format of an acquisition signal transmitted by the shooting device and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule.
Step 208, converting the acquired signals transmitted by the shooting device into acquired images corresponding to each of the rendered images in the target color space based on the target conversion rule.
Step 210, acquiring an acquisition color corresponding to each original color from the acquired image obtained by conversion, so as to further establish a color calibration mapping relationship.
In some examples, the master control end to which the method of the present embodiment is applied may be a software program running on a master control machine in the embodiment shown in fig. 1A, where the master control machine may specifically be a computer device, including but not limited to a server, a cloud server, a server cluster, a tablet computer, a personal digital assistant, a laptop computer, or a desktop computer.
The original color in step 202 may be configured by a user, for example, the master end may provide a configuration interface, through which one or more original colors configured by the user are obtained. The main control terminal can automatically generate a plurality of original colors by utilizing the color generation rule, and the embodiment does not limit the original colors and specific colors, and can be configured according to the needs in practical application.
As an example, a color may be represented by a color value, such as an RGB (RED, green, blue, RED Green Blue) value, and each color channel may take a value between 0 and 255; alternatively, the subsequent processing can be performed after 0 to 255 are normalized. As an example, n different values may be uniformly sampled for each color channel, so as to form all original colors, where the values of n may be flexibly configured. Taking n as an example of 16, there may be a total of 4096 (16 x 16) different original colors.
The rendering engine of the present embodiment may be mounted in the rendering apparatus of the embodiment shown in fig. 1A. In practical applications, the rendering Engine may be various, for example, the aforementioned real-time rendering illusion Engine such as UE (universal Engine), which is not limited in this embodiment. The rendering device can be further loaded with other software according to needs, for example, a high-performance multi-screen display management system nDisplay based on Unreal Engine can be loaded, nDisplay is a plug-in UE, and is mainly used for synchronously rendering the same frame on a plurality of devices by one virtual scene, each device renders a part of a picture, and the rendered pictures of all the devices are ensured to have correct viewing cones and are in seamless connection.
In step 204, the master may send each raw color to the rendering engine; wherein, all original colors can be simultaneously sent to a rendering engine, and then the rendering engine receives all the original colors and then renders a rendering image for each original color; or each original color of the master control end is independently sent to the rendering engine, which is not limited in this embodiment. The rendered image may be obtained by rendering one original color value or may be obtained by rendering a plurality of original color values, which is not limited in this embodiment. The rendering engine has a self color space, and when the rendering engine renders an image based on the original color, the rendering engine renders the image based on the target color space used by the rendering engine. In practical application, different rendering engines can use different color spaces, the same rendering engine can configure a plurality of different color spaces, one of the different color spaces can be selected for use based on the configuration of a user and the like, and the specific color space is not limited in the embodiment; for example, rec.709 or rec.2020, etc. may be included. Wherein Rec.709 refers to the ITU-R BT.709 standard, which stands for International telecommunication Union radio communication department (International Telecommunication Union-Radiocommunication Sector), BT being an abbreviation for Broadcasting Television; thus, the ITU-R bt.709 standard is a recommendation issued by the international telecommunications union for the broadcast television field; rec.709 defines chromaticity coordinates and luminance ranges of the three primary colors red, green, and blue, and a range of digitally encoded values for representing an image. The standard specifies an accurate representation of color and brightness in high definition televisions to achieve image consistency across different devices. Rec.2020 refers to the ITU-R BT.2020 standard, also known as the UHDTV standard or the ultra high definition television standard, which is a color space widely used for High Dynamic Range (HDR) and Wide Color Gamut (WCG) content, with the color gamut of Rec.2020 being wider relative to Rec.709, while the description of the luminance range is also more accurate.
In this embodiment, the master control end may receive an acquisition signal transmitted by the capturing device in a process of capturing each of the rendered images displayed on the display screen. For example, the main control computer where the main control terminal is located can be connected with the shooting equipment in a wired or wireless mode, so that the acquisition signal transmitted by the shooting equipment is acquired through the connection between the main control computer and the shooting equipment.
In some examples, the main control terminal operates on a main control computer, the main control computer is configured with a video acquisition card, and the video acquisition card is connected with the shooting device through a transmission line.
The receiving the acquisition signal transmitted by the shooting device in the process of shooting the display screen to display each rendered image may include:
and receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image through the video acquisition card and the transmission line.
The transmission line may be configured as needed, for example, may be an SDI transmission line, or an HDMI transmission line, etc., which is not limited in this embodiment. The video acquisition card configured by the main control computer corresponds to the transmission line, for example, the main control computer configures the SDI video acquisition card, and then the main control computer is connected with the shooting equipment through the SDI transmission line. Thus, the shooting equipment can transmit the shot video stream signals to the main control computer in real time. In some examples, based on the virtual shooting scene of the present embodiment, the SDI may be selected, and the SDI interface may perform long-distance transmission and may ensure stable signal quality.
The shooting device can provide output functions of acquisition signals in various different formats, and in practical application, the shooting device can be selected to output a Raw signal or Log signal and the like. Different photographing apparatuses may output acquisition signals of different formats.
The Log signal is an image coding mode based on logarithmic transformation and is used for recording a large dynamic range. The sensor of the camera device can capture a fairly broad range of brightness, but standard display devices (e.g., televisions, displays) cannot simultaneously restore such a large range because of their limited display capabilities. In order to solve this problem, the photographing apparatus may convert the collected Raw data into Log signals, and compress a larger range of luminance information into a range that the display apparatus can restore. By Log signals, darker and lighter details captured by the camera device can be better preserved and adjusted in post-processing. Different manufacturers or models of shooting equipment can adopt different Log function curves, so that different shooting equipment can output Log signals in different formats.
In this embodiment, the master control maintains at least one conversion rule, where each conversion rule is used to convert the collected signal in a specific format into an image in a specific color space. For example, the Log signal of the photographing apparatus is obtained by compressing the Raw data to a smaller range by the photographing apparatus through a built-in Log function, so as to better process and post-color-tone. However, some image information is lost in this process, especially in terms of brightness and color details. Therefore, in performing the post-toning, compositing, outputting, etc., it is necessary to restore the image information and convert the Log signal into a standard color space, such as rec.709, rec.2020, etc. In order to achieve the conversion of Log signals into a standard color space, the camera vendor may provide some conversion LUT (look-up table). These LUTs are pre-calculated values that can be converted to a specific color space for use in post-processing software. Different camera manufacturers will provide different LUTs to accommodate the Log functions and color space they use.
Based on this, in this embodiment, the master control end may maintain multiple conversion rules based on different photographing devices, so as to adapt the conversion from the collected signals with multiple different formats to multiple different color spaces. In actual implementation, the conversion rules from the acquired signals of the plurality of shooting devices to different color spaces can be obtained in advance and configured in the main control terminal. For example, the conversion rules maintained by the master may include: conversion rule of the acquisition signal of format 1 to color space 1; conversion rules of the acquisition signal of format 1 into color space 2; conversion rules of the acquisition signal of format 1 into color space 3; conversion rule of the acquisition signal of format 2 to color space 1; conversion rules of the acquisition signal of format 2 into color space 2, etc.
Thus, in step 206, in the primary color calibration procedure, the master control end may acquire the acquisition format of the acquisition signal transmitted by the current capturing device, and acquire the target color space used by the rendering engine, and further may query the adapted target conversion rule from the maintained multiple conversion rules.
The acquisition format of the acquisition signal transmitted by the current shooting device can be acquired in various modes. For example, the master may provide a format configuration interface for configuring the format of the acquisition signal; the acquiring the acquisition format of the acquisition signal transmitted by the shooting equipment comprises the following steps: and responding to a received call request for the format configuration interface initiated by a user, and acquiring the format of an acquisition signal of the shooting equipment configured by the user, wherein the acquisition signal is contained in the call request.
The format configuration interface can be specifically realized through a graphical user interface, so that the configuration of the format of the collected signals is provided for a user. The user can input the format of the acquisition signal through the interaction element (such as a text box, a drop-down menu, a check box and the like) on the interface, and the interface generates a call request based on the configuration of the user, so that the main control terminal can acquire the format of the acquisition signal of the shooting device configured by the user, which is contained in the call request.
The host may provide a configuration interface for configuring the target color space used by the rendering engine, through which information of the target color space used by the rendering engine submitted by the user is acquired. Or, the main control end can also be in butt joint with the rendering engine, the rendering engine can provide an interactive interface, the main control end calls the interactive interface, and the rendering engine feeds back the target color space used by the main control end through the interactive interface.
In step 208, the master control end may convert the acquired signal transmitted by the capturing device into an acquired image corresponding to each of the rendered images in the target color space based on the target conversion rule. For example, when the display screen displays the rendered images, the photographing device photographs and transmits the rendered images to the main control terminal in real time, the main control terminal can determine the time when each rendered image is displayed on the display screen, further extract video frame images corresponding to each rendered image from the acquired signals, and then convert the video frame images into the acquired images under the target color space based on the target conversion rule.
In some examples, the display screen displays each of the rendered images at a set time; the acquisition signal is a video acquisition signal.
The converting, based on the target conversion rule, the acquired signal transmitted by the capturing device into the acquired image corresponding to each of the rendered images in the target color space may include:
based on the display time of each rendering image, acquiring video frame images corresponding to each rendering image from video acquisition signals transmitted by the shooting equipment;
and based on the target conversion rule, converting each acquired video frame image into an acquisition image corresponding to each rendering image in the target color space.
In this embodiment, the capturing device captures a video capture signal, i.e., a video stream, when displaying each rendered image on the display screen. The main control terminal can extract video frame images corresponding to each rendering image from the collected video according to the moment when each rendering image is displayed as the collected image. The master control end needs to accurately determine the moment when each rendered image is displayed on the display screen. For example, the display screen displays the rendered image 1 at time t1, the rendered image 2 at time t2, and the rendered image 3 at time t 3. The display screen displays each rendering image in real time, and the shooting equipment acquires and transmits signals to the main control end in real time. In this way, the main control end can obtain the display time of each rendering image, further obtain video frame images corresponding to each rendering image from the video acquisition signals transmitted by the shooting device, and further convert each obtained video frame image into an acquisition image corresponding to each rendering image in the target color space based on the target conversion rule.
Further, in step 210, an acquisition color corresponding to each of the original colors may be obtained from the converted acquisition image, so as to further establish a color calibration mapping relationship. For example, the number of the collected colors is m, the number of the original colors is m, a one-to-one correspondence between m groups of collected colors and the original colors is obtained, and a color calibration mapping relationship, namely an LUT table, can be further generated by utilizing the m groups of correspondence; the specific generation process can be realized by configuring an LUT generation algorithm and the like on the main control end, and the main control end calls the built-in LUT generation algorithm to generate.
In some examples, the method may further comprise:
and sending the established color calibration mapping relation to the rendering engine so that the rendering engine can locally complete the configuration of the received color calibration mapping relation.
In this embodiment, the color calibration mapping relationship established by the master control end may be sent to the rendering engine, and in a specific sending process, the rendering engine may provide a relevant configuration interface, the master control end may generate a call request for the configuration interface, where the call request includes the established color calibration mapping relationship, and the rendering engine obtains the call request through the interface, obtains the color calibration mapping relationship established by the master control end from the call request, and completes the configuration of the received color calibration mapping relationship locally. And then, the rendering engine can convert the virtual scene image to be displayed based on the configured color calibration mapping relation and output the virtual scene image to a display screen for display.
In some examples, to intuitively present the calibration result of the color calibration mapping to the user, the method may further include:
after the color calibration mapping relation is established, outputting a color calibration comparison result to a user; wherein, the color calibration comparison result may include: comparing the original color with the collected color corresponding to the original color and comparing the original color with the collected color corresponding to the calibration color; the calibration color is a color obtained by calibrating the original color through the color calibration mapping relation.
The output color calibration comparison result may relate to comparison of one original color, or may include comparison of multiple original colors, which is not limited by the specific color number in this embodiment.
For example, fig. 2B is a schematic diagram of a color calibration comparison result in this embodiment, where a gray scale is used for illustration, and a color chart may be used in practical application. The figure shows the comparison of 64 original colors, and in the comparison diagram on the left side in fig. 2B, 64 pairs of color patches are included, each pair of color patches represents an acquisition color of an original color corresponding to the original color; the left color block in each pair of color blocks represents the original color (not calibrated by the color calibration mapping relation), and the right color block represents the acquisition color corresponding to the original color. The right alignment chart in fig. 2B also contains 64 pairs of color patches, each pair of color patches representing an acquisition color corresponding to the original color and the calibration color; the color block on the left side in each pair of color blocks represents the original color, and the color block on the right side represents the acquisition color corresponding to the calibration color (i.e. the color of the original color calibrated by the color calibration mapping relation).
The description will be made with reference to a schematic diagram of comparison of one of the original colors shown in fig. 2C, which can be understood as a pair of color patches at the same position respectively taken out of the 64 pairs of color patches on the left side and the 64 pairs of color patches on the right side of fig. 2B. In fig. 2C, the left pair of color blocks includes an original color C, and the rendering engine is not calibrated to render an image thereof and display the image on the display screen, and the photographing device photographs the image, so as to obtain an acquisition color C ', where the acquisition color C' may be obtained through the process of the foregoing embodiment of the method for establishing the color calibration mapping relationship.
In FIG. 2C, one on the leftThe color matching block comprises an original color C and an acquisition color C 1 '. Specifically, the master control end may convert the original color C into the calibration color C by using the established color calibration mapping relationship 1 The calibration color C 1 It can be understood that: in the formal shooting process, if the master control end sends the original color C to the rendering engine, the rendering engine converts the original color C into a calibrated color C which is rendered and displayed on the display screen based on the self-configured color calibration mapping relation 1 . The master control end further needs to acquire and calibrate the color C 1 Corresponding acquisition color C 1 ' similarly, the foregoing embodiment performs data acquisition before the color calibration mapping relationship is established, and the photographing device acquires a plurality of acquired colors to inquire about C 1 Corresponding acquisition color C 1 ' it is only necessary. In this way, the original color C and the acquisition color C can be displayed 1 ' alignment; and the user passes through C' and C 1 ' the difference of the collected colors of the cameras before and after calibration can also be sensed.
In some examples, the color calibration comparison result may further include:
color difference between the original color and the collected color corresponding to the original color; and a color difference between the original color and the collected color corresponding to the calibration color.
In this embodiment, the color difference between the front and rear color values may be calculated by using the color value of the original color and the color value of the collected color corresponding to the original color. Similarly, the color difference between the front color value and the rear color value can be calculated by using the color value of the original color and the color value of the collected color corresponding to the calibration color. For example, for FIG. 2C, the color difference of C and C' can be calculated, as well as C and C 1 The chromatic aberration of' is output, and the user can obtain objective color calibration result.
Under the condition that a plurality of original colors are involved in the color calibration comparison result, the color difference between each original color and the collected color corresponding to the original color can be calculated, a plurality of color difference values are obtained, and an average value is obtained, so that a first color difference average value is obtained. And similarly, calculating the color difference between each original color and the acquired color corresponding to the calibrated color of the original color to obtain a plurality of color difference values, and taking an average value to obtain a second color difference average value. And finally, outputting two color difference average values. For example, in fig. 2B, for 64 original colors, based on the color difference between each of the 64 original colors and the respectively corresponding collected colors, 64 color difference values are obtained and averaged in the left side, so as to obtain a first color difference average value of 0.039 (the color values may be normalized in practical application, if necessary); similarly, a similar calculation is performed in the right side, resulting in a first color difference average value of 0.002.
In the embodiment, when the color calibration comparison result is output, two color differences are also output to the user, so that the user can intuitively find the calibration result of the color calibration mapping relation.
Fig. 2D is a schematic diagram of a scenario of setting up a color calibration mapping relationship according to an exemplary embodiment of the present disclosure, where in this embodiment, a main control unit may obtain an acquisition signal of a photographing device through a transmission line such as an SDI, so as to avoid copying acquisition data manually through an SD card, simplify an acquisition process, and improve efficiency of color calibration. Secondly, the scheme directly analyzes the acquisition signal of the shooting device into the acquisition color under the target color space used by the rendering engine in the software of the main control computer (namely the main control end in the embodiment), and uses the acquisition color set and the rendering color set to generate a calibration LUT, and the LUT is directly output to the rendering engine in the upper screen computer by the main control end, so that the problem of using third-party color matching software is avoided, the color calibration efficiency is improved, and the use flexibility is improved. Thirdly, the scheme gives subjective evaluation standards of the color calibration effect by outputting comparison results before and after calibration; and outputting the average color difference between the rendering color values before and after calibration and the collected color values to give objective evaluation standards of the color calibration effect.
Corresponding to the foregoing embodiments of the method for establishing a color calibration mapping relationship, the present disclosure further provides embodiments of a device for establishing a color calibration mapping relationship and a computer to which the device is applied.
The embodiment of the device for establishing the color calibration mapping relation in the specification can be applied to computer equipment, such as a server or terminal equipment. The apparatus embodiments may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions in a nonvolatile memory into a memory by a processor where the device is located. In terms of hardware, as shown in fig. 3, a hardware structure diagram of a computer device where the device for establishing a color calibration mapping relationship in this specification is located is shown in fig. 3, and besides the processor 310, the memory 330, the network interface 320, and the nonvolatile memory 340 shown in fig. 3, the computer device where the device 331 for establishing a color calibration mapping relationship in this embodiment is located may generally include other hardware according to the actual function of the computer device, which is not described herein again.
As shown in fig. 4, fig. 4 is a block diagram of a device for establishing a color calibration mapping relationship, which is applied to a master terminal in a virtual photographing system and maintains at least one conversion rule, each of which is used to convert an acquisition signal in a specific format into an image in a specific color space, according to an exemplary embodiment of the present disclosure; the virtual shooting system further comprises a rendering engine, a display screen and shooting equipment, and the device comprises:
A transmitting module 41, configured to acquire one or more original colors and transmit the one or more original colors to the rendering engine, wherein the rendering engine renders the one or more original colors into one or more rendered images based on a target color space used by the rendering engine, and the display screen displays each of the rendered images;
a receiving module 42 for: receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image;
a query module 43 for: acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule;
a conversion module 44 for: based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space;
a setup module 45 for: and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.
In some examples, the master provides a format configuration interface for configuring the format of the acquisition signal;
the acquiring the acquisition format of the acquisition signal transmitted by the shooting equipment comprises the following steps:
and responding to a received call request for the format configuration interface initiated by a user, and acquiring the format of the acquisition signal of the shooting equipment configured by the user, wherein the format is contained in the call request.
In some examples, the apparatus further comprises an output module for:
after the color calibration mapping relation is established, outputting a color calibration comparison result to a user; wherein the color calibration comparison result comprises: comparing the original color with the collected color corresponding to the original color and comparing the original color with the collected color corresponding to the calibration color; the calibration color is a color obtained by calibrating the original color through the color calibration mapping relation.
In some examples, the color calibration comparison result further includes:
color difference between the original color and the collected color corresponding to the original color; and a color difference between the original color and the collected color corresponding to the calibration color.
In some examples, the main control terminal operates on a main control computer, the main control computer is configured with a video acquisition card, and the video acquisition card is connected with the shooting equipment through a transmission line;
the receiving the acquisition signal transmitted by the shooting device in the process of shooting the display screen to display each rendered image comprises the following steps:
and receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image through the video acquisition card and the transmission line.
In some examples, the video capture card comprises an SDI video capture card and the transmission line comprises an SDI transmission line.
In some examples, the display screen displays each of the rendered images for a set duration; the acquisition signal is a video acquisition signal;
the converting, based on the target conversion rule, the acquired signal transmitted by the photographing device into an acquired image corresponding to each of the rendered images in the target color space, includes:
based on the set time length, video frame images corresponding to each rendering image are obtained from video acquisition signals transmitted by the shooting equipment;
And based on the target conversion rule, converting each acquired video frame image into an acquisition image corresponding to each rendering image in the target color space.
In some examples, the apparatus further comprises a transmitting module for:
and sending the established color calibration mapping relation to the rendering engine so that the rendering engine can locally complete the configuration of the received color calibration mapping relation.
The implementation process of the functions and roles of each module in the device for establishing the color calibration mapping relationship is specifically detailed in the implementation process of the corresponding steps in the method for establishing the color calibration mapping relationship, and is not described herein again.
Correspondingly, the embodiment of the specification also provides a virtual shooting system, which comprises a main control machine, rendering equipment, a display screen and shooting equipment; the main control computer runs a main control program, and the main control program realizes the steps of the method embodiment for establishing the color calibration mapping relation when being executed by the processor.
Accordingly, the embodiments of the present specification further provide a computer program product, which includes a computer program, where the computer program when executed by a processor implements the steps of the foregoing method embodiment for setting up a color calibration mapping relationship.
Correspondingly, the embodiment of the specification also provides a computer device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of the embodiment of the method for establishing the color calibration mapping relation when executing the program.
Accordingly, the present embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method embodiments of establishing a color calibration mapping relationship.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present description. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The above-described embodiments may be applied to one or more computer devices, which are devices capable of automatically performing numerical calculations and/or information processing according to preset or stored instructions, the hardware of which include, but are not limited to, microprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASICs), programmable gate arrays (fields-Programmable Gate Array, FPGAs), digital processors (Digital Signal Processor, DSPs), embedded devices, etc.
The computer device may be any electronic product that can interact with a user in a human-computer manner, such as a personal computer, tablet computer, smart phone, personal digital assistant (Personal Digital Assistant, PDA), game console, interactive internet protocol television (Internet Protocol Television, IPTV), smart wearable device, etc.
The computer device may also include a network device and/or a user device. Wherein the network device includes, but is not limited to, a single network server, a server group composed of a plurality of network servers, or a Cloud based Cloud Computing (Cloud Computing) composed of a large number of hosts or network servers.
The network in which the computer device is located includes, but is not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a virtual private network (Virtual Private Network, VPN), and the like.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The above steps of the methods are divided, for clarity of description, and may be combined into one step or split into multiple steps when implemented, so long as they include the same logic relationship, and they are all within the protection scope of this patent; it is within the scope of this application to add insignificant modifications to the algorithm or flow or introduce insignificant designs, but not to alter the core design of its algorithm and flow.
Where a description of "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present description. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Other embodiments of the present description will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This specification is intended to cover any variations, uses, or adaptations of the specification following, in general, the principles of the specification and including such departures from the present disclosure as come within known or customary practice within the art to which the specification pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the specification being indicated by the following claims.
It is to be understood that the present description is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present description is limited only by the appended claims.
The foregoing description of the preferred embodiments is provided for the purpose of illustration only, and is not intended to limit the scope of the disclosure, since any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the disclosure are intended to be included within the scope of the disclosure.

Claims (12)

1. The method is applied to a main control end in a virtual shooting system, wherein the main control end maintains at least one conversion rule, and each conversion rule is used for converting an acquisition signal in a specific format into an image in a specific color space; the virtual shooting system also comprises a rendering engine, a display screen and shooting equipment, and the method comprises the following steps:
acquiring one or more original colors and sending the one or more original colors to the rendering engine, rendering the one or more original colors into one or more rendered images by the rendering engine based on a target color space used by the rendering engine, and displaying each of the rendered images by the display screen;
receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image;
acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule;
Based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space;
and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.
2. The method of claim 1, the master providing a format configuration interface for configuring a format of the acquisition signal;
the acquiring the acquisition format of the acquisition signal transmitted by the shooting equipment comprises the following steps:
and responding to a received call request for the format configuration interface initiated by a user, and acquiring the format of the acquisition signal of the shooting equipment configured by the user, wherein the format is contained in the call request.
3. The method of claim 1, the method further comprising:
after the color calibration mapping relation is established, outputting a color calibration comparison result to a user; wherein the color calibration comparison result comprises: comparing the original color with the collected color corresponding to the original color and comparing the original color with the collected color corresponding to the calibration color; the calibration color is a color obtained by calibrating the original color through the color calibration mapping relation.
4. A method according to claim 3, wherein the color calibration comparison result further comprises:
color difference between the original color and the collected color corresponding to the original color; and a color difference between the original color and the collected color corresponding to the calibration color.
5. The method of claim 1, wherein the main control terminal operates on a main control computer, the main control computer is provided with a video acquisition card, and the video acquisition card is connected with the shooting equipment through a transmission line;
the receiving the acquisition signal transmitted by the shooting device in the process of shooting the display screen to display each rendered image comprises the following steps:
and receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image through the video acquisition card and the transmission line.
6. The method of claim 5, the video capture card comprising an SDI video capture card, the transmission line comprising an SDI transmission line.
7. The method of claim 1, the display screen displaying each of the rendered images at a set time; the acquisition signal is a video acquisition signal;
the converting, based on the target conversion rule, the acquired signal transmitted by the photographing device into an acquired image corresponding to each of the rendered images in the target color space, includes:
Based on the display time of each rendering image, acquiring video frame images corresponding to each rendering image from video acquisition signals transmitted by the shooting equipment;
and based on the target conversion rule, converting each acquired video frame image into an acquisition image corresponding to each rendering image in the target color space.
8. The method of claim 1, the method further comprising:
and sending the established color calibration mapping relation to the rendering engine so that the rendering engine can locally complete the configuration of the received color calibration mapping relation.
9. The device is applied to a main control end in a virtual shooting system, and the main control end maintains at least one conversion rule, wherein each conversion rule is used for converting an acquisition signal in a specific format into an image in a specific color space; the virtual shooting system further comprises a rendering engine, a display screen and shooting equipment, and the device comprises:
the sending module is used for acquiring one or more original colors and sending the one or more original colors to the rendering engine, the rendering engine renders the one or more original colors into one or more rendered images based on a target color space used by the rendering engine, and the display screen displays each rendered image;
A receiving module for: receiving acquisition signals transmitted by the shooting equipment in the process of shooting the display screen to display each rendered image;
a query module for: acquiring an acquisition format of an acquisition signal transmitted by the shooting equipment and acquiring a target color space used by the rendering engine, and further inquiring a target conversion rule adapting to the acquisition format and the target color space from the maintained at least one conversion rule;
a conversion module for: based on the target conversion rule, converting the acquisition signals transmitted by the shooting equipment into acquisition images corresponding to each rendering image in the target color space;
the establishing module is used for: and acquiring the acquired color corresponding to each original color from the acquired image obtained through conversion so as to further establish a color calibration mapping relation.
10. The virtual shooting system comprises a main control machine, rendering equipment, a display screen and shooting equipment; the main control computer is operated with a main control program which when executed by a processor implements the steps of the method of any of claims 1 to 8.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method of any of claims 1 to 8 when the computer program is executed.
12. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of claims 1 to 8.
CN202311395095.7A 2023-10-25 2023-10-25 Method for establishing color calibration mapping relation, virtual shooting system and related device Pending CN117478860A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311395095.7A CN117478860A (en) 2023-10-25 2023-10-25 Method for establishing color calibration mapping relation, virtual shooting system and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311395095.7A CN117478860A (en) 2023-10-25 2023-10-25 Method for establishing color calibration mapping relation, virtual shooting system and related device

Publications (1)

Publication Number Publication Date
CN117478860A true CN117478860A (en) 2024-01-30

Family

ID=89624930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311395095.7A Pending CN117478860A (en) 2023-10-25 2023-10-25 Method for establishing color calibration mapping relation, virtual shooting system and related device

Country Status (1)

Country Link
CN (1) CN117478860A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013122663A (en) * 2011-12-09 2013-06-20 Sharp Corp Display system, calibration method, computer program, and recording medium
US20200413018A1 (en) * 2018-03-12 2020-12-31 Hewlett-Packard Development Company, L.P. Generating a color mapping
CN116489328A (en) * 2022-11-22 2023-07-25 腾讯科技(深圳)有限公司 Shooting parameter-based color lookup table generation method and device and computer equipment
CN116485979A (en) * 2023-04-28 2023-07-25 北京优酷科技有限公司 Mapping relation calculation method, color calibration method and electronic equipment
CN116540963A (en) * 2023-04-21 2023-08-04 北京优酷科技有限公司 Mapping relation calculation method, color calibration method, device and electronic equipment
CN116800941A (en) * 2023-07-20 2023-09-22 神力视界(深圳)文化科技有限公司 Color calibration method, device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013122663A (en) * 2011-12-09 2013-06-20 Sharp Corp Display system, calibration method, computer program, and recording medium
US20200413018A1 (en) * 2018-03-12 2020-12-31 Hewlett-Packard Development Company, L.P. Generating a color mapping
CN116489328A (en) * 2022-11-22 2023-07-25 腾讯科技(深圳)有限公司 Shooting parameter-based color lookup table generation method and device and computer equipment
CN116540963A (en) * 2023-04-21 2023-08-04 北京优酷科技有限公司 Mapping relation calculation method, color calibration method, device and electronic equipment
CN116485979A (en) * 2023-04-28 2023-07-25 北京优酷科技有限公司 Mapping relation calculation method, color calibration method and electronic equipment
CN116800941A (en) * 2023-07-20 2023-09-22 神力视界(深圳)文化科技有限公司 Color calibration method, device and storage medium

Similar Documents

Publication Publication Date Title
JP7150127B2 (en) Video signal processing method and apparatus
US8860785B2 (en) Stereo 3D video support in computing devices
US11032579B2 (en) Method and a device for encoding a high dynamic range picture, corresponding decoding method and decoding device
WO2021004176A1 (en) Image processing method and apparatus
KR101680254B1 (en) Method of calibration of a target color reproduction device
WO2023016035A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023016037A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2021073304A1 (en) Image processing method and apparatus
CN116489328A (en) Shooting parameter-based color lookup table generation method and device and computer equipment
US20160286090A1 (en) Image processing method, image processing apparatus, and image processing program
CN117478860A (en) Method for establishing color calibration mapping relation, virtual shooting system and related device
WO2023016040A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN112309312A (en) Image display method and device, receiving card, sending card and LED display system
TWI543598B (en) Television system
KR102489894B1 (en) Method and device for determining a characteristic of a display device
CN117499559A (en) Virtual shooting system, device configuration method, device, equipment and storage medium
TWI837469B (en) Method and device for removeing the background
US20240037795A1 (en) Color calibration method and apparatus, computer device, and computer-readable storage medium
CN114449244B (en) Image quality adjusting method and device
Sharma Exploring the basic concepts of HDR: dynamic range, gamma curves, and wide color gamut
WO2023016041A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN117478861A (en) Method for establishing color calibration mapping relation, virtual shooting system and related device
CN116501273A (en) Picture display method, device, computer equipment and storage medium
Bae et al. Study on HDR/WCG Service Model for UHD Service
WO2024020356A1 (en) Multiple-intent composite image encoding and rendering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination