CN117560577A - Virtual-real material color alignment method, device, equipment and storage medium - Google Patents

Virtual-real material color alignment method, device, equipment and storage medium Download PDF

Info

Publication number
CN117560577A
CN117560577A CN202311406175.8A CN202311406175A CN117560577A CN 117560577 A CN117560577 A CN 117560577A CN 202311406175 A CN202311406175 A CN 202311406175A CN 117560577 A CN117560577 A CN 117560577A
Authority
CN
China
Prior art keywords
color
real
virtual
scene object
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311406175.8A
Other languages
Chinese (zh)
Inventor
韩洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenli Vision Shenzhen Cultural Technology Co ltd
Original Assignee
Shenli Vision Shenzhen Cultural Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenli Vision Shenzhen Cultural Technology Co ltd filed Critical Shenli Vision Shenzhen Cultural Technology Co ltd
Priority to CN202311406175.8A priority Critical patent/CN117560577A/en
Publication of CN117560577A publication Critical patent/CN117560577A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a virtual-real material color alignment method, device, equipment and storage medium, wherein the method comprises the following steps: the electronic equipment responds to the interactive operation of the user, and acquires and displays a real-time shooting picture; in a real-time shooting picture, determining a virtual scene object and a real scene object to be aligned; and performing color adjustment on the virtual scene object based on the first color of the real scene object. In the application, the electronic equipment displays a real-time shooting picture of the camera based on interactive operation of a user, determines a virtual scene object and a real scene object to be aligned in the real-time shooting picture, and then adjusts the color of the virtual scene object according to the first color of the real scene object to ensure that the virtual scene object and the real scene object keep consistent in appearance. Therefore, the electronic equipment can realize automatic alignment of the material colors of the real scene and the virtual scene in the virtual shooting picture, does not need to manually perform color adjustment operation by a user, improves the efficiency of color adjustment, saves the time of the user, and can also improve the accuracy of color alignment.

Description

Virtual-real material color alignment method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for color alignment of virtual and real materials.
Background
With the continuous development of the internet, artificial intelligence algorithms and the like, the application of virtual shooting technology is becoming wider and wider. The virtual photographing technique may refer to a technique of displaying a fictitious scene through a display screen of a type such as a light emitting diode (Light Emitting Diode, LED) at a photographing site and directly photographing a fused picture.
In order to ensure that the illusion scene is consistent with the actual scene, in the related art, a user generally needs to manually perform color adjustment such as picture rendering on the assets of the illusion scene, and the manual adjustment is low in efficiency, consumes a lot of time for the user, and is low in accuracy because the user performs the manual adjustment based on manual experience.
Disclosure of Invention
Various aspects of the present application provide a method, apparatus, device, and storage medium for color alignment of virtual and real materials, which are used to improve the efficiency of color adjustment, save the time of the user, and improve the accuracy of color adjustment.
In a first aspect, an embodiment of the present application provides a method for color alignment of virtual and real materials, including:
responding to the interactive operation of the user, and acquiring and displaying a real-time shooting picture;
in the real-time shooting picture, determining a virtual scene object and a real scene object to be aligned;
and performing color adjustment on the virtual scene object based on the first color of the real scene object.
In one possible implementation manner, the acquiring and displaying the real-time shooting picture in response to the interactive operation of the user includes:
responding to the starting operation of a user, and displaying a preset color alignment control;
and responding to the touch operation of the user on the preset color alignment control, and acquiring and displaying a real-time shooting picture.
In a possible implementation manner, the determining, in the real-time shooting picture, the virtual scene object and the real scene object to be aligned includes:
in the real-time shooting picture, responding to a selection operation of a user, and determining an object to be aligned corresponding to the selection operation;
and determining virtual scene objects and real scene objects in the objects to be aligned.
In a possible implementation manner, the determining, in the real-time shooting picture, the virtual scene object and the real scene object to be aligned includes:
in the real-time shooting picture, determining an object to be aligned in the real-time shooting picture according to a preset image recognition algorithm;
and determining virtual scene objects and real scene objects in the objects to be aligned.
In a possible implementation manner, the determining the virtual scene object and the real scene object in the objects to be aligned includes:
responding to a labeling operation of a user, and determining a virtual scene object or a real scene object in the objects to be aligned according to the labeling operation; or,
modifying preset attributes of the virtual images, determining an object with changed preset attributes in the object to be aligned as a virtual scene object, and determining an object with unchanged preset attributes in the object to be aligned as a real scene object; and recovering the preset attribute of the virtual scene object.
In a possible implementation manner, the color adjustment of the virtual scene object based on the first color of the real scene object includes:
setting the virtual scene color of the virtual scene object as a first color value corresponding to the first color; or,
adding a color layer corresponding to the virtual scene object; the color value of the color layer is the first color value.
In one possible embodiment, the method further comprises:
acquiring a second color of the adjusted virtual scene object;
determining a color difference of the first color and the second color;
under the condition that the color difference is smaller than a preset threshold value, determining the virtual scene object of the second color as a target virtual scene object;
and under the condition that the color difference is larger than or equal to a preset threshold value, continuing to perform color adjustment on the virtual scene object until the color difference is smaller than the preset threshold value.
In a second aspect, an embodiment of the present application provides a color alignment device for virtual and real materials, including:
the display module is used for responding to the interactive operation of the user and acquiring and displaying a real-time shooting picture;
the determining module is used for determining virtual scene objects and real scene objects to be aligned in the real-time shooting picture;
and the adjustment module is used for carrying out color adjustment on the virtual scene object based on the first color of the real scene object.
In a possible implementation manner, the display module is specifically configured to:
responding to the starting operation of a user, and displaying a preset color alignment control;
and responding to the touch operation of the user on the preset color alignment control, and acquiring and displaying a real-time shooting picture.
In a possible implementation manner, the determining module is specifically configured to:
in the real-time shooting picture, responding to a selection operation of a user, and determining an object to be aligned corresponding to the selection operation;
and determining virtual scene objects and real scene objects in the objects to be aligned.
In a possible implementation manner, the determining module is specifically configured to:
in the real-time shooting picture, determining an object to be aligned in the real-time shooting picture according to a preset image recognition algorithm;
and determining virtual scene objects and real scene objects in the objects to be aligned.
In a possible implementation manner, the determining module is specifically configured to:
responding to a labeling operation of a user, and determining a virtual scene object or a real scene object in the objects to be aligned according to the labeling operation; or,
modifying preset attributes of the virtual images, determining an object with changed preset attributes in the object to be aligned as a virtual scene object, and determining an object with unchanged preset attributes in the object to be aligned as a real scene object; and recovering the preset attribute of the virtual scene object.
In a possible embodiment, the adjusting module is specifically configured to:
setting the virtual scene color of the virtual scene object as a first color value corresponding to the first color; or,
adding a color layer corresponding to the virtual scene object; the color value of the color layer is the first color value.
In one possible embodiment, the apparatus is further for:
acquiring a second color of the adjusted virtual scene object;
determining a color difference of the first color and the second color;
under the condition that the color difference is smaller than a preset threshold value, determining the virtual scene object of the second color as a target virtual scene object;
and under the condition that the color difference is larger than or equal to a preset threshold value, continuing to perform color adjustment on the virtual scene object until the color difference is smaller than the preset threshold value.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory and a processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory, such that the processor executes the method for color alignment of virtual and real materials according to any one of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein computer-executable instructions for implementing the virtual-to-real texture color alignment method of any one of the first aspects when the computer-executable instructions are executed by a processor.
In a fifth aspect, embodiments of the present application provide a computer program product, including a computer program, where the computer program when executed by a processor implements the method for color alignment of virtual and real materials according to any one of the first aspect.
In the embodiment of the application, the electronic equipment responds to the interactive operation of a user to acquire and display a real-time shooting picture; in a real-time shooting picture, determining a virtual scene object and a real scene object to be aligned; and performing color adjustment on the virtual scene object based on the first color of the real scene object. In the application, the electronic equipment displays a real-time shooting picture of the camera based on interactive operation of a user, determines a virtual scene object and a real scene object to be aligned in the real-time shooting picture, and then adjusts the color of the virtual scene object according to the first color of the real scene object to ensure that the virtual scene object and the real scene object keep consistent in appearance. Therefore, the electronic equipment can realize automatic alignment of the material colors of the real scene and the virtual scene in the virtual shooting picture, does not need to manually perform color adjustment operation by a user, improves the efficiency of color adjustment, saves the time of the user, and can also improve the accuracy of color alignment.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a schematic diagram of an application scenario provided in an exemplary embodiment of the present application;
fig. 2 is a flowchart of a virtual-to-real material color alignment method according to an exemplary embodiment of the present application;
fig. 3 is a flowchart illustrating another color alignment method for virtual and real materials according to an exemplary embodiment of the present application;
fig. 4 is a schematic structural diagram of a color alignment device for virtual and real materials according to an exemplary embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure. User information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to herein are both user-authorized or fully authorized information and data by parties, and the collection, use and processing of relevant data requires compliance with relevant laws and regulations and standards of the relevant country and region, and is provided with corresponding operation portals for user selection of authorization or denial.
The conventional film and television production usually adopts a green screen form, that is, the green screen is used as a background in the shooting process, and virtual scenes of a illusion Engine (UE) are synthesized into a shooting picture in a later process, that is, the green screen is replaced by the illusion scene. With the continuous development of internet technology, artificial intelligence and other technologies, the application of virtual shooting technology is becoming wider and wider. The virtual shooting refers to placing a screen (such as an LED screen) on site during shooting, and playing asset materials corresponding to the UE illusion scene, so that the fused picture can be directly viewed and shot in real time.
In order to ensure that the illusion scene is consistent with the look and feel of the actual scene, in the related art, currently, the theatrical personnel (i.e. the user) generally need to perform color adjustment on picture rendering and the like in the illusion scene, the manual adjustment is low in efficiency, consumes a great deal of time for the user, and has low accuracy because the user performs manual adjustment based on manual experience or subjective feeling.
In order to solve the problems, the electronic device in the application displays a real-time shooting picture of a camera based on interactive operation of a user, determines a virtual scene object and a real scene object to be aligned in the real-time shooting picture, and then adjusts the color of the virtual scene object according to a first color of the real scene object to ensure that the virtual scene object and the real scene object keep consistent in appearance. In this way, the electronic equipment automatically processes the UE asset and modifies the UE rendering according to the picture shot by the camera, so that the colors of virtual and real objects in the shot picture are aligned, the look and feel are consistent, the efficiency of color adjustment is improved, the time of a user is saved, and the accuracy of color alignment can be improved.
Fig. 1 is a schematic diagram of an application scenario provided in an exemplary embodiment of the present application. As shown in fig. 1, includes a user 101, an electronic device 102. The electronic device 102 may be a mobile phone, a computer, etc. In the related art, when performing virtual shooting, the user 101 usually manually performs material color adjustment on the illusive scene, and this color adjustment method is low in efficiency, consumes a lot of time for the user, and is not high in accuracy of color adjustment.
In the embodiment of the application, the electronic device 102 responds to the interactive operation of the user, acquires the real-time shooting picture, determines the real-scene object and the virtual scene object to be aligned in the real-time shooting picture, and then can perform color adjustment on the virtual scene object based on the first color of the real scene object, so that automatic alignment of the virtual scene color and the real scene color in the virtual shooting scene can be realized, the user is not required to manually perform color adjustment, the adjustment efficiency is higher, the time of the user is saved, and the accuracy of the color adjustment is also higher.
The technical scheme shown in the application is described in detail through specific embodiments. It should be noted that the following embodiments may exist alone or in combination with each other, and for the same or similar content, the description will not be repeated in different embodiments.
Fig. 2 is a flowchart of a virtual-to-real material color alignment method according to an exemplary embodiment of the present application. Referring to fig. 2, the method for color alignment of virtual and real materials may include:
s201, responding to interactive operation of a user, and acquiring and displaying a real-time shooting picture.
The execution body of the embodiment of the application may be an electronic device, or may be a virtual-real material color alignment device disposed in the electronic device. The virtual and real material color alignment device can be realized by software or by the combination of software and hardware. For ease of understanding, hereinafter, an execution body will be described as an example of an electronic device.
In this embodiment of the present application, the interactive operation may refer to a virtual-real material color alignment start operation of a user, specifically may refer to a click operation and a sliding operation of the user in a display screen of an electronic device, or an input operation based on an input device such as a mouse and a keyboard, which is not limited by a specific type of the interactive operation. The real-time shooting picture can be a real-time image picture shot by a camera in the virtual shooting scene, and the real-time shooting picture can comprise the illusive scene in a display screen such as an LED and the like and also comprises the actual scene of the virtual shooting.
In this step, in the virtual shooting scene, the color alignment of the virtual and real materials can be realized by the UE plug-in the electronic device. The UE plugin may be specifically implemented based on a program language such as C language or c++. Materials in a UE may refer to classes that express object appearance attributes, typically including basic color, metalness, roughness, etc., and the file may be saved in disk and multiplexed. When the user needs to perform color alignment of virtual and real materials, the UE plug-in may be started in the electronic device, and the specific starting mode may refer to starting by touching an icon of the plug-in or starting by pressing a preset key (or a preset key combination), which is not limited in the embodiment of the present application. The electronic equipment responds to the starting operation of the user, can start the UE plug-in unit, the UE plug-in unit can display the color alignment control, the user can touch the color alignment control, the virtual and real material color alignment process is started, and the electronic equipment can correspondingly acquire and display the real-time shooting picture of the camera.
S202, in a real-time shooting picture, determining a virtual scene object and a real scene object to be aligned.
In this embodiment, the virtual scene object and the real scene object to be aligned may refer to the same object that exists in both the virtual scene and the actual scene that are played in the LED display screen. For example, in the virtual shooting scene, a fictive scene is played in the LED display screen, wherein a virtual image background such as a desert can be included, and in the actual scene, an actual object such as sand can be generally arranged as a correspondence. For desert in the illusion scene and sand in the actual scene, the appearance of the desert in the illusion scene and the sand in the actual scene are required to be consistent in the virtual shooting process, so that color alignment is required, and a virtual scene object in the illusion scene and a real scene object of the same object in the actual scene can be used as a virtual scene object and a real scene object to be aligned.
Specifically, in the step, the electronic device can automatically determine the virtual scene object and the real scene object to be aligned in the real-time shooting picture through a preset image recognition algorithm, so that the automatic selection and determination of the object to be aligned can be realized, and the color adjustment efficiency can be improved. And/or, the electronic equipment can also respond to the selection operation of the user to determine the object to be aligned and determine the virtual scene object and the real scene object in the object to be aligned, so that the electronic equipment can be more matched with the actual requirement of the user, and the operation of the user can be simplified to a certain extent.
S203, performing color adjustment on the virtual scene object based on the first color of the real scene object.
In the embodiment of the present application, the first color may be an actual color in the real scene object, and may be specifically represented by a red, green, blue (RGB) color value or the like, and the specific representation form of the first color is not limited in the embodiment of the present application. After determining the virtual scene object and the real scene object to be aligned, in order to ensure that the appearance of the same object in the virtual scene is consistent, the electronic device may align the color of the virtual scene object with the color of the real scene object. Because the color of the real scene object is fixed, the electronic equipment can adjust the color of the virtual scene object, so that the adjusted color of the virtual scene object is consistent with the color of the real scene object.
Specifically, the electronic device may determine the first color of the real scene object, and perform color adjustment on the virtual scene object, where the specific adjustment mode may be to directly set the color of the virtual scene object as the first color, or add a light or a layer of the first color to the virtual scene object, so that the adjusted virtual scene object also presents the first color, and it is ensured that the color impression of the same object in the virtual scene is consistent.
Based on the above embodiments, fig. 3 is a schematic flow chart of another color alignment method for virtual and real materials according to an exemplary embodiment of the present application. Referring to fig. 3, the method for color alignment of virtual and real materials may include:
s301, responding to a starting operation of a user, and displaying a preset color alignment control; and responding to the touch operation of the user on the preset color alignment control, and acquiring and displaying a real-time shooting picture.
In this embodiment of the present application, the start operation may refer to an operation of a user to start a UE plug-in, and specifically may refer to an operation of a user to touch an icon of the UE plug-in, an operation of a user to press a physical key (or a physical key combination), an operation of a user to input voice, or the like. The preset color alignment control may refer to a trigger control for color alignment that is preset.
In the step, when a user needs to perform color alignment on virtual and real materials, a starting operation can be executed in the electronic equipment to start the UE plugin, and the electronic equipment responds to the starting operation of the user to start the UE plugin and display a preset color alignment control; the user can touch the preset color alignment control, the virtual and real material color alignment process is triggered, and the electronic equipment responds to the touch operation of the user on the preset color alignment control, and can acquire and display a real-time shooting picture of the camera. Of course, the triggering mode of color alignment of the virtual and real materials may be other modes, and other triggering controls may be correspondingly set.
S302, in a real-time shooting picture, responding to a selection operation of a user, and determining an object to be aligned corresponding to the selection operation; or in the real-time shooting picture, determining the object to be aligned in the real-time shooting picture according to a preset image recognition algorithm.
In the embodiment of the present application, the selection operation may refer to an interactive operation of selecting an object to be aligned by a user, and specifically may be a pointing operation, a sliding operation, or an input operation based on an input device such as a mouse, a keyboard, or the like. The preset image recognition algorithm may refer to a preset image recognition algorithm model, may be used for realizing recognition of the same object in the real-time shooting picture, and may specifically refer to a convolutional neural network algorithm, a deep learning algorithm and the like.
The object to be aligned may refer to the same object that exists in the virtual-real scene and needs to be color aligned, and may also be referred to as the object to be aligned. For example, in the virtual shooting scene, a desert exists in the virtual background, and sand also exists in the actual scene, so as to ensure continuity of the virtual and actual scenes, and the electronic device can select the desert in the virtual background and the sand in the actual scene as objects to be aligned because the virtual background and the sand in the actual scene may not be consistent in color, and then can further determine the virtual scene object (i.e., the desert in the virtual background) and the actual scene object (i.e., the sand in the actual scene) in the objects to be aligned.
In this step, when determining the object to be aligned that needs to be color aligned, the electronic device may determine the object based on a selection operation of a user, or may automatically determine the object based on a preset image recognition algorithm. Specifically, in one possible implementation manner, the user may perform a selection operation in the real-time shooting picture, select an object to be color aligned, and the electronic device may determine, in response to the selection operation of the user, an object to be aligned corresponding to the selection operation of the user. In another possible implementation manner, the electronic device may also automatically determine the object to be aligned in the real-time shooting picture based on a preset image recognition algorithm, so that the efficiency of determining the object to be aligned can be improved. Of course, the electronic device may also adopt the above two modes, that is, on the basis of automatic recognition performed by the preset image recognition algorithm, the user may perform manual adjustment based on actual requirements, so as to further improve flexibility of determining the object to be aligned, which is not limited in the embodiment of the present application.
S303, determining virtual scene objects and real scene objects in the objects to be aligned.
In this embodiment of the present application, after determining the object to be aligned, the electronic device may further determine a virtual scene object and a real scene object in the object to be aligned, where a specific determining manner may be determined based on a labeling operation of a user, or may be determined by the electronic device based on a characteristic of a virtual scene.
In one possible implementation, step S303 may be specifically implemented by:
and responding to the labeling operation of the user, and determining a virtual scene object or a real scene object in the objects to be aligned according to the labeling operation.
In this embodiment of the present application, the labeling operation may refer to an operation of adding a label to a virtual and actual material of an object to be aligned by a user. After determining the object to be aligned, the electronic device may determine, in response to a labeling operation by the user, a virtual object and a real object in the object to be aligned based on the labeling operation.
In another possible implementation manner, step S303 may be specifically implemented in the following another manner:
modifying preset attributes of the virtual images, determining an object with changed preset attributes in the objects to be aligned as a virtual scene object, and determining an object with unchanged preset attributes in the objects to be aligned as a real scene object; and restoring the preset attribute of the virtual scene object.
In this embodiment of the present application, the preset attribute may refer to an attribute of a virtual image corresponding to a fantasy scene played by an LED display screen, and specifically may refer to color, brightness, contrast, and the like. In a virtual shooting scene, the object properties in the actual scene are not adjustable, while the object properties in the virtual image in the illusive scene are adjustable. The electronic device may adjust the preset attribute of the object to be aligned, for example, adjust the color of the object to be aligned, etc., then the electronic device may determine the object whose preset attribute is changed as a virtual object, determine the object whose preset attribute is unchanged as a real object, and then restore the preset attribute of the object to be aligned to the original state. In this way, the electronic device determines the object with the changed preset attribute as the virtual object by modifying and verifying the preset attribute of the object to be aligned, so that the efficiency of determining the virtual object and the real object can be improved, and the operation of a user is further simplified.
It should be noted that, the electronic device may also determine the virtual object and the real object in the object to be aligned in other manners, for example, an image detection algorithm, etc., which is not limited in the embodiment of the present application.
S304, performing color adjustment on the virtual scene object based on the first color of the real scene object.
In one possible implementation, step S304 may be specifically implemented by:
setting the virtual scene color of the virtual scene object as a first color value corresponding to the first color; or,
adding a color layer corresponding to the virtual scene object; the color value of the color layer is a first color value.
In this embodiment of the present application, the first color value may refer to a specific value of a first color corresponding to a real scene object, and may specifically be an RGB color value or a color value of other color spaces, etc., which is not limited to a specific representation manner of the first color value. For example, when the first color value is an RGB color value, the RGB color value may refer to a statistical value such as an average value or a mode of RGB color values of all pixels in the real scene object, or may refer to an RGB color value of a certain pixel in the real scene object. The virtual scene color may refer to a color attribute value of the virtual scene object. The color layer may refer to a layer covered by the virtual scene object, and may enable the virtual scene object to present the color of the layer. When the electronic equipment adjusts the color of the virtual scene object, the virtual scene color of the virtual scene object can be directly set to be a first color value corresponding to the first color, so that the adjusted virtual scene object is aligned with the color of the real scene object. In particular, when the color is adjusted, the electronic device can use a gamma curve to continuously and iteratively adjust the basic color of the virtual scene object, and finally the virtual scene object reaches a first color value; the electronic device may also adopt a 3D lookup Table (Look Up Table) to perform color replacement, and the specific manner of color adjustment in the embodiment of the present application is not limited.
In addition, the electronic device can also increase a color layer corresponding to the virtual scene object, and the color value of the color layer is a first color value, so that the virtual scene object covered by the color layer can be aligned with the color of the real scene object as well, and the consistency of the appearance of virtual and real materials in the virtual shooting scene is ensured.
In another possible implementation manner, step S304 may be specifically implemented in another manner as follows:
displaying a first color of the real object; responding to the touch operation of a user on the color adjustment control, and determining a color parameter corresponding to the touch operation; and performing color adjustment on the virtual scene object based on the color parameters.
In the embodiment of the present application, the color adjustment control may refer to a preset color adjustment interaction control. The color parameter may refer to a color value set by a user. Specifically, the electronic device may display a first color of the real scene object in the display screen, for example, may display a color block of the first color or display a first color value corresponding to the first color, etc., and the user may perform a touch operation in the color adjustment control based on the first color displayed in the display screen, and input a color parameter; the electronic device may correspondingly perform color adjustment on the virtual scene object based on the color parameter, specifically may directly set the virtual scene color of the virtual scene object as the color parameter, or increase a color layer of the virtual scene object, etc., so that flexibility of color alignment may be further improved.
S305, acquiring a second color of the adjusted virtual scene object; a color difference of the first color and the second color is determined.
S306, determining the virtual scene object of the second color as a target virtual scene object under the condition that the color difference is smaller than a preset threshold value; and under the condition that the color difference is larger than or equal to a preset threshold value, continuing to perform color adjustment on the virtual scene object until the color difference is smaller than the preset threshold value.
In this embodiment of the present application, the second color may refer to a color of the adjusted virtual object. The color difference may refer to a color difference between a first color of the real scene object and a second color of the adjusted virtual scene object, and in particular may refer to a color value difference between the first color and the second color, for example, an RGB color difference value or the like. Of course, the color difference may also be represented by other forms, which are not limited in the embodiments of the present application. The preset threshold may be a preset critical value of a color difference, and when the color difference between the first color and the second color is smaller than the preset threshold, the electronic device may determine that the color impression of the virtual scene object is consistent with that of the real scene object; when the color difference between the first color and the second color is not smaller than the preset threshold, the electronic device can determine that the color impression of the virtual scene object is inconsistent with that of the real scene object. The target virtual object may refer to a virtual object that is the same as the color impression of the real object as finally determined.
In this step, after the electronic device performs color adjustment on the virtual scene object based on the first color, at this time, the image rendering changes, and the color of the virtual scene object displayed in the screen is adjusted, for example, the virtual scene object after the color adjustment in step S304 may be re-rendered to the screen, and the electronic device may determine, in the re-photographed image, the adjusted second color of the virtual scene object, and further determine the color difference between the second color of the virtual scene object and the first color of the real scene object. Then, the electronic equipment can compare the color difference with a preset threshold value, if the color difference is smaller than the preset threshold value, the electronic equipment can determine that the color impression of the adjusted virtual scene object is consistent with that of the real scene object, and at the moment, the adjusted virtual scene object with the second color can be determined to be a final target virtual scene object; if the color difference is not smaller than the preset threshold, the electronic device may determine that the adjusted virtual scene object is inconsistent with the color impression of the real scene object, at this time, the electronic device may continue to perform the color alignment operation, and adjust the virtual scene color of the virtual scene object again, for example, may continue to perform color iteration based on the gamma curve, or adjust the second color value corresponding to the second color up or down according to the preset step length, until the color difference is smaller than the preset threshold.
Therefore, the electronic equipment determines whether to continue color alignment or not based on the comparison result of the color difference and the preset threshold value by detecting the second color of the adjusted virtual scene object and comparing the second color with the first color of the real scene object, so that the accuracy of color alignment of virtual and real materials can be further improved, and the consistency of the color impression of the virtual scene object and the real scene object is ensured.
In addition, the final target virtual scene object can be considered to have basic colors aligned with the colors of the real scene object, on the basis, the colors of the target virtual scene object can be further adjusted according to the depth of field, the height, virtual illumination and other information, so that the influence of light shadows, height fluctuation, distance and the like on the colors is reflected, and special effect adjustment required by blurring and the like can be performed. Taking the desert as a virtual scenery object as an example, after further adjustment, the color of the desert can be changed along with factors such as fluctuation, distance and the like, or special effects such as perspective blurring and the like can be formed. The further adjusted target virtual scene object can be rendered to a screen for virtual shooting.
Fig. 4 is a schematic structural diagram of a color alignment device for virtual and real materials according to an exemplary embodiment of the present application, please refer to fig. 4, wherein the color alignment device 40 for virtual and real materials includes:
a display module 41 for acquiring and displaying a real-time photographing picture in response to an interactive operation of a user;
a determining module 42, configured to determine, in a real-time shot frame, a virtual scene object and a real scene object to be aligned;
the adjustment module 43 is configured to perform color adjustment on the virtual scene object based on the first color of the real scene object.
In one possible implementation, the display module 41 is specifically configured to:
responding to the starting operation of a user, and displaying a preset color alignment control;
and responding to the touch operation of the user on the preset color alignment control, and acquiring and displaying a real-time shooting picture.
In one possible implementation, the determining module 42 is specifically configured to:
in a real-time shooting picture, responding to a selection operation of a user, and determining an object to be aligned corresponding to the selection operation;
virtual and real objects in the objects to be aligned are determined.
In one possible implementation, the determining module 42 is specifically configured to:
in the real-time shooting picture, determining an object to be aligned in the real-time shooting picture according to a preset image recognition algorithm;
virtual and real objects in the objects to be aligned are determined.
In one possible implementation, the determining module 42 is specifically configured to:
responding to the labeling operation of a user, and determining a virtual scene object or a real scene object in the objects to be aligned according to the labeling operation; or,
modifying preset attributes of the virtual images, determining an object with changed preset attributes in the objects to be aligned as a virtual scene object, and determining an object with unchanged preset attributes in the objects to be aligned as a real scene object; and restoring the preset attribute of the virtual scene object.
In one possible implementation, the adjustment module 43 is specifically configured to:
setting the virtual scene color of the virtual scene object as a first color value corresponding to the first color; or,
adding a color layer corresponding to the virtual scene object; the color value of the color layer is a first color value.
In one possible embodiment, the apparatus 40 is further configured to:
acquiring a second color of the adjusted virtual scene object;
determining a color difference of the first color and the second color;
under the condition that the color difference is smaller than a preset threshold value, determining the virtual scene object with the second color as a target virtual scene object;
and under the condition that the color difference is larger than or equal to a preset threshold value, continuing to perform color adjustment on the virtual scene object until the color difference is smaller than the preset threshold value.
The color alignment device 40 for virtual and real materials provided in the embodiment of the present application may execute the technical scheme shown in the embodiment of the method, and its implementation principle and beneficial effects are similar, and will not be described herein again.
Fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application, and referring to fig. 5, the electronic device 50 may include a processor 51 and a memory 52. The processor 51, the memory 52, and the like are illustratively interconnected by a bus 53.
Memory 52 stores computer-executable instructions;
the processor 51 executes computer-executable instructions stored in the memory 52, so that the processor 51 executes the virtual-to-real material color alignment method as in the method embodiment described above.
Accordingly, embodiments of the present application provide a computer readable storage medium, in which computer executable instructions are stored, for implementing the virtual-to-real material color alignment method of the above method embodiments when the computer executable instructions are executed by a processor.
Accordingly, the embodiments of the present application may also provide a computer program product, including a computer program, where the computer program may implement the virtual-real material color alignment method shown in the foregoing method embodiments when executed by a processor.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors, input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. The method for aligning the colors of the virtual and the real materials is characterized by comprising the following steps:
responding to the interactive operation of the user, and acquiring and displaying a real-time shooting picture;
in the real-time shooting picture, determining a virtual scene object and a real scene object to be aligned;
and performing color adjustment on the virtual scene object based on the first color of the real scene object.
2. The method of claim 1, wherein the acquiring and displaying the real-time photographed picture in response to the user's interactive operation comprises:
responding to the starting operation of a user, and displaying a preset color alignment control;
and responding to the touch operation of the user on the preset color alignment control, and acquiring and displaying a real-time shooting picture.
3. The method according to claim 1, wherein determining virtual and real objects to be aligned in the real-time shot comprises:
in the real-time shooting picture, responding to a selection operation of a user, and determining an object to be aligned corresponding to the selection operation;
and determining virtual scene objects and real scene objects in the objects to be aligned.
4. The method according to claim 1, wherein determining virtual and real objects to be aligned in the real-time shot comprises:
in the real-time shooting picture, determining an object to be aligned in the real-time shooting picture according to a preset image recognition algorithm;
and determining virtual scene objects and real scene objects in the objects to be aligned.
5. The method according to claim 3 or 4, wherein said determining virtual and real objects of the objects to be aligned comprises:
responding to a labeling operation of a user, and determining a virtual scene object or a real scene object in the objects to be aligned according to the labeling operation; or,
modifying preset attributes of the virtual images, determining an object with changed preset attributes in the object to be aligned as a virtual scene object, and determining an object with unchanged preset attributes in the object to be aligned as a real scene object; and recovering the preset attribute of the virtual scene object.
6. The method of claim 1, wherein the color adjusting the virtual scene object based on the first color of the real scene object comprises:
setting the virtual scene color of the virtual scene object as a first color value corresponding to the first color; or,
adding a color layer corresponding to the virtual scene object; the color value of the color layer is the first color value.
7. The method according to claim 1, wherein the method further comprises:
acquiring a second color of the adjusted virtual scene object;
determining a color difference of the first color and the second color;
under the condition that the color difference is smaller than a preset threshold value, determining the virtual scene object of the second color as a target virtual scene object;
and under the condition that the color difference is larger than or equal to a preset threshold value, continuing to perform color adjustment on the virtual scene object until the color difference is smaller than the preset threshold value.
8. A virtual-to-real material color alignment apparatus, comprising:
the display module is used for responding to the interactive operation of the user and acquiring and displaying a real-time shooting picture;
the determining module is used for determining virtual scene objects and real scene objects to be aligned in the real-time shooting picture;
and the adjustment module is used for carrying out color adjustment on the virtual scene object based on the first color of the real scene object.
9. An electronic device, comprising: a memory and a processor;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory, causing the processor to perform the virtual-to-real texture color alignment method of any one of claims 1 to 7.
10. A computer readable storage medium having stored therein computer executable instructions for implementing the virtual-to-real texture color alignment method of any one of claims 1 to 7 when the computer executable instructions are executed by a processor.
CN202311406175.8A 2023-10-26 2023-10-26 Virtual-real material color alignment method, device, equipment and storage medium Pending CN117560577A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311406175.8A CN117560577A (en) 2023-10-26 2023-10-26 Virtual-real material color alignment method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311406175.8A CN117560577A (en) 2023-10-26 2023-10-26 Virtual-real material color alignment method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117560577A true CN117560577A (en) 2024-02-13

Family

ID=89810133

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311406175.8A Pending CN117560577A (en) 2023-10-26 2023-10-26 Virtual-real material color alignment method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117560577A (en)

Similar Documents

Publication Publication Date Title
CN110675310B (en) Video processing method and device, electronic equipment and storage medium
US20220350405A1 (en) Image changes based on facial appearance
WO2019164631A1 (en) Dynamic lighting for objects in images
WO2022033485A1 (en) Video processing method and electronic device
CN114640783B (en) Photographing method and related equipment
EP3268930B1 (en) Method and device for processing a peripheral image
US20230171508A1 (en) Increasing dynamic range of a virtual production display
CN117560577A (en) Virtual-real material color alignment method, device, equipment and storage medium
EP4150560B1 (en) Single image 3d photography with soft-layering and depth-aware inpainting
CN113938597B (en) Face recognition method, device, computer equipment and storage medium
Oulasvirta et al. How real is real enough? Optimal reality sampling for fast recognition of mobile imagery
CN117764890A (en) Image processing method and device and electronic equipment
CN117670691A (en) Image processing method and device, computing device and storage medium
CN113552942A (en) Method and equipment for displaying virtual object based on illumination intensity
CN115797160A (en) Image generation method and device
WO2023094870A1 (en) Increasing dynamic range of a virtual production display
WO2023094874A1 (en) Increasing dynamic range of a virtual production display
WO2023094872A1 (en) Increasing dynamic range of a virtual production display
WO2023094875A1 (en) Increasing dynamic range of a virtual production display
WO2023094882A1 (en) Increasing dynamic range of a virtual production display
WO2023094873A1 (en) Increasing dynamic range of a virtual production display
WO2023094879A1 (en) Increasing dynamic range of a virtual production display
WO2023094876A1 (en) Increasing dynamic range of a virtual production display
CN113902665A (en) Image processing method and device
WO2023094877A1 (en) Increasing dynamic range of a virtual production display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination