CN113706719A - Virtual scene generation method and device, storage medium and electronic equipment - Google Patents

Virtual scene generation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113706719A
CN113706719A CN202111011155.1A CN202111011155A CN113706719A CN 113706719 A CN113706719 A CN 113706719A CN 202111011155 A CN202111011155 A CN 202111011155A CN 113706719 A CN113706719 A CN 113706719A
Authority
CN
China
Prior art keywords
light source
virtual
data
source data
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111011155.1A
Other languages
Chinese (zh)
Inventor
庄宇轩
马若超
詹澍祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202111011155.1A priority Critical patent/CN113706719A/en
Publication of CN113706719A publication Critical patent/CN113706719A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure belongs to the technical field of live video, and relates to a virtual scene generation method and device, a storage medium and electronic equipment. The method comprises the following steps: acquiring virtual light source data in a virtual scene, and displaying the virtual light source data on a live broadcast interface; and responding to the adjustment trigger operation acting on the live broadcast interface, and adjusting the virtual light source data. The virtual light source data are displayed on the live broadcast interface, the anchor terminal can check the light source data in the virtual scene, and a data basis is provided for the independent interaction and the individual setting of the anchor terminal. And virtual light source data in the virtual scene are adjusted according to the adjustment triggering operation, and the anchor automatically and interactively changes the light distribution effect of the virtual scene, so that the customization of the personalized live broadcast scene is completed, the automation and intelligence degrees are high, the generation period of the virtual scene is accelerated, the time cost and the labor cost for changing the virtual scene are reduced, the requirements of users on the freshness and diversification of the live broadcast scene are met, and the watching experience of audiences is optimized.

Description

Virtual scene generation method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of live video technologies, and in particular, to a method and an apparatus for generating a virtual scene, a computer-readable storage medium, and an electronic device.
Background
In a live scenario, the live room background is an important viewing medium other than the anchor himself. In addition, the watching experience of the user is directly influenced by the effect of the live broadcast scene, and further the activity degree of the user in the live broadcast room and the atmosphere of the live broadcast room are influenced.
However, in live-action live scenes, the number of live-action live rooms to be arranged is limited. Meanwhile, under a virtual live broadcast scene, the required virtual scenes need to be manufactured one by one according to the requirements of the anchor, the manufacturing period is long, and the replacement cost of the virtual scenes is high. Therefore, it is difficult to satisfy the freshness and diversified appeal of the user to the live scene in the live-action live scene and the virtual live scene.
In view of this, there is a need in the art to develop a new method and apparatus for generating a virtual scene.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure aims to provide a virtual scene generation method, a virtual scene generation apparatus, a computer-readable storage medium, and an electronic device, so as to overcome, at least to some extent, the technical problems of a single live broadcast scene, high scene production cost, and a long period due to limitations of related technologies.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of an embodiment of the present invention, a method for generating a virtual scene is provided, in which a live interface is provided by a host, the live interface includes a video display area, and the video display area displays the virtual scene, and the method includes:
acquiring virtual light source data in the virtual scene, and displaying the virtual light source data on the live broadcast interface;
and responding to an adjustment trigger operation acting on the live broadcast interface, and adjusting the virtual light source data.
In an exemplary embodiment of the present invention, the acquiring virtual light source data in the virtual scene includes:
providing a light source adjusting area on the live broadcast interface;
and responding to function trigger operation acting in the light source adjusting area to acquire virtual light source data in the virtual scene.
In an exemplary embodiment of the present invention, the virtual light source data includes: light source type data, light source location data, and light source attribute parameters.
In an exemplary embodiment of the present invention, the displaying the virtual light source data on the live interface includes:
and generating a lamp control panel area on the live broadcast interface so as to display the virtual light source data in the lamp control panel area.
In an exemplary embodiment of the present invention, the displaying the virtual light source data in the lamp control panel region includes:
performing position projection processing on the light source position data to obtain projection position data;
and displaying light source attribute parameters and the projection position data corresponding to the light source type data in the lamp control panel area based on the light source type data.
In an exemplary embodiment of the invention, the light source property parameters include: a light source information parameter and a light source color parameter.
In an exemplary embodiment of the invention, the method further comprises:
and displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a specified sliding path corresponding to the virtual light source identifier in the lamp control area panel;
and responding to the adjustment triggering operation acted on the virtual light source identification, sliding the virtual light source identification according to the specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a virtual light source sliding rod corresponding to the virtual light source identification in the lamp control area panel, and acquiring the mapping relation of the virtual light source identification and the current identification parameter of the virtual light source identification;
and responding to an adjustment trigger operation acted on the virtual light source sliding rod, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
In an exemplary embodiment of the present invention, the mapping relationship is established according to the light source identification parameter and the light source attribute parameter of the virtual light source identification.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a palette control corresponding to the virtual light source identification in the lamp control area panel;
and responding to an adjustment trigger operation acted on the palette control to adjust the light source color parameters.
In an exemplary embodiment of the invention, the adjusting the light source color parameter includes:
acquiring color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file;
and adjusting the light source color parameters by using the image color file.
In an exemplary embodiment of the invention, the method further comprises:
acquiring anchor display data under the irradiation of the adjusted virtual light source data, and determining an anchor display rule of the adjusted virtual light source data;
and adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data.
In an exemplary embodiment of the present invention, the adjusting the entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data includes:
acquiring light source data to be adjusted corresponding to entity light source data, and performing light source data calculation on the light source data to be adjusted and the anchor display data to obtain a fusion display difference value;
and acquiring a fusion display threshold value of the anchor display rule, and adjusting entity light source data corresponding to the virtual scene according to the fusion display difference value and the fusion display threshold value.
In an exemplary embodiment of the present invention, the anchor display rule includes: the same display rule and the complementary display rule.
In an exemplary embodiment of the invention, the method further comprises:
and generating result identification data according to the adjustment result of the entity light source data, and displaying the result identification data on the live broadcast interface.
In an exemplary embodiment of the invention, the method further comprises:
and adjusting the virtual light source data again according to the result identification data.
In an exemplary embodiment of the invention, the readjusting the virtual light source data according to the result identification data includes:
acquiring target light source data after the virtual light source data and the entity light source data are adjusted, and acquiring original light source data before the entity light source data are adjusted;
performing light source mean calculation on the original light source data and the target light source data to obtain light source mean data;
and when the result identification data is that the adjustment of the entity light source data is successful, adjusting the virtual light source data again according to the light source mean value data.
According to a second aspect of the embodiments of the present invention, there is provided a virtual scene generation apparatus, which provides a live interface through a host, where the live interface includes a video display area, and the video display area displays a virtual scene, including:
the data display module is configured to acquire virtual light source data in the virtual scene and display the virtual light source data on the live broadcast interface;
a data adjustment module configured to adjust the virtual light source data in response to an adjustment trigger operation acting on the live interface.
According to a third aspect of embodiments of the present invention, there is provided an electronic apparatus including: a processor and a memory; wherein the memory has stored thereon computer readable instructions, which when executed by the processor, implement the method for generating a virtual scene in any of the above exemplary embodiments.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of generating a virtual scene in any of the above-described exemplary embodiments.
As can be seen from the foregoing technical solutions, the virtual scene generation method, the virtual scene generation apparatus, the computer storage medium, and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the method and the device provided by the exemplary embodiment of the disclosure, the virtual light source data is displayed on the live interface, so that the anchor can view the light source data in the virtual scene, and a data basis is provided for the autonomous interaction and the personality setting of the anchor. And virtual light source data in the virtual scene are adjusted according to the adjustment triggering operation, and the anchor automatically changes the light distribution effect in the virtual scene in an interactive mode, so that the customization of the personalized live broadcast scene is completed, the automation and intelligence degrees are high, the generation period of the virtual scene is accelerated, the time cost and the labor cost for changing the virtual scene are reduced, the requirements of users on the freshness and diversification of the live broadcast scene are met, and the watching experience of the audiences is optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically illustrates a flow chart of a method for generating a virtual scene in an exemplary embodiment of the present disclosure;
fig. 2 schematically illustrates a flow chart of a method of acquiring virtual light source data in an exemplary embodiment of the present disclosure;
fig. 3 schematically illustrates a flow chart of a method of displaying virtual light source data in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a flow chart of a method of adjusting virtual light source data in an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates a flow chart of another method for adjusting virtual light source data in an exemplary embodiment of the disclosure;
fig. 6 schematically illustrates a flowchart of a method for adjusting virtual light source data according to another exemplary embodiment of the disclosure;
FIG. 7 schematically illustrates a flow chart of a method of adjusting a color parameter of a light source in an exemplary embodiment of the disclosure;
fig. 8 schematically illustrates a flow chart of a method of adjusting physical light source data in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a flow chart of a method of further adjusting physical light source data in an exemplary embodiment of the present disclosure;
fig. 10 schematically illustrates a flow chart of a method of readjusting virtual light source data in an exemplary embodiment of the present disclosure;
fig. 11 is a schematic interface diagram illustrating a virtual broadcast of a host in an application scenario in an exemplary embodiment of the present disclosure;
FIG. 12 is a schematic interface diagram illustrating a lamp control panel area in an application scenario according to an exemplary embodiment of the disclosure;
FIG. 13 is a schematic diagram illustrating an interface for specifying a slide path in an application scenario according to an exemplary embodiment of the disclosure;
fig. 14 schematically illustrates a virtual live broadcast effect diagram after virtual light source data is adjusted in an application scenario in an exemplary embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of a virtual scene generation apparatus according to an exemplary embodiment of the present disclosure;
fig. 16 schematically illustrates an electronic device for implementing a method of generating a virtual scene in an exemplary embodiment of the present disclosure;
fig. 17 schematically illustrates a computer-readable storage medium for implementing a virtual scene generation method in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Aiming at the problems in the related art, the method for generating the virtual scene is provided by the disclosure. Fig. 1 shows a flow chart of a method for generating a virtual scene, as shown in fig. 1, the method for generating a virtual scene at least comprises the following steps:
and S110, acquiring virtual light source data in a virtual scene, and displaying the virtual light source data on a live broadcast interface.
And S120, responding to the adjustment trigger operation acted on the live broadcast interface, and adjusting the virtual light source data.
In an exemplary embodiment of the disclosure, virtual light source data is displayed on a live interface, so that a anchor can view the light source data in a virtual scene, and a data basis is provided for autonomous interaction and personality setting of the anchor. And virtual light source data in the virtual scene are adjusted according to the adjustment triggering operation, and the anchor automatically changes the light distribution effect in the virtual scene in an interactive mode, so that the customization of the personalized live broadcast scene is completed, the automation and intelligence degrees are high, the generation period of the virtual scene is accelerated, the time cost and the labor cost for changing the virtual scene are reduced, the requirements of users on the freshness and diversification of the live broadcast scene are met, and the watching experience of the audiences is optimized.
The following describes each step of the virtual scene generation method in detail.
In step S110, virtual light source data in the virtual scene is acquired, and the virtual light source data is displayed on the live interface.
In an exemplary embodiment of the present disclosure, the anchor may complete the virtual playback through a virtual playback function of the live software.
Specifically, the anchor is located before the green screen, and the process is entered by clicking the virtual broadcast of the broadcast background. Further, a UE (universal Engine) virtual background is selected to preview the effect after keying in real time. Then, the angle and other parameters of the camera are adjusted to click the playing control to complete the virtual playing. And, the anchor is in a virtual broadcast state.
It should be noted that after the UE virtual background is selected, the customization of the personalized scene may be adjusted before or during the broadcasting.
Correspondingly, the user enters the main broadcast live broadcast room of the virtual broadcast and can normally watch the live broadcast. And the customized virtual scene adjusted by the anchor can be observed.
In an alternative embodiment, fig. 2 shows a flowchart of a method for acquiring virtual light source data, as shown in fig. 2, the method at least includes the following steps: in step S210, a light source adjustment area is provided on the live interface.
The light source adjustment area may be an area that determines whether a function triggering operation of the anchor is valid. Also, the light source adjustment area may be an area of any size or any shape within the live broadcast interface, which is not particularly limited in this exemplary embodiment.
In step S220, virtual light source data in the virtual scene is acquired in response to a function triggering operation acting in the light source adjustment region.
The anchor can trigger interaction events of functions such as clicking and the like on a virtual live broadcast preview picture, namely a live broadcast interface, namely a function trigger operation. The function triggering operation may be other types of triggering operations such as long-time pressing or sliding, besides the click operation, and this exemplary embodiment is not particularly limited to this.
Further, the position information of the function triggering operation acted by the anchor can be judged. Only when the action position of the function triggering operation belongs to the light source adjustment area, the virtual light source data in the virtual scene can be acquired.
And the position information of the function triggering operation can be that the anchor terminal of the live broadcast platform sends the position information to the anchor terminal UE instance, and the anchor terminal UE instance judges and determines that the action position of the function triggering operation belongs to the light source adjusting area.
When the action position of the function triggering operation is not located in the light source adjusting area, the anchor end continues to preview the current virtual live broadcast picture; when the action position of the function triggering operation belongs to the light source adjusting area, the calling logic of the scene light control is satisfied. Specifically, the anchor UE instance reads the virtual light source data in the current virtual scene. And further, the virtual light source data is sent to a main broadcasting end of the live broadcasting platform.
In an alternative embodiment, the virtual light source data includes: light source type data, light source location data, and light source attribute parameters.
The light source type data may include a Directional light source (Directional), a Point light source (Point), a Spot light source (Spot), and Sky light (Sky); the light source position data may include position data of the types of light sources included in the virtual scene, which may be in the form of three-dimensional point coordinates; the light source property parameters may include parameters such as light intensity of each light source, light source color, and influence scene. In addition, the virtual light source data may include other data, and the light source type data, the light source position data, and the light source attribute parameters may further include other data, which is not particularly limited in the present exemplary embodiment.
In the exemplary embodiment, whether virtual light source data are acquired or not is defined through the light source adjustment area, the interaction setting logic is complete, and a functional entrance is provided for the anchor autonomous interaction and setting.
Furthermore, virtual light source data can be displayed on a live broadcast interface.
In an alternative embodiment, a light control panel area is generated on the live interface to display virtual light source data in the light control panel area.
After receiving the virtual light source data, the live broadcast platform anchor terminal can generate a corresponding lamp control panel area on a live broadcast interface. The lamp control panel area is used for displaying the acquired virtual light source data.
In an alternative embodiment, fig. 3 shows a flow chart of a method for displaying virtual light source data, as shown in fig. 3, the method at least comprises the following steps: in step S310, the light source position data is subjected to position projection processing to obtain projection position data.
In order to display the light source position data in the lamp control panel region, the light source position data needs to be subjected to position projection processing.
Specifically, the light source position data of each light source in the virtual scene may be read, and the light source position data may be projected onto a plane formed by XY coordinate axes in sequence to obtain two-dimensional projection position data.
In step S320, based on the light source type data, light source attribute parameters and projection position data corresponding to the light source type data are displayed in the lamp control panel region.
When the virtual light source data are displayed in the lamp control panel area, the light source type data can be read in sequence, and corresponding light source attribute parameters and projection position data are set for different light source type data.
Therefore, the light source attribute parameters and the projection position data can be displayed in the lamp control panel region according to the light source type data.
In an alternative embodiment, the light source attribute parameters include: a light source information parameter and a light source color parameter.
The light source color data may include a light source color value, a light source color temperature, and may also include other data reflecting the light source color, which is not particularly limited in this exemplary embodiment. The light source information data may include the light intensity, and may also include data of other light source information, which is not particularly limited in this exemplary embodiment.
In order to more remarkably show the light source attributes and the projection position data corresponding to different light source types for the anchor terminal, a virtual light source identifier can be generated.
In an optional embodiment, a virtual light source identifier corresponding to the light source type data is displayed on the live interface.
The virtual light source identification may be generated for different light source type settings. Moreover, the virtual light source identifier may be in a sun pattern, a bulb pattern, or a desk lamp pattern, which is not limited in this exemplary embodiment.
And in order to display the light source attribute parameters, the light source attribute parameters can be read in sequence, controls such as light movement track mapping, light attribute and color information mapping and the like are set according to different light source attribute parameters, and the light source attribute parameters are displayed in the area of the lamp control panel according to the form of a sliding rod, a color disc or input parameter values.
In the exemplary embodiment, the converted virtual light source data can be displayed by utilizing the lamp control panel area, the display of the virtual light source data is more beneficial to the viewing and adjustment of the anchor, a light source interaction mode is provided for the anchor, and interaction dimensionality in a virtual live broadcast scene is improved.
In step S120, the virtual light source data is adjusted in response to an adjustment trigger operation applied to the live interface.
In an exemplary embodiment of the present disclosure, after the virtual light source data is displayed on the live interface, the corresponding virtual light source data may be adjusted by an adjustment trigger operation.
In an alternative embodiment, fig. 4 is a flowchart illustrating a method for adjusting virtual light source data, where as shown in fig. 4, the method at least includes the following steps: in step S410, a predetermined sliding path corresponding to the virtual light source identifier is provided in the light control area panel.
A prescribed sliding path can be correspondingly set for the light source type data in the virtual scene. The specified sliding path is a mapping track of the light movement, for example, a 3D-2D mapping track, for restricting the position of the corresponding light source, so that the virtual light source data can be displayed on the specified sliding path for the anchor to act on the virtual light source identifier. In addition, in order to make the anchor end easy to view, a dotted track which is the same as the specified sliding path can be generated and projected in a preview window of the anchor end, so that the anchor can drag according to the dotted track.
In step S420, in response to the adjustment triggering operation applied to the virtual light source identifier, the virtual light source identifier is slid according to the predetermined sliding path to obtain the current sliding position, and the projection position data is adjusted according to the current sliding position.
After providing the prescribed sliding path, the anchor can adjust the trigger operation according to the requirements of different types of light sources and the indication function of the virtual light source identification. The adjustment trigger operation may be a click operation, a long-press operation, or a sliding operation, which is not particularly limited in this exemplary embodiment.
The virtual light source identification can be slid to the current sliding position according to the adjustment triggering operation, and the projection position data is adjusted according to the current sliding position, so that the adjustment of the virtual light source data is realized.
The anchor can adjust the action of triggering operation through an anchor end of the virtual live broadcast, then the anchor end UE instance changes the lamplight position of a corresponding light source in the virtual scene in real time according to the specified sliding path, and updates the picture video stream to a preview window in real time. At this time, the anchor can check the effect of the light after moving.
In the exemplary embodiment, the position data in the virtual light source data can be adjusted through the specified sliding path corresponding to the virtual light source identifier, an interactive mode for adjusting the light source position is provided for the anchor, the personalized effect of the virtual scene is ensured from the angle of the light source position, and the adjustment mode is simple and easy to operate.
In an alternative embodiment, fig. 5 is a schematic flow chart of another method for adjusting virtual light source data, and as shown in fig. 5, the method at least includes the following steps: in step S510, a virtual light source sliding bar corresponding to the virtual light source identifier is provided in the light control area panel, and a mapping relationship of the virtual light source identifier and a current identifier parameter of the virtual light source identifier are obtained.
The anchor UE instance may read the corresponding attribute parameter interface to obtain the corresponding current identifier parameter for the currently displayed virtual light source identifier.
And a virtual light source sliding rod can be generated in the lamp control area panel and used for adjusting the light source information parameters. Further, a mapping relation of the virtual light source identifier is established.
In an alternative embodiment, the mapping relationship is established according to the light source identification parameter and the light source attribute parameter of the virtual light source identification.
And establishing a mapping relation according to the light source information parameters and the sliding rod numerical values of the virtual light source sliding rods. For example, the light intensity may be [0,1000], and the virtual light source slider may correspond to [0,100 ]. Thus, a mapping between [0,1000] for light intensity and [0,100] for virtual light source slider value can be established. The mapping relationship may be linear or non-linear, and this exemplary embodiment is not particularly limited in this respect.
In step S520, in response to the adjustment triggering operation acting on the virtual light source sliding bar, the light source information parameter is adjusted according to the mapping relationship and the current identification parameter.
Furthermore, the user can act an adjustment trigger operation on the virtual light source slide bar to adjust the light source information parameters. The adjustment trigger operation may be a click operation, a long-press operation, or a sliding operation, which is not particularly limited in this exemplary embodiment.
After receiving data of adjustment triggering operation of the anchor terminal, the anchor terminal UE instance may change light source information parameters in the virtual scene in real time according to the mapping relationship, and update the picture video stream to a preview window in real time, so that the anchor terminal views the effect after the light value is changed.
In the exemplary embodiment, the light source information parameters can be adjusted in real time through the arrangement of the virtual light source sliding rod, an interactive mode for adjusting the light values is provided for the anchor, the personalized effect of the virtual scene is ensured from the light values, and the adjustment mode is simple and easy to operate.
In an alternative embodiment, fig. 6 is a flowchart illustrating a further method for adjusting virtual light source data, and as shown in fig. 6, the method at least includes the following steps: in step S610, a palette control corresponding to the virtual light source identifier is provided in the light control area panel.
The palette control may be a control that exhibits a range of color values. Moreover, the palette controls of different virtual light source identifiers may be the same or different, and this exemplary embodiment is not particularly limited to this.
In step S620, the light source color parameters are adjusted in response to an adjustment trigger operation acting on the palette control.
The anchor UE instance may read its attribute parameter interface for the current virtual luminaire identification, and the anchor may return the color value range (srbg, standard Red Green Blue) of the palette control currently selected by the anchor. And the anchor terminal UE instance can adjust the light source color parameters according to the received color value data determined by the adjustment triggering operation.
In an alternative embodiment, fig. 7 shows a flow chart of a method of adjusting color parameters of a light source, which, as shown in fig. 7, comprises at least the following steps: in step S710, color value data corresponding to the adjustment trigger operation is acquired, and color value mapping processing is performed on the color value data to obtain an image color file.
After the color value data is obtained, color value mapping processing may be performed on the color value data.
Specifically, the color value data may be mapped to a scene Lut (Lookup Table) graph to obtain an image color file.
In step S720, the light source color parameters are adjusted by using the image color file.
After the image color file is obtained, the entire light range in the virtual scene can be adjusted by using the Lut chart, and the anchor end can check the effect of the changed light source color parameters.
In the exemplary embodiment, the color parameters of the light source can be adjusted in real time through the setting of the palette control, an interactive mode for adjusting the color parameters is provided for the anchor, the personalized effect of the virtual scene is ensured from the angle of the color of the light, and the adjustment mode is simple and easy to operate.
After the virtual light source data is adjusted by the anchor UE instance, the entity light source data in the real environment where the virtual scene is located can be changed to adapt to the light change in the virtual scene.
In an alternative embodiment, fig. 8 shows a flow chart of a method for adjusting physical light source data, as shown in fig. 8, the method at least includes the following steps: in step S810, anchor display data illuminated by the adjusted virtual light source data is acquired, and an anchor display rule associated with the adjusted virtual light source data is determined.
After the virtual light source data is adjusted, the illumination of the virtual light source data affects the display effect of the face and other parts or other areas of the anchor, so that anchor display data under the illumination of the virtual light source data can be acquired.
The anchor display data may be a color temperature value of the anchor face portion, or may be other data, which is not particularly limited in this exemplary embodiment.
Further, the anchor side UE instance may further obtain a corresponding anchor display rule, that is, a fusion template of the virtual scene and the entity light.
In an alternative embodiment, the anchor display rules include: the same display rule and the complementary display rule.
The same display rule may be a rule that when the anchor display data is cool tone data, the entity light source data is also adjusted to be the cool tone data, that is, a rule of the same color temperature value; the complementary display rule may be a rule for adjusting the entity light source data to the color impact data of the anchor display data, i.e. the complementary color temperature value. Besides, the same display rule and the complementary display rule may also be other rules set according to actual requirements, and this exemplary embodiment is not particularly limited to this.
In step S820, entity light source data corresponding to the virtual scene is adjusted according to the anchor display rule and the anchor display data.
In an alternative embodiment, fig. 9 shows a flow chart of a method for further adjusting physical light source data, and as shown in fig. 9, the method at least includes the following steps: in step S910, to-be-dimmed source data corresponding to the entity light source data is obtained, and light source data calculation is performed on the to-be-dimmed source data and the anchor display data to obtain a fusion display difference.
The light source data to be adjusted may be a target value for determining that the entity light source data needs to be adjusted.
Further, light source data calculation can be performed on the light source data to be dimmed and the anchor display data. Specifically, the difference between the light source data to be dimmed and the anchor display data may be calculated to obtain a corresponding fusion display difference value.
In step S920, a fusion display threshold of the anchor display rule is obtained, and the entity light source data corresponding to the virtual scene is adjusted according to the fusion display difference and the fusion display threshold.
The fusion display threshold is a threshold for judging whether the entity light source data needs to be adjusted.
When the fusion display difference value is larger than the fusion display threshold value, the corresponding entity light source data can be adjusted according to the fusion display difference value and the fusion display threshold value so as to adjust the entity light source data to be warm tone when the virtual light source data is warm tone, and avoid the occurrence of the situation of color distortion of the anchor face.
The anchor UE instance can send entity light source data needing to be adjusted to an entity light control in an offline scene through an entity light control plug-in and a wireless communication template. And when the entity light control receives the entity light source data needing to be adjusted, the data can be analyzed, and the entity light source data can be adjusted according to the corresponding numerical value.
In the exemplary embodiment, the corresponding entity light source data is adjusted through the indication of the virtual light source data, so that a fusion display effect of combining a virtual scene and a display scene into a whole is achieved, and the customization of the personalized scene of the anchor terminal is completed.
In the process of adjusting the entity light source data, the entity light source data may be successfully adjusted, and the entity light source data may also be unsuccessfully adjusted, so that corresponding result identification data may be generated for indication.
In an optional embodiment, the result identification data is generated according to the adjustment result of the entity light source data, and the result identification data is displayed on the live broadcast interface.
The result identification data may be data capable of showing the adjustment result of the entity light source data, for example, may be word data which is successful or failed, or may be symbol data of a correct sign and a wrong sign, and the present exemplary embodiment is not particularly limited. And, the result identification data may be returned to the live end UE instance.
And after the adjustment of the entity light source data is successful, the virtual light source data can be further finely adjusted.
In an alternative embodiment, the virtual light source data is adjusted again in dependence of the resulting identification data.
At this time, the result identification data may be data showing that the adjustment of the entity light source data fails.
In an alternative embodiment, fig. 10 shows a flowchart of a method for readjusting virtual light source data, as shown in fig. 10, the method at least includes the following steps: in step S1010, target light source data after the virtual light source data and the entity light source data are adjusted is obtained, and original light source data before the entity light source data are adjusted is obtained.
The target light source data may be current light source data of the physical light source.
In step S1020, a light source mean value calculation is performed on the original light source data and the target light source data to obtain light source mean value data.
After the original light source data and the target light source data are obtained, the target light source data and the original light source data can be averaged to obtain light source average data before and after the entity light source data are adjusted.
In step S1030, when the result identification data is that the adjustment of the entity light source data is successful, the virtual light source data is adjusted again according to the light source mean data.
After the entity light source data is successfully adjusted, the virtual light source data can be adjusted again by using the light source mean data before and after the entity light source data is adjusted, that is, the virtual light source data is close to the light source mean data before and after the entity light source data is adjusted, so that the virtual scene and the offline environment are kept consistent. And moreover, the picture video stream can be updated to a preview window in real time, so that the anchor end can check the effect of the changed light numerical value.
In this exemplary embodiment, the virtual light source data may be further finely adjusted according to the adjustment result of the entity light source data, so as to maximize the effect of the interaction between the virtual light source data and the entity light source data, and to ensure the scene display effect of the virtual-real combination to the greatest extent.
After the adjustment of the virtual light source data, the adjustment of the entity light source data and the fine adjustment of the virtual light source data are completed, the anchor can click a save button at an anchor endpoint so as to synchronize the adjusted personalized scene to the user end, so that the personalized scene is presented in a live broadcast client of the user end, and the display of the customized virtual scene is completed.
The following describes a method for generating a virtual scene in the embodiment of the present disclosure in detail with reference to an application scene.
Fig. 11 shows an interface schematic diagram of a virtual broadcast of a anchor terminal in an application scenario, and as shown in fig. 11, the anchor can complete the virtual broadcast through a virtual broadcast function of live broadcast software.
Specifically, the anchor is located before the green screen, and the process is entered by clicking the virtual broadcast of the broadcast background. Further, a UE virtual background is selected to preview the effect after image matting in real time. Then, the angle and other parameters of the camera are adjusted to click the playing control to complete the virtual playing. And, the anchor is in a virtual broadcast state.
It should be noted that after the UE virtual background is selected, the customization of the personalized scene may be adjusted before or during the broadcasting.
Correspondingly, the user enters the main broadcast live broadcast room of the virtual broadcast and can normally watch the live broadcast. And the customized virtual scene adjusted by the anchor can be observed.
Fig. 12 is a schematic interface diagram of a lamp control panel area in an application scenario, and as shown in fig. 12, first, a light source adjustment area is provided on a live interface.
The light source adjustment area may be an area that determines whether a function triggering operation of the anchor is valid. Also, the light source adjustment area may be an area of any size or any shape within the live broadcast interface, which is not particularly limited in this exemplary embodiment.
And responding to the function trigger operation acting in the light source adjusting area to acquire virtual light source data in the virtual scene.
The anchor can trigger interaction events of functions such as clicking and the like on a virtual live broadcast preview picture, namely a live broadcast interface, namely a function trigger operation. The function triggering operation may be other types of triggering operations such as long-time pressing or sliding, besides the click operation, and this exemplary embodiment is not particularly limited to this.
Further, the position information of the function triggering operation acted by the anchor can be judged. Only when the action position of the function triggering operation belongs to the light source adjustment area, the virtual light source data in the virtual scene can be acquired.
And the position information of the function triggering operation can be that the anchor terminal of the live broadcast platform sends the position information to the anchor terminal UE instance, and the anchor terminal UE instance judges and determines that the action position of the function triggering operation belongs to the light source adjusting area.
When the action position of the function triggering operation is not located in the light source adjusting area, the anchor end continues to preview the current virtual live broadcast picture; when the action position of the function triggering operation belongs to the light source adjusting area, the calling logic of the scene light control is satisfied. Specifically, the anchor UE instance reads the virtual light source data in the current virtual scene. And further, the virtual light source data is sent to a main broadcasting end of the live broadcasting platform.
Wherein the virtual light source data may include: light source type data, light source location data, and light source attribute parameters.
The light source type data may include directional light sources, point light sources, spotlight sources, and sky light; the light source position data may include position data of the types of light sources included in the virtual scene, which may be in the form of three-dimensional point coordinates; the light source property parameters may include parameters such as light intensity of each light source, light source color, and influence scene. In addition, the virtual light source data may include other data, and the light source type data, the light source position data, and the light source attribute parameters may further include other data, which is not particularly limited in the present exemplary embodiment.
Then, the virtual light source data can also be displayed on the live interface.
Specifically, a lamp control panel area is generated on the live broadcast interface, so that virtual light source data is displayed in the lamp control panel area.
After receiving the virtual light source data, the live broadcast platform anchor terminal can generate a corresponding lamp control panel area on a live broadcast interface. The lamp control panel area is used for displaying the acquired virtual light source data.
And carrying out position projection processing on the position data of the light source to obtain projection position data.
In order to display the light source position data in the lamp control panel region, the light source position data needs to be subjected to position projection processing.
Specifically, the light source position data of each light source in the virtual scene may be read, and the light source position data may be projected onto a plane formed by XY coordinate axes in sequence to obtain two-dimensional projection position data.
And displaying the light source attribute parameters and the projection position data corresponding to the light source type data in the lamp control panel area based on the light source type data.
When the virtual light source data are displayed in the lamp control panel area, the light source type data can be read in sequence, and corresponding light source attribute parameters and projection position data are set for different light source type data.
Therefore, the light source attribute parameters and the projection position data can be displayed in the lamp control panel region according to the light source type data.
Wherein the light source attribute parameters may include: a light source information parameter and a light source color parameter.
The light source color data may include a light source color value, a light source color temperature, and may also include other data reflecting the light source color, which is not particularly limited in this exemplary embodiment. The light source information data may include the light intensity, and may also include data of other light source information, which is not particularly limited in this exemplary embodiment.
In order to more remarkably show the light source attributes and the projection position data corresponding to different light source types for the anchor terminal, a virtual light source identifier can be generated.
And displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
The virtual light source identification may be generated for different light source type settings. Also, the virtual light source identification may be bulb-style.
And in order to display the light source attribute parameters, the light source attribute parameters can be read in sequence, controls such as light movement track mapping, light attribute and color information mapping and the like are set according to different light source attribute parameters, and the light source attribute parameters are displayed in the area of the lamp control panel according to the form of a sliding rod, a color disc or input parameter values.
Furthermore, after the virtual light source data are displayed on the live broadcast interface, the corresponding virtual light source data can be adjusted through adjusting the trigger operation.
A prescribed sliding path corresponding to the virtual light source identification is provided in the light control area panel.
Fig. 13 is a schematic diagram of an interface for specifying a sliding path in an application scene, and as shown in fig. 13, a specified sliding path may be correspondingly set for light source type data of a bulb pattern in a virtual scene. The specified sliding path is a mapping track of the light movement, for example, a 3D-2D mapping track, for restricting the position of the corresponding light source, so that the virtual light source data can be displayed on the specified sliding path for the anchor to act on the virtual light source identifier. In addition, in order to make the anchor end easy to view, a dotted track which is the same as the specified sliding path can be generated and projected in a preview window of the anchor end, so that the anchor can drag according to the dotted track.
And responding to the adjustment triggering operation acted on the virtual light source identifier, sliding the virtual light source identifier according to a specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
After providing the prescribed sliding path, the anchor can adjust the trigger operation according to the requirements of different types of light sources and the indication function of the virtual light source identification. The adjustment trigger operation may be a click operation, a long-press operation, or a sliding operation, which is not particularly limited in this exemplary embodiment.
The virtual light source identification can be slid to the current sliding position according to the adjustment triggering operation, and the projection position data is adjusted according to the current sliding position, so that the adjustment of the virtual light source data is realized.
The anchor can adjust the action of triggering operation through an anchor end of the virtual live broadcast, then the anchor end UE instance changes the lamplight position of a corresponding light source in the virtual scene in real time according to the specified sliding path, and updates the picture video stream to a preview window in real time. At this time, the anchor can check the effect of the light after moving.
And providing a virtual light source sliding rod corresponding to the virtual light source identification in the lamp control area panel to obtain the mapping relation of the virtual light source identification and the current identification parameter of the virtual light source identification.
The anchor UE instance may read the corresponding attribute parameter interface to obtain the corresponding current identifier parameter for the currently displayed virtual light source identifier.
And a virtual light source sliding rod can be generated in the lamp control area panel and used for adjusting the light source information parameters. Further, a mapping relation of the virtual light source identifier is established.
And establishing a mapping relation according to the light source information parameters and the sliding rod numerical values of the virtual light source sliding rods. For example, the light intensity may be [0,1000], and the virtual light source slider may correspond to [0,100 ]. Thus, a mapping between [0,1000] for light intensity and [0,100] for virtual light source slider value can be established. The mapping relationship may be linear or non-linear, and this exemplary embodiment is not particularly limited in this respect.
And responding to the adjustment triggering operation acted on the virtual light source sliding rod, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
Furthermore, the user can act an adjustment trigger operation on the virtual light source slide bar to adjust the light source information parameters. The adjustment trigger operation may be a click operation, a long-press operation, or a sliding operation, which is not particularly limited in this exemplary embodiment.
After receiving data of adjustment triggering operation of the anchor terminal, the anchor terminal UE instance may change light source information parameters in the virtual scene in real time according to the mapping relationship, and update the picture video stream to a preview window in real time, so that the anchor terminal views the effect after the light value is changed.
And providing a palette control corresponding to the virtual light source identification in the light control area panel.
The palette control may be a control that exhibits a range of color values. Moreover, the palette controls of different virtual light source identifiers may be the same or different, and this exemplary embodiment is not particularly limited to this.
And responding to the adjustment trigger operation acted on the color palette control to adjust the color parameters of the light source.
The anchor UE instance may read its attribute parameter interface for the current virtual light source identification, and the anchor may return the color value range of the palette control currently selected by the anchor. And the anchor terminal UE instance can adjust the light source color parameters according to the received color value data determined by the adjustment triggering operation.
And acquiring color value data corresponding to the adjustment trigger operation, and performing color value mapping processing on the color value data to obtain an image color file.
After the color value data is obtained, color value mapping processing may be performed on the color value data.
Specifically, the color value data may be mapped to a scene Lut (Lookup Table) graph to obtain an image color file.
And adjusting the color parameters of the light source by using the image color file.
After the image color file is obtained, the entire light range in the virtual scene can be adjusted by using the Lut chart, and the anchor end can check the effect of the changed light source color parameters.
Fig. 14 shows a virtual live broadcast effect diagram after adjusting virtual light source data in an application scene, and as shown in fig. 14, a anchor can freely customize a light scene required by the anchor by adjusting the virtual light source data in the virtual live broadcast scene. Under the light scene of adjusting virtual light source data, the light distribution effect in the virtual scene is better, and the virtual combination effect is better.
According to the method for generating the virtual scene under the application scene, the virtual light source data are displayed on the live broadcast interface, so that the anchor terminal can view the light source data in the virtual scene, and a data basis is provided for the independent interaction and the individual setting of the anchor terminal. And virtual light source data in the virtual scene are adjusted according to the adjustment triggering operation, and the anchor automatically changes the light distribution effect in the virtual scene in an interactive mode, so that the customization of the personalized live broadcast scene is completed, the automation and intelligence degrees are high, the generation period of the virtual scene is accelerated, the time cost and the labor cost for changing the virtual scene are reduced, the requirements of users on the freshness and diversification of the live broadcast scene are met, and the watching experience of the audiences is optimized.
In addition, in an exemplary embodiment of the present disclosure, a device for generating a virtual scene is further provided, where a live interface is provided by a host, the live interface includes a video display area, and the video display area displays the virtual scene. Fig. 15 is a schematic structural diagram of a virtual scene generation apparatus, and as shown in fig. 15, the virtual scene generation apparatus 1500 may include: a data display module 1510 and a data adjustment module 1520. Wherein:
a data display module 1510 configured to acquire virtual light source data within the virtual scene and display the virtual light source data on the live broadcast interface; a data adjustment module 1520 configured to adjust the virtual light source data in response to an adjustment trigger operation applied to the live interface.
In an exemplary embodiment of the present invention, the acquiring virtual light source data in the virtual scene includes:
providing a light source adjusting area on the live broadcast interface;
and responding to function trigger operation acting in the light source adjusting area to acquire virtual light source data in the virtual scene.
In an exemplary embodiment of the present invention, the virtual light source data includes: light source type data, light source location data, and light source attribute parameters.
In an exemplary embodiment of the present invention, the displaying the virtual light source data on the live interface includes:
and generating a lamp control panel area on the live broadcast interface so as to display the virtual light source data in the lamp control panel area.
In an exemplary embodiment of the present invention, the displaying the virtual light source data in the lamp control panel region includes:
performing position projection processing on the light source position data to obtain projection position data;
and displaying light source attribute parameters and the projection position data corresponding to the light source type data in the lamp control panel area based on the light source type data.
In an exemplary embodiment of the invention, the light source property parameters include: a light source information parameter and a light source color parameter.
In an exemplary embodiment of the invention, the method further comprises:
and displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a specified sliding path corresponding to the virtual light source identifier in the lamp control area panel;
and responding to the adjustment triggering operation acted on the virtual light source identification, sliding the virtual light source identification according to the specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a virtual light source sliding rod corresponding to the virtual light source identification in the lamp control area panel, and acquiring the mapping relation of the virtual light source identification and the current identification parameter of the virtual light source identification;
and responding to an adjustment trigger operation acted on the virtual light source sliding rod, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
In an exemplary embodiment of the present invention, the mapping relationship is established according to the light source identification parameter and the light source attribute parameter of the virtual light source identification.
In an exemplary embodiment of the present invention, the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface includes:
providing a palette control corresponding to the virtual light source identification in the lamp control area panel;
and responding to an adjustment trigger operation acted on the palette control to adjust the light source color parameters.
In an exemplary embodiment of the invention, the adjusting the light source color parameter includes:
acquiring color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file;
and adjusting the light source color parameters by using the image color file.
In an exemplary embodiment of the invention, the method further comprises:
acquiring anchor display data under the irradiation of the adjusted virtual light source data, and determining an anchor display rule of the adjusted virtual light source data;
and adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data.
In an exemplary embodiment of the present invention, the adjusting the entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data includes:
acquiring light source data to be adjusted corresponding to entity light source data, and performing light source data calculation on the light source data to be adjusted and the anchor display data to obtain a fusion display difference value;
and acquiring a fusion display threshold value of the anchor display rule, and adjusting entity light source data corresponding to the virtual scene according to the fusion display difference value and the fusion display threshold value.
In an exemplary embodiment of the present invention, the anchor display rule includes: the same display rule and the complementary display rule.
In an exemplary embodiment of the invention, the method further comprises:
and generating result identification data according to the adjustment result of the entity light source data, and displaying the result identification data on the live broadcast interface.
In an exemplary embodiment of the invention, the method further comprises:
and adjusting the virtual light source data again according to the result identification data.
In an exemplary embodiment of the invention, the readjusting the virtual light source data according to the result identification data includes:
acquiring target light source data after the virtual light source data and the entity light source data are adjusted, and acquiring original light source data before the entity light source data are adjusted;
performing light source mean calculation on the original light source data and the target light source data to obtain light source mean data;
and when the result identification data is that the adjustment of the entity light source data is successful, adjusting the virtual light source data again according to the light source mean value data.
The specific details of the virtual scene generating apparatus 1500 have been described in detail in the corresponding virtual scene generating method, and therefore are not described herein again.
It should be noted that although several modules or units of the generating apparatus 1500 of the virtual scene are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
An electronic device 1600 according to such an embodiment of the invention is described below with reference to fig. 16. The electronic device 1600 shown in fig. 16 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 16, electronic device 1600 is in the form of a general purpose computing device. Components of electronic device 1600 may include, but are not limited to: the at least one processing unit 1610, the at least one memory unit 1620, the bus 1630 connecting different system components (including the memory unit 1620 and the processing unit 1610), and the display unit 1640.
Wherein the memory unit stores program code that may be executed by the processing unit 1610 to cause the processing unit 1610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary method" of the present specification.
The memory unit 1620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)1621 and/or a cache memory unit 1622, and may further include a read only memory unit (ROM) 1623.
The storage unit 1620 may also include a program/utility 1624 having a set (at least one) of program modules 1625, such program modules 1625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1630 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1600 can also communicate with one or more external devices 1800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1600 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interface 1650. Also, the electronic device 1600 can communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1660. As shown, network adapter 1640 communicates with other modules of electronic device 1600 over bus 1630. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with electronic device 1600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when said program product is run on the terminal device.
Referring to fig. 17, a program product 1700 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (21)

1. A method for generating a virtual scene is provided, a live interface is provided through a main broadcast end, the live interface comprises a video display area, and the video display area displays the virtual scene, and the method is characterized by comprising the following steps:
acquiring virtual light source data in the virtual scene, and displaying the virtual light source data on the live broadcast interface;
and responding to an adjustment trigger operation acting on the live broadcast interface, and adjusting the virtual light source data.
2. The method for generating a virtual scene according to claim 1, wherein the acquiring virtual light source data in the virtual scene comprises:
providing a light source adjusting area on the live broadcast interface;
and responding to function trigger operation acting in the light source adjusting area to acquire virtual light source data in the virtual scene.
3. The method for generating a virtual scene according to claim 1, wherein the virtual light source data comprises: light source type data, light source location data, and light source attribute parameters.
4. The method for generating a virtual scene according to claim 3, wherein the displaying the virtual light source data on the live interface comprises:
and generating a lamp control panel area on the live broadcast interface so as to display the virtual light source data in the lamp control panel area.
5. The method for generating a virtual scene according to claim 4, wherein the displaying the virtual light source data in the light control panel area comprises:
performing position projection processing on the light source position data to obtain projection position data;
and displaying light source attribute parameters and the projection position data corresponding to the light source type data in the lamp control panel area based on the light source type data.
6. The method for generating a virtual scene according to claim 5, wherein the light source attribute parameters comprise: a light source information parameter and a light source color parameter.
7. The method for generating a virtual scene according to claim 5, further comprising:
and displaying a virtual light source identifier corresponding to the light source type data on the live broadcast interface.
8. The method for generating a virtual scene according to claim 7, wherein the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface comprises:
providing a specified sliding path corresponding to the virtual light source identifier in the lamp control area panel;
and responding to the adjustment triggering operation acted on the virtual light source identification, sliding the virtual light source identification according to the specified sliding path to obtain a current sliding position, and adjusting the projection position data according to the current sliding position.
9. The method for generating a virtual scene according to claim 7, wherein the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface comprises:
providing a virtual light source sliding rod corresponding to the virtual light source identification in the lamp control area panel, and acquiring the mapping relation of the virtual light source identification and the current identification parameter of the virtual light source identification;
and responding to an adjustment trigger operation acted on the virtual light source sliding rod, and adjusting the light source information parameters according to the mapping relation and the current identification parameters.
10. The method of claim 9, wherein the mapping relationship is established according to the light source identification parameter of the virtual light source identification and the light source attribute parameter.
11. The method for generating a virtual scene according to claim 7, wherein the adjusting the virtual light source data in response to an adjustment trigger operation acting on the live interface comprises:
providing a palette control corresponding to the virtual light source identification in the lamp control area panel;
and responding to an adjustment trigger operation acted on the palette control to adjust the light source color parameters.
12. The method for generating a virtual scene according to claim 11, wherein the adjusting the light source color parameters comprises:
acquiring color value data corresponding to the adjustment triggering operation, and performing color value mapping processing on the color value data to obtain an image color file;
and adjusting the light source color parameters by using the image color file.
13. The method for generating a virtual scene according to claim 1, further comprising:
acquiring anchor display data under the irradiation of the adjusted virtual light source data, and determining an anchor display rule of the adjusted virtual light source data;
and adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data.
14. The method for generating a virtual scene according to claim 13, wherein said adjusting entity light source data corresponding to the virtual scene according to the anchor display rule and the anchor display data comprises:
acquiring light source data to be adjusted corresponding to entity light source data, and performing light source data calculation on the light source data to be adjusted and the anchor display data to obtain a fusion display difference value;
and acquiring a fusion display threshold value of the anchor display rule, and adjusting entity light source data corresponding to the virtual scene according to the fusion display difference value and the fusion display threshold value.
15. The method of generating a virtual scene of claim 14, wherein the anchor display rule comprises: the same display rule and the complementary display rule.
16. The method for generating a virtual scene according to claim 13, further comprising:
and generating result identification data according to the adjustment result of the entity light source data, and displaying the result identification data on the live broadcast interface.
17. The method for generating a virtual scene according to claim 16, further comprising:
and adjusting the virtual light source data again according to the result identification data.
18. The method of claim 17, wherein said readjusting the virtual light source data according to the result identification data comprises:
acquiring target light source data after the virtual light source data and the entity light source data are adjusted, and acquiring original light source data before the entity light source data are adjusted;
performing light source mean calculation on the original light source data and the target light source data to obtain light source mean data;
and when the result identification data is that the adjustment of the entity light source data is successful, adjusting the virtual light source data again according to the light source mean value data.
19. A virtual scene generating device provides a live interface through a main broadcasting end, the live interface comprises a video display area, and the video display area displays a virtual scene, and the virtual scene generating device is characterized by comprising:
the data display module is configured to acquire virtual light source data in the virtual scene and display the virtual light source data on the live broadcast interface;
a data adjustment module configured to adjust the virtual light source data in response to an adjustment trigger operation acting on the live interface.
20. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method for generating a virtual scene according to any one of claims 1 to 18.
21. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of generating a virtual scene of any one of claims 1-18 via execution of the executable instructions.
CN202111011155.1A 2021-08-31 2021-08-31 Virtual scene generation method and device, storage medium and electronic equipment Pending CN113706719A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111011155.1A CN113706719A (en) 2021-08-31 2021-08-31 Virtual scene generation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111011155.1A CN113706719A (en) 2021-08-31 2021-08-31 Virtual scene generation method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113706719A true CN113706719A (en) 2021-11-26

Family

ID=78657906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111011155.1A Pending CN113706719A (en) 2021-08-31 2021-08-31 Virtual scene generation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113706719A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114201095A (en) * 2021-12-14 2022-03-18 广州博冠信息科技有限公司 Control method and device for live interface, storage medium and electronic equipment
CN114554240A (en) * 2022-02-25 2022-05-27 广州博冠信息科技有限公司 Interaction method and device in live broadcast, storage medium and electronic equipment
CN115243065A (en) * 2022-07-19 2022-10-25 广州博冠信息科技有限公司 Method and device for scheduling light and package and electronic equipment
CN117424969A (en) * 2023-10-23 2024-01-19 神力视界(深圳)文化科技有限公司 Light control method and device, mobile terminal and storage medium
CN117440184A (en) * 2023-12-20 2024-01-23 深圳市亿莱顿科技有限公司 Live broadcast equipment and control method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105794196A (en) * 2013-10-21 2016-07-20 诺基亚技术有限公司 Method, apparatus and computer program product for modifying illumination in an image
CN109785423A (en) * 2018-12-28 2019-05-21 广州华多网络科技有限公司 Image light compensation method, device and computer equipment
CN111050189A (en) * 2019-12-31 2020-04-21 广州酷狗计算机科技有限公司 Live broadcast method, apparatus, device, storage medium, and program product
CN111756956A (en) * 2020-06-23 2020-10-09 网易(杭州)网络有限公司 Virtual light control method and device, medium and equipment in virtual studio
US20200368616A1 (en) * 2017-06-09 2020-11-26 Dean Lindsay DELAMONT Mixed reality gaming system
CN112116695A (en) * 2020-09-24 2020-12-22 广州博冠信息科技有限公司 Virtual light control method and device, storage medium and electronic equipment
CN112188228A (en) * 2020-09-30 2021-01-05 网易(杭州)网络有限公司 Live broadcast method and device, computer readable storage medium and electronic equipment
CN112562056A (en) * 2020-12-03 2021-03-26 广州博冠信息科技有限公司 Control method, device, medium and equipment for virtual light in virtual studio
CN112770135A (en) * 2021-01-21 2021-05-07 腾讯科技(深圳)有限公司 Live broadcast-based content explanation method and device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105794196A (en) * 2013-10-21 2016-07-20 诺基亚技术有限公司 Method, apparatus and computer program product for modifying illumination in an image
US20200368616A1 (en) * 2017-06-09 2020-11-26 Dean Lindsay DELAMONT Mixed reality gaming system
CN109785423A (en) * 2018-12-28 2019-05-21 广州华多网络科技有限公司 Image light compensation method, device and computer equipment
CN111050189A (en) * 2019-12-31 2020-04-21 广州酷狗计算机科技有限公司 Live broadcast method, apparatus, device, storage medium, and program product
CN111756956A (en) * 2020-06-23 2020-10-09 网易(杭州)网络有限公司 Virtual light control method and device, medium and equipment in virtual studio
CN112116695A (en) * 2020-09-24 2020-12-22 广州博冠信息科技有限公司 Virtual light control method and device, storage medium and electronic equipment
CN112188228A (en) * 2020-09-30 2021-01-05 网易(杭州)网络有限公司 Live broadcast method and device, computer readable storage medium and electronic equipment
CN112562056A (en) * 2020-12-03 2021-03-26 广州博冠信息科技有限公司 Control method, device, medium and equipment for virtual light in virtual studio
CN112770135A (en) * 2021-01-21 2021-05-07 腾讯科技(深圳)有限公司 Live broadcast-based content explanation method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HUI-NING WU, XUE-MIN WANG, 等: "Rendering a virtual light source to seem like a realistic light source in an electronic display: A critical band of luminance gradients for the perception of self-luminosity", DISPLAYS, vol. 59, 30 September 2019 (2019-09-30) *
温广权;: "基于AI技术的增强型虚拟演播室应用", 广播与电视技术, no. 01, 15 January 2020 (2020-01-15) *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114201095A (en) * 2021-12-14 2022-03-18 广州博冠信息科技有限公司 Control method and device for live interface, storage medium and electronic equipment
CN114554240A (en) * 2022-02-25 2022-05-27 广州博冠信息科技有限公司 Interaction method and device in live broadcast, storage medium and electronic equipment
CN115243065A (en) * 2022-07-19 2022-10-25 广州博冠信息科技有限公司 Method and device for scheduling light and package and electronic equipment
CN117424969A (en) * 2023-10-23 2024-01-19 神力视界(深圳)文化科技有限公司 Light control method and device, mobile terminal and storage medium
CN117440184A (en) * 2023-12-20 2024-01-23 深圳市亿莱顿科技有限公司 Live broadcast equipment and control method thereof
CN117440184B (en) * 2023-12-20 2024-03-26 深圳市亿莱顿科技有限公司 Live broadcast equipment and control method thereof

Similar Documents

Publication Publication Date Title
CN113706719A (en) Virtual scene generation method and device, storage medium and electronic equipment
CN111698390B (en) Virtual camera control method and device, and virtual studio implementation method and system
US20220014709A1 (en) Display And Image Processing Method
CN101438579B (en) Adaptive rendering of video content based on additional frames of content
US9881584B2 (en) System and method for presenting content within virtual reality environment
CN109887066B (en) Lighting effect processing method and device, electronic equipment and storage medium
CN103649904A (en) Adaptive presentation of content
CN111756956B (en) Virtual light control method and device, medium and equipment in virtual studio
TW200527920A (en) Projector and method of projecting an image having multiple image sizes
CN112449229B (en) Sound and picture synchronous processing method and display equipment
CN112543344B (en) Live broadcast control method and device, computer readable medium and electronic equipment
CN111683260A (en) Program video generation method, system and storage medium based on virtual anchor
CN105912116A (en) Intelligent projection method and projector
CN112399263A (en) Interaction method, display device and mobile terminal
CN114092671A (en) Virtual live broadcast scene processing method and device, storage medium and electronic equipment
CN114302221B (en) Virtual reality equipment and screen-throwing media asset playing method
CN112017264B (en) Display control method and device for virtual studio, storage medium and electronic equipment
WO2021088890A1 (en) Display system and display method
CN113407289A (en) Wallpaper switching method, wallpaper generation method, device and storage medium
CN112269553A (en) Display system, display method and computing device
CN107430841B (en) Information processing apparatus, information processing method, program, and image display system
CN112470486B (en) Modifying playback of alternate content in response to detection of a remote control signal modifying operation of a playback device
CN113676690A (en) Method, device and storage medium for realizing video conference
CN112153472A (en) Method and device for generating special picture effect, storage medium and electronic equipment
CN114339174B (en) Projection equipment and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination