CN114520903B - Rendering display method, rendering display device, electronic equipment and storage medium - Google Patents

Rendering display method, rendering display device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114520903B
CN114520903B CN202210148247.2A CN202210148247A CN114520903B CN 114520903 B CN114520903 B CN 114520903B CN 202210148247 A CN202210148247 A CN 202210148247A CN 114520903 B CN114520903 B CN 114520903B
Authority
CN
China
Prior art keywords
led screens
display screen
display
led
virtual shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210148247.2A
Other languages
Chinese (zh)
Other versions
CN114520903A (en
Inventor
唐继正
贾晨曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenli Vision Shenzhen Cultural Technology Co ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202210148247.2A priority Critical patent/CN114520903B/en
Publication of CN114520903A publication Critical patent/CN114520903A/en
Application granted granted Critical
Publication of CN114520903B publication Critical patent/CN114520903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Embodiments of the present application provide a rendering display method, apparatus, storage medium, and computer program product, where the method includes: obtaining shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the size and position information of the at least two LED screens; obtaining the display range of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen; according to the display range and the positions of at least two LED screens, at least two display areas corresponding to the at least two LED screens are established in the target display screen; rendering and displaying images of the LED screen and the virtually shot object in at least two display areas of the target display screen.

Description

Rendering display method, rendering display device, electronic equipment and storage medium
Technical Field
Embodiments of the present application relate to the field of electronic information technology, and in particular, to a rendering display method, apparatus, storage medium, and computer program product.
Background
With the development of film and television shooting technology, a virtual shooting technology using an LED screen is developed, and video images are projected on the screen through the built LED screen so as to complete scenery work of film and television shooting. The film and television shooting does not need to be subjected to field view finding any more, so that the cost and the manufacturing period of the film and television shooting are reduced.
The existing virtual shooting technology is that images displayed by an LED screen are processed by a real-time rendering engine and output to the screen, and the real-time rendering engine needs to process the images displayed by the LED screen through complex later-stage software, so that the virtual shooting technology is complex in operation of shooting video and low in manufacturing efficiency.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a rendering display method, apparatus, storage medium, and computer program product to at least partially solve the above-mentioned problems.
According to a first aspect of embodiments of the present application, there is provided a rendering display method based on virtual shooting, the method including: obtaining shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the size and position information of the at least two LED screens; obtaining the display range of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen; according to the display range and the positions of at least two LED screens, at least two display areas corresponding to the at least two LED screens are established in the target display screen; rendering and displaying images of the LED screen and the virtually shot object in at least two display areas of the target display screen.
According to a second aspect of embodiments of the present application, there is provided an electronic device, including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface are communicated with each other through the communication bus; the memory is configured to store at least one executable instruction, where the executable instruction causes the processor to perform operations corresponding to the virtual shooting-based rendering display method according to the first aspect.
According to a third aspect of embodiments of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the virtual shooting-based rendering display method as in the first aspect.
According to a fourth aspect of embodiments of the present application, there is provided a computer program product which, when executed by a processor, implements the virtual photography based rendering display method of the first aspect.
According to the rendering display scheme based on virtual shooting, the display range of the at least two LED screens projected to the target display screen is obtained through the size of the at least two LED screens and the size of the target display screen, and then according to the display range and the positions of the at least two LED screens, at least two display areas corresponding to the at least two LED screens are built in the target display screen. In the embodiment of the application, images of the LED screen and the virtually shot objects are rendered and displayed in at least two display areas of the target display screen. According to the method and the device for the video shooting, the display states of at least two LED screens can be directly restored in the display area of the target screen, so that the display of the target screen is not required to be manually set, the operation of the video shooting by the virtual shooting technology is simpler and more convenient, and the manufacturing efficiency of the video shooting is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings may also be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic view of a rendering display scene based on virtual shooting according to an embodiment of the present application;
fig. 2 is a flowchart of a rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 3 is a schematic diagram of a screen based on virtual shooting according to an embodiment of the present application;
fig. 4 is a schematic view of another screen based on virtual shooting according to an embodiment of the present application;
fig. 5 is a schematic page diagram of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 6 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 7 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 8 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 9 is a schematic page diagram of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 10 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 11 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 12 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 13 is a schematic page diagram of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 14 is a flowchart of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 15 is a schematic page diagram of still another rendering display method based on virtual shooting according to an embodiment of the present application;
fig. 16 is a block diagram of a rendering display device based on virtual shooting according to an embodiment of the present application;
fig. 17 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to better understand the technical solutions in the embodiments of the present application, the following descriptions will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the embodiments of the present application shall fall within the scope of protection of the embodiments of the present application.
Embodiments of the present application are further described below with reference to the accompanying drawings of embodiments of the present application.
An application scene of the rendering display method based on virtual shooting provided in the embodiment of the application is described, referring to fig. 1, and fig. 1 is a schematic scene diagram of the rendering display method based on virtual shooting provided in the embodiment of the application. The rendering display method of virtual shooting shown in fig. 1 is run on the electronic device 101, and the electronic device 101 may be a device that executes the rendering display method based on virtual shooting provided in the embodiment of the present application. The virtually shot image is transmitted to the electronic device 101 in real time, and the electronic device 101 runs the rendering display method of virtual shooting in the embodiment of the application.
The electronic device 101 may be a terminal device such as a smart phone, a tablet computer, a notebook computer, or a vehicle-mounted terminal, or the electronic device 101 may be a network device such as a server, which is, of course, only illustrative and not limiting.
The electronic device 101 may access a network, connect to the cloud end through the network, and perform data interaction, or the electronic device 101 may be a device in the cloud end. In the present application, the network includes a local area network (English: local Area Network, LAN), a wide area network (English: wide Area Network, WAN), and a mobile communication network; such as the World Wide Web (WWW), long term evolution (english: long Term Evolution, LTE) networks, 2G networks (english: 2th Generation Mobile Network), 3G networks (english: 3th Generation Mobile Network), 5G networks (english: 5th Generation Mobile Network), and the like. The cloud may include various devices connected through a network, for example, a server, a relay Device, a Device-to-Device (D2D) Device, and the like. Of course, the description is intended to be illustrative only and is not to be taken as limiting the present application.
With reference to the system shown in fig. 1, the method for rendering and displaying a virtual shot provided in the embodiment of the present application will be described in detail, and it should be noted that fig. 1 is only one application scenario of the method for rendering and displaying a virtual shot provided in the first embodiment of the present application, and does not represent that the method for rendering and displaying a virtual shot must be applied to the application scenario shown in fig. 1.
Referring to fig. 2, a rendering display method based on virtual shooting is provided in an embodiment of the present application. The method comprises the following steps:
and step 201, obtaining shooting environment information of at least two LED screens for virtual shooting.
Specifically, the shooting environment information includes: and the size and position information of the at least two LED screens.
The size of the at least two LED screens may characterize the size of the at least two LED screens, the location information of the at least two LED screens including: information representing the positional relationship of at least two LED screens, such as the angle between the at least two LED screens, whether the at least two LED screens overlap, and the like.
In some specific implementations of the embodiment of the present application, referring to fig. 3, at least two LED screens used for performing virtual shooting are three LED screens, where the first LED screen 31 is disposed on the ground, the second LED screen 32 is perpendicular to the third LED screen 33, and the second LED screen 32 is disposed on the first LED screen 31. The object for virtual shooting is located in the surrounding of the three LED screens, so that video images projected by the three LED screens form a scene of film and television shooting.
In some specific implementations of this embodiment, referring to fig. 4, at least two LED screens used for virtual shooting are two LED screens, where the fourth LED screen 41 is disposed on the ground, and the fifth LED screen 42 is a curved screen and is vertically disposed on the first LED screen 41. The object for virtual shooting is located in the surrounding of the two LED screens, so that video images projected by the two LED screens form a scene of film and television shooting.
Other at least two LED screens surrounding the virtually photographed object may also be used in the embodiments of the present application, which are not limited in this application.
According to the embodiment of the application, at least two LED screens are adopted for virtual shooting, so that a scenery surrounding a virtual shot object can be formed.
Step 202, obtaining a display range of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen.
According to the method and the device for displaying the LED screen, the display range of the at least two LED screens projected to the target display screen is determined through the size of the at least two LED screens and the size of the target display screen, so that a user does not need to manually operate the at least two LED screens, and the display range of the at least two LED screens is determined in the target display screen.
Specifically, according to the proportional relation between the sizes of at least two LED screens and the size of the target display screen, the display range of the at least two LED screens projected to the target display screen is determined.
And 203, establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens.
In order to restore at least two LED screens in a target display screen within a display range, the embodiment of the application adopts the method that a plurality of first characteristic points are taken from the at least two LED screens, mapping is carried out according to coordinate values of the plurality of first characteristic points under three-dimensional coordinates of the at least two LED screens, and second characteristic points corresponding to the plurality of first characteristic points in the display range of the target display screen are obtained. And obtaining at least two display areas corresponding to the at least two LED screens according to the plurality of second characteristic points.
According to the embodiment of the application, at least two display areas are adopted to restore at least two LED screens, so that the shooting environment of virtual shooting can be truly restored, and the shooting environment of virtual shooting can be tested and adjusted.
And 204, rendering and displaying the images of the LED screen and the virtually shot object in at least two display areas of the target display screen.
For example, referring to fig. 5, the embodiment of the present application displays three display areas consistent with three LED screens virtually photographed in a target display screen, and displays images of the LED screens and objects virtually photographed in the three display areas.
According to the method and the device for displaying the images of the LED screens and the virtual shot objects in the at least two display areas, the display states of the at least two LED screens can be directly restored in the display areas of the target screens, so that the display of the target screens is not required to be manually set, the virtual shooting technology is enabled to operate more conveniently in film shooting, and the manufacturing efficiency of the film shooting is improved.
Referring to fig. 6, in some further specific implementations of the embodiments of the present application, the method further includes:
step 205, according to the received first user instruction, respectively adjusting the light distribution states of the at least two display areas to simulate the light distribution effect of the virtual shooting.
Illustratively, referring to FIG. 5, an embodiment of the present application has a light distribution state control 51. The user can input a first user instruction through the light distribution state control 51, adjust the light distribution state, and simulate the light distribution effect of virtual shooting in advance.
According to the method and the device, the virtual shooting light distribution effect is simulated in advance, so that the virtual shooting light distribution is adjusted in advance, and the better virtual shooting effect is achieved.
Referring to fig. 7, in still other specific implementations of the embodiments of the present application, the method further includes:
and 206, respectively adjusting the time axes of the at least two display areas according to the received second user instruction so as to monitor the playing effect of the simulated shooting environment.
Illustratively, referring to FIG. 5, the present embodiment has a timeline control 52 associated with each of the three display areas. The user may input a second user command through the three time axis controls 52, and monitor the playing effects of the at least two display areas, so as to perform various adjustment operations on the virtual shot screen image.
According to the embodiment of the application, the playing effect of the at least two display areas is monitored, so that various adjustment operations are conducted on the virtual shooting, and better virtual shooting effect is achieved.
Referring to fig. 8, in still other specific implementations of the embodiments of the present application, the method further includes:
step 207, the display area is divided into at least one sub-display area, and image processing operation is performed on each sub-display area.
In the virtual shooting process, the images may have differences in depth of field such as foreground, middle scene, background and the like. Therefore, aiming at each corresponding LED screen, the display area in the embodiment of the application cuts more sub-display areas, allows the sub-display areas to be subjected to image processing operations such as masking, sorting, scaling, deforming and the like so as to construct more various depth of field effects, and supports more film and television shooting scenes.
For example, referring to fig. 9, in the embodiment of the present application, a display area is divided into three sub-display areas (sub-display area 1, sub-display area 2, and sub-display area 3), and depth controls 91 corresponding to the three sub-display areas respectively may be adjusted, so that image processing operations such as masking, sorting, scaling, deforming, etc. are performed on the sub-display areas to construct more diverse depth effects.
Referring to fig. 10, in still other specific implementations of the embodiments of the present application, the method further includes:
and step 208, adding a virtual shooting element, and displaying the added attribute of the virtual shooting element, wherein the virtual shooting element is an element adopted for virtual shooting.
For example, referring to fig. 5, in the embodiment of the present application, a user may add a virtual shooting element 53 such as a camera, a light, and the like, and when the virtual shooting element is selected, display an attribute 54 of the virtual shooting element.
According to the embodiment of the application, the virtual shooting elements can be intuitively adjusted through adding the virtual shooting elements and displaying the attributes of the added virtual shooting elements, so that a better virtual shooting effect is achieved.
Referring to fig. 11, in still other specific implementations of the embodiments of the present application, the method further includes:
and 209, performing image brightness reduction processing on the part, corresponding to the joint of the at least two LED screens, of the display area.
According to the embodiment of the application, the image brightness is reduced at the part corresponding to the joint of the at least two LED screens, so that the unnatural feeling caused by the joint of the at least two LED screens is weakened.
Referring to fig. 12, in still other specific implementations of the embodiments of the present application, the method further includes:
and 210, respectively adjusting parameters of a camera adopted by the virtual shooting according to the received third user instruction.
Illustratively, referring to fig. 13, the user adjusts the focus control, aperture control, photosensitive control, zoom control of the first camera via a third user instruction.
According to the method and the device for adjusting the parameters of the video camera, parameters of the video camera adopted by virtual shooting can be adjusted according to the third user instruction, so that the video camera can be adjusted in real time according to the moving condition of the virtual shooting object, and the more optimal virtual shooting effect is achieved.
Referring to fig. 14, in still other specific implementations of the embodiments of the present application, the method further includes:
step 211, respectively adjusting parameters of the lamplight adopted by the virtual shooting according to the received fourth user instruction.
According to the embodiment of the application, the parameters of the lamplight adopted by the virtual shooting can be adjusted according to the fourth user instruction, so that the lamplight can be adjusted in real time according to the moving condition of the virtual shooting object, and the more optimal virtual shooting effect is realized.
Illustratively, referring to fig. 15, the user adjusts the color control, brightness control, contrast control, area control of the first light via a fourth user instruction.
In summary, according to the embodiment of the application, the second user instruction, the third user instruction and the fourth user instruction can be used for respectively adjusting the virtual shooting screen image, the camera and the light, so that the screen image, the camera and the light in virtual shooting can be correspondingly changed along with the movement of the virtual shooting object, and a better tracking effect on the virtual shooting object can be achieved.
Based on the method described in the foregoing embodiments, referring to fig. 16, an embodiment of the present application further provides a rendering display device based on virtual shooting, where the device includes:
an information obtaining module 161, configured to obtain shooting environment information of at least two LED screens performing virtual shooting, where the shooting environment information includes: the size and position information of the at least two LED screens;
the range determining module 162 is configured to obtain a display range of the at least two LED screens projected onto the target display screen according to the size of the at least two LED screens and the size of the target display screen:
a region determining module 163, configured to establish at least two display regions corresponding to at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens;
and an image display module 164, configured to render and display the image of the LED screen and the virtually photographed object in at least two display areas of the target display screen.
According to the rendering display scheme based on virtual shooting, the display range of the at least two LED screens projected to the target display screen is obtained through the size of the at least two LED screens and the size of the target display screen, and then according to the display range and the positions of the at least two LED screens, at least two display areas corresponding to the at least two LED screens are built in the target display screen. In the embodiment of the application, images of the LED screen and the virtually shot objects are rendered and displayed in at least two display areas of the target display screen. According to the method and the device for the video shooting, the display states of at least two LED screens can be directly restored in the display area of the target screen, so that the display of the target screen is not required to be manually set, the operation of the video shooting by the virtual shooting technology is simpler and more convenient, and the manufacturing efficiency of the video shooting is improved.
Based on the method described in the foregoing embodiments, the embodiments of the present application provide an electronic device, which is configured to perform the method described in the foregoing embodiments, and referring to fig. 17, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown, and specific embodiments of the present application are not limited to specific implementations of the electronic device.
As shown in fig. 17, the electronic device 170 may include: a processor 1702, a communication interface (Communications Interface) 1704, a memory 1706, and a communication bus 1708.
Wherein:
processor 1702, communication interface 1704, and memory 1706 communicate with each other over a communication bus 1708.
A communication interface 1704 for communicating with other electronic devices or servers.
The processor 1702 is configured to execute the program 1710, and may specifically perform relevant steps in the embodiment of the rendering display method based on virtual shooting.
In particular, the program 1710 may include program code including computer operating instructions.
The processor 1702 may be a processor CPU or a specific integrated circuit ASIC (Application Specific Integrated Circuit) or one or more integrated circuits configured to implement embodiments of the present application. The one or more processors comprised by the smart device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 1706 for storing programs 1710. Memory 1706 may comprise high-speed RAM memory or may also comprise non-volatile memory, such as at least one disk memory.
The program 1710 may be specifically configured to cause the processor 1702 to execute to implement the virtual shooting-based rendering display method described in the above embodiment. The specific implementation of each step in the program 1710 may refer to corresponding steps and corresponding descriptions in the units in the embodiment of the rendering display method based on virtual shooting, which are not described herein. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and modules described above may refer to corresponding procedure descriptions in the foregoing method embodiments, which are not repeated herein.
Based on the methods described in the above embodiments, the present embodiments provide a computer storage medium having a computer program stored thereon, which when executed by a processor, implements the methods described in the above embodiments.
Based on the methods described in the above embodiments, the present application provides a computer program product that, when executed by a processor, implements the methods described in the above embodiments.
It should be noted that, according to implementation requirements, each component/step described in the embodiments of the present application may be split into more components/steps, and two or more components/steps or part of operations of the components/steps may be combined into new components/steps, so as to achieve the purposes of the embodiments of the present application.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium and to be stored in a local recording medium downloaded through a network, so that the methods described herein may be stored on such software processes on a recording medium using a general purpose computer, special purpose processor, or programmable or special purpose hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes a memory component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the navigation methods described herein. Further, when the general-purpose computer accesses code for implementing the navigation method shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the navigation method shown herein.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only for illustrating the embodiments of the present application, but not for limiting the embodiments of the present application, and various changes and modifications can be made by one skilled in the relevant art without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also fall within the scope of the embodiments of the present application, and the scope of the embodiments of the present application should be defined by the claims.

Claims (11)

1. A rendering display method based on virtual shooting, the method comprising:
obtaining shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the size and position information of the at least two LED screens;
obtaining a display range of the at least two LED screens projected to a target display screen according to the sizes of the at least two LED screens and the size of the target display screen, wherein the target display screen is used for restoring the at least two LED screens so as to test and adjust a shooting environment of virtual shooting;
according to the display range and the positions of at least two LED screens, at least two display areas corresponding to the at least two LED screens are established in the target display screen;
rendering and displaying images of the LED screen and the virtually shot object in at least two display areas of the target display screen;
the obtaining the display range of the at least two LED screens projected to the target display screen according to the size of the at least two LED screens and the size of the target display screen includes: determining the display range of the projection of at least two LED screens to the target display screen according to the proportional relation between the sizes of the at least two LED screens and the size of the target display screen;
according to the display range and the positions of at least two LED screens, at least two display areas corresponding to the at least two LED screens are established in the target display screen, and the method comprises the following steps: and a plurality of first characteristic points are taken from at least two LED screens, mapping is carried out according to coordinate values of the first characteristic points under three-dimensional coordinates of the at least two LED screens, second characteristic points corresponding to the plurality of first characteristic points in the display range of the target display screen are obtained, and at least two display areas corresponding to the at least two LED screens are obtained according to the plurality of second characteristic points.
2. The method of claim 1, wherein the method further comprises:
and respectively adjusting the light distribution states of the at least two display areas according to the received first user instruction so as to simulate the light distribution effect of the virtual shooting.
3. The method of claim 1, wherein the method further comprises:
and respectively adjusting the time axes of the at least two display areas according to the received second user instruction so as to monitor the playing effect of the virtual shooting.
4. The method of claim 1, wherein the method further comprises:
and dividing the display area into at least one sub-display area, and respectively performing image processing operation on each sub-display area.
5. The method of claim 1, wherein the method further comprises:
and adding a virtual shooting element, and displaying the attribute of the added virtual shooting element, wherein the virtual shooting element is an element adopted for virtual shooting.
6. The method of claim 1, wherein the method further comprises:
and performing image brightness reduction processing on the part, corresponding to the joint of the at least two LED screens, of the display area.
7. The method of claim 1, wherein the method further comprises:
and respectively adjusting parameters of a camera adopted by the virtual shooting according to the received third user instruction.
8. The method of claim 1, wherein the method further comprises:
and respectively adjusting parameters of the lamplight adopted by the virtual shooting according to the received fourth user instruction.
9. A virtual shooting-based rendering display apparatus, the apparatus comprising:
the information obtaining module is used for obtaining shooting environment information of at least two LED screens for virtual shooting, and the shooting environment information comprises: the size and position information of the at least two LED screens;
the range determining module is configured to obtain a display range of the at least two LED screens projected to the target display screen according to a size of the at least two LED screens and a size of the target display screen, where the target display screen is configured to restore the at least two LED screens, so as to test and adjust a shooting environment of virtual shooting:
the area determining module is used for establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens;
the image display module is used for rendering and displaying images of the LED screen and the virtually shot objects in at least two display areas of the target display screen;
the range determining module is used for determining the display range of the at least two LED screens projected to the target display screen according to the proportional relation between the sizes of the at least two LED screens and the size of the target display screen;
the region determining module is used for taking a plurality of first characteristic points from at least two LED screens, mapping the first characteristic points according to coordinate values of the first characteristic points under three-dimensional coordinates of the at least two LED screens, obtaining second characteristic points corresponding to the plurality of first characteristic points in a display range of a target display screen, and obtaining at least two display regions corresponding to the at least two LED screens according to the plurality of second characteristic points.
10. An electronic device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform operations corresponding to the method of any one of claims 1-8.
11. A storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of claims 1-8.
CN202210148247.2A 2022-02-17 2022-02-17 Rendering display method, rendering display device, electronic equipment and storage medium Active CN114520903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210148247.2A CN114520903B (en) 2022-02-17 2022-02-17 Rendering display method, rendering display device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210148247.2A CN114520903B (en) 2022-02-17 2022-02-17 Rendering display method, rendering display device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114520903A CN114520903A (en) 2022-05-20
CN114520903B true CN114520903B (en) 2023-08-08

Family

ID=81598817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210148247.2A Active CN114520903B (en) 2022-02-17 2022-02-17 Rendering display method, rendering display device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114520903B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117440110A (en) * 2023-10-23 2024-01-23 神力视界(深圳)文化科技有限公司 Virtual shooting control method, medium and mobile terminal

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107948466A (en) * 2017-11-23 2018-04-20 北京德火新媒体技术有限公司 A kind of three-dimensional scene construction method and system for video program production
CN109961495A (en) * 2019-04-11 2019-07-02 深圳迪乐普智能科技有限公司 A kind of implementation method and VR editing machine of VR editing machine
CN110806847A (en) * 2019-10-30 2020-02-18 支付宝(杭州)信息技术有限公司 Distributed multi-screen display method, device, equipment and system
CN111766951A (en) * 2020-09-01 2020-10-13 北京七维视觉科技有限公司 Image display method and apparatus, computer system, and computer-readable storage medium
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN113129814A (en) * 2021-04-23 2021-07-16 浙江博采传媒有限公司 Color correction method and system applied to virtual production of LED (light-emitting diode) ring screen
US11107195B1 (en) * 2019-08-23 2021-08-31 Lucasfilm Entertainment Company Ltd. Motion blur and depth of field for immersive content production systems
CN113556443A (en) * 2021-07-20 2021-10-26 北京星光影视设备科技股份有限公司 LED screen real-time correction method facing virtual playing environment and virtual playing system
CN113810612A (en) * 2021-09-17 2021-12-17 上海傲驰广告文化集团有限公司 Analog live-action shooting method and system
CN113905145A (en) * 2021-10-11 2022-01-07 浙江博采传媒有限公司 LED circular screen virtual-real camera focus matching method and system
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10496353B2 (en) * 2016-09-29 2019-12-03 Jiang Chang Three-dimensional image formation and color correction system and method
US11580616B2 (en) * 2020-04-29 2023-02-14 Lucasfilm Entertainment Company Ltd. Photogrammetric alignment for immersive content production

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107948466A (en) * 2017-11-23 2018-04-20 北京德火新媒体技术有限公司 A kind of three-dimensional scene construction method and system for video program production
CN109961495A (en) * 2019-04-11 2019-07-02 深圳迪乐普智能科技有限公司 A kind of implementation method and VR editing machine of VR editing machine
US11107195B1 (en) * 2019-08-23 2021-08-31 Lucasfilm Entertainment Company Ltd. Motion blur and depth of field for immersive content production systems
CN110806847A (en) * 2019-10-30 2020-02-18 支付宝(杭州)信息技术有限公司 Distributed multi-screen display method, device, equipment and system
CN111766951A (en) * 2020-09-01 2020-10-13 北京七维视觉科技有限公司 Image display method and apparatus, computer system, and computer-readable storage medium
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN113129814A (en) * 2021-04-23 2021-07-16 浙江博采传媒有限公司 Color correction method and system applied to virtual production of LED (light-emitting diode) ring screen
CN113556443A (en) * 2021-07-20 2021-10-26 北京星光影视设备科技股份有限公司 LED screen real-time correction method facing virtual playing environment and virtual playing system
CN113810612A (en) * 2021-09-17 2021-12-17 上海傲驰广告文化集团有限公司 Analog live-action shooting method and system
CN113905145A (en) * 2021-10-11 2022-01-07 浙江博采传媒有限公司 LED circular screen virtual-real camera focus matching method and system
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall

Also Published As

Publication number Publication date
CN114520903A (en) 2022-05-20

Similar Documents

Publication Publication Date Title
US11272165B2 (en) Image processing method and device
US11074755B2 (en) Method, device, terminal device and storage medium for realizing augmented reality image
CN107945112B (en) Panoramic image splicing method and device
US11330172B2 (en) Panoramic image generating method and apparatus
CN108939556B (en) Screenshot method and device based on game platform
JP2011239361A (en) System and method for ar navigation and difference extraction for repeated photographing, and program thereof
CN113223130B (en) Path roaming method, terminal equipment and computer storage medium
CN112634414B (en) Map display method and device
CN108961423B (en) Virtual information processing method, device, equipment and storage medium
CN109840946B (en) Virtual object display method and device
WO2023207963A1 (en) Image processing method and apparatus, electronic device, and storage medium
US20240208419A1 (en) Display method and display system of on-vehicle avm, electronic device, and storage medium
CN114520903B (en) Rendering display method, rendering display device, electronic equipment and storage medium
CN113724391A (en) Three-dimensional model construction method and device, electronic equipment and computer readable medium
US20220044560A1 (en) Roadside sensing method, electronic device, storage medium, and roadside equipment
CN108510541B (en) Information adjusting method, electronic equipment and computer readable storage medium
CN111862240B (en) Panoramic camera and calibration method thereof, panoramic image splicing method and storage medium
CN115908218A (en) Third-view shooting method, device, equipment and storage medium for XR scene
CN111917986A (en) Image processing method, medium thereof, and electronic device
CN115988322A (en) Method and device for generating panoramic image, electronic equipment and storage medium
US20240046554A1 (en) Presenting virtual representation of real space using spatial transformation
CN114066731A (en) Method and device for generating panorama, electronic equipment and storage medium
CN114782611A (en) Image processing method, image processing device, storage medium and electronic equipment
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN112258435A (en) Image processing method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TA01 Transfer of patent application right

Effective date of registration: 20230804

Address after: Room 602, Building S1, Alibaba Cloud Building, No. 3239 Keyuan Road, Ulan Coast Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518054

Applicant after: Shenli Vision (Shenzhen) Cultural Technology Co.,Ltd.

Address before: Room 508, 5 / F, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: Alibaba (China) Co.,Ltd.

TA01 Transfer of patent application right