CN113810612A - Analog live-action shooting method and system - Google Patents

Analog live-action shooting method and system Download PDF

Info

Publication number
CN113810612A
CN113810612A CN202111095033.5A CN202111095033A CN113810612A CN 113810612 A CN113810612 A CN 113810612A CN 202111095033 A CN202111095033 A CN 202111095033A CN 113810612 A CN113810612 A CN 113810612A
Authority
CN
China
Prior art keywords
real
virtual
display screen
camera
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111095033.5A
Other languages
Chinese (zh)
Inventor
崔非
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aochi Advertising Culture Group Co ltd
Original Assignee
Shanghai Aochi Advertising Culture Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aochi Advertising Culture Group Co ltd filed Critical Shanghai Aochi Advertising Culture Group Co ltd
Priority to CN202111095033.5A priority Critical patent/CN113810612A/en
Publication of CN113810612A publication Critical patent/CN113810612A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a simulation live-action shooting method and system, and relates to the technical field of photography. The method comprises the following steps: and repeatedly carving the virtual shooting environment in the virtual scene making software. And setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen. And acquiring the reflection position of the infrared camera on the real camera, which is irradiated on the real display screen, according to different shooting positions of the real camera, calculating a three-dimensional space coordinate according to the reflection position, and finishing tracking and correction of the virtual camera. And calculating the position relation between the real camera and the real display screen according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera. And loading and operating a virtual scene on the virtual display screen, and projecting a virtual background picture to the real display screen by virtual scene making software according to the real-time shooting position and lens information of the real camera. The 'distortion' feeling that the background, the environment and the figure are difficult to fuse is avoided, and the vivid live-action shooting effect in the studio is realized.

Description

Analog live-action shooting method and system
Technical Field
The invention relates to the technical field of photography, in particular to a simulated live-action shooting method and a system.
Background
In the shooting process, in order to achieve the expected background picture, live-action shooting needs to be carried out in a view finding mode at an actual place, which is not only limited by the limitation of the actual scene, but also needs to spend a large amount of manpower, material resources, financial resources and time on the way to the live-action shooting place, and the live-action shooting place can not ensure that the background during shooting meets the shooting requirement.
Therefore, in order to achieve a desired background picture, a technique of synthesizing a live view image and a human picture image is generally employed. Namely, background light parameters and camera parameters are adjusted under a blue or green background to simulate an actual picture image, and then a person is shot to acquire a person picture image. Then, the image of the actual shooting place or the existing real scene picture image is used as a real scene background image and a character picture image for post image processing. Specifically, the character in the character image is firstly extracted and then synthesized with the live-action image to form a final image, so that the effect of shooting the character at the live-action shooting site is achieved. The operation is complicated because the final image needs to be artificially synthesized by two images. And the irradiation direction and intensity of the light rays of the scene shot by the live-action picture image are difficult to be absolutely consistent with the irradiation direction and intensity of the light rays of the scene shot by the character picture image, so that the character picture and the background picture in the final image obtained by the synthesis technology are difficult to be unified, the image quality is not high, the distortion sense that the synthesized image and the environment are difficult to fuse is unavoidable, and the vivid live-action shooting effect can not be realized in the studio.
Disclosure of Invention
The invention aims to provide a simulation live-action shooting method and a simulation live-action shooting system, which can realize a vivid live-action shooting effect in a studio.
The embodiment of the invention is realized by the following steps:
in a first aspect, an embodiment of the present application provides a simulated live-action shooting method, which includes the following steps: the virtual shooting environment is repeatedly engraved in virtual scene making software according to the shooting environment in the real world to construct a virtual scene, wherein the shooting environment in the real world comprises a real display screen and a real camera, and the virtual shooting environment comprises a virtual display screen and a virtual camera. And setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen to enable the parameters of the real display screen to be matched with the parameters of the virtual display screen. And acquiring the shooting position of the real-world camera in real time. And acquiring the reflection position of the infrared camera on the real camera, which is irradiated on the real display screen, according to different shooting positions of the real camera, and calculating a three-dimensional space coordinate according to the reflection position so as to finish the tracking correction of the virtual camera. And calculating the position relation between the real camera and the real display screen according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual camera and the virtual display screen. And loading and operating a virtual scene on the virtual display screen, calculating a virtual background picture with correct perspective relation by virtual scene making software according to the real-time shooting position and lens information of the real camera, and projecting the virtual background picture to the real display screen.
In some embodiments of the present invention, before the step of projecting the virtual background picture onto the real display screen, the method further includes: and rendering the virtual background picture in real time.
In a second aspect, an embodiment of the present application provides a simulated live-action shooting system, which includes: the virtual scene building module is used for repeatedly carving a virtual shooting environment in virtual scene manufacturing software according to the shooting environment in the real world so as to build the virtual scene, wherein the shooting environment in the real world comprises a real display screen and a real camera, and the virtual shooting environment comprises a virtual display screen and a virtual camera. And the display screen matching module is used for setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen to enable the parameters of the real display screen to be matched with the parameters of the virtual display screen. And the shooting position acquisition module is used for acquiring the shooting position of the real camera in real time. And the virtual camera tracking correction module is used for acquiring the reflection position of the infrared camera on the real display screen according to different shooting positions of the real camera, and calculating a space three-dimensional coordinate according to the reflection position so as to finish the virtual camera tracking correction. And the space calibration module is used for calculating the position relation between the real camera and the real display screen according to the structured light and synchronizing the position relation to the virtual display screen and the virtual camera so as to finish the space calibration of the virtual camera and the virtual display screen. And the virtual background picture putting module is used for loading and operating a virtual scene on the virtual display screen, and the virtual scene making software calculates a virtual background picture with correct perspective relation according to the real-time shooting position and lens information of the real camera and puts the virtual background picture on the real display screen.
In some embodiments of the present invention, the simulated live-action shooting system further includes: and the rendering module is used for rendering the virtual background picture in real time.
In some embodiments of the invention, the simulated live-action shooting system further includes a lighting unit, and the lighting unit includes a plurality of scene lights arranged at the shooting site.
In some embodiments of the present invention, the display screen is an LED screen.
In some embodiments of the invention, the simulated live-action shooting system further includes a track, and the real-action camera is disposed on the track and can move along the track.
In some embodiments of the present invention, a rotating mechanism is disposed below the real-world camera, and the rotating mechanism is configured to drive the real-world camera to rotate.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory for storing one or more programs; a processor. The program or programs, when executed by a processor, implement the method of any of the first aspects as described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the method according to any one of the first aspect described above.
Compared with the prior art, the embodiment of the invention has at least the following advantages or beneficial effects:
the invention provides a simulation live-action shooting method and a system, comprising the following steps: the virtual shooting environment is repeatedly engraved in virtual scene making software according to the shooting environment in the real world to construct a virtual scene, wherein the shooting environment in the real world comprises a real display screen and a real camera, and the virtual shooting environment comprises a virtual display screen and a virtual camera. And setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen to enable the parameters of the real display screen to be matched with the parameters of the virtual display screen. And acquiring the shooting position of the real-world camera in real time. And acquiring the reflection position of the infrared camera on the real camera, which is irradiated on the real display screen, according to different shooting positions of the real camera, and calculating a three-dimensional space coordinate according to the reflection position so as to finish the tracking correction of the virtual camera. And calculating the position relation between the real camera and the real display screen according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual camera and the virtual display screen. And loading and operating a virtual scene on the virtual display screen, calculating a virtual background picture with correct perspective relation by virtual scene making software according to the real-time shooting position and lens information of the real camera, and projecting the virtual background picture to the real display screen.
The method and the system repeatedly carve the virtual display screen in the virtual scene manufacturing software according to the specification and the size of the real display screen, set the parameters of the virtual display screen in the virtual scene manufacturing software, and put the parameters of the virtual display screen on the real display screen to complete the setting of the parameters of the real display screen, so that the real display screen is matched with the virtual display screen.
And acquiring the shooting position of the real camera in real time, and feeding the shooting position of the real camera back to the virtual scene making software. When the real camera shoots at different shooting positions, infrared rays emitted by the infrared camera of the real camera irradiate different positions of the real display screen to obtain different reflecting positions on the real display screen, and the tracking correction of the virtual camera is completed according to the spatial three-dimensional coordinate calculated according to the reflecting positions, so that the virtual camera is matched with the real camera to complete the repeated carving of the virtual camera in the virtual shooting environment. When the infrared camera projects a specific optical signal to the real display screen, the specific optical signal is collected by a lens of the real camera. And calculating information such as the position, the depth and the like of the real display screen according to the optical signal change of the real display screen to obtain the position relation between the real camera and the real display screen, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual shooting environment. Thus, when the real camera shoots the real display screen, the virtual camera can synchronously shoot the virtual display screen.
Loading the virtual scene into the virtual display screen, operating the virtual scene in the virtual display screen, when a user walks to the real display screen for shooting, the real camera follows the user for shooting, the shooting position of the real camera is obtained in real time, lens information of the real camera is obtained according to the infrared camera of the real camera, and when the real camera shoots the real display screen, the virtual camera can synchronously shoot the virtual display screen, so that a virtual background picture with correct perspective relation is calculated, and the virtual background picture is projected to the real display screen for real-scene shooting. Therefore, the 'distortion' feeling that the background, the environment and the figure are difficult to fuse is avoided, and the vivid live-action shooting effect in the studio is realized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a simulated live-action shooting method according to an embodiment of the present invention;
fig. 2 is a block diagram of a simulated live-action shooting system according to an embodiment of the present invention;
fig. 3 is a real-world shooting environment according to an embodiment of the present invention;
FIG. 4 is an enlarged view of a portion of FIG. 3 at A;
fig. 5 is a schematic structural block diagram of an electronic device according to an embodiment of the present invention.
Icon: 100-simulating a live-action shooting system; 110-a virtual scene construction module; 120-display screen matching module; 130-shooting position acquisition module; 140-virtual camera tracking rectification module; 150-a spatial calibration module; 160-virtual background picture putting module; 1-a real camera; 2-scene light; 3-a track; 4-a rotation mechanism; 5-a real display screen; 101-a memory; 102-a processor; 103-communication interface.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of an element identified by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the description of the present application, it should be noted that if the terms "upper", "lower", "inner", "outer", etc. are used to indicate an orientation or positional relationship based on that shown in the drawings or that the application product is usually placed in use, the description is merely for convenience and simplicity, and it is not intended to indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore should not be construed as limiting the present application.
In the description of the present application, it should also be noted that, unless otherwise explicitly stated or limited, the terms "disposed" and "connected" should be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the individual features of the embodiments can be combined with one another without conflict.
Examples
Referring to fig. 1, fig. 3 and fig. 4, fig. 1 is a flowchart illustrating a simulated live-action shooting method according to an embodiment of the present disclosure, fig. 3 is a shooting environment in the real world according to the embodiment of the present disclosure, and fig. 4 is a partial enlarged view of a point a in fig. 3. A simulation live-action shooting method comprises the following steps:
s110: repeatedly engraving a virtual shooting environment in virtual scene making software according to the shooting environment in the real world to construct a virtual scene, wherein the shooting environment in the real world comprises a real display screen 5 and a real camera 1, and the virtual shooting environment comprises a virtual display screen and a virtual camera;
illustratively, the virtual display screen in the virtual scene may be duplicated in the virtual scene creation software according to the specification and size of the real display screen 5 described above. The camera tracking system can track the shooting position of the real-world camera 1 in real time and feed the shooting position of the real-world camera 1 back to the virtual scene creation software. When the real camera 1 shoots at different shooting positions, infrared rays emitted by an infrared camera of the real camera 1 irradiate different positions of the real display screen 5 to obtain different reflecting positions on the real display screen 5, and tracking and correction of the virtual camera are completed according to spatial three-dimensional coordinates calculated according to the reflecting positions, so that the virtual camera is matched with the real camera 1, namely the virtual camera and the real camera 1 act consistently.
In the implementation process, according to the shooting environment in the real world, virtual scene manufacturing software is utilized, and a real-time rendering technology is utilized to synthesize the virtual shooting environment, so that the virtual scene is constructed.
S120: setting parameters of a virtual display screen, and putting the parameters of the virtual display screen into the real display screen 5 to enable the parameters of the real display screen 5 to be matched with the parameters of the virtual display screen;
specifically, the parameters of the virtual display screen are set in the virtual scene making software, and the parameters of the virtual display screen are put into the real display screen 5, so that the setting of the parameters of the real display screen 5 is completed, and the real display screen 5 is matched with the virtual display screen. So that the virtual camera can shoot the virtual display screen synchronously when the real camera 1 shoots the real display screen 5.
For example, the parameter of the virtual display screen may be a grid layout parameter.
S130: acquiring the shooting position of the real camera 1 in real time;
illustratively, the camera tracking system may track the shooting position of the real-world camera 1 in real time and feed back the shooting position of the real-world camera 1 to the virtual scene creation software in real time, thereby acquiring the shooting position of the real-world camera 1.
S140: acquiring a reflection position of the infrared camera on the real camera 1, which is irradiated on the real display screen 5 by the infrared camera, according to different shooting positions of the real camera 1, and calculating a spatial three-dimensional coordinate according to the reflection position to finish tracking and correction of the virtual camera;
specifically, the real-world camera 1 may be provided with a plurality of infrared cameras. When the real camera 1 shoots at different shooting positions, infrared rays emitted by the infrared camera irradiate different positions on the real display screen 5 to obtain point cloud data, wherein the point cloud data is a set of a group of vectors in a three-dimensional coordinate, a reflection position obtained by scanning the infrared camera is recorded in a point cloud mode, and each point cloud comprises a three-dimensional coordinate, so that a space three-dimensional coordinate is obtained according to the reflection position, and tracking correction of the virtual camera is completed.
S150: calculating the position relation between the real camera 1 and the real display screen 5 according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual camera and the virtual display screen;
the structured light is a system structure including an infrared camera and the real-world camera 1. After the infrared camera projects a specific optical signal to the real display screen 5, the optical signal is collected by the lens of the real camera 1. And calculating information such as the position and the depth of the real display screen 5 according to the change of the optical signal of the real display screen 5, further obtaining the position relation between the real camera 1 and the real display screen 5, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual shooting environment. So that the virtual camera can shoot the virtual display screen synchronously when the real camera 1 shoots the real display screen 5.
S160: and loading and operating a virtual scene on the virtual display screen, calculating a virtual background picture with correct perspective relation by virtual scene making software according to the real-time shooting position and lens information of the real camera 1, and projecting the virtual background picture to the real display screen 5.
Specifically, a virtual scene is selected according to the user requirement and loaded into the virtual display screen, the camera tracking system acquires the shooting position of the real-time camera 1 in real time, and lens information of the real-time camera is acquired according to the infrared camera of the real-time camera 1, wherein the lens information comprises information such as focal length and aperture. When the real camera 1 shoots the real display screen 5, the virtual camera can synchronously shoot the virtual display screen, so that the perspective relation is calculated, a correct virtual background picture is synthesized through a real-time rendering technology, and the virtual background picture is projected to the real display screen 5.
In the implementation process, the virtual display screen in the virtual scene is repeatedly engraved in the virtual scene manufacturing software according to the specification and the size of the real display screen 5, the parameters of the virtual display screen are set in the virtual scene manufacturing software, and the parameters of the virtual display screen are put into the real display screen 5 to complete the setting of the parameters of the real display screen 5, so that the real display screen 5 is matched with the virtual display screen. And acquiring the shooting position of the real camera 1 in real time, and feeding the shooting position of the real camera 1 back to the virtual scene making software. When the real camera 1 shoots at different shooting positions, infrared rays emitted by an infrared camera of the real camera 1 irradiate different positions of the real display screen 5 to obtain different reflecting positions on the real display screen 5, and tracking and correction of the virtual camera are completed according to spatial three-dimensional coordinates calculated according to the reflecting positions, so that the virtual camera is matched with the real camera 1 to complete repeated carving of the virtual camera in a virtual shooting environment. When the infrared camera projects a specific optical signal to the real display screen 5, the optical signal is collected by the lens of the real camera 1. And calculating information such as the position and the depth of the real display screen 5 according to the optical signal change of the real display screen 5 to obtain the position relation between the real camera 1 and the real display screen 5, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual shooting environment. So that the virtual camera can shoot the virtual display screen synchronously when the real camera 1 shoots the real display screen 5. Loading a virtual scene into a virtual display screen, operating the virtual scene in the virtual display screen, when a user walks to the real display screen 5 for shooting, the real camera 1 shoots along with the user, acquiring the shooting position of the real camera 1 in real time, and acquiring lens information of the real camera according to an infrared camera of the real camera 1. Therefore, the 'distortion' feeling that the background, the environment and the figure are difficult to fuse is avoided, and the vivid live-action shooting effect in the studio is realized.
In some embodiments of this embodiment, before the step of projecting the virtual background picture onto the real display screen 5, the method further includes: and rendering the virtual background picture in real time. The rendered virtual background picture is closer to the real environment, so that the virtual background picture has more reality, and the real-scene shooting effect in the studio is more vivid.
Referring to fig. 2, fig. 2 is a block diagram illustrating a simulated live-action shooting system 100 according to an embodiment of the present disclosure. A simulated live action shooting system 100, comprising: the virtual scene constructing module 110 is configured to repeatedly carve a virtual shooting environment in virtual scene creation software according to a shooting environment in the real world to construct a virtual scene, where the shooting environment in the real world includes the real display screen 5 and the real camera 1, and the virtual shooting environment includes the virtual display screen and the virtual camera. And the display screen matching module 120 is configured to set parameters of the virtual display screen, and put the parameters of the virtual display screen into the real display screen 5 so that the parameters of the real display screen 5 are matched with the parameters of the virtual display screen. A shooting position acquiring module 130, configured to acquire a shooting position of the real-world camera 1 in real time. And the virtual camera tracking correction module 140 is configured to acquire a reflection position where the infrared camera of the real camera 1 irradiates the real display screen 5 according to different shooting positions of the real camera 1, and calculate a three-dimensional space coordinate according to the reflection position to complete virtual camera tracking correction. And the space calibration module 150 is used for calculating the position relationship between the real camera 1 and the real display screen 5 according to the structured light and synchronizing the position relationship to the virtual display screen and the virtual camera so as to complete the space calibration of the virtual camera and the virtual display screen. And the virtual background picture putting module 160 is used for loading and operating a virtual scene on the virtual display screen, and the virtual scene making software calculates a virtual background picture with correct perspective relation according to the real-time shooting position and lens information of the real camera 1 and puts the virtual background picture on the real display screen 5. Specifically, a virtual shooting scene is constructed by using a real-time rendering technology in cooperation with virtual scene manufacturing software according to the specification and the size of the real display screen 5, parameters of the virtual display screen are set in the virtual scene manufacturing software, and the parameters of the virtual display screen are put into the real display screen 5 to complete setting of the parameters of the real display screen 5, so that the real display screen 5 is matched with the virtual display screen. And acquiring the shooting position of the real camera 1 in real time, and feeding the shooting position of the real camera 1 back to the virtual scene making software. When the real camera 1 shoots at different shooting positions, infrared rays emitted by an infrared camera of the real camera 1 irradiate different positions of the real display screen 5 to obtain different reflecting positions on the real display screen 5, and tracking and correction of the virtual camera are completed according to spatial three-dimensional coordinates calculated according to the reflecting positions, so that the virtual camera is matched with the real camera 1 to complete repeated carving of the virtual camera in a virtual shooting environment. When the infrared camera projects a specific optical signal to the real display screen 5, the optical signal is collected by the lens of the real camera 1. And calculating information such as the position and the depth of the real display screen 5 according to the optical signal change of the real display screen 5 to obtain the position relation between the real camera 1 and the real display screen 5, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual shooting environment. So that the virtual camera can shoot the virtual display screen synchronously when the real camera 1 shoots the real display screen 5. Loading a virtual scene into a virtual display screen, operating the virtual scene in the virtual display screen, when a user walks to the real display screen 5 for shooting, the real camera 1 shoots along with the user, acquiring the shooting position of the real camera 1 in real time, and acquiring lens information of the real camera according to an infrared camera of the real camera 1. Therefore, the 'distortion' feeling that the background, the environment and the figure are difficult to fuse is avoided, and the vivid live-action shooting effect in the studio is realized.
In some embodiments of the present embodiment, the simulated live-action shooting system 100 further includes: and the rendering module is used for rendering the virtual background picture in real time. The rendered virtual background picture is closer to the real environment, so that the virtual background picture has more reality, and the real-scene shooting effect in the studio is more vivid.
In some embodiments of the present embodiment, the simulated live-action shooting system 100 further includes a light unit, and the light unit includes a plurality of scene lights 2 arranged at the shooting site. Specifically, scene light 2 is accurately arranged on a shooting site, real lighting effect can be met, the effect of one-time shooting is achieved, and shooting time and time for later-stage shooting image processing are saved.
In some embodiments of the present embodiment, the real display screen 5 is an LED screen. Specifically, the LED screen brings real environment shadow and tone for users in a full-coverage mode.
In some embodiments of the present invention, the simulated live-action shooting system 100 further includes a track 3, and the real-action camera 1 is disposed on the track 3 and can move along the track 3. Specifically, when the real camera 1 can freely move along the track 3, the real camera 1 is convenient to move, and inconvenience of artificially moving the real camera 1 is avoided.
In some embodiments of the present embodiment, a rotating mechanism 4 is disposed below the real-world camera 1, and the rotating mechanism 4 is configured to rotate the real-world camera 1. Specifically, the rotating mechanism 4 can drive the lens of the real camera 1 to rotate, so that the flexibility of the lens of the real camera 1 is increased, and a wider shooting angle is achieved.
Referring to fig. 5, fig. 5 is a schematic structural block diagram of an electronic device according to an embodiment of the present disclosure. The electronic device comprises a memory 101, a processor 102 and a communication interface 103, wherein the memory 101, the processor 102 and the communication interface 103 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to the simulation live-action shooting system 100 provided in the embodiment of the present application, and the processor 102 executes the software programs and modules stored in the memory 101, thereby executing various functional applications and data processing. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory 101 (RAM), a Read Only Memory 101 (ROM), a Programmable Read Only Memory 101 (PROM), an Erasable Read Only Memory 101 (EPROM), an electrically Erasable Read Only Memory 101 (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor 102, including a Central Processing Unit (CPU) 102, a Network Processor 102 (NP), and the like; but may also be a Digital Signal processor 102 (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components.
It will be appreciated that the configuration shown in fig. 5 is merely illustrative and that the electronic device may include more or fewer components than shown in fig. 5 or have a different configuration than shown in fig. 5. The components shown in fig. 5 may be implemented in hardware, software, or a combination thereof.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory 101 (ROM), a Random Access Memory 101 (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
To sum up, the method and system for simulating live-action shooting provided by the embodiment of the application comprises the following steps: virtual shooting environments are repeatedly engraved in virtual scene making software according to the shooting environments in the real world to construct virtual scenes, wherein the shooting environments in the real world comprise a real display screen 5 and a real camera 1, and the virtual shooting environments comprise a virtual display screen and a virtual camera. And setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen 5 to enable the parameters of the real display screen 5 to be matched with the parameters of the virtual display screen. The shooting position of the real-world camera 1 is acquired in real time. And acquiring the reflection position of the infrared camera on the real camera 1, which is irradiated on the real display screen 5 by the infrared camera, according to different shooting positions of the real camera 1, and calculating a three-dimensional space coordinate according to the reflection position so as to finish the tracking correction of the virtual camera. And calculating the position relation between the real camera 1 and the real display screen 5 according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual camera and the virtual display screen. And loading and operating a virtual scene on the virtual display screen, calculating a virtual background picture with correct perspective relation by virtual scene making software according to the real-time shooting position and lens information of the real camera 1, and projecting the virtual background picture to the real display screen 5. The virtual display screen in the virtual scene is repeatedly engraved in the virtual scene making software according to the specification and the size of the real display screen 5, the parameters of the virtual display screen are set in the virtual scene making software, and the parameters of the virtual display screen are put into the real display screen 5 to complete the setting of the parameters of the real display screen 5, so that the real display screen 5 is matched with the virtual display screen. And acquiring the shooting position of the real camera 1 in real time, and feeding the shooting position of the real camera 1 back to the virtual scene making software. When the real camera 1 shoots at different shooting positions, infrared rays emitted by an infrared camera of the real camera 1 irradiate different positions of the real display screen 5 to obtain different reflecting positions on the real display screen 5, and tracking and correction of the virtual camera are completed according to spatial three-dimensional coordinates calculated according to the reflecting positions, so that the virtual camera is matched with the real camera 1 to complete repeated carving of the virtual camera in a virtual shooting environment. When the infrared camera projects a specific optical signal to the real display screen 5, the optical signal is collected by the lens of the real camera 1. And calculating information such as the position and the depth of the real display screen 5 according to the optical signal change of the real display screen 5 to obtain the position relation between the real camera 1 and the real display screen 5, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual shooting environment. So that the virtual camera can shoot the virtual display screen synchronously when the real camera 1 shoots the real display screen 5. Loading a virtual scene into a virtual display screen, operating the virtual scene in the virtual display screen, when a user walks to the real display screen 5 for shooting, the real camera 1 shoots along with the user, acquiring the shooting position of the real camera 1 in real time, and acquiring lens information of the real camera according to an infrared camera of the real camera 1. Therefore, the 'distortion' feeling that the background, the environment and the figure are difficult to fuse is avoided, and the vivid live-action shooting effect in the studio is realized.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A simulation live-action shooting method is characterized by comprising the following steps:
repeatedly engraving a virtual shooting environment in virtual scene manufacturing software according to the shooting environment in the real world to construct a virtual scene, wherein the shooting environment in the real world comprises a real display screen and a real camera, and the virtual shooting environment comprises a virtual display screen and a virtual camera;
setting parameters of a virtual display screen, and putting the parameters of the virtual display screen into a real display screen to enable the parameters of the real display screen to be matched with the parameters of the virtual display screen;
acquiring the shooting position of a real camera in real time;
acquiring a reflection position of an infrared camera on a real camera, which is irradiated on a real display screen, according to different shooting positions of the real camera, and calculating a spatial three-dimensional coordinate according to the reflection position to finish tracking and correction of a virtual camera;
calculating the position relation between the real camera and the real display screen according to the structured light, and synchronizing the position relation to the virtual display screen and the virtual camera to finish the space calibration of the virtual camera and the virtual display screen;
and loading and operating a virtual scene on the virtual display screen, calculating a virtual background picture with correct perspective relation by virtual scene making software according to the real-time shooting position and lens information of the real camera, and projecting the virtual background picture to the real display screen.
2. The simulated live-action shooting method as claimed in claim 1, wherein before the step of projecting the virtual background picture onto a real display screen, the method further comprises:
and rendering the virtual background picture in real time.
3. A simulated live-action shooting system, comprising:
the virtual scene construction module is used for repeatedly engraving a virtual shooting environment in virtual scene manufacturing software according to the shooting environment in the real world so as to construct a virtual scene, wherein the shooting environment in the real world comprises a real display screen and a real camera, and the virtual shooting environment comprises a virtual display screen and a virtual camera;
the display screen matching module is used for setting parameters of the virtual display screen, and putting the parameters of the virtual display screen into the real display screen to enable the parameters of the real display screen to be matched with the parameters of the virtual display screen;
the shooting position acquisition module is used for acquiring the shooting position of the real camera in real time;
the virtual camera tracking correction module is used for acquiring the reflection position of the infrared camera on the real camera, which is irradiated on the real display screen, according to different shooting positions of the real camera, and calculating a spatial three-dimensional coordinate according to the reflection position so as to finish virtual camera tracking correction;
the space calibration module is used for calculating the position relation between the real camera and the real display screen according to the structured light and synchronizing the position relation to the virtual display screen and the virtual camera so as to complete the space calibration of the virtual camera and the virtual display screen;
and the virtual background picture putting module is used for loading and operating a virtual scene on the virtual display screen, and the virtual scene making software calculates a virtual background picture with correct perspective relation according to the real-time shooting position and lens information of the real camera and puts the virtual background picture on the real display screen.
4. The simulated live-action shooting system as claimed in claim 3, further comprising:
and the rendering module is used for rendering the virtual background picture in real time.
5. A simulated live action shooting system according to claim 3 further comprising a light unit, said light unit comprising a plurality of scene lights arranged at the shooting site.
6. The simulated live-action system as claimed in claim 3, wherein said real display screen is an LED screen.
7. The simulated live-action system as claimed in claim 3, further comprising a rail, wherein said real camera is disposed on said rail and is movable along said rail.
8. The simulated live-action shooting system as claimed in claim 3, wherein a rotating mechanism is arranged below the real-action camera, and the rotating mechanism is used for driving the real-action camera to rotate.
9. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method as recited in claims 1-2.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as claimed in claims 1-2.
CN202111095033.5A 2021-09-17 2021-09-17 Analog live-action shooting method and system Pending CN113810612A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111095033.5A CN113810612A (en) 2021-09-17 2021-09-17 Analog live-action shooting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111095033.5A CN113810612A (en) 2021-09-17 2021-09-17 Analog live-action shooting method and system

Publications (1)

Publication Number Publication Date
CN113810612A true CN113810612A (en) 2021-12-17

Family

ID=78895855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111095033.5A Pending CN113810612A (en) 2021-09-17 2021-09-17 Analog live-action shooting method and system

Country Status (1)

Country Link
CN (1) CN113810612A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222067A (en) * 2022-01-05 2022-03-22 广州博冠信息科技有限公司 Scene shooting method and device, storage medium and electronic equipment
CN114520903A (en) * 2022-02-17 2022-05-20 阿里巴巴(中国)有限公司 Rendering display method, device, storage medium and computer program product
CN115460395A (en) * 2022-06-24 2022-12-09 北京电影学院 Camera registration tracking method based on LED background wall time-sharing multiplexing
CN116260956A (en) * 2023-05-15 2023-06-13 四川中绳矩阵技术发展有限公司 Virtual reality shooting method and system
CN116320363A (en) * 2023-05-25 2023-06-23 四川中绳矩阵技术发展有限公司 Multi-angle virtual reality shooting method and system
CN117527995A (en) * 2023-11-06 2024-02-06 中影电影数字制作基地有限公司 Simulated live-action shooting method and system based on space simulated shooting
CN117528237A (en) * 2023-11-01 2024-02-06 神力视界(深圳)文化科技有限公司 Adjustment method and device for virtual camera
EP4378756A1 (en) * 2022-12-01 2024-06-05 Steffen Gero Mobile image capture studio and method of capturing images

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296683A (en) * 2016-08-09 2017-01-04 深圳迪乐普数码科技有限公司 A kind of generation method of virtual screen curtain wall and terminal
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN111766951A (en) * 2020-09-01 2020-10-13 北京七维视觉科技有限公司 Image display method and apparatus, computer system, and computer-readable storage medium
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN112311965A (en) * 2020-10-22 2021-02-02 北京虚拟动点科技有限公司 Virtual shooting method, device, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296683A (en) * 2016-08-09 2017-01-04 深圳迪乐普数码科技有限公司 A kind of generation method of virtual screen curtain wall and terminal
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN111766951A (en) * 2020-09-01 2020-10-13 北京七维视觉科技有限公司 Image display method and apparatus, computer system, and computer-readable storage medium
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN112311965A (en) * 2020-10-22 2021-02-02 北京虚拟动点科技有限公司 Virtual shooting method, device, system and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222067A (en) * 2022-01-05 2022-03-22 广州博冠信息科技有限公司 Scene shooting method and device, storage medium and electronic equipment
CN114222067B (en) * 2022-01-05 2024-04-26 广州博冠信息科技有限公司 Scene shooting method and device, storage medium and electronic equipment
CN114520903A (en) * 2022-02-17 2022-05-20 阿里巴巴(中国)有限公司 Rendering display method, device, storage medium and computer program product
CN114520903B (en) * 2022-02-17 2023-08-08 阿里巴巴(中国)有限公司 Rendering display method, rendering display device, electronic equipment and storage medium
CN115460395A (en) * 2022-06-24 2022-12-09 北京电影学院 Camera registration tracking method based on LED background wall time-sharing multiplexing
EP4378756A1 (en) * 2022-12-01 2024-06-05 Steffen Gero Mobile image capture studio and method of capturing images
CN116260956A (en) * 2023-05-15 2023-06-13 四川中绳矩阵技术发展有限公司 Virtual reality shooting method and system
CN116320363A (en) * 2023-05-25 2023-06-23 四川中绳矩阵技术发展有限公司 Multi-angle virtual reality shooting method and system
CN117528237A (en) * 2023-11-01 2024-02-06 神力视界(深圳)文化科技有限公司 Adjustment method and device for virtual camera
CN117527995A (en) * 2023-11-06 2024-02-06 中影电影数字制作基地有限公司 Simulated live-action shooting method and system based on space simulated shooting

Similar Documents

Publication Publication Date Title
CN113810612A (en) Analog live-action shooting method and system
CN103945210B (en) A kind of multi-cam image pickup method realizing shallow Deep Canvas
CN109658365B (en) Image processing method, device, system and storage medium
US20180293774A1 (en) Three dimensional acquisition and rendering
Alexander et al. The digital emily project: Achieving a photorealistic digital actor
CN106027855B (en) A kind of implementation method and terminal of virtual rocker arm
US10275898B1 (en) Wedge-based light-field video capture
CN110517355A (en) Environment for illuminating mixed reality object synthesizes
CN108470370A (en) The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
CN108765542A (en) Image rendering method, electronic equipment and computer readable storage medium
US11425283B1 (en) Blending real and virtual focus in a virtual display environment
CN113436343A (en) Picture generation method and device for virtual studio, medium and electronic equipment
US11526067B2 (en) Lighting assembly for producing realistic photo images
US11048925B1 (en) Active marker device for performance capture
CN110090437A (en) Video acquiring method, device, electronic equipment and storage medium
Gu et al. 3dunderworld-sls: an open-source structured-light scanning system for rapid geometry acquisition
WO2015174885A1 (en) Method for constructing a three-dimensional color image and device for the implementation thereof
CN108460824A (en) The determination method, apparatus and system of three-dimensional multimedia messages
CN109446945A (en) Threedimensional model treating method and apparatus, electronic equipment, computer readable storage medium
CN115222927A (en) Stepping virtual roaming scene construction method
Stork et al. Image analysis of paintings by computer graphics synthesis: an investigation of the illumination in Georges de la Tour's Christ in the carpenter's studio
CN112562057B (en) Three-dimensional reconstruction system and method
Boe et al. Human Photogrammetry: Foundational Techniques for Creative Practitioners
KR20230022153A (en) Single-image 3D photo with soft layering and depth-aware restoration
CN113327329A (en) Indoor projection method, device and system based on three-dimensional model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination