CN114257705A - Virtual shooting technology specific implementation method based on unreal engine - Google Patents

Virtual shooting technology specific implementation method based on unreal engine Download PDF

Info

Publication number
CN114257705A
CN114257705A CN202111525292.7A CN202111525292A CN114257705A CN 114257705 A CN114257705 A CN 114257705A CN 202111525292 A CN202111525292 A CN 202111525292A CN 114257705 A CN114257705 A CN 114257705A
Authority
CN
China
Prior art keywords
virtual
engine
shooting
implementation method
base station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111525292.7A
Other languages
Chinese (zh)
Inventor
温訸
许震雷
杨正淏
肖彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Special Effects Xiaoge Culture Communication Co ltd
Original Assignee
Hangzhou Special Effects Xiaoge Culture Communication Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Special Effects Xiaoge Culture Communication Co ltd filed Critical Hangzhou Special Effects Xiaoge Culture Communication Co ltd
Priority to CN202111525292.7A priority Critical patent/CN114257705A/en
Publication of CN114257705A publication Critical patent/CN114257705A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2228Video assist systems used in motion picture production, e.g. video cameras connected to viewfinders of motion picture cameras or related video signal processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of virtual shooting, and discloses a virtual shooting technology implementation method based on a virtual engine, which needs a virtual shooting implementation method with lower cost and stable and reliable technology in shooting of shorter videos such as network films, advertisements and the like by erecting a VR (virtual reality) positioner and a camera with a tracker, calibrating and dividing a virtual space, configuring a virtual camera and starting virtual shooting, the background only needs an LED (light emitting diode) background wall at a common stage level, the original shooting cost can not be greatly increased, VR equipment is based on mass consumption level products, compared with professional products, the virtual engine has extremely high time urgency in problem solving, the simple operation performance of the virtual engine can be compared with other scene bearing platforms, the virtual engine has higher operability, and the virtual engine implementation method can comprehensively combine the three advantages, the method is a solution which has wide applicability to small and medium shooting teams.

Description

Virtual shooting technology specific implementation method based on unreal engine
Technical Field
The invention relates to the technical field of virtual shooting, in particular to a virtual shooting technology specific implementation method based on a ghost engine.
Background
The application of real-time Ray Tracing (RTX) technology in game engines is becoming more and more extensive. A virtual shooting technology based on an RTX engine is produced accordingly. Virtual shooting, also known as virtual film production, is a shooting scheme that utilizes the powerful and fast picture expressive power of a game engine. The method comprises the steps of utilizing a game engine to manufacture a shooting background, utilizing an LED background to build a shooting environment, and utilizing VR equipment to synchronize motion tracks of a real camera and a virtual camera. Thereby rapidly shooting a film of the background made of the special effect.
The currently embodied virtual photography technologies that appear in the world based on this theory and scheme are very diverse. The large-scale technical scheme limits the use conditions of small and medium-sized teams and individuals, the small-scale shooting technology is difficult to solve the limitation of various software BUGs and equipment due to the scale limitation of development teams or other reasons, and a stable and reliable technical scheme is difficult to achieve.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a virtual shooting technology specific implementation method based on an illusion engine, which has the advantages of low cost, reliable technology, extremely high advantage in time urgency for solving problems, high operability and the like, solves the problem that a large-scale technical scheme limits the use conditions of medium and small-sized teams and individuals, and the problem that a small-scale shooting technology is difficult to solve the limitation of various software BUGs and equipment due to the scale limitation of development teams or other reasons.
In order to achieve the purpose, the invention provides the following technical scheme: a virtual shooting technology implementation method based on a ghost engine comprises the following steps of S1: erecting a VR positioner and a camera with a tracker;
firstly, establishing a central origin of a performance field, wherein the origin is a central position of a performance area and a world coordinate origin of a virtual world in an illusion engine, once the origin is established, the origin cannot be changed, arranging a locator around the performance field, ensuring that the locator is within an effective distance of a locator base station, the detection angle of a single locator base station is one hundred twenty degrees, when the base station is established at a high position, the base station is inclined downwards by thirty-five to forty-five degrees, the installation of the locator base station is stable, once the base station is calibrated and used, the base station cannot move or shake any more, a manufacturer of the locator base station is a fixed manufacturer, and a camera with a tracker is erected at a shooting field;
step S2: establishing space calibration and division;
calibrating and dividing the space according to the designated steps of SteamVR, and after the space is determined, the head-mounted display should be placed at a shooting site and is ensured to be observed by any locator base station;
step S3: a virtual camera configuration within the illusion engine;
the virtual engine is internally provided with a template project of a virtual camera, the template is utilized to build a proper virtual camera, a self-defined game mode is created, a new Pann class is used as a Pann class of the self-defined game mode, a self-defined game controller is newly built, parameters of the controller are set to be proper operation values, the plane size of a canvas of the virtual camera template is modified, the size of the canvas plane is completely consistent with the size and the radian angle of an LED screen of a shooting site, a display cluster configuration file attached to the template is modified, default parameters are set as LED resolution parameters, the resolution parameter size can be adjusted according to the proportion by the performance of an operation server, LIVELLINK is started, LIVELLINKXR is added as a source, a control handle or a tracker of VR is linked, the linkage is confirmed to be effective and real-time in the engine, if phenomena of blocking, frame dropping and the like occur, the post-processing effect in the engine can be reduced or the size and the resolution of the capture range of the virtual camera can be modified, after the virtual camera is set, the parameter value of World Transform under a livelink parameter panel is canceled, and the required scene level is used as the default loading level of the engine project;
step S4: starting virtual shooting;
the nDisplayLauter is started from the ghost Engine installation location \ UE _4.26\ Engine \ Binaries \ DotNET path. And respectively loading the illusion engine engineering file and the display cluster configuration file of the Config Files, and clicking run to run.
Preferably, in step S1, the maximum effective distance of the locator base station cannot exceed five meters, and the distribution distance of the locators is controlled to be four-point five meters.
Preferably, in step S1, the locator arrangements are evenly distributed on an arc line with a radius of three to four points and five meters, with the center origin as a center.
Preferably, the locators in step S1 are arranged on both sides of the performance ground and have a three-meter by three-meter rectangular shape.
Preferably, in the step S4, the receiving distance of the bluetooth signal receiver of the VR tracking device should be noticed during the shooting process, so as to keep the receiver close enough to the tracking device, and the head-mounted display device of the VR device should be kept running in the shooting site during the shooting.
Preferably, if the phenomena of stuck, frame dropping, etc. occur in step S3, the post-processing effect in the engine may be reduced or the size and Resolution of the capture range of the virtual camera, i.e. the two parameter values of the fovmultilier and Resolution Multiplier, may be modified.
In summary, the invention includes at least one of the following advantages:
1. according to the virtual shooting technology specific implementation method based on the unreal engine, only the LED background wall of the common stage level is needed for the background in the shooting process, the original shooting cost cannot be increased, and the cost problem of the scheme is reduced.
2. According to the virtual shooting technology specific implementation method based on the illusion engine, VR equipment is based on mass consumption products, and compared with professional products, the virtual shooting technology specific implementation method has extremely high advantages in time urgency of problem solving.
3. According to the virtual shooting technology specific implementation method based on the unreal engine, the simple operation performance of the unreal engine can be higher than that of other scene bearing platforms, and the operability is higher.
Drawings
FIG. 1 is a schematic view of the flow structure of the present invention;
fig. 2 is a schematic top view of a field layout according to an embodiment of the present invention.
Wherein: s1, erecting a VR positioner and a camera with a tracker; s2, establishing space calibration and division; s3, virtual camera configuration within the illusion engine; and S4, starting virtual shooting.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a virtual photography technique implementation method based on a ghost engine includes four steps, and the details of each step are as follows.
Step S1: firstly, a central origin of a performance field is established, the origin is not only the central position of a performance area, but also the world coordinate origin of a virtual world in an illusion engine, once the origin is established, the origin can not be changed, locators are arranged around the performance field, and the locators are ensured to be in an effective distance and are limited by the maximum effective distance of a locator base station and can not exceed five meters, so the distribution distance is preferably controlled to be about four-point five meters, the optimized arrangement scheme is that the origin is taken as the center of a circle and is uniformly distributed on an arc line with the radius of three to four-point five meters, the locators can also be arranged in two rows in opposite directions to form a rectangle with the length of three meters by three meters, the locator base station is sufficiently high and is preferably about three meters, the locator base station can not be used for penetrating in the shooting process, the locator base station is ensured to be not shielded by an object and is positioned in the effective detection distance, the detection angle of a single locator base station is twenty degrees, therefore, when a base station is set at a high place, the base station is inclined downwards by thirty-five to forty-five degrees, or the locator base stations are distributed in a square shape, but in order to avoid the shielding situation which can occur in the shooting process, even if more than four locator base stations exist, the space is still kept to be ten meters by ten meters, the installation of the locator base stations is kept stable, once the calibration and use are started, the base stations cannot move or shake any more, a manufacturer of the locator base stations needs to be a fixed manufacturer, the problem of uncoordinated locators produced by different agents exists, and then a camera with a tracker is installed in a shooting site;
step S2: calibrating and dividing the space according to the appointed steps of SteamVR, after the space is determined, a head-mounted display should be placed on a shooting site and observed by any locator base station, in order to ensure the real-time link effect of VR equipment and avoid the interference of complex wireless signals on the shooting site, a Bluetooth signal receiving device of the tracker should extend from an operation server to the greatest extent and be close to the tracker by using an extension line;
step S3: the virtual engine is internally provided with a template project of a virtual camera, the template can be utilized to quickly build a proper virtual camera, create a self-defined game mode, newly build a paw type as the paw type of the self-defined game mode, newly build a self-defined game controller, set the parameters of the controller as proper operation values, modify the canvas plane size of the virtual camera template to be completely consistent with the size and radian angle of an LED screen of a shooting site, modify a display cluster configuration file attached in the template, set default parameters as LED resolution parameters, enable the performance of an operation server to proportionally adjust the resolution parameter size, start LIVELLINK, add LIVELLINKXR as a source, link a control handle or a tracker of VR, confirm that the link is effective and real-time in the engine, and if the phenomena of blocking, frame falling and the like occur, can reduce the post-processing effect in the engine or modify the size and the resolution of the capturing range of the virtual camera, after the virtual camera is set, the parameter value of World Transform under a livelink parameter panel is canceled, and the required scene level is taken as the default loading level of the engine project;
step S4: starting the nDisplayLauncher from the ghost Engine installation position \ UE _4.26\ Engine \ Binaries \ DotNET path, respectively loading the ghost Engine engineering file and the display cluster configuration file of the Config Files, and clicking run.
It should be noted that the receiving distance of the bluetooth signal receiver of the VR tracking device should be taken into consideration during the shooting process, and the receiver is kept close enough to the tracking device. And the head mounted display device of the VR device should remain operational in the shooting site during shooting.
Referring to fig. 2, subject to field site limitations, the LED wall is 15 meters long and 7.5 meters high, the VR locators are arranged longitudinally in 2 rows perpendicular to the left and right of the LED wall, 2 rows of 3 VR locators face each other at a distance of 3 meters, the locators are mounted on a bracket 2 meters high and face downward at about 40 degrees, after the space is defined, the head-mounted display is placed under the locator bracket farthest from the LED wall, and the bluetooth signal receiver extends from the operating computer to under the middle locator bracket using an extension cord to ensure normal tracking;
it should be noted that the specific arrangement is based on the illusion engine version 4.26, the PRO version of the steadvr device using vive, and the motion tracker version 2.0;
in the configuration stage of the virtual camera, in the operation parameters of the self-defined controller, an "input Yaw scaling" parameter is changed into 0.5, an "input Pitch scaling" and an "input Roll scaling" are both changed into 0, a "smooth target view rotation speed" is set to 20, the original pixel Resolution of the on-site LED screen wall is 4068 2688, the Resolution is limited to the hardware performance of the on-site computer, the Resolution is changed into 2304 896 when the display cluster configuration file is changed, because the Resolution is lower than that of the LED wall, the capture range parameter of the virtual camera is modified into 3.3, "Resolution multiplexer" is 0.5, the non-displayLauter is started from the computer C disk \ EpicGames \ UE _4.26\ Engine Binaries DotNET, the modified file is loaded and the gate card starts to normally shoot;
during shooting, the VR head-mounted display is used regularly to check whether the locator is moved and whether the virtual space ground plane is consistent with the shooting site ground plane.
The virtual shooting technology specific implementation method based on the illusion engine provided by the embodiment of the invention is based on mass consumption level VR equipment, only needs an LED background wall of a common stage level, does not increase the original shooting cost, is simple in required programming technology and easy to operate, and can be generally suitable for use conditions of small teams and individuals.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. A virtual shooting technology implementation method based on a ghost engine is characterized by comprising the following steps:
step S1: erecting a VR positioner and a camera with a tracker;
A. firstly, a central origin of a performance site is established, the origin is a central position of a performance area and is also a world coordinate origin of a virtual world in an illusion engine, once the origin is established, the origin cannot be changed, a positioner is arranged around the performance site, and the positioner is ensured to be within an effective distance of a positioner base station;
B. the detection angle of a single locator base station is one hundred twenty degrees, when the base station is arranged at a high place, the base station is required to be inclined downwards by thirty-five to forty-five degrees, the installation of the locator base station is required to be stable, and once the locator base station is calibrated and used, the base station cannot move or shake any more;
C. a camera with a tracker is erected on a shooting site;
step S2: establishing space calibration and division;
calibrating and dividing the space according to the designated steps of SteamVR, and after the space is determined, the head-mounted display should be placed at a shooting site and is ensured to be observed by any locator base station;
step S3: a virtual camera configuration within the illusion engine;
A. providing a template project of a virtual camera inside the illusion engine, and constructing a suitable virtual camera by using the template;
B. creating a self-defined game mode, newly building a paw type as the paw type of the self-defined game mode, newly building a self-defined game controller, and setting the parameters of the controller to be proper operation values;
C. the size of the canvas plane of the virtual camera template is modified to be completely consistent with the size of an LED screen on a shooting site and the radian angle;
D. modifying a display cluster configuration file attached to the template, and setting default parameters as LED resolution parameters to enable the performance of the operating server to proportionally adjust the size of the resolution parameters;
E. starting LIVELLINK, adding LIVELLINKXR as a source, linking a control handle or a tracker of VR, and confirming that the link is effective and real-time in an engine;
F. after the virtual camera is set, canceling a World Transform parameter value under a levelink parameter panel;
G. taking the required scene level as a default loading level of the engine project;
step S4: starting virtual shooting;
starting the nDisplayLauncher from the ghost Engine installation position \ UE _4.26\ Engine \ Binaries \ DotNET path, respectively loading the ghost Engine engineering file and the display cluster configuration file of the Config Files, and clicking run.
2. The virtual photography technology implementation method based on the illusion engine of claim 1, wherein: in the step S1, the maximum effective distance of the locator base station cannot exceed five meters, and the distribution distance of the locators is controlled to be four or five meters.
3. The virtual photography technology implementation method based on the illusion engine of claim 1, wherein: and in the step S1, the arrangement of the positioners takes the center origin as the center of a circle and is uniformly distributed on an arc line with the radius of three to four points and five meters.
4. The virtual photography technology implementation method based on the illusion engine of claim 1, wherein: in the step S1, the locators are arranged on both sides of the performance ground and have a three-meter by three-meter rectangle.
5. The virtual photography technology implementation method based on the illusion engine of claim 1, wherein: in the step S4, the receiving distance of the bluetooth signal receiver of the VR tracking device should be controlled during shooting, and the head-mounted display device of the VR device should be kept running in the shooting site during shooting.
6. The virtual photography technology implementation method based on the illusion engine of claim 1, wherein: in step S3, if the phenomenon of stuck or dropped frames occurs, the post-processing effect in the engine can be reduced or the size and Resolution of the capture range of the virtual camera can be modified, i.e., two parameter values, i.e., fovmultilier and Resolution Multiplier.
CN202111525292.7A 2021-12-14 2021-12-14 Virtual shooting technology specific implementation method based on unreal engine Pending CN114257705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111525292.7A CN114257705A (en) 2021-12-14 2021-12-14 Virtual shooting technology specific implementation method based on unreal engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111525292.7A CN114257705A (en) 2021-12-14 2021-12-14 Virtual shooting technology specific implementation method based on unreal engine

Publications (1)

Publication Number Publication Date
CN114257705A true CN114257705A (en) 2022-03-29

Family

ID=80795032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111525292.7A Pending CN114257705A (en) 2021-12-14 2021-12-14 Virtual shooting technology specific implementation method based on unreal engine

Country Status (1)

Country Link
CN (1) CN114257705A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243064A (en) * 2011-04-22 2011-11-16 深圳迪乐普数码科技有限公司 Determination system and method for position and posture of video camera in virtual studio
US20130344916A1 (en) * 2009-06-10 2013-12-26 Digimarc Corporation Content sharing methods and systems
KR20150114103A (en) * 2014-03-31 2015-10-12 우덕명 3D Real-Time Virtual Stereo Studio System And Method For Producting Virtual Stereo Studio Image In Real-Time Virtual Stereo Studio System
WO2020125839A1 (en) * 2018-12-18 2020-06-25 GRID INVENT gGmbH Electronic element and electrically controlled display element
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN113628314A (en) * 2021-08-30 2021-11-09 中国人民解放军国防科技大学 Visualization method, device and equipment for photographic measurement model in illusion engine
WO2021243313A1 (en) * 2020-05-29 2021-12-02 Medtronic, Inc. Extended reality (xr) applications for cardiac blood flow procedures

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130344916A1 (en) * 2009-06-10 2013-12-26 Digimarc Corporation Content sharing methods and systems
CN102243064A (en) * 2011-04-22 2011-11-16 深圳迪乐普数码科技有限公司 Determination system and method for position and posture of video camera in virtual studio
KR20150114103A (en) * 2014-03-31 2015-10-12 우덕명 3D Real-Time Virtual Stereo Studio System And Method For Producting Virtual Stereo Studio Image In Real-Time Virtual Stereo Studio System
WO2020125839A1 (en) * 2018-12-18 2020-06-25 GRID INVENT gGmbH Electronic element and electrically controlled display element
WO2021243313A1 (en) * 2020-05-29 2021-12-02 Medtronic, Inc. Extended reality (xr) applications for cardiac blood flow procedures
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method
CN113628314A (en) * 2021-08-30 2021-11-09 中国人民解放军国防科技大学 Visualization method, device and equipment for photographic measurement model in illusion engine

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈军: "基于LED背景墙的电影虚拟化制作系统集成与研发", 北京电影学院学报, pages 114 - 120 *

Similar Documents

Publication Publication Date Title
CN108619720B (en) Animation playing method and device, storage medium and electronic device
US9058693B2 (en) Location correction of virtual objects
US20170091991A1 (en) Control for Digital Lighting
CN105407259A (en) Virtual camera shooting method
US11425283B1 (en) Blending real and virtual focus in a virtual display environment
CN111756956B (en) Virtual light control method and device, medium and equipment in virtual studio
US20220067968A1 (en) Motion capture calibration using drones with multiple cameras
CN105137705A (en) Method and device for creating virtual dome screen
CN109997167A (en) Directional images for spherical surface picture material splice
CN111736489B (en) Distributed stage lighting simulation system and method
CN111698391B (en) Method for controlling real-time change of light parameters through simulated environment light parameters
CN104516482A (en) Shadowless projection system and method
CN114257705A (en) Virtual shooting technology specific implementation method based on unreal engine
CN110288207A (en) It is a kind of that the method and apparatus of scene information on duty is provided
JP2015534299A (en) Automatic correction method of video projection by inverse transformation
CN207301789U (en) A kind of unmanned plane formation algorithm checking system based on small-sized quadrotor
WO2022217768A1 (en) Method and apparatus for customizing direction-changing projection, device, and system
EP4231243A1 (en) Data storage management method, object rendering method, and device
CN117751374A (en) Distributed command execution in a multi-location studio environment
US20220076450A1 (en) Motion capture calibration
US20210185297A1 (en) Multi-spectral volumetric capture
US11189107B2 (en) 3D device simulator visualizer platform as a service
US20050270287A1 (en) Image processing
CN108604427A (en) Control method, computer-readable medium and controller
CN105933609B (en) Transfer the method and device of video camera capable of rotating

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination