CN114222067A - Scene shooting method and device, storage medium and electronic equipment - Google Patents

Scene shooting method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114222067A
CN114222067A CN202210005639.3A CN202210005639A CN114222067A CN 114222067 A CN114222067 A CN 114222067A CN 202210005639 A CN202210005639 A CN 202210005639A CN 114222067 A CN114222067 A CN 114222067A
Authority
CN
China
Prior art keywords
real
virtual
shooting
equipment
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210005639.3A
Other languages
Chinese (zh)
Other versions
CN114222067B (en
Inventor
王毅
刘旺
谭建明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202210005639.3A priority Critical patent/CN114222067B/en
Publication of CN114222067A publication Critical patent/CN114222067A/en
Application granted granted Critical
Publication of CN114222067B publication Critical patent/CN114222067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/043Allowing translations
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/18Heads with mechanism for moving the apparatus relatively to the stand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure provides a scene shooting method, a scene shooting device, a storage medium and electronic equipment, and relates to the technical field of computers. The scene shooting method comprises the following steps: acquiring equipment parameters of real shooting equipment in a real scene, and determining a motion range of virtual shooting equipment corresponding to the real shooting equipment in a virtual scene based on the equipment parameters of the real shooting equipment; responding to control operation on the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting, and acquiring a motion track of the virtual shooting equipment; and determining the data to be moved of the real shooting equipment according to the motion track of the virtual shooting equipment, and controlling the real shooting equipment to carry out motion shooting in the real scene based on the data to be moved. The method and the device can effectively improve the convenience of synchronous shooting of the virtual scene and the real scene, and reduce the shooting cost.

Description

Scene shooting method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a scene shooting method, a scene shooting apparatus, a computer-readable storage medium, and an electronic device.
Background
In order to improve the shooting visual effect, it is more and more common to synthesize and make a virtual scene and a real scene. In the process of synthesizing and manufacturing the virtual scene and the real scene, synchronizing the shot picture in the virtual scene with the shot picture in the real scene is a very important link.
In the related art, a device with a positioning function needs to be used for acquiring motion data of a real shooting device so as to convert the motion amplitude of the virtual shooting device in a virtual scene and synchronize the shooting perspective angles of the virtual scene and the real scene, but in such a way, a special real shooting operator needs to operate the real shooting device, so that the synchronization shooting difficulty is high, and the cost is high.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a scene shooting method, a scene shooting device, a computer-readable storage medium, and an electronic device, thereby solving, at least to a certain extent, the problems of high difficulty and high cost of synchronous shooting in the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a scene shooting method, the method comprising: acquiring equipment parameters of real shooting equipment in a real scene, and determining a motion range of virtual shooting equipment corresponding to the real shooting equipment in a virtual scene based on the equipment parameters of the real shooting equipment; responding to control operation on the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting, and acquiring a motion track of the virtual shooting equipment; and determining the data to be moved of the real shooting equipment according to the motion track of the virtual shooting equipment, and controlling the real shooting equipment to carry out motion shooting in the real scene based on the data to be moved.
In an exemplary embodiment of the present disclosure, the determining a motion range of a virtual camera corresponding to the real camera in a virtual scene based on device parameters of the real camera includes: determining a motion range of the real photographing apparatus based on the apparatus parameters of the real photographing apparatus; and mapping the motion range of the real shooting equipment to the virtual scene according to the space mapping relation between the virtual scene and the real scene so as to determine the motion range of the virtual shooting equipment.
In an exemplary embodiment of the present disclosure, the controlling the virtual photographing apparatus to perform motion photographing and acquiring a motion trajectory of the virtual photographing apparatus in response to a control operation on the virtual photographing apparatus within a motion range of the virtual photographing apparatus includes: determining a target mirror moving arrangement form of the virtual shooting equipment from a preset mirror moving library according to the motion range of the virtual shooting equipment; and responding to the mirror moving trigger operation of the target mirror moving arrangement form, controlling the virtual shooting equipment to carry out motion shooting, and acquiring a motion track of the virtual shooting equipment.
In an exemplary embodiment of the present disclosure, the determining data to be moved of the real photographing apparatus according to the motion trajectory of the virtual photographing apparatus includes: and mapping the motion trail of the virtual shooting equipment to the real scene according to the spatial mapping relation between the virtual scene and the real scene so as to determine the data to be moved of the real shooting equipment.
In an exemplary embodiment of the present disclosure, the mapping, according to a spatial mapping relationship between the virtual scene and the real scene, a motion trajectory of the virtual shooting device into the real scene to determine data to be moved of the real shooting device includes: performing three-dimensional decomposition on the motion track of the virtual shooting equipment in a virtual scene to obtain three-dimensional decomposition data; and determining the data to be moved of the real shooting equipment in the real scene according to the three-dimensional decomposition data.
In an exemplary embodiment of the present disclosure, before determining data to be moved of the real photographing apparatus in the real scene according to the three-dimensional decomposition data, the method further includes: and in response to the updating operation of the three-dimensional decomposition data, updating the three-dimensional decomposition data, and controlling the virtual shooting equipment to carry out motion shooting based on the updated three-dimensional decomposition data.
In an exemplary embodiment of the present disclosure, the three-dimensional decomposition data includes a movement distance and a movement speed of the virtual photographing apparatus in each dimension in the virtual scene; the updating the three-dimensional decomposition data in response to the update operation on the three-dimensional decomposition data includes: and updating the movement distance or the movement speed of the virtual shooting device in any dimension in response to the adjustment operation of the movement distance or the movement speed of the virtual shooting device in any dimension.
In an exemplary embodiment of the present disclosure, the controlling the real shooting device to perform motion shooting in the real scene based on the data to be moved includes: and sending the data to be moved to the real shooting equipment by calling a data interface of the real shooting equipment so that the real shooting equipment carries out movement shooting based on the data to be moved.
In an exemplary embodiment of the present disclosure, the method further comprises: and calling an equipment driving library according to the equipment parameters of the real shooting equipment so as to be connected with the real shooting equipment.
According to a second aspect of the present disclosure, there is provided a scene capture apparatus, the apparatus comprising: the parameter acquisition module is used for acquiring equipment parameters of real shooting equipment in a real scene and determining the motion range of virtual shooting equipment corresponding to the real shooting equipment in a virtual scene based on the equipment parameters of the real shooting equipment; the first shooting module is used for responding to control operation on the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting and acquiring a motion track of the virtual shooting equipment; and the second shooting module is used for determining the data to be moved of the real shooting equipment according to the motion track of the virtual shooting equipment and controlling the real shooting equipment to carry out motion shooting in the real scene based on the data to be moved.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described scene capturing method.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-described scene capture method via execution of the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
in the scene shooting process, on one hand, the motion track of the virtual shooting device is converted into the to-be-moved data of the real shooting device, and the real shooting device is controlled to carry out motion shooting based on the to-be-moved data, so that the track synchronization of the virtual shooting device and the track synchronization of the real shooting device are realized, the shooting picture of the virtual shooting device and the shooting picture of the real shooting device can achieve the effect of perspective angle synchronization, the experience of virtual vision is expanded, the shooting requirement of a large-range moving mirror can be met, and the shooting difficulty cannot be increased. On the other hand, the motion trail of the virtual shooting equipment is converted into the to-be-moved data of the real shooting equipment, and the motion shooting of the real shooting equipment is controlled based on the to-be-moved data, so that the shooting automation of the real shooting equipment is realized, a real camera operator is not required to control the real shooting equipment to shoot, but the virtual shooting equipment is controlled to carry out motion shooting, and the shooting difficulty and the shooting cost are greatly reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is apparent that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings can be obtained from those drawings without inventive effort for a person skilled in the art.
FIG. 1 illustrates a system architecture diagram of an environment in which the present exemplary embodiment operates;
fig. 2 shows a flowchart of a scene capture method in the present exemplary embodiment;
fig. 3 shows a flowchart for determining the range of motion of the virtual camera in the present exemplary embodiment;
fig. 4 shows a flowchart of controlling the virtual photographing apparatus to perform motion photographing in the present exemplary embodiment;
fig. 5 shows an example diagram of a virtual photographing apparatus mirror-motion simulation in the X-axis dimension in the present exemplary embodiment;
fig. 6 shows a block diagram of a scene capture device in the present exemplary embodiment;
fig. 7 shows an electronic device for implementing the above method in the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams depicted in the figures are functionally real, and do not necessarily correspond to physically or logically separate realities. These functions may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Herein, "first", "second", etc. are labels for specific objects, and do not limit the number or order of the objects.
In the related art, in order to enable the virtual shooting device and the real shooting device to shoot synchronously, the real shooting device is enabled to be matched with the virtual shooting device to operate the mirror through a positioning position or a slight lens push-pull mode generally, but in such a mode, when the virtual shooting device performs large-scale mirror operation, the real shooting device is difficult to be matched synchronously, and finally, the perspective angles of the two pictures are not completely matched, so that the method is only suitable for small-range slow-speed mirror operation and is difficult to meet the shooting requirement.
In view of one or more of the above problems, exemplary embodiments of the present disclosure provide a scene shooting method, which may be applied to scenes such as virtual live broadcast, movie shooting, virtual studio recording, and the like. Fig. 1 shows a system architecture diagram of an environment in which the method operates, including a terminal device 110, a real-world camera 120, and a network 130. The terminal device 110 may be an intelligent device such as a mobile phone, a tablet, or a computer configured with a virtual engine, and may be configured to execute the scene shooting method. The real shooting device 120 may be a shooting auxiliary device of the real shooting device, such as an electric slide rail, an electric pan tilt, and the like, and may be used to control an entity camera installed thereon to move and shoot, and the real shooting device 120 may be mainly light and common devices, and may achieve a more breakthrough visual effect with a lower manufacturing cost. The medium used by the network 130 to provide a communication link between the terminal device 110 and the real photographing device 120 may include various connection types, such as a wired, wireless communication link, and the like.
Fig. 2 shows a scene shooting method in the present exemplary embodiment, which specifically includes the following steps S210 to S230:
step S210, acquiring device parameters of real shooting devices in a real scene, and determining a motion range of virtual shooting devices corresponding to the real shooting devices in a virtual scene based on the device parameters of the real shooting devices;
step S220, responding to the control operation of the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting, and acquiring the motion track of the virtual shooting equipment;
and step S230, determining to-be-moved data of the real shooting equipment according to the movement track of the virtual shooting equipment, and controlling the real shooting equipment to carry out movement shooting in a real scene based on the to-be-moved data.
In the scene shooting process, on one hand, the motion track of the virtual shooting device is converted into the to-be-moved data of the real shooting device, and the real shooting device is controlled to carry out motion shooting based on the to-be-moved data, so that the track synchronization of the virtual shooting device and the track synchronization of the real shooting device are realized, the shooting picture of the virtual shooting device and the shooting picture of the real shooting device can achieve the effect of perspective angle synchronization, the experience of virtual vision is expanded, the shooting requirement of a large-range moving mirror can be met, and the shooting difficulty cannot be increased. On the other hand, the motion trail of the virtual shooting equipment is converted into the to-be-moved data of the real shooting equipment, and the motion shooting of the real shooting equipment is controlled based on the to-be-moved data, so that the shooting automation of the real shooting equipment is realized, a real camera operator is not required to control the real shooting equipment to shoot, but the virtual shooting equipment is controlled to carry out motion shooting, and the shooting difficulty and the shooting cost are greatly reduced.
Each step in fig. 2 will be described in detail below.
Step S210, obtaining device parameters of a real shooting device in a real scene, and determining a motion range of a virtual shooting device corresponding to the real shooting device in a virtual scene based on the device parameters of the real shooting device.
The real scene refers to a real existing scene, and may be a green scene specially constructed for shooting, so as to facilitate post-keying synthesis, where the real scene may include a shot object, such as a live broadcaster, an actor, a prop, and other real objects.
The real shooting device refers to shooting auxiliary devices of the real shooting device, such as a motor-driven sliding rail, a motor-driven holder and the like, which can be used for shooting, and the devices are installed in a real scene, and the specific installation position and angle are determined by the shooting target position in the real scene to be shot.
The electric slide rail is used for shooting by the entity camera, real shooting equipment is fixed on the electric slide rail, and the movement of the real shooting equipment can be controlled through an electronic program. The electric pan-tilt is a supporting device for installing and fixing the entity camera, and the camera can be controlled to rotate through a control signal of the controller.
The device parameters of the real photographing device refer to parameters that can be used to distinguish real photographing devices having different functions, and may be related parameters such as the type, brand, and model of the photographing device.
The virtual scene may be a virtual scene previously created by the virtual engine, and may include elements such as scenes, characters, lights, and the like, for example, scenes such as a virtual live room, a virtual studio, and the like.
The virtual shooting equipment is used for simulating real shooting equipment and shooting a virtual scene to acquire a virtual picture in the virtual scene. The movement range of the virtual photographing apparatus refers to a movable range of the virtual photographing apparatus in the virtual scene. For example, when the real shooting device is determined to be a transverse electric slide rail according to the device parameters of the real shooting device, the transverse movable distance of the transverse electric slide rail is the movement range of the virtual shooting device.
It should be noted that, before executing step S210, the user may perform wired physical connection or wireless physical connection between the real shooting device and the execution terminal device in advance, so that the device terminal and the real shooting device can perform communication subsequently.
In an optional implementation manner, after the user performs wired physical connection or wireless physical connection between the real shooting device and the terminal device, the terminal device may call the device driver library according to the device parameters of the real shooting device to connect to the real shooting device, so that the terminal device can communicate with the real shooting device and control the real shooting device.
In the process, the communicable connection is established between the terminal device and the real shooting device by calling the driver so as to drive the control system of the real shooting device, so that the terminal device can control the real shooting device installed on the real shooting device.
In an optional implementation manner, acquiring device parameters of a real shooting device in a real scene includes: and responding to the equipment parameter input operation of the real shooting equipment in the real scene, and acquiring the equipment parameters of the real shooting equipment.
The device parameter entry operation may be an operation of configuring device parameters such as a brand, a type, and a model of the real shooting device in the terminal device.
In the process, the device parameters of the corresponding real shooting device are obtained by reading the recorded device parameters, so that on one hand, a parameter basis is provided for determining the motion range of the virtual shooting device, and on the other hand, a device connection object is provided for the terminal device, so that the terminal device and the corresponding real shooting device are in communication connection.
It should be noted that, it is necessary to ensure that the recorded device parameters are consistent with the device parameters of the real shooting device that performs physical connection, so as to prevent the terminal device from being unable to identify the connected real shooting device, thereby affecting the normal operation of shooting.
In an optional implementation manner, determining a motion range of a virtual shooting device corresponding to a real shooting device in a virtual scene based on device parameters of the real shooting device may be implemented by the steps shown in fig. 3, which specifically include the following steps S310 to S320:
step S310, determining the motion range of the real shooting equipment based on the equipment parameters of the real shooting equipment;
step S320, mapping the motion range of the real shooting device to the virtual scene according to the spatial mapping relationship between the virtual scene and the real scene, so as to determine the motion range of the virtual shooting device.
The movement range of the real photographing apparatus refers to a spatial range in which a physical camera mounted on the real photographing apparatus can move. For example, when the real photographing apparatus is a transverse motorized slide rail, the movement range thereof may be the maximum transverse movable distance of the physical camera mounted on the transverse motorized slide rail.
The spatial mapping relationship between the virtual scene and the real scene refers to a spatial dimension proportional relationship between the virtual scene and the real scene. When the virtual scene is modeled relative to the real scene in a 1:1 ratio, then the range of motion of the real camera can be taken as the range of motion of the virtual camera. When the virtual scene is not modeled relative to the real scene according to the 1:1 ratio, the motion range of the real shooting equipment needs to be converted according to the scene ratio to obtain the motion range of the virtual shooting equipment.
In the process, the movement range of the virtual shooting equipment is determined through the movement range of the real shooting equipment, so that the movement ranges of the real shooting equipment and the virtual shooting equipment correspond to each other, and the shooting angles of the real shooting equipment and the virtual shooting equipment can be ensured to be consistent in the subsequent shooting process.
Step S220, responding to the control operation of the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting, and acquiring the motion track of the virtual shooting equipment;
the control operation for the virtual photographing apparatus may be an operation of performing a drag movement for the corresponding virtual camera in the terminal apparatus. Further, the control operation for the virtual photographing apparatus may also be a drag operation for a virtual scene picture photographed by the virtual photographing apparatus. When the virtual shooting device shoots in a virtual scene, the terminal device can record the motion track of the virtual shooting device in real time.
The above process provides a reference trajectory for performing a synchronous motion for the real photographing apparatus by acquiring the motion trajectory of the virtual photographing apparatus.
In an alternative embodiment, the above-mentioned controlling the virtual shooting device to perform motion shooting and acquiring the motion track of the virtual shooting device in response to the control operation on the virtual shooting device within the motion range of the virtual shooting device may be implemented by steps shown in fig. 4, and specifically includes the following steps S410 to S420:
step S410, determining a target mirror moving arrangement form of the virtual shooting equipment from a preset mirror moving library according to the motion range of the virtual shooting equipment;
and step S420, in response to the mirror movement triggering operation in the target mirror movement arrangement form, controlling the virtual shooting equipment to carry out movement shooting, and acquiring the movement track of the virtual shooting equipment.
The preset mirror library refers to a mirror set containing a plurality of mirror arrangement forms, and is configured in the terminal device in advance for calling. The target moving mirror arrangement form refers to a virtual moving mirror arrangement form which conforms to the movement range of the virtual shooting device. The mirror operation triggering operation may be a triggering operation performed on a plurality of mirror operation arrangement forms configured in the corresponding type of virtual mirror operation grouping, and may be triggered by a user at a terminal device, and after the triggering operation is performed, the virtual shooting device may move according to the triggered mirror operation arrangement form.
The moving mirror marshalling which accords with the motion range of the virtual shooting equipment is selected from the preset moving mirror library, so that a moving strategy can be provided for the moving mirror of the virtual shooting equipment in a virtual scene, and the convenience of controlling the virtual shooting equipment is improved.
The real shooting devices of different types and/or the device combinations of the real shooting devices of different types can also correspond to different virtual mirror marshalling, each mirror marshalling is configured with at least one mirror marshalling form for a user to select and trigger, and the triggerable target mirror marshalling form is determined by judging the mirror marshalling corresponding to the real shooting devices. Different types of real shooting devices may, for example: the horizontal electronic slide rail, the horizontal electronic slide rail of arc, ordinary electronic cloud platform, triaxial electronic cloud platform etc.. Device combinations of different types of real shooting devices can be, for example: horizontal electronic slide rail + ordinary electronic cloud platform, horizontal electronic slide rail + triaxial electronic cloud platform etc.. As shown in table 1 below, the types of common real shooting devices and the corresponding virtual mirror grouping list are given.
TABLE 1
Figure BDA0003456588870000091
Figure BDA0003456588870000101
In addition, data calling can be performed on the software development toolkits of the types of the real shooting devices in advance, and compatible processing is performed, so that when the real shooting devices of the corresponding types are accessed, the terminal device can normally operate.
And step S230, determining to-be-moved data of the real shooting equipment according to the movement track of the virtual shooting equipment, and controlling the real shooting equipment to carry out movement shooting in a real scene based on the to-be-moved data.
The data to be moved of the real photographing apparatus may include data of a direction, a speed, a distance, and a rotation angle in which the real photographing apparatus is to be moved.
In an optional implementation manner, determining data to be moved of the real shooting device according to the motion track of the virtual shooting device includes: and mapping the motion trail of the virtual shooting equipment to the real scene according to the space mapping relation between the virtual scene and the real scene so as to determine the data to be moved of the real shooting equipment.
For example, when the real shooting device is a transverse electric slide rail and the moving track of the virtual shooting device is the X-axis transverse movement, the moving distance of the virtual shooting device in the positive direction of the X-axis is converted to obtain the transverse moving distance of the transverse electric slide rail along the positive direction of the X-axis.
The data to be moved of the real shooting equipment is obtained through conversion according to the space mapping relation between the virtual scene and the real scene, so that the real shooting equipment can be matched with the moving track of the virtual shooting equipment when moving, and the effect of synchronizing the shot pictures is achieved.
In an optional implementation manner, the mapping the motion trajectory of the virtual shooting device to the real scene according to the spatial mapping relationship between the virtual scene and the real scene to determine the data to be moved of the real shooting device may be implemented by: performing three-dimensional decomposition on the motion track of the virtual shooting equipment in a virtual scene to obtain three-dimensional decomposition data; and determining the data to be moved of the real shooting equipment in the real scene according to the three-dimensional decomposition data.
The three-dimensional decomposition refers to the decomposition of the spatial position of the motion trail of the virtual shooting device in the virtual scene in three dimensions of an X axis, a Y axis and a Z axis. The three-dimensional decomposition data refers to the motion state of the virtual photographing apparatus in each dimension obtained through decomposition. The motion state of the virtual shooting device in each dimension can be converted according to the space mapping relation between the virtual scene and the real scene, so that the to-be-moved data of the real shooting device in the real scene is obtained, and the motion tracks of the real shooting device and the virtual shooting device are ensured to be synchronous.
In an optional implementation manner, before determining data to be moved of the real shooting device in the real scene according to the three-dimensional decomposition data, the three-dimensional decomposition data can be updated in response to an updating operation on the three-dimensional decomposition data, and the virtual shooting device is controlled to carry out motion shooting based on the updated three-dimensional decomposition data.
By updating the three-dimensional decomposition data, the function of finely adjusting the motion trail of the virtual camera is realized, so that a better visual shooting effect is achieved.
In an alternative embodiment, the three-dimensional decomposition data may include a movement distance and a movement speed of the virtual shooting device in each dimension of the virtual scene; updating the three-dimensional decomposition data in response to an update operation on the three-dimensional decomposition data, including: and updating the movement distance or the movement speed of the virtual shooting device in any dimension in response to the adjustment operation of the movement distance or the movement speed of the virtual shooting device in any dimension.
It should be noted that the above-mentioned moving distance may include a forward moving distance along the shaft and a backward moving distance along the shaft.
Taking the adjustment of the movement distance or the movement speed in the X-axis dimension as an example, referring to fig. 5, the curve in fig. 5 is a moving mirror simulation curve in the X-axis dimension, the horizontal axis represents time, and the vertical axis represents the relative position of the virtual shooting device in the X-axis dimension, key nodes may be set in the curve, the movement distance may be adjusted by responding to the up-and-down dragging operation on the key nodes, the movement speed of the trajectory may be adjusted by responding to the left-and-right dragging operation on the key nodes, for example, the key nodes are dragged leftward to perform acceleration adjustment, and the key nodes are dragged rightward to perform deceleration adjustment. The Y-axis and Z-axis dimensions can also be adjusted in a similar manner, and are not described in detail here.
Through adjusting movement distance and movement speed, not only can be to the regulation that the motion of virtual making a video recording is more meticulous, can also change the naturalness and the soft degree of fortune mirror, can promote the visual experience of the picture of shooing.
In an optional implementation manner, the real shooting device is controlled to perform motion shooting in a real scene based on the data to be moved, which may be specifically implemented by the following means: and sending the data to be moved to the real shooting equipment by calling a data interface of the real shooting equipment so that the real shooting equipment performs movement shooting based on the data to be moved.
In the process, the terminal device and the real shooting device can be in real-time communication, and the data to be moved are sent to the real shooting device in real time, so that the real shooting device and the virtual shooting device can move synchronously.
Exemplary embodiments of the present disclosure also provide a scene photographing apparatus, as shown in fig. 6, the scene photographing apparatus 600 may include:
the parameter acquiring module 610 is configured to acquire device parameters of real shooting devices in a real scene, and determine a motion range of a virtual shooting device corresponding to the real shooting device in a virtual scene based on the device parameters of the real shooting devices;
a first shooting module 620, configured to control the virtual shooting device to perform motion shooting in response to a control operation on the virtual shooting device within a motion range of the virtual shooting device, and obtain a motion trajectory of the virtual shooting device;
the second shooting module 630 is configured to determine data to be moved of the real shooting device according to the motion trajectory of the virtual shooting device, and control the real shooting device to perform motion shooting in a real scene based on the data to be moved.
In an optional implementation manner, the parameter obtaining module 610 determines a motion range of a virtual shooting device corresponding to the real shooting device in the virtual scene based on device parameters of the real shooting device, and may be further configured to: determining a motion range of the real shooting device based on the device parameters of the real shooting device; and mapping the motion range of the real shooting equipment to the virtual scene according to the space mapping relation between the virtual scene and the real scene so as to determine the motion range of the virtual shooting equipment.
In an alternative embodiment, the first photographing module 620 may be configured to: determining a target mirror moving arrangement form of the virtual shooting equipment from a preset mirror moving library according to the movement range of the virtual shooting equipment; and controlling the virtual shooting equipment to carry out motion shooting in response to the mirror moving triggering operation in the arrangement form of the target mirror moving, and acquiring the motion track of the virtual shooting equipment.
In an optional implementation, the second capturing module 630 further includes: and the to-be-moved data determining module is used for mapping the movement track of the virtual shooting equipment to the real scene according to the space mapping relation between the virtual scene and the real scene so as to determine the to-be-moved data of the real shooting equipment.
In an optional implementation manner, the to-be-moved data determining module may include: the data decomposition module is used for performing three-dimensional decomposition on the motion track of the virtual shooting equipment in a virtual scene to obtain three-dimensional decomposition data; and the to-be-moved data determining submodule is used for determining the to-be-moved data of the real shooting equipment in the real scene according to the three-dimensional decomposition data.
In an optional implementation manner, the to-be-moved data determining module further includes: and the data updating module is used for responding to the updating operation of the three-dimensional decomposition data, updating the three-dimensional decomposition data and controlling the virtual shooting equipment to carry out motion shooting based on the updated three-dimensional decomposition data.
In an optional embodiment, the three-dimensional decomposition data includes a movement distance and a movement speed of the virtual shooting device in each dimension of the virtual scene; a data update module, which may be configured to: in response to an adjustment operation of the movement distance or the movement speed of the virtual photographing apparatus in any dimension, the movement distance or the movement speed of the virtual photographing apparatus in any dimension is updated.
In an alternative embodiment, the second shooting module 630, which controls the real shooting device to perform motion shooting in the real scene based on the data to be moved, may be configured to: and sending the data to be moved to the real shooting equipment by calling a data interface of the real shooting equipment so that the real shooting equipment performs movement shooting based on the data to be moved.
In an optional implementation, the scene capturing apparatus 600 further includes: and the equipment connection module is used for calling the equipment driving library according to the equipment parameters of the real shooting equipment so as to be connected with the real shooting equipment.
The details of the parts of the scene capturing apparatus 600 are described in detail in the method part, and the details that are not disclosed may refer to the contents of the method part, and thus are not described again.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described scene capturing method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing an electronic device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the electronic device. The program product may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Exemplary embodiments of the present disclosure also provide an electronic device capable of implementing the above-described scene shooting method. An electronic device 700 according to such an exemplary embodiment of the present disclosure is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 may take the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: at least one processing unit 710, at least one memory unit 720, a bus 730 that connects the various system components (including the memory unit 720 and the processing unit 710), and a display unit 740.
The memory unit 720 stores program code that may be executed by the processing unit 710 to cause the processing unit 710 to perform steps according to various exemplary embodiments of the present disclosure as described in the "exemplary methods" section above in this specification. For example, processing unit 710 may perform any one or more of the method steps of fig. 2-4.
The storage unit 720 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)721 and/or a cache memory unit 722, and may further include a read only memory unit (ROM) 723.
The memory unit 720 may also include programs/utilities 724 having a set (at least one) of program modules 725, such program modules 725 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 730 may be any representation of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 700, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 760. As shown, the network adapter 760 communicates with the other modules of the electronic device 700 via the bus 730. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the exemplary embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the following claims.

Claims (12)

1. A method for scene capture, the method comprising:
acquiring equipment parameters of real shooting equipment in a real scene, and determining a motion range of virtual shooting equipment corresponding to the real shooting equipment in a virtual scene based on the equipment parameters of the real shooting equipment;
responding to control operation on the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting, and acquiring a motion track of the virtual shooting equipment;
and determining the data to be moved of the real shooting equipment according to the motion track of the virtual shooting equipment, and controlling the real shooting equipment to carry out motion shooting in the real scene based on the data to be moved.
2. The method of claim 1, wherein determining a range of motion of a virtual camera in a virtual scene corresponding to the real camera based on device parameters of the real camera comprises:
determining a motion range of the real photographing apparatus based on the apparatus parameters of the real photographing apparatus;
and mapping the motion range of the real shooting equipment to the virtual scene according to the space mapping relation between the virtual scene and the real scene so as to determine the motion range of the virtual shooting equipment.
3. The method according to claim 1, wherein the controlling the virtual photographing apparatus to perform motion photographing and acquiring a motion trajectory of the virtual photographing apparatus in response to a control operation on the virtual photographing apparatus within a range of motion of the virtual photographing apparatus comprises:
determining a target mirror moving arrangement form of the virtual shooting equipment from a preset mirror moving library according to the motion range of the virtual shooting equipment;
and responding to the mirror moving trigger operation of the target mirror moving arrangement form, controlling the virtual shooting equipment to carry out motion shooting, and acquiring a motion track of the virtual shooting equipment.
4. The method according to claim 1, wherein the determining the data to be moved of the real photographing apparatus according to the motion trajectory of the virtual photographing apparatus comprises:
and mapping the motion trail of the virtual shooting equipment to the real scene according to the spatial mapping relation between the virtual scene and the real scene so as to determine the data to be moved of the real shooting equipment.
5. The method according to claim 4, wherein the mapping the motion trajectory of the virtual camera to the real scene according to the spatial mapping relationship between the virtual scene and the real scene to determine the data to be moved of the real camera comprises:
performing three-dimensional decomposition on the motion track of the virtual shooting equipment in a virtual scene to obtain three-dimensional decomposition data;
and determining the data to be moved of the real shooting equipment in the real scene according to the three-dimensional decomposition data.
6. The method according to claim 5, characterized in that before determining the data to be moved of the real shooting device in the real scene from the three-dimensional decomposition data, the method further comprises:
and in response to the updating operation of the three-dimensional decomposition data, updating the three-dimensional decomposition data, and controlling the virtual shooting equipment to carry out motion shooting based on the updated three-dimensional decomposition data.
7. The method of claim 6, wherein the three-dimensional decomposition data comprises a movement distance and a movement speed of the virtual camera in each dimension of the virtual scene;
the updating the three-dimensional decomposition data in response to the adjustment operation on the three-dimensional decomposition data comprises:
and updating the movement distance or the movement speed of the virtual shooting device in any dimension in response to the adjustment operation of the movement distance or the movement speed of the virtual shooting device in any dimension.
8. The method according to claim 1, wherein the controlling the real shooting device to perform motion shooting in the real scene based on the data to be moved comprises:
and sending the data to be moved to the real shooting equipment by calling a data interface of the real shooting equipment so that the real shooting equipment carries out movement shooting based on the data to be moved.
9. The method of claim 1, further comprising:
and calling an equipment driving library according to the equipment parameters of the real shooting equipment so as to be connected with the real shooting equipment.
10. A scene capture device, the device comprising:
the parameter acquisition module is used for acquiring equipment parameters of real shooting equipment in a real scene and determining the motion range of virtual shooting equipment corresponding to the real shooting equipment in a virtual scene based on the equipment parameters of the real shooting equipment;
the first shooting module is used for responding to control operation on the virtual shooting equipment in the motion range of the virtual shooting equipment, controlling the virtual shooting equipment to carry out motion shooting and acquiring a motion track of the virtual shooting equipment;
and the second shooting module is used for determining the data to be moved of the real shooting equipment according to the motion track of the virtual shooting equipment and controlling the real shooting equipment to carry out motion shooting in the real scene based on the data to be moved.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 9 via execution of the executable instructions.
CN202210005639.3A 2022-01-05 2022-01-05 Scene shooting method and device, storage medium and electronic equipment Active CN114222067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210005639.3A CN114222067B (en) 2022-01-05 2022-01-05 Scene shooting method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210005639.3A CN114222067B (en) 2022-01-05 2022-01-05 Scene shooting method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN114222067A true CN114222067A (en) 2022-03-22
CN114222067B CN114222067B (en) 2024-04-26

Family

ID=80707804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210005639.3A Active CN114222067B (en) 2022-01-05 2022-01-05 Scene shooting method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114222067B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488457A (en) * 2015-11-23 2016-04-13 北京电影学院 Virtual simulation method and system of camera motion control system in film shooting
KR101879166B1 (en) * 2017-03-06 2018-07-23 (주)잼투고 A real-world studio system capable of producing contents using the control of a virtual studio and its operating method
CN110691175A (en) * 2019-08-19 2020-01-14 深圳市励得数码科技有限公司 Video processing method and device for simulating motion tracking of camera in studio
CN111698390A (en) * 2020-06-23 2020-09-22 网易(杭州)网络有限公司 Virtual camera control method and device, and virtual studio implementation method and system
CN111970453A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Virtual shooting system and method for camera robot
CN113810612A (en) * 2021-09-17 2021-12-17 上海傲驰广告文化集团有限公司 Analog live-action shooting method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488457A (en) * 2015-11-23 2016-04-13 北京电影学院 Virtual simulation method and system of camera motion control system in film shooting
KR101879166B1 (en) * 2017-03-06 2018-07-23 (주)잼투고 A real-world studio system capable of producing contents using the control of a virtual studio and its operating method
CN110691175A (en) * 2019-08-19 2020-01-14 深圳市励得数码科技有限公司 Video processing method and device for simulating motion tracking of camera in studio
CN111698390A (en) * 2020-06-23 2020-09-22 网易(杭州)网络有限公司 Virtual camera control method and device, and virtual studio implementation method and system
CN111970453A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Virtual shooting system and method for camera robot
CN113810612A (en) * 2021-09-17 2021-12-17 上海傲驰广告文化集团有限公司 Analog live-action shooting method and system

Also Published As

Publication number Publication date
CN114222067B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN111698390B (en) Virtual camera control method and device, and virtual studio implementation method and system
US10499035B2 (en) Method and system of displaying a popping-screen
JP2012523269A (en) Virtual camera motion simulation
US11044398B2 (en) Panoramic light field capture, processing, and display
EP2174299A2 (en) Method and system for producing a sequence of views
CN110502097B (en) Motion control portal in virtual reality
US20210342971A1 (en) Photogrammetric alignment for immersive content production
CN113115110A (en) Video synthesis method and device, storage medium and electronic equipment
US11948257B2 (en) Systems and methods for augmented reality video generation
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN110572717A (en) Video editing method and device
CN114615556B (en) Virtual live broadcast enhanced interaction method and device, electronic equipment and storage medium
EP3963464A1 (en) Apparatus for multi-angle screen coverage analysis
CN115687106A (en) Reinjection hardware-based in-loop automatic testing method and device
CN112017264A (en) Display control method and device for virtual studio, storage medium and electronic equipment
KR102474451B1 (en) Apparatus, method, system and program for recording data in virtual production
CN114222067B (en) Scene shooting method and device, storage medium and electronic equipment
CN109636917B (en) Three-dimensional model generation method, device and hardware device
CN118540452B (en) Shooting system based on illusion engine and use method
US20240096035A1 (en) Latency reduction for immersive content production systems
CN117437343A (en) Virtual object rendering method and device, readable storage medium and electronic equipment
CN115174993B (en) Method, apparatus, device and storage medium for video production
Xia et al. Intelligent Construction of Aesthetic Evaluation System Based on Multi-camera Automatic Switching Algorithm of Director
WO2024148992A1 (en) Method and apparatus for generating cover image of virtual object, device, medium, and program
CN116774902A (en) Virtual camera configuration method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant