CN116260956B - Virtual reality shooting method and system - Google Patents

Virtual reality shooting method and system Download PDF

Info

Publication number
CN116260956B
CN116260956B CN202310541066.0A CN202310541066A CN116260956B CN 116260956 B CN116260956 B CN 116260956B CN 202310541066 A CN202310541066 A CN 202310541066A CN 116260956 B CN116260956 B CN 116260956B
Authority
CN
China
Prior art keywords
shooting
virtual
camera
display window
virtual camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310541066.0A
Other languages
Chinese (zh)
Other versions
CN116260956A (en
Inventor
陈政权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Zhongsheng Matrix Technology Development Co ltd
Original Assignee
Sichuan Zhongsheng Matrix Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Zhongsheng Matrix Technology Development Co ltd filed Critical Sichuan Zhongsheng Matrix Technology Development Co ltd
Priority to CN202310541066.0A priority Critical patent/CN116260956B/en
Publication of CN116260956A publication Critical patent/CN116260956A/en
Application granted granted Critical
Publication of CN116260956B publication Critical patent/CN116260956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Abstract

The invention relates to a virtual reality shooting method and a system, wherein the shooting method comprises the steps of obtaining a real scene picture and displaying the real scene picture in a first display window; obtaining a virtual scene picture and displaying the virtual scene picture in a second display window; calibrating a world coordinate system and a shooting angle, so that the positions of the first virtual camera, the second virtual camera and the real camera in the world coordinate system are the same and the shooting angles are the same; acquiring the position and shooting parameters of a first virtual camera, wherein the shooting parameters comprise shooting angles; and controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters. Based on the virtual display shooting system, the shooting method is combined to achieve final virtual display shooting. The whole shooting process is convenient to operate, the experience is good, and the obtained virtual reality picture is better in sense of reality.

Description

Virtual reality shooting method and system
Technical Field
The invention relates to the field of virtual reality shooting, in particular to a virtual reality shooting method and system.
Background
In the virtual reality movie and television production, in one mode, shooting is completed on a part of picture content such as actors in front of a green screen, and a real object shooting image is obtained by removing a green background through an image matting technology; and the other part of content such as the virtual scene is virtually shot to obtain a virtual scene shot image, the two parts of content are independently finished, and then the two parts of content are subjected to picture fusion to obtain a virtual reality shot picture.
In order to solve the problems of complex post-processing and high difficulty, in the second mode, an LED screen is used for replacing a traditional green screen, recorded background materials are played on the LED screen, then actors perform shooting before the LED screen, the complexity and difficulty of the post-processing can be reduced, but due to the fact that the shape and the area of the LED screen are limited, shooting angles are limited, and shooting at all-around angles is difficult to perform.
In the third mode, a mode of synchronously shooting a real camera and a virtual camera is adopted, the real camera shoots a real picture, and the position and shooting angle of the real camera and the position and shooting angle of the virtual camera in a virtual scene are synchronously acquired, so that the sense of reality of a fused picture is improved, however, the effect of synchronization is poor because the difficulty of acquiring the position and shooting angle of the real camera is high, and because the technology is limited and the control error is large.
In the fourth mode, contrary to the third mode, the position and the shooting angle of the virtual camera that shoots the virtual scene picture are acquired, and the position and the shooting angle of the real camera are controlled based on the position and the shooting angle, thereby reducing the technical limitation, however, there are still problems that the synchronization is not ideal and the picture angle is not ideal.
Disclosure of Invention
The technical problem to be solved by the application is to provide a virtual reality shooting method and a system, and the method and the system have the characteristic that a virtual reality shooting picture is more real.
In a first aspect, an embodiment provides a virtual reality shooting method, which is applied to a virtual reality shooting system, where the virtual reality shooting system includes an intelligent terminal and a real camera, the intelligent terminal includes a display module, the display module is used for a first display window and a second display window, a first virtual camera is disposed in the first display window, and a second virtual camera is disposed in the second display window; the shooting method comprises the following steps:
acquiring a real scene picture and displaying the real scene picture in a first display window;
obtaining a virtual scene picture and displaying the virtual scene picture in a second display window;
calibrating a world coordinate system so that the positions of the first virtual camera, the second virtual camera and the real camera in the world coordinate system are the same;
calibrating shooting angles so that the shooting angles of the first virtual camera, the second virtual camera and the real camera are the same;
acquiring the position and shooting parameters of a first virtual camera, wherein the shooting parameters comprise shooting angles;
and controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters.
In one embodiment, the acquiring the real scene picture and displaying the real scene picture in the first display window includes:
and acquiring part or all of real scene pictures shot by the real camera and displaying the real scene pictures in a first display window in real time.
In one embodiment, the acquiring the position and the shooting parameter of the first virtual camera includes:
and acquiring a control signal of the position change and/or shooting angle change of the first virtual camera, and calculating to obtain a position vector and/or an angle vector of the next stepping point of the first virtual camera.
In one embodiment, the controlling the real camera and the second virtual camera to synchronously shoot and synchronously update the position and the shooting parameters based on the position and the shooting parameters of the first virtual camera includes:
and controlling the position change track and/or the shooting angle change track of the first virtual camera at the next moment based on the calculated position vector and/or angle vector of the next stepping point of the first virtual camera, and synchronously controlling the position change track and/or the shooting angle change track of the real camera and the second virtual camera.
In one embodiment, the photographing method further comprises:
acquiring a virtual scene shot by a second virtual camera;
acquiring a real object synchronously shot by a real camera;
and fusing the synchronously shot virtual scene and the real object to obtain a virtual reality fusion image.
In a second aspect, in one embodiment, a virtual reality shooting system is provided, for virtual reality image shooting, the system includes:
the real camera is used for shooting a real scene picture;
the intelligent terminal comprises a display module and a processor, wherein the display module is used for displaying a first display window and a second display window; the first display window is used for displaying a shot real scene picture, and a first virtual camera is arranged in the first display window; the second display window is used for displaying a virtual scene picture, and a second virtual camera is arranged in the second display window; the processor is used for controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters; the shooting parameters comprise shooting angles;
and the control module is used for controlling the position and shooting parameters of the first virtual camera.
In one embodiment, the first display window and the second display window are displayed on the same display module to perform split-screen display or window floating display; in the window suspension display, a first display window is suspended on a second display window, or the second display window is suspended on the first display window, or both the first display window and the second display window are suspended.
In one embodiment, the smart terminal includes a first smart terminal and a second smart terminal, wherein the first display window is displayed on a display module of the first smart terminal, and the second display window is displayed on a display module of the second smart terminal.
In an embodiment, the control module includes a touch unit of the display module where the first display window is located, and the position and the shooting parameters of the first virtual camera in the world coordinate system are controlled by the touch unit.
In one embodiment, the control module comprises an operation control part in communication connection with the intelligent terminal, and the position and shooting angle of the first virtual camera in a world coordinate system are controlled through the operation control part.
The beneficial effects of the invention are as follows:
because the first display window to which the first virtual camera belongs displays the picture shot by the real camera, the position and shooting parameters of the real camera are controlled based on the position and shooting parameters of the first virtual camera, a worker can focus on the focus of attention in the real scene more conveniently and intuitively, the shooting effect is felt more conveniently and intuitively, and the shooting convenience is improved. Because the real camera and the second virtual camera are controlled to synchronously shoot based on the control of the first virtual camera, and the position and shooting parameters are synchronously updated, the synchronicity of shooting the virtual scene and shooting the real scene is improved, and the sense of realism of the synthesized picture can be improved.
Drawings
Fig. 1 is a schematic flow chart of a virtual reality shooting method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a virtual reality shooting method according to an embodiment of the present application;
FIG. 3 is a schematic view of a virtual reality shooting system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a virtual reality shooting system according to an embodiment of the present application.
Detailed Description
The invention will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, some operations associated with the present application have not been shown or described in the specification to avoid obscuring the core portions of the present application, and may not be necessary for a person skilled in the art to describe in detail the relevant operations based on the description herein and the general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The terms "coupled" and "connected," as used herein, are intended to encompass both direct and indirect coupling (coupling), unless otherwise indicated.
For convenience of explanation of the inventive concept of the present application, a brief explanation of the virtual reality shooting technique is provided below.
For virtual reality movie production, in a virtual reality shooting method, a real scene and a virtual scene are shot simultaneously, in order to improve the convenience of shooting the virtual scene and the real scene and reduce shooting cost, in the shooting process, the motion trail of a virtual camera shooting the virtual scene is acquired first, and then the motion trail of the real camera shot in the real scene is controlled based on the motion trail of the virtual camera. To a certain extent, the convenience of shooting is improved, and the shooting cost is lower because the real camera is controlled based on the motion trail of the virtual camera. However, the following disadvantages still exist:
firstly, the motion trail of a virtual camera shooting a virtual scene is acquired, and then the motion trail of a real camera shooting in the real scene is controlled based on the motion trail of the virtual camera, so that the problem of poor synchronism between the shooting of the virtual scene and the shooting of the real scene still affects the sense of reality of a synthesized picture.
Secondly, objects or people in the real scene are used as focus of attention, shooting angles in the shooting process are influenced to a greater extent, the movement track of the camera is controlled based on the virtual scene, the shooting angles of the focus of attention cannot be well determined, and the shot images are poor in ideality, so that the shooting method is poor in experience and poor in operation in the actual shooting process based on image influence.
Based on the above-mentioned problems, in one embodiment of the present application, a virtual reality shooting method is provided, and the virtual reality shooting method may be applied to movie shooting, small video production, studio recording, virtual reality live broadcasting, and the like. The virtual reality shooting method is applied to a virtual reality shooting system, and the virtual reality shooting system comprises an intelligent terminal and a real camera. The intelligent terminal is provided with signal processing capability, a virtual engine can be configured in the intelligent terminal, and the intelligent terminal can be a computer, a mobile phone, a tablet and the like. The intelligent terminal comprises a display module, wherein the display module is used for a first display window and a second display window, a first virtual camera is arranged in the first display window, and a second virtual camera is arranged in the second display window.
Referring to fig. 1, the virtual reality shooting method includes:
step S10, a real scene picture is acquired and displayed in a first display window.
The real scene refers to a picture shot by the real camera, and is displayed in the first display window in real time based on the picture shot by the real camera, so that the picture of the real scene can be clearly watched through the first window.
In one embodiment, all pictures shot within the real camera shooting view angle can be displayed in the first display window.
In some embodiments, a portion of the image captured within the view angle of the real camera may be displayed in the first display window, for example, only the actor and/or physical object that is the focus of attention may be displayed, some other image content may be added, or the actor and/or physical object that is the focus of attention may be processed and displayed, for example, only the outline or backbone of the actor may be displayed. And the focus object is displayed in the first display window by the focus point, so that interference of other factors is eliminated, and the shooting visual angle and the shooting effect are observed.
Therefore, it can be understood that in step S10, the real scene picture is acquired and displayed in the first display window, and the first display window may display all the pictures that are shot, may also be part of the pictures, or may even be the pictures after the shot pictures are processed.
Step S20, a virtual scene picture is acquired and displayed in a second display window.
It should be noted that, the step S10 and the step S20 are not strictly executed, and the step S10 may be executed first, or the step S20 may be executed first, and the step sequence in the embodiment section of the present application is not limited to the protection scope, but is an additional description for clarity of description of the embodiment, and does not mean a necessary sequence.
In step S20, the second display window is used for displaying the virtual scene images that need to be subjected to virtual reality fusion, and the virtual scene images under the shooting view angle of the second virtual camera can be dynamically displayed based on the movement track and the shooting view angle of the second virtual camera set in the second display window.
Step S30, calibrating the world coordinate system so that the positions of the first virtual camera, the second virtual camera and the real camera in the world coordinate system are the same.
Based on the virtual reality shooting requirement, consistency of a shooting range and a shooting angle needs to be determined. And adjusting the initial position and shooting angle of the real camera, setting a first virtual camera corresponding to the real camera in the first display window according to world coordinates based on the world coordinates corresponding to the real camera in the first display window, so that the positions of the first virtual camera and the real camera in a world coordinate system are the same. And setting a second virtual camera corresponding to the first virtual camera in a second display window according to world coordinates based on world coordinates of the first virtual camera, so that the positions of the second virtual camera and the first virtual camera in a world coordinate system are the same. Therefore, the world coordinate system is unified, and the initial shooting positions of the first virtual camera, the second virtual camera and the real camera are guaranteed to be the same.
In the present application, the world coordinate system is a world coordinate system of three or more dimensions, and the position is a spatial position of three or more dimensions.
And S40, calibrating shooting angles so that the shooting angles of the first virtual camera, the second virtual camera and the real camera are the same.
Under the condition that the initial shooting angle of the real camera is adjusted, based on the requirement of visual angle consistency, the shooting angle of the first virtual camera in the first display window is set based on the shooting angle of the real camera, so that the shooting angles of the first virtual camera and the real camera are the same. And setting the shooting angle of the second virtual camera in the second display window based on the set shooting angle of the first virtual camera, so that the shooting angles of the second virtual camera and the first virtual camera are the same.
It will be appreciated that the shooting position and shooting angle determine the angle of view of the shot, and that in the world coordinate system, the angles of view of the plurality of cameras at the same shooting angle are the same at the same position. In the same scene, the photographed range and picture are the same under the same viewing angle. In this way, it is determined that the initial photographing angles of view of the first virtual camera, the second virtual camera, and the real camera are the same. Therefore, the pictures photographed by the three cameras are identical and corresponding in spatial position.
Step S50, the position and shooting parameters of the first virtual camera are acquired, and the shooting parameters comprise shooting angles.
When shooting is started, the shooting position and shooting parameters of the first virtual camera are controlled. In some embodiments, only the shooting position change of the first virtual camera may be controlled, for example, based on the movement on the view angle line where the shooting angle is located, where the movement is close to the shooting object or far from the shooting object (the gesture angle of the camera shooting is unchanged, and the movement is close to or far from the shooting object along the shooting angle line), or only the shooting parameter change of the first virtual camera may be controlled, where the shooting parameter includes parameters such as the shooting angle, the focal length, and the shooting mode, which can make gesture adjustment and built-in parameter adjustment on the camera. It is also possible to control both the change of the shooting position of the first virtual camera and the change of the shooting parameters.
In an embodiment, to improve the consistency in the time domain of the three cameras, through the control signals, the position vector and/or the angle vector of the next stepping point of the first virtual camera can be calculated based on the obtained control signals of the position and/or the shooting angle change of the first virtual camera, so as to obtain the next target position and/or the gesture of the first virtual camera, and to improve the real-time performance and reduce the delay, the position vector and/or the angle vector of the next 5ms stepping of the first virtual camera can be calculated by taking ms as a stepping time unit, for example, 5 ms. Since the shooting position and the shooting parameters of the virtual camera are obtained, the target shooting position and/or the shooting parameters of the first virtual camera can be obtained from the control signal before the shooting position and the shooting parameters of the first virtual camera are changed.
In addition, not only the position vector and/or the angle vector, but also parameters capable of adjusting built-in parameters of the camera such as the focal length and the shooting mode of the first virtual camera can be obtained through control signals, so that setting of built-in parameters of the camera such as the focal length and the shooting mode can be synchronized.
Step S60, based on the position and shooting parameters of the first virtual camera, controlling the real camera and the second virtual camera to synchronously shoot, and synchronously updating the position and shooting parameters.
And controlling the real camera and the second virtual camera to synchronously shoot based on the acquired position and shooting parameters of the first virtual camera, and controlling the real camera and the second virtual camera to synchronously update shooting positions and shooting parameters.
In one embodiment of the present application, based on the calculated position vector and/or angle vector of the next stepping point of the first virtual camera, the position change track and/or the shooting angle change track of the next moment of the first virtual camera are controlled, and the position change track and/or the shooting angle change track of the real camera and the second virtual camera are synchronously controlled.
In one embodiment of the present application, the built-in parameters of the real camera and the second virtual camera are set synchronously based on the setting of the built-in parameters of the first virtual camera in the control signal of the first virtual camera.
It can be understood that, since the first display window to which the first virtual camera belongs displays the picture shot by the real camera, and the position and shooting parameters of the real camera are controlled based on the position and shooting parameters of the first virtual camera, whether the display picture of the first display window obtained under the control of the first virtual camera is an ideal picture can be obtained by directly watching the first display picture, so that a worker can focus on the focus of attention in the real scene more conveniently, intuitively feel the shooting effect, and improve the shooting convenience. For staff, the shooting position and shooting parameters of the real camera are easier to control, and the shooting effect is easier to control.
Based on the calculated position vector and/or angle vector of the next stepping point of the first virtual camera, the position change track and/or the shooting angle change track of the next moment of the first virtual camera are controlled, the position change track and/or the shooting angle change track of the real camera are synchronously controlled, and the synchronicity of the position change track and/or the shooting angle change track of the first virtual camera and the real camera is improved, so that the synchronicity of the position change and/or the shooting parameter change of the first virtual camera and the display picture change in the first display window is improved, the shooting effect feeling is more real-time, the shooting experience of a worker is improved, and the shooting efficiency can be further improved.
The position change track and/or the shooting angle change track of the second virtual camera are synchronously controlled while the position change track and/or the shooting angle change track of the real camera are synchronously controlled, so that the consistency of the three cameras in the time domain is improved.
Because the real camera and the second virtual camera are controlled to synchronously shoot based on the control of the first virtual camera, and the position and shooting parameters are synchronously updated instead of controlling the real camera by the second virtual camera, the synchronicity of shooting the virtual scene and shooting the real scene is improved, and the sense of realism of the synthesized picture can be improved.
It will be appreciated that, since the purpose of the first virtual camera is to desynchronize the control of the real camera and the second virtual camera, in some embodiments, the photographed image does not need to be actually output for the first virtual camera, only the control parameters for the first virtual camera need to be obtained.
It will be appreciated that the synchronisation here is not absolute and, because of the inevitable signal delays, can only be approximated infinitely rather than absolutely.
Referring to fig. 2, in an embodiment of the present application, the virtual reality shooting method further includes:
step S70, obtaining a virtual scene shot by the second virtual camera.
Step S80, obtaining a real object synchronously shot by a real camera.
In one embodiment, the real object to be acquired can be extracted based on the photographed picture of the real scene of the green curtain furnishing, or can be acquired based on other methods in the prior art.
Step S90, fusing the synchronously shot virtual scene and the real object to obtain a virtual reality fusion image.
In some embodiments, the frame obtained by fusing the virtual scene and the real object may be directly displayed in the second display window, or may be displayed in another window, or may be synchronously fused when the second virtual camera and the real camera capture, or may be fused later.
Referring to fig. 3 and 4, in one embodiment of the present application, a virtual reality shooting system is provided for virtual reality image shooting, the system includes:
a real camera 01 for shooting a real scene picture.
The real camera refers to image acquisition equipment, can be a CCD camera or a CMOS camera, can be professional shooting equipment, can be a portable and simple mobile intelligent terminal with a camera such as a mobile phone and a tablet, and can be wearable shooting equipment with an image acquisition function.
In some embodiments, the camera shooting auxiliary device of the real camera may also be included, for example, an electric sliding rail for loading the real camera, an electric tripod head, and the like, or an unmanned aerial vehicle for transferring the real camera, where the tripod head is arranged on the unmanned aerial vehicle, and the real camera is fixed on the tripod head.
When the real camera is loaded, the position and shooting angle of the real camera are controlled by controlling the electric slide rail and the electric cradle head. When the unmanned aerial vehicle is used for loading the real camera, the position of the real camera can be controlled by controlling the position of the unmanned aerial vehicle, and the shooting angle of the real camera can be controlled by controlling the posture of the cradle head on the unmanned aerial vehicle.
The intelligent terminal 02 comprises a display module 021 and a processor 022, wherein the display module is used for displaying a first display window and a second display window; the first display window is used for displaying a photographed real scene picture, and a first virtual camera is arranged in the first display window; the second display window is used for displaying a virtual scene picture, and a second virtual camera is arranged in the second display window; the processor is used for controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters; wherein the shooting parameters include shooting angles.
In one embodiment, the processor of the intelligent terminal 02 sends a control signal based on the position and shooting parameters of the first virtual camera to control the real camera and the second virtual camera to synchronously shoot, wherein the synchronous shooting comprises synchronously updating the position and shooting parameters of the real camera and the second virtual camera.
In some embodiments, there may be only one or two intelligent terminals. When there is only one intelligent terminal, the one intelligent terminal may be configured with two display modules, such as two display screens, or may be configured with only one display module. When only one intelligent terminal is provided and only one display module is configured on the intelligent terminal, the first display window and the second display window are displayed on the same display module, and split-screen display or window floating display is performed; when the window suspension display is carried out, the first display window is suspended on the second display window, or the second display window is suspended on the first display window, or the first display window and the second display window are both suspended and displayed and are not overlapped.
When there is only one intelligent terminal and the one intelligent terminal is configured with more than two display modules (e.g., two display screens), the first display window and the second display window may be displayed on one display module respectively, or may be displayed on the same display module.
When the intelligent terminal has more than two, the first display window and the second display window can be respectively displayed on the display module of one intelligent terminal.
And the control module 03 is used for controlling the position and shooting parameters of the first virtual camera.
In an embodiment, referring to fig. 3, the control module 03 includes a touch unit of a display module where the first display window is located, and controls a position and shooting parameters of the first virtual camera in a world coordinate system through the touch unit.
In one embodiment, referring to fig. 4, the control module 04 includes an operation control unit communicatively connected to the intelligent terminal, through which the position and shooting parameters of the first virtual camera in the world coordinate system are controlled.
The operation control component is a component externally connected with the intelligent terminal 02, is in wired or wireless communication with the intelligent terminal 02, can be an operation handle, can be other operation control components which are convenient for changing the position and shooting parameters of the first virtual camera, and can even be an intelligent operation control component.
Based on the virtual display shooting system, the shooting method is combined to achieve final virtual display shooting. The whole shooting process is convenient to operate, the experience is good, and the obtained virtual reality picture is better in sense of reality.
The foregoing description of the invention has been presented for purposes of illustration and description, and is not intended to be limiting. Several simple deductions, modifications or substitutions may also be made by a person skilled in the art to which the invention pertains, based on the idea of the invention.

Claims (9)

1. The virtual reality shooting method is characterized by being applied to a virtual reality shooting system, wherein the virtual reality shooting system comprises an intelligent terminal and a real camera, the intelligent terminal comprises a display module, the display module is used for displaying a first display window and a second display window, a first virtual camera is arranged in the first display window, and a second virtual camera is arranged in the second display window; the shooting method comprises the following steps:
acquiring a real scene picture and displaying the real scene picture in a first display window;
obtaining a virtual scene picture and displaying the virtual scene picture in a second display window;
calibrating a world coordinate system so that the positions of the first virtual camera, the second virtual camera and the real camera in the world coordinate system are the same;
calibrating shooting angles so that the shooting angles of the first virtual camera, the second virtual camera and the real camera are the same;
acquiring the position and shooting parameters of a first virtual camera, wherein the shooting parameters comprise shooting angles;
controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters;
the step of obtaining the position and shooting parameters of the first virtual camera comprises the following steps:
and acquiring a control signal of the position change and/or shooting angle change of the first virtual camera, and calculating to obtain a position vector and/or an angle vector of the next stepping point of the first virtual camera.
2. The method of claim 1, wherein the step of obtaining the real scene picture and displaying the real scene picture in the first display window comprises:
and acquiring part or all of real scene pictures shot by the real camera and displaying the real scene pictures in a first display window in real time.
3. The virtual reality shooting method of claim 1, wherein controlling the real camera and the second virtual camera to synchronously shoot and synchronously update the position and the shooting parameters based on the position and the shooting parameters of the first virtual camera comprises:
and controlling the position change track and/or the shooting angle change track of the first virtual camera at the next moment based on the calculated position vector and/or angle vector of the next stepping point of the first virtual camera, and synchronously controlling the position change track and/or the shooting angle change track of the real camera and the second virtual camera.
4. A virtual reality shooting method as claimed in any one of claims 1 to 3, further comprising:
acquiring a virtual scene shot by a second virtual camera;
acquiring a real object synchronously shot by a real camera;
and fusing the synchronously shot virtual scene and the real object to obtain a virtual reality fusion image.
5. A virtual reality shooting system for virtual reality image shooting, the system comprising:
a real camera (01) for shooting a real scene picture;
the intelligent terminal (02) comprises a display module (021) and a processor (022), wherein the display module is used for displaying a first display window and a second display window; the first display window is used for displaying a shot real scene picture, and a first virtual camera is arranged in the first display window; the second display window is used for displaying a virtual scene picture, and a second virtual camera is arranged in the second display window; the processor is used for controlling the real camera and the second virtual camera to synchronously shoot based on the position and shooting parameters of the first virtual camera, and synchronously updating the position and shooting parameters; the shooting parameters comprise shooting angles;
a control module (03) for controlling the position and shooting parameters of the first virtual camera;
the method for acquiring the position and shooting parameters of the first virtual camera comprises the following steps: and acquiring a position change and shooting angle change control signal of the first virtual camera, and calculating to obtain a position vector and an angle vector of a next stepping point of the first virtual camera.
6. The virtual reality shooting system of claim 5, wherein the first display window and the second display window are displayed on a same display module for split screen display or window floating display; in the window suspension display, a first display window is suspended on a second display window, or the second display window is suspended on the first display window, or both the first display window and the second display window are suspended.
7. The virtual reality shooting system of claim 5, wherein the smart terminal comprises a first smart terminal and a second smart terminal, the first display window is displayed on a display module of the first smart terminal, and the second display window is displayed on a display module of the second smart terminal.
8. The virtual reality shooting system of any one of claims 5-7, wherein the control module includes a touch unit of a display module where the first display window is located, and the position and shooting parameters of the first virtual camera in a world coordinate system are controlled by the touch unit.
9. The virtual reality shooting system of any one of claims 5-7, wherein the control module comprises an operational control component communicatively coupled to the intelligent terminal, whereby the position and shooting parameters of the first virtual camera in a world coordinate system are controlled by the operational control component.
CN202310541066.0A 2023-05-15 2023-05-15 Virtual reality shooting method and system Active CN116260956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310541066.0A CN116260956B (en) 2023-05-15 2023-05-15 Virtual reality shooting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310541066.0A CN116260956B (en) 2023-05-15 2023-05-15 Virtual reality shooting method and system

Publications (2)

Publication Number Publication Date
CN116260956A CN116260956A (en) 2023-06-13
CN116260956B true CN116260956B (en) 2023-07-18

Family

ID=86681089

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310541066.0A Active CN116260956B (en) 2023-05-15 2023-05-15 Virtual reality shooting method and system

Country Status (1)

Country Link
CN (1) CN116260956B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117527995A (en) * 2023-11-06 2024-02-06 中影电影数字制作基地有限公司 Simulated live-action shooting method and system based on space simulated shooting

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101680367B1 (en) * 2015-10-29 2016-11-28 동서대학교산학협력단 CG image product system by synchronization of simulator camera and virtual camera
WO2019041351A1 (en) * 2017-09-04 2019-03-07 艾迪普(北京)文化科技股份有限公司 Real-time aliasing rendering method for 3d vr video and virtual three-dimensional scene
CN111698390A (en) * 2020-06-23 2020-09-22 网易(杭州)网络有限公司 Virtual camera control method and device, and virtual studio implementation method and system
CN113426117A (en) * 2021-06-23 2021-09-24 网易(杭州)网络有限公司 Virtual camera shooting parameter acquisition method and device, electronic equipment and storage medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4231216B2 (en) * 2001-07-31 2009-02-25 日本放送協会 Virtual scene shooting method and apparatus
WO2017149441A1 (en) * 2016-02-29 2017-09-08 Nokia Technologies Oy Adaptive control of image capture parameters in virtual reality cameras
JP2019152899A (en) * 2018-02-28 2019-09-12 株式会社バンダイナムコスタジオ Simulation system and program
KR102192412B1 (en) * 2019-06-25 2020-12-16 주식회사 소울엑스 Method for compositing real time video in 3d virtual space and apparatus using the same
CN110602383B (en) * 2019-08-27 2021-06-29 深圳市华橙数字科技有限公司 Pose adjusting method and device for monitoring camera, terminal and storage medium
CN114520868B (en) * 2020-11-20 2023-05-12 华为技术有限公司 Video processing method, device and storage medium
CN112933606B (en) * 2021-03-16 2023-05-09 天津亚克互动科技有限公司 Game scene conversion method and device, storage medium and computer equipment
JP2022182119A (en) * 2021-05-27 2022-12-08 キヤノン株式会社 Image processing apparatus, control method thereof, and program
CN113810612A (en) * 2021-09-17 2021-12-17 上海傲驰广告文化集团有限公司 Analog live-action shooting method and system
CN114845147B (en) * 2022-04-29 2024-01-16 北京奇艺世纪科技有限公司 Screen rendering method, display screen synthesizing method and device and intelligent terminal
CN115591234A (en) * 2022-10-14 2023-01-13 网易(杭州)网络有限公司(Cn) Display control method and device for virtual scene, storage medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101680367B1 (en) * 2015-10-29 2016-11-28 동서대학교산학협력단 CG image product system by synchronization of simulator camera and virtual camera
WO2019041351A1 (en) * 2017-09-04 2019-03-07 艾迪普(北京)文化科技股份有限公司 Real-time aliasing rendering method for 3d vr video and virtual three-dimensional scene
CN111698390A (en) * 2020-06-23 2020-09-22 网易(杭州)网络有限公司 Virtual camera control method and device, and virtual studio implementation method and system
CN113426117A (en) * 2021-06-23 2021-09-24 网易(杭州)网络有限公司 Virtual camera shooting parameter acquisition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN116260956A (en) 2023-06-13

Similar Documents

Publication Publication Date Title
CN111698390B (en) Virtual camera control method and device, and virtual studio implementation method and system
JP6432029B2 (en) Method and system for producing television programs at low cost
US9774896B2 (en) Network synchronized camera settings
US9304388B2 (en) Three-dimensional imaging device and three-dimensional imaging method
US10659683B2 (en) Rolling shutter synchronization
CN105681656B (en) System and method for bullet time shooting
CN105072314A (en) Virtual studio implementation method capable of automatically tracking objects
JP2014529930A (en) Selective capture and display of a portion of a native image
CN116260956B (en) Virtual reality shooting method and system
CN113923360A (en) Image pickup apparatus, image pickup method, display apparatus, and display method
CN113329172B (en) Shooting method and device and electronic equipment
US20100123790A1 (en) Autofocus system
JP2019161397A (en) Control device, program, and control method
CN116320363B (en) Multi-angle virtual reality shooting method and system
TWI450025B (en) A device that can simultaneous capture multi-view 3D images
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
CN116320364B (en) Virtual reality shooting method and display method based on multi-layer display
JP2021086189A (en) Information processing apparatus, information processing method, video processing system, and program
CN104735353A (en) Method and device for taking panoramic photo
CN113507575B (en) Human body self-photographing lens generation method and system
CN112887653B (en) Information processing method and information processing device
US10931889B1 (en) System and method for providing landscape and portrait oriented images of a common scene on separate feeds with independent image controls
EP3206082A1 (en) System, method and computer program for recording a non-virtual environment for obtaining a virtual representation
CN114885147B (en) Fusion production and broadcast system and method
WO2017118668A1 (en) Image capturing device on a moving body

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant