CN117527992B - Camera correction method and system for space simulation shooting - Google Patents

Camera correction method and system for space simulation shooting Download PDF

Info

Publication number
CN117527992B
CN117527992B CN202311465226.4A CN202311465226A CN117527992B CN 117527992 B CN117527992 B CN 117527992B CN 202311465226 A CN202311465226 A CN 202311465226A CN 117527992 B CN117527992 B CN 117527992B
Authority
CN
China
Prior art keywords
camera
virtual
real
picture
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311465226.4A
Other languages
Chinese (zh)
Other versions
CN117527992A (en
Inventor
马平
孙靖
姜文
安娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Film Digital Production Base Co ltd
Original Assignee
China Film Digital Production Base Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Film Digital Production Base Co ltd filed Critical China Film Digital Production Base Co ltd
Priority to CN202311465226.4A priority Critical patent/CN117527992B/en
Publication of CN117527992A publication Critical patent/CN117527992A/en
Application granted granted Critical
Publication of CN117527992B publication Critical patent/CN117527992B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a camera correction method and system for space simulation shooting, a storage medium and electronic equipment, wherein the method comprises the following steps: the method comprises the steps of live-action shooting: capturing a scene picture and an environment picture of a real world; building a virtual environment: converting the real world scene and environment frames into models and textures in a virtual environment; virtual camera settings: shooting simulated scenery pictures and environment pictures in a virtual environment; camera correction: based on the position data, angle data, position movement track and shooting angle adjusting track of the virtual camera and the real camera, the camera and the lens are automatically corrected, calibrated, spatially mapped and color corrected. The invention automatically corrects, calibrates, spatially maps and corrects colors of the camera and the lens, and avoids the problems that the shot picture has a motion track which is not shot and the shot picture has chromatic aberration, color cast and distortion.

Description

Camera correction method and system for space simulation shooting
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and a system for correcting a camera for space analog shooting, a storage medium, and an electronic device.
Background
When using virtual camera, correction is a very critical step, it can influence the tracking quality, space simulation shooting is to utilize computer to produce virtual space, simulate real environment, users interact with objects in virtual environment by means of necessary equipment and with sense of sight, hearing, touch and the like, and influence each other, so that virtual reality has the characteristics of interactivity, imagination, immersion and the like.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a camera correction method and a system for space simulation shooting.
In order to solve the technical problems, the technical scheme adopted by the invention is a camera correction method for space simulation shooting, which comprises the following steps:
capturing a scene picture and an environment picture of a real world;
Converting the real world scene into a model in the virtual environment, converting the real world environment scene into textures in the virtual environment, and converting the real world scene and the environment scene into the model and the textures in the virtual environment;
Shooting a model and textures in a virtual environment, synchronously controlling the position movement track and the shooting angle adjustment track of the virtual camera and the real camera through the position data and the angle data of the virtual camera and the real camera calculated by the intelligent terminal and the tracking system, and automatically carrying out camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera so as to enable the virtual camera and the real camera to interactively shoot;
The components of the automatic correction tool are matched with the tracking module, so that the intelligent terminal monitors and adjusts the tracking correction of the virtual camera in real time, and performs space correction on the virtual camera and the virtual display screen in real time and performs space correction on the virtual camera and the virtual display screen;
based on the interaction shooting of the virtual camera and the real camera, the image shot by the real scene is fused with the scene of the virtual environment.
Preferably, the color correction includes color difference correction and color shift correction.
Preferably, the camera calibration comprises calibration with an automatic calibration tool comprising a sensor, an image processing unit and a calibration algorithm.
Preferably, the spatial map includes a picture that corrects for distortion by geometrically varying the picture based on data from the virtual camera and the real camera.
Preferably, when the position data and the angle data of the virtual camera and the real camera are different, the intelligent terminal automatically generates calibration for the motion track which is not shot in the shot picture, and automatically adjusts the position moving track and the shooting angle adjusting track of the virtual camera to be matched with the position moving track and the shooting angle adjusting track of the real camera.
According to another aspect of the present invention, there is provided a camera correction system for spatially-analog photographing, including:
The live-action shooting module is used for capturing a scene picture and an environment picture of a real world;
The virtual environment module is used for converting the scene picture of the real world into a model in the virtual environment, converting the environment picture of the real world into textures in the virtual environment and converting the scene picture and the environment picture of the real world into the model and the textures in the virtual environment;
The virtual picture shooting module is used for shooting models and textures in a virtual environment, synchronously controlling the position movement track and the shooting angle adjustment track of the virtual camera and the real camera through the position data and the angle data of the virtual camera and the real camera obtained by calculation of the intelligent terminal and the tracking system, and automatically carrying out camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera so as to enable the virtual camera and the real camera to carry out interactive shooting;
the camera correction module is used for matching the components of the automatic correction tool with the tracking module, so that the intelligent terminal monitors and adjusts the tracking correction of the virtual camera in real time, and performs space correction on the virtual camera and the virtual display screen in real time and performs space correction on the virtual camera and the virtual display screen;
The virtual picture and real picture fusion module is used for fusing the image shot by the real scene with the scene of the virtual environment based on the interactive shooting of the virtual camera and the real camera;
the intelligent terminal module comprises a processing module, a storage module and a control module.
Preferably, the virtual picture shooting module further comprises a camera sliding track for moving the real camera and a rotating device for adjusting the angle.
Preferably, the system further comprises a display module, wherein the display module comprises a virtual picture display screen, a real picture display screen and a fusion picture display screen, the virtual picture display screen is used for displaying pictures and parameters shot by the virtual camera, the real picture display screen is used for displaying pictures and parameters shot by the real camera, and the fusion picture display screen is used for displaying pictures after fusion of pictures shot by the virtual camera and pictures shot by the real camera and parameters of the virtual camera and the real camera.
Preferably, the spatial map includes a picture that corrects for distortion by geometrically varying the picture based on data from the virtual camera and the real camera.
Preferably, when the position data and the angle data of the virtual camera and the real camera are different, the intelligent terminal automatically generates calibration for the motion track which is not shot in the shot picture, and automatically adjusts the position moving track and the shooting angle adjusting track of the virtual camera to be matched with the position moving track and the shooting angle adjusting track of the real camera.
According to another aspect of the present invention, there is provided an electronic apparatus including: a memory and a processor, the memory and the processor coupled; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the method of any of the above.
According to another aspect of the present invention there is provided a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method as claimed in any one of the preceding claims.
After the technical scheme is adopted, compared with the prior art, the invention has the following beneficial effects, and of course, any product for implementing the invention does not necessarily need to achieve all the following advantages at the same time:
Through intelligent terminal and tracking system, automatic correction, demarcation, space mapping, colour correction are carried out camera and camera lens, the picture that has avoided taking appears colour difference, colour cast and the problem of distorted picture, when virtual camera and real camera's position data and angle data are different, intelligent terminal can automatic generation mark, the position movement track and the angle regulation orbit of making a video recording of automatic adjustment virtual camera make it with real camera's position movement track and the angle regulation orbit phase-match of making a video recording, avoid taking the picture in have the motion track that has not been taken.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing embodiments of the present invention in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, and not constitute a limitation to the invention. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 is a flow chart of a camera calibration method according to an exemplary embodiment of the present invention;
Fig. 2 is a schematic structural diagram of a camera calibration system according to an exemplary embodiment of the present invention.
Detailed Description
Hereinafter, exemplary embodiments according to the present invention will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention and not all embodiments of the present invention, and it should be understood that the present invention is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present invention are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present invention, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in an embodiment of the invention may be generally understood as one or more without explicit limitation or the contrary in the context.
Fig. 1 is a flowchart of a camera calibration method according to an exemplary embodiment of the present invention.
The method 100 comprises the following steps:
step 101, capturing a scene picture and an environment picture of the real world.
Step 102, converting the real-world scene into a model in the virtual environment, converting the real-world environment into a texture in the virtual environment, and converting the real-world scene and the environment into a model and a texture in the virtual environment.
And 103, shooting the models and textures in the virtual environment, synchronously controlling the position movement track and the shooting angle adjustment track of the virtual camera and the real camera through the position data and the angle data of the virtual camera and the real camera calculated by the intelligent terminal and the tracking system, and automatically carrying out camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera, so that the virtual camera and the real camera are interacted.
Step 104, the components of the automatic correction tool are matched with the tracking module, so that the intelligent terminal monitors and adjusts the tracking correction of the virtual camera in real time, and performs space correction on the virtual camera and the virtual display screen in real time and performs space correction on the virtual camera and the virtual display screen.
Step 105, fusing the image shot by the live action with the scene of the virtual environment based on the interaction shooting of the virtual camera and the real camera.
In another form, the method includes:
Step one: capturing a scene picture and an environment picture of a real world, and shooting the real scene in a real shooting studio by using professional shooting equipment and technology;
step two: converting an image or video of a real-world scene and an environment into a scene picture and an environment picture in a virtual environment, processing the scene picture and the environment picture in the real world through an intelligent terminal, converting the scene picture in the real world into a model in the virtual environment, converting the environment picture in the real world into a texture in the virtual environment, and converting the scene picture in the real world and the environment picture into the model and the texture in the virtual environment;
Step three: parameters and effects of simulated scenery pictures and environment pictures in the virtual environment are controlled through a plurality of cameras, a tracking system and intelligent terminal graphic rendering equipment, and shooting of models and textures in the virtual environment is carried out;
Step four: the position data and the angle data of the virtual camera and the real camera are obtained through calculation by the intelligent terminal and the tracking system, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera are synchronously controlled, and the intelligent terminal automatically corrects, calibrates, spatially maps and corrects colors of the virtual camera based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera, so that the virtual camera and the real camera are in interactive shooting;
the camera color correction comprises color difference correction and color cast correction of a virtual camera and a real camera;
The camera correction comprises an automatic correction tool, wherein the components of the automatic correction tool comprise a sensor, an image processing unit and a correction algorithm, and the components of the automatic correction tool are matched with a tracking module, so that the intelligent terminal can monitor and adjust the tracking correction of the virtual camera in real time, and perform space correction on the virtual camera and the virtual display screen in real time and perform space correction on the virtual camera and the virtual display screen;
space mapping correction, which is to perform geometric change on a picture to correct the distorted picture according to the data of the virtual camera and the real camera;
when the position data and the angle data of the virtual camera and the real camera are different, an un-shot motion track exists in a shot picture, the intelligent terminal can automatically generate calibration, and the position moving track and the shooting angle adjusting track of the virtual camera are automatically adjusted to be matched with the position moving track and the shooting angle adjusting track of the real camera;
Step five: based on the interaction shooting of the virtual camera and the real camera, the image shot by the real scene is fused with the scene of the virtual environment, and the real scene image and the virtual scene are seamlessly fused through the processing and synthesizing software in the intelligent terminal, so that a vivid simulated real scene shooting effect is created.
Fig. 2 is a schematic structural diagram of a camera calibration system according to an exemplary embodiment of the present invention.
The system 200 includes:
The live-action shooting module 201 is used for capturing a scene picture and an environment picture of the real world.
The virtual environment module 202 is constructed for converting a real-world scene picture into a model in a virtual environment, converting a real-world environment picture into a texture in the virtual environment, and converting a real-world scene picture and an environment picture into a model and a texture in the virtual environment.
The virtual image shooting module 203 is configured to shoot a model and a texture in a virtual environment, synchronously control a position movement track and a shooting angle adjustment track of a virtual camera and a real camera through position data and angle data of the virtual camera and the real camera calculated by the intelligent terminal and the tracking system, and automatically perform camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera, so that the virtual camera and the real camera perform interactive shooting.
The camera correction module 204 is configured to cooperate with the tracking module to enable the intelligent terminal to monitor and adjust tracking correction of the virtual camera in real time, and to perform spatial correction on the virtual camera and the virtual display screen in real time and to perform spatial correction on the virtual camera and the virtual display screen.
The virtual picture and real picture fusion module 205 is configured to fuse an image captured by a live-action with a scene of a virtual environment based on the interaction photography of the virtual camera and the real camera.
In addition, the system 200 further includes an intelligent terminal module, which includes a processing module, a storage module, and a control module.
In another described manner, a system includes:
The live-action shooting module is used for capturing a scene picture and an environment picture of a real world;
A virtual environment module is constructed and used for converting a scene picture and an environment picture of a real world into a model and textures in a virtual environment;
the virtual picture shooting module is used for shooting simulated scenery pictures and environment pictures in the virtual environment;
The camera correction module is used for automatically correcting, calibrating, spatially mapping and correcting and color correcting the camera and the lens based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera, so that the virtual camera and the real camera are interacted to be photographed;
The virtual picture and real picture fusion module is used for fusing the image shot by the real scene with the scene of the virtual environment based on the interactive shooting of the virtual camera and the real camera;
the intelligent terminal module comprises a processing module, a storage module and a control module.
The camera correction module of the embodiment further comprises a camera sliding track for moving the real camera and a rotating device for adjusting the angle.
The display module further comprises a display module, wherein the display module comprises a virtual picture display screen, a real picture display screen and a fusion picture display screen, the virtual picture display screen is used for displaying pictures and parameters shot by the virtual camera, the real picture display screen is used for displaying pictures and parameters shot by the real camera, the fusion picture display screen is used for displaying pictures after fusion of pictures shot by the virtual camera and pictures shot by the real camera, and parameters of the virtual camera and the real camera.
The camera correction module of this embodiment spatially maps the correction module, calibration module, color correction module, calibration module, and movement and camera angle adjustment module.
The basic principles of the present disclosure have been described above in connection with specific embodiments, but it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
The block diagrams of the devices, apparatuses, devices, systems referred to in this disclosure are merely illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure. The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (10)

1. A camera calibration method for spatially simulated photography, comprising the steps of:
capturing a scene picture and an environment picture of a real world;
Converting a scene picture of a real world into a model in a virtual environment, and converting an environment picture of the real world into textures in the virtual environment;
Shooting a model and textures in a virtual environment, synchronously controlling the position movement track and the shooting angle adjustment track of the virtual camera and the real camera through the position data and the angle data of the virtual camera and the real camera calculated by the intelligent terminal and the tracking system, and automatically carrying out camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera so as to enable the virtual camera and the real camera to interactively shoot;
The components of the automatic correction tool are matched with the tracking module, so that the intelligent terminal monitors and adjusts the tracking correction of the virtual camera in real time, and performs space correction on the virtual camera and the virtual display screen in real time;
Fusing the image shot by the live action with the scene of the virtual environment based on the interactive shooting of the virtual camera and the real camera;
When the position data and the angle data of the virtual camera and the real camera are different, the intelligent terminal automatically generates calibration for the motion track which is not shot in the shooting picture, and automatically adjusts the position moving track and the shooting angle adjusting track of the virtual camera to be matched with the position moving track and the shooting angle adjusting track of the real camera.
2. The method for camera correction for spatially-analog photographing according to claim 1, wherein said color correction includes color difference correction and color shift correction.
3. A camera calibration method for spatially simulated photography as claimed in claim 1, wherein said camera calibration comprises calibration using an automatic calibration tool comprising a sensor, an image processing unit and a calibration algorithm.
4. A camera correction method for spatially simulated photographing as claimed in claim 1, wherein said spatial map comprises geometrically varying the frames to correct distorted frames based on data from the virtual camera and the real camera.
5. A camera correction system for spatially simulated photography, comprising:
The live-action shooting module is used for capturing a scene picture and an environment picture of a real world;
A virtual environment module is constructed and used for converting a scene picture of the real world into a model in a virtual environment and converting an environment picture of the real world into textures in the virtual environment;
The virtual picture shooting module is used for shooting models and textures in a virtual environment, synchronously controlling the position movement track and the shooting angle adjustment track of the virtual camera and the real camera through the position data and the angle data of the virtual camera and the real camera obtained by calculation of the intelligent terminal and the tracking system, and automatically carrying out camera correction, calibration, space mapping and color correction on the virtual camera by the intelligent terminal based on the position data, the angle data, the position movement track and the shooting angle adjustment track of the virtual camera and the real camera so as to enable the virtual camera and the real camera to carry out interactive shooting;
The camera correction module is used for matching the components of the automatic correction tool with the tracking module, so that the intelligent terminal monitors and adjusts the tracking correction of the virtual camera in real time, and performs space correction on the virtual camera and the virtual display screen in real time;
The virtual picture and real picture fusion module is used for fusing the image shot by the real scene with the scene of the virtual environment based on the interactive shooting of the virtual camera and the real camera;
Wherein, the virtual picture shooting module is further used for:
When the position data and the angle data of the virtual camera and the real camera are different, the intelligent terminal automatically generates calibration for the motion track which is not shot in the shot picture, and automatically adjusts the position moving track and the shooting angle adjusting track of the virtual camera to be matched with the position moving track and the shooting angle adjusting track of the real camera.
6. The camera calibration system for spatially-analog shooting of claim 5, wherein the virtual frame shooting module further comprises a camera sliding track for real camera movement and a rotation means for adjusting the angle.
7. The camera calibration system of claim 5, further comprising a display module, wherein the display module comprises a virtual picture display screen, a real picture display screen, and a fused picture display screen, wherein the virtual picture display screen is used for displaying pictures and parameters captured by the virtual camera, the real picture display screen is used for displaying pictures and parameters captured by the real camera, and the fused picture display screen is used for displaying pictures after fusion of pictures captured by the virtual camera and pictures captured by the real camera, and parameters of the virtual camera and the real camera.
8. The camera correction system for spatially simulated photography of claim 5, wherein the spatial map comprises a picture that geometrically changes to correct distortion based on data from the virtual camera and the real camera.
9. An electronic device, the electronic device comprising: a memory and a processor, the memory and the processor coupled; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the method of any of claims 1 to 4.
10. A computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1 to 4.
CN202311465226.4A 2023-11-06 2023-11-06 Camera correction method and system for space simulation shooting Active CN117527992B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311465226.4A CN117527992B (en) 2023-11-06 2023-11-06 Camera correction method and system for space simulation shooting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311465226.4A CN117527992B (en) 2023-11-06 2023-11-06 Camera correction method and system for space simulation shooting

Publications (2)

Publication Number Publication Date
CN117527992A CN117527992A (en) 2024-02-06
CN117527992B true CN117527992B (en) 2024-06-21

Family

ID=89743099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311465226.4A Active CN117527992B (en) 2023-11-06 2023-11-06 Camera correction method and system for space simulation shooting

Country Status (1)

Country Link
CN (1) CN117527992B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN113923377A (en) * 2021-10-11 2022-01-11 浙江博采传媒有限公司 Virtual film-making system of LED (light emitting diode) circular screen

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101280170B1 (en) * 2011-08-01 2013-06-28 김현아 Method for processing Augmented-Reality-based Information Mobile Terminal
DE102015226749A1 (en) * 2015-12-28 2017-06-29 Robert Bosch Gmbh monitoring system
CN107959803A (en) * 2018-01-16 2018-04-24 灵然创智(天津)动画科技发展有限公司 A kind of virtual camera is aligned with real camera
JP2019191989A (en) * 2018-04-26 2019-10-31 キヤノン株式会社 System for, method of, and program for generating virtual viewpoint image
JP7492833B2 (en) * 2020-02-06 2024-05-30 株式会社 ディー・エヌ・エー Program, system, and method for providing content using augmented reality technology
CN215871598U (en) * 2021-10-11 2022-02-18 浙江博采传媒有限公司 Virtual film-making system of LED (light emitting diode) circular screen
CN114022568A (en) * 2021-11-10 2022-02-08 浙江博采传媒有限公司 Virtual and real camera pose correction method and device, storage medium and electronic equipment
KR102559913B1 (en) * 2022-10-20 2023-07-26 주식회사 비브스튜디오스 Method for implementing camera movement by using a virtual camera
CN116320363B (en) * 2023-05-25 2023-07-28 四川中绳矩阵技术发展有限公司 Multi-angle virtual reality shooting method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN113923377A (en) * 2021-10-11 2022-01-11 浙江博采传媒有限公司 Virtual film-making system of LED (light emitting diode) circular screen

Also Published As

Publication number Publication date
CN117527992A (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN112040092B (en) Real-time virtual scene LED shooting system and method
JP5224721B2 (en) Video projection system
US9699438B2 (en) 3D graphic insertion for live action stereoscopic video
CN107341832B (en) Multi-view switching shooting system and method based on infrared positioning system
CN107038724A (en) Panoramic fisheye camera image correction, synthesis and depth of field reconstruction method and system
CN108805801A (en) A kind of panoramic picture bearing calibration and system
Joshi et al. Color calibration for arrays of inexpensive image sensors
US12008708B2 (en) Method and data processing system for creating or adapting individual images based on properties of a light ray within a lens
CN110691175B (en) Video processing method and device for simulating motion tracking of camera in studio
GB2512680A (en) A method and apparatus
CN105120247A (en) White-balance adjusting method and electronic device
JP2004088247A (en) Image processing apparatus, camera calibration processing apparatus and method, and computer program
CN113556443A (en) LED screen real-time correction method facing virtual playing environment and virtual playing system
CN115578970A (en) Spherical LED screen correction method, device and system and electronic equipment
WO2021119272A1 (en) Rendering back plates
CN108632538B (en) CG animation and camera array combined bullet time shooting system and method
KR101725024B1 (en) System for real time making of 360 degree VR video base on lookup table and Method for using the same
KR101704362B1 (en) System for real time making of panoramic video base on lookup table and Method for using the same
CN117527992B (en) Camera correction method and system for space simulation shooting
CN116320363B (en) Multi-angle virtual reality shooting method and system
CN105681638A (en) Controllable pitch angle panoramic photography system used for film virtual production
CN116260956B (en) Virtual reality shooting method and system
WO2019155904A1 (en) Image processing device, image processing method, program, and projection system
CN112019747B (en) Foreground tracking method based on holder sensor
CN117527995A (en) Simulated live-action shooting method and system based on space simulated shooting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant