CN113676711B - Virtual projection method, device and readable storage medium - Google Patents

Virtual projection method, device and readable storage medium Download PDF

Info

Publication number
CN113676711B
CN113676711B CN202111135154.8A CN202111135154A CN113676711B CN 113676711 B CN113676711 B CN 113676711B CN 202111135154 A CN202111135154 A CN 202111135154A CN 113676711 B CN113676711 B CN 113676711B
Authority
CN
China
Prior art keywords
virtual
depth map
projected
projection
viewport
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111135154.8A
Other languages
Chinese (zh)
Other versions
CN113676711A (en
Inventor
任志忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tiantu Wanjing Technology Co ltd
Original Assignee
Beijing Tiantu Wanjing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tiantu Wanjing Technology Co ltd filed Critical Beijing Tiantu Wanjing Technology Co ltd
Priority to CN202111135154.8A priority Critical patent/CN113676711B/en
Publication of CN113676711A publication Critical patent/CN113676711A/en
Application granted granted Critical
Publication of CN113676711B publication Critical patent/CN113676711B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3188Scale or resolution adjustment

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

The embodiment of the invention provides a virtual projection method, a virtual projection device and a readable storage medium, and belongs to the technical field of movies. The method is based on a virtual camera and a physical camera, and comprises the following steps: determining a size of a presentation viewport of the virtual camera; obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; determining an initial imaging depth map based on the virtual surface to be projected; generating a virtual projection cloth based on the initial imaging depth map; and projecting the video source of the physical camera onto the virtual projection cloth. The invention can perfectly put the video source in the virtual scene to accurately display the picture.

Description

Virtual projection method, device and readable storage medium
Technical Field
The invention relates to the technical field of movies, in particular to a virtual projection method, a virtual projection device and a readable storage medium.
Background
In science fiction cinematography, a large number of special effects are required to present a virtual scene. It is currently possible to directly project a video source (e.g., a video of a person) captured by a physical camera into a virtual scene to appear as if the person is acting in the virtual scene. However, since the virtual scene is not a solid, there is a high possibility that problems may occur when the video source is placed in the virtual scene, for example, the video source is directly inserted into the virtual ground, and the like, which results in inaccurate picture display.
Disclosure of Invention
The embodiment of the invention aims to provide a virtual projection method, a virtual projection device and a readable storage medium, wherein the virtual projection method, the virtual projection device and the readable storage medium can enable a video source to be perfectly put in a virtual scene so as to accurately display a picture.
In order to achieve the above object, an embodiment of the present invention provides a virtual projection method, which is based on a virtual camera and a physical camera, and includes: determining a size of a presentation viewport of the virtual camera; obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; determining an initial imaging depth map based on the virtual surface to be projected; generating a virtual projection cloth based on the initial imaging depth map; and projecting the video source of the physical camera onto the virtual projection cloth.
Preferably, after determining the initial imaging depth map, the method further comprises: extracting the initial imaging depth map; determining an imaging depth map of the occlusion object participating in rendering; and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
Preferably, the projecting the video source of the physical camera onto the virtual projection cloth comprises: determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera; and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
Preferably, generating the virtual projection cloth comprises generating a base mesh and enabling textures.
Preferably, the virtual body to be projected includes: at least one of a virtual floor, a virtual wall, and a virtual roof.
The embodiment of the invention also provides a virtual projection device, which is based on the virtual camera and the physical camera and comprises: the virtual camera comprises a rendering viewport determining unit, a virtual surface determining unit, a depth map determining unit, a projection cloth generating unit and a projecting unit, wherein the rendering viewport determining unit is used for determining the size of a rendering viewport of the virtual camera; the virtual surface determining unit is used for obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; the depth map determining unit is used for determining an initial imaging depth map based on the virtual surface to be projected; the projection cloth generating unit is used for generating virtual projection cloth based on the initial imaging depth map; and the projection unit is used for projecting the video source of the physical camera onto the virtual projection cloth.
Preferably, the depth map determination unit is further configured to: extracting the initial imaging depth map; determining an imaging depth map of the occlusion object participating in rendering; and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
Preferably, the projection unit is further configured to: determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera; and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
Preferably, the projection cloth generating unit is further configured to generate a basic grid and enable materials.
Embodiments of the present invention also provide a machine-readable storage medium having stored thereon instructions that, when executed, implement the above-described method.
By the technical scheme, the virtual projection method and the device provided by the invention are adopted, the method is based on the virtual camera and the physical camera, and the method comprises the following steps: determining a size of a presentation viewport of the virtual camera; obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; determining an initial imaging depth map based on the virtual surface to be projected; generating a virtual projection cloth based on the initial imaging depth map; and projecting the video source of the physical camera onto the virtual projection cloth. The invention can perfectly put the video source in the virtual scene to accurately display the picture.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 is a diagram illustrating a situation of inaccurate display in the prior art;
FIG. 2 is a flowchart of a virtual projection method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a projection scenario provided by an embodiment of the present invention;
FIG. 4 is a flowchart of a virtual projection method according to another embodiment of the present invention;
FIG. 5 is a schematic diagram of a projection scenario provided by another embodiment of the present invention;
fig. 6 is a schematic structural diagram of a virtual projection apparatus according to an embodiment of the present invention.
Description of the reference numerals
1 rendering viewport determining unit 2 virtual surface determining unit
3 depth map determining unit 4 projection cloth generating unit
5 a projection unit.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
Fig. 1 is a schematic diagram illustrating a situation of inaccurate display in the prior art. As shown in fig. 1, the video source corresponding to the virtual camera imaging viewport is parallel to the screen of the lens, and the whole video source is divided into a virtual portion outside the body to be projected and a virtual portion inside the body to be projected, wherein the virtual portion inside the body to be projected is inserted to cause a problem in the display of the picture. It will be appreciated that although a virtual floor is used as an example, the same problem may arise with virtual walls, virtual roofs, and the like, virtual planes or corners.
Fig. 2 is a flowchart of a virtual projection method according to an embodiment of the present invention. As shown in fig. 2, the method is based on a virtual camera and a physical camera, the method comprising:
step S21, determining a size of a rendering viewport of the virtual camera;
for example, the size of the virtual camera's presentation viewport is first determined, e.g., in fig. 1, between two lines that exit the virtual camera lens. The size of the image viewport is related to the vertical distance D from the virtual body to be projected to the lens of the virtual camera, the size of the imaging element and the focal length of the lens, i.e. the size of the image viewport of the virtual camera can be determined by the three.
Step S22, obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected;
for example, as shown in fig. 3, the virtual to-be-projected body is of a bent type, and then according to the size of the image view port, a virtual to-be-projected body L1+ L2 located inside the image view port is determined, then a plane L parallel to the screen of the lens and located inside the image view port is determined, and finally the two parts are superposed to form a virtual plane to be projected. The virtual object to be projected may be understood as a part that blocks normal projection, for example, a virtual plane or corner such as a virtual ground, a virtual wall, a virtual roof, and the like.
Step S23, determining an initial imaging depth map based on the virtual surface to be projected;
for example, a depth map of a virtual surface to be projected is determined, and the determination method of the depth map is common and will not be described herein again.
Step S24, generating a virtual projection cloth based on the initial imaging depth map; and
for example, the depth map is converted into a basic grid body, the material is started, a "virtual projection cloth" is formed, and the "virtual projection cloth" is attached to a virtual surface to be projected.
And step S25, projecting the video source of the physical camera onto the virtual projection cloth.
For example, finally, the video source is projected onto a virtual projection cloth to complete the projection. For example, a perspective correction is first applied to the video source. Then, since the vertical distance D from the virtual object to be projected to the lens of the virtual camera is different, the size of the video source filling the image viewport should be adjusted accordingly. Therefore, firstly, a projection ratio is determined according to the vertical distance D from the virtual body to be projected to the lens of the virtual camera, and then the video source is zoomed according to the projection ratio.
Fig. 4 is a flowchart of a virtual projection method according to another embodiment of the present invention. Fig. 5 is a schematic diagram of a projection scenario provided by another embodiment of the invention. As shown in fig. 5, if between the virtual camera and the virtual volume to be projected, there are also obstructions within the imaging viewport, after determining the initial imaging depth map, the method further comprises:
step S41, extracting the initial imaging depth map;
step S42, determining an imaging depth map of the occlusion object participating in rendering;
step S43, deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, so as to generate the virtual projection cloth based on the final imaging depth map.
For example, the initial imaging depth map is extracted, and the imaging depth map in which the occlusion is involved in rendering is deleted, so that the final imaging depth map can be obtained. The final imaged depth map is more preferred than the initial imaged depth map, so the final imaged depth map may be used instead of the initial imaged depth map in the step of using the initial imaged depth map in the above steps, e.g. to generate a virtual projection cloth using the final imaged depth map.
Fig. 6 is a schematic structural diagram of a virtual projection apparatus according to an embodiment of the present invention. As shown in fig. 6, the apparatus is based on a virtual camera and a physical camera, and includes: the virtual camera comprises a rendering viewport determining unit 1, a virtual surface determining unit 2, a depth map determining unit 3, a projection cloth generating unit 4 and a projecting unit 5, wherein the rendering viewport determining unit 1 is used for determining the size of a rendering viewport of the virtual camera; the virtual surface determining unit 2 is configured to obtain a virtual to-be-projected body in the presentation viewport based on the size of the presentation viewport, so as to determine a virtual surface to be projected; the depth map determining unit 3 is configured to determine an initial imaging depth map based on the virtual surface to be projected; the projection cloth generating unit 4 is configured to generate a virtual projection cloth based on the initial imaging depth map; and the projection unit 5 is used for projecting the video source of the physical camera onto the virtual projection cloth.
Preferably, the depth map determination unit 3 is further configured to: extracting the initial imaging depth map; determining an imaging depth map of the occlusion object participating in rendering; and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
Preferably, the projection unit is further configured to 5: determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera; and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
Preferably, the projection cloth generating unit 4 is further configured to generate a basic grid and enable material.
The virtual projection apparatus described above is similar to the embodiment of the virtual projection method described above, and is not described herein again.
The virtual projection device comprises a processor and a memory, wherein the imaging viewport determining unit 1, the virtual surface determining unit 2, the depth map determining unit 3, the projection cloth generating unit 4, the projection unit 5 and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to one or more, and projection is carried out by adjusting kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the virtual projection method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the virtual projection method is executed when the program runs.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein the processor executes the program and realizes the following steps:
determining a size of a presentation viewport of the virtual camera; obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; determining an initial imaging depth map based on the virtual surface to be projected; generating a virtual projection cloth based on the initial imaging depth map; and projecting the video source of the physical camera onto the virtual projection cloth.
Preferably, after determining the initial imaging depth map, the method further comprises: extracting the initial imaging depth map; determining an imaging depth map of the occlusion object participating in rendering; and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
Preferably, the projecting the video source of the physical camera onto the virtual projection cloth comprises: determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera; and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
Preferably, generating the virtual projection cloth comprises generating a base mesh and enabling textures.
Preferably, the virtual body to be projected includes: at least one of a virtual floor, a virtual wall, and a virtual roof.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device:
determining a size of a presentation viewport of the virtual camera; obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected; determining an initial imaging depth map based on the virtual surface to be projected; generating a virtual projection cloth based on the initial imaging depth map; and projecting the video source of the physical camera onto the virtual projection cloth.
Preferably, after determining the initial imaging depth map, the method further comprises: extracting the initial imaging depth map; determining an imaging depth map of the occlusion object participating in rendering; and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
Preferably, the projecting the video source of the physical camera onto the virtual projection cloth comprises: determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera; and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
Preferably, generating the virtual projection cloth comprises generating a base mesh and enabling textures.
Preferably, the virtual body to be projected includes: at least one of a virtual floor, a virtual wall, and a virtual roof.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A virtual projection method, wherein the method is based on a virtual camera and a physical camera, the method comprising:
determining a size of a presentation viewport of the virtual camera;
obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected;
determining an initial imaging depth map based on the virtual surface to be projected;
generating a virtual projection cloth based on the initial imaging depth map; and
projecting a video source of the physical camera onto the virtual projection cloth;
the obtaining a virtual volume to be projected in the presentation viewport based on the size of the presentation viewport to determine a virtual surface to be projected includes: firstly, determining a virtual body to be projected which is positioned in a visualization viewport, then determining a plane which is positioned in the viewport, is outside the virtual body to be projected and is parallel to a screen surface of a lens, and finally superposing the body to be projected and the plane which is parallel to the screen surface of the lens to form a virtual surface to be projected.
2. The virtual projection method of claim 1, wherein after determining the initial imaging depth map, the method further comprises:
extracting the initial imaging depth map;
determining an imaging depth map of the occlusion object participating in rendering;
and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
3. The virtual projection method of claim 1, wherein the projecting the video source of the physical camera onto the virtual projection cloth comprises:
determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera;
and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
4. The virtual projection method of any of claims 1-3, wherein generating the virtual projection cloth comprises generating a base mesh volume and enabling textures.
5. The virtual projection method according to any one of claims 1 to 3, wherein the virtual body to be projected includes:
at least one of a virtual floor, a virtual wall, and a virtual roof.
6. A virtual projection apparatus, wherein the apparatus is based on a virtual camera and a physical camera, the apparatus comprising:
a rendering viewport determining unit, a virtual surface determining unit, a depth map determining unit, a projection cloth generating unit and a projecting unit, wherein,
the rendering viewport determination unit is used for determining the size of the rendering viewport of the virtual camera;
the virtual surface determining unit is used for obtaining a virtual body to be projected in the imaging viewport based on the size of the imaging viewport so as to determine a virtual surface to be projected;
the depth map determining unit is used for determining an initial imaging depth map based on the virtual surface to be projected;
the projection cloth generating unit is used for generating virtual projection cloth based on the initial imaging depth map; and
the projection unit is used for projecting the video source of the physical camera onto the virtual projection cloth;
the obtaining a virtual volume to be projected in the presentation viewport based on the size of the presentation viewport to determine a virtual surface to be projected includes: firstly, determining a virtual body to be projected which is positioned in a visualization viewport, then determining a plane which is positioned in the viewport, is outside the virtual body to be projected and is parallel to a screen surface of a lens, and finally superposing the body to be projected and the plane which is parallel to the screen surface of the lens to form a virtual surface to be projected.
7. The virtual projection apparatus as claimed in claim 6, wherein the depth map determination unit is further configured to:
extracting the initial imaging depth map;
determining an imaging depth map of the occlusion object participating in rendering;
and deleting the imaging depth map in which the occlusion participates in rendering from the initial imaging depth map to obtain a final imaging depth map, and generating the virtual projection cloth based on the final imaging depth map.
8. The virtual projection device of claim 6, wherein the projection unit is further configured to:
determining a projection ratio according to the vertical distance from the virtual body to be projected to the lens of the virtual camera;
and scaling and correcting the video source based on the projection proportion so as to project the video source onto the virtual projection cloth.
9. The virtual projection apparatus according to any of claims 6-8, wherein the projection cloth generation unit is further configured to generate a base mesh and enable material.
10. A machine-readable storage medium having stored thereon instructions which, when executed, implement the method of any of claims 1-5.
CN202111135154.8A 2021-09-27 2021-09-27 Virtual projection method, device and readable storage medium Active CN113676711B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111135154.8A CN113676711B (en) 2021-09-27 2021-09-27 Virtual projection method, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111135154.8A CN113676711B (en) 2021-09-27 2021-09-27 Virtual projection method, device and readable storage medium

Publications (2)

Publication Number Publication Date
CN113676711A CN113676711A (en) 2021-11-19
CN113676711B true CN113676711B (en) 2022-01-18

Family

ID=78550259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111135154.8A Active CN113676711B (en) 2021-09-27 2021-09-27 Virtual projection method, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN113676711B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2447060B (en) * 2007-03-01 2009-08-05 Magiqads Sdn Bhd Method of creation of a virtual three dimensional image to enable its reproduction on planar substrates
ES2383976B1 (en) * 2010-12-03 2013-05-08 Alu Group, S.L. METHOD FOR VIRTUAL FOOTWEAR TESTING.
US20120180084A1 (en) * 2011-01-12 2012-07-12 Futurewei Technologies, Inc. Method and Apparatus for Video Insertion
JP6615541B2 (en) * 2015-09-02 2019-12-04 株式会社バンダイナムコアミューズメント Projection system
US10802665B2 (en) * 2016-10-05 2020-10-13 Motorola Solutions, Inc. System and method for projecting graphical objects
US10311630B2 (en) * 2017-05-31 2019-06-04 Verizon Patent And Licensing Inc. Methods and systems for rendering frames of a virtual scene from different vantage points based on a virtual entity description frame of the virtual scene

Also Published As

Publication number Publication date
CN113676711A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
US9843787B2 (en) Generation and use of a 3D radon image
TWI637355B (en) Methods of compressing a texture image and image data processing system and methods of generating a 360-degree panoramic video thereof
US9818201B2 (en) Efficient lens re-distortion
CN111080805A (en) Method and device for generating three-dimensional block diagram of marked object, electronic equipment and storage medium
US9332195B2 (en) Image processing apparatus, imaging apparatus, and image processing method
US20150130909A1 (en) Method and electrical device for taking three-dimensional (3d) image and non-transitory computer-readable storage medium for storing the method
CN112511767B (en) Video splicing method and device, and storage medium
WO2020087729A1 (en) Image processing method and apparatus, electronic device and storage medium
CN112367481A (en) Video clip processing method and device
CN113643414A (en) Three-dimensional image generation method and device, electronic equipment and storage medium
US20160309142A1 (en) Image output apparatus, control method, image pickup apparatus, and storage medium
CN113676711B (en) Virtual projection method, device and readable storage medium
CN112819956A (en) Three-dimensional map construction method, system and server
CN111008934B (en) Scene construction method, device, equipment and storage medium
JP5926626B2 (en) Image processing apparatus, control method therefor, and program
CN115272124A (en) Distorted image correction method and device
CN116843812A (en) Image rendering method and device and electronic equipment
CN111161426B (en) Panoramic image-based three-dimensional display method and system
CN109698951B (en) Stereoscopic image reproducing method, apparatus, device and storage medium
CN115965519A (en) Model processing method, device, equipment and medium
KR20190114924A (en) Method, apparatus, and device for identifying human body and computer readable storage
CN112040144B (en) Vehicle damaged video generation method, device and equipment
CN114697501B (en) Time-based monitoring camera image processing method and system
CN112995514B (en) Method and equipment for acquiring photo object distance of industrial camera
KR102151250B1 (en) Device and method for deriving object coordinate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant