CN113727041A - Image matting region determination method and device - Google Patents

Image matting region determination method and device Download PDF

Info

Publication number
CN113727041A
CN113727041A CN202111196680.5A CN202111196680A CN113727041A CN 113727041 A CN113727041 A CN 113727041A CN 202111196680 A CN202111196680 A CN 202111196680A CN 113727041 A CN113727041 A CN 113727041A
Authority
CN
China
Prior art keywords
matting
virtual
parameter information
physical
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111196680.5A
Other languages
Chinese (zh)
Inventor
殷元江
余恩林
鲍成刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7d Vision Technology Co ltd
Original Assignee
Beijing 7d Vision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7d Vision Technology Co ltd filed Critical Beijing 7d Vision Technology Co ltd
Priority to CN202111196680.5A priority Critical patent/CN113727041A/en
Publication of CN113727041A publication Critical patent/CN113727041A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Abstract

The invention discloses a method and a device for determining an image matting region, wherein the method comprises the following steps: acquiring physical parameter information of a physical screen; determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation; acquiring visible angle parameters of video shooting equipment; the image matting background is projected to a virtual screen by using a rendering engine based on visual angle parameters of a virtual projection camera to obtain an image matting area, wherein the size of the image matting area is smaller than that of the physical screen, and the image matting area changes along with the movement of the video shooting equipment. Above-mentioned in-process, the size of matting the image size of region is less than the size of physical screen and has reduced the existence of matting the image background, has reduced the reverse degree, and the region of matting is along with the video shooting equipment removes and changes, has realized the adjustment of the region of matting, has further reduced the reverse degree of the regional shooting object of matting.

Description

Image matting region determination method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a device for determining an image matting region.
Background
The term "matting" is obtained from the early television production, which means that a certain color in a picture is taken as a transparent color, and the transparent color is scratched from the picture, so that a background is made to be transparent, and a two-layer picture is formed by superposition and synthesis. Thus, the figure shot indoors is overlapped with various scenes after being scratched to form a peculiar image synthesis effect.
In recent years, companies use a large screen and an entity blue box as a matting area, and although the matting is uniform and easy to perform matting processing, the matting area is a self-luminous mode, and if a matting background is used in a large area, a shooting object is easy to generate a reverse color phenomenon.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for determining a matting region, so as to solve the problem that a large screen is used as the matting region, although the matting is uniformly spread and is easy to perform the matting processing, if a matting background is used in a large area, a reverse color phenomenon is easily generated in a photographed object because the matting region is a self-luminous manner. The specific scheme is as follows:
a method of determining a matting area, comprising:
acquiring physical parameter information of a physical screen, wherein the physical parameter information comprises: size information and spatial position information;
determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation;
acquiring visible angle parameters of video shooting equipment;
and projecting a keying background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a keying region, wherein the keying size of the keying region is smaller than the size information of the physical screen, and the keying region is changed along with the movement of the virtual projection camera.
The above method, optionally, further includes:
taking the rest areas except the image matting area as light supplement areas, and setting light supplement colors of the light supplement areas;
and supplementing light to the shooting object in the image matting area based on the light supplementing area.
Optionally, in the method, a preset number of infrared positioning cameras are pre-installed at preset positions on the physical screen to obtain physical parameter information of the physical screen, and the method includes:
initializing the infrared positioning cameras of the preset number to obtain a space coordinate system;
determining each coordinate of the preset position based on the space coordinate system and the preset number of infrared positioning cameras;
determining parameter information of the physical screen based on the respective coordinates.
Optionally, the method for acquiring the visual angle parameter of the video shooting device includes:
calibrating the video shooting equipment;
and after the calibration is finished, acquiring the visible angle parameter of the video shooting equipment.
Optionally, the method for projecting a matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area includes:
transmitting the visual angle parameter to a virtual camera, and enabling the virtual camera to determine a projection area in the virtual screen based on the visual angle parameter;
and projecting the image matting background onto the projection area to obtain an image matting area.
A matting area determination apparatus comprising:
the first acquisition module is used for acquiring physical parameter information of a physical screen, wherein the physical parameter information comprises: size information and spatial position information;
the determining and establishing module is used for determining virtual parameter information matched with the physical parameter information based on the physical parameter information and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation;
the second acquisition module is used for acquiring the visible angle parameter of the video shooting equipment;
and the projection module is used for projecting the matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the matting size of the matting area is smaller than the size information of the physical screen, and the matting area is changed along with the movement of the virtual projection camera.
The above apparatus, optionally, further comprises:
the setting module is used for taking the rest areas except the image matting area as light supplement areas and setting light supplement colors of the light supplement areas;
and the light supplementing module is used for supplementing light to the shooting object in the image matting area based on the light supplementing area.
The device is optional, the preset position on the physical screen is pre-provided with a preset number of infrared positioning cameras, and the first acquisition module comprises:
the initialization unit is used for initializing the infrared positioning cameras with the preset number to obtain a space coordinate system;
the coordinate determination unit is used for determining each coordinate of the preset position based on the space coordinate system and the preset number of infrared positioning cameras;
a parameter information determination unit for determining parameter information of the physical screen based on the respective coordinates.
In the foregoing apparatus, optionally, the second obtaining module includes:
the calibration unit is used for calibrating the video shooting equipment;
and the acquisition unit is used for acquiring the visible angle parameter of the video shooting equipment after calibration is finished.
The above apparatus, optionally, the projection module includes:
the transfer unit is used for transferring the visual angle parameter to a virtual camera, and enabling the virtual camera to determine a projection area in the virtual screen based on the visual angle parameter;
and the projection unit is used for projecting the image matting background onto the projection area to obtain an image matting area.
Compared with the prior art, the invention has the following advantages:
the invention discloses a method and a device for determining an image matting region, wherein the method comprises the following steps: acquiring physical parameter information of a physical screen; determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation; acquiring visible angle parameters of video shooting equipment; and projecting a matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the size of the matting area is smaller than that of the physical screen, and the matting area changes along with the movement of the video shooting equipment. Above-mentioned in-process, the size of matting the image size of region is less than the size of physical screen and has reduced the existence of matting the image background, has reduced the reverse degree, and the region of matting is along with the video shooting equipment removes and changes, has realized the adjustment of the region of matting, has further reduced the reverse degree of the regional shooting object of matting.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for determining a matting area according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an image matting effect disclosed in an embodiment of the present application;
fig. 3 is a block diagram of a device for determining a matte area according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The invention discloses a method and a device for determining a keying region, which are applied to the determination process of the keying region in the keying process, and the synthesis technology of removing images by using a pure color keying technology has appeared for more than twenty years, wherein the term 'keying' is obtained from early television production. The English is called "Key", which means that a certain color in the picture is taken as transparent color, and the transparent color is scratched out of the picture, so that the background is made to be transparent, and the superposed composition of the two layers of pictures is formed. Thus, the figure shot indoors is overlapped with various scenes after being scratched to form a peculiar image synthesis effect. In recent years, companies also use a large screen and an entity blue box as a matting area, although the matting is uniform in light distribution and easy to perform matting processing, because the matting large screen is a self-luminous mode, if matting color is used in a large area, characters are easy to generate a reverse color phenomenon, the matting effect is seriously influenced, further, the entity blue box occupies a large area, the matting effect is influenced by non-uniform light, and the box body can fade after long-term use; based on the above problems, the present invention provides a method for determining a matte area, where an execution flow of the method is shown in fig. 1, and the method includes the steps of:
s101, acquiring physical parameter information of a physical screen, wherein the physical parameter information comprises: size information and spatial position information;
in the embodiment of the present invention, the physical screen is a real screen, preferably, the physical screen may be an LED screen, and an LED display (LED display) is a flat panel display, and is a device composed of small LED module panels and used for displaying various information such as text, images, and videos. Acquiring physical parameter information of the physical screen, wherein the physical parameter information can be acquired at a specified position or obtained by installing a sensor for measurement, in the embodiment of the present invention, a specific acquisition form of the physical parameter information is not limited, and the physical parameter information includes: size information and spatial position information, the size information including a length and a width of the physical screen.
S102, determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation;
in this embodiment of the present invention, virtual parameter information matched with the physical parameter information is determined based on the physical parameter information, preferably, the physical parameter information is the same as the virtual parameter information, and the virtual screen is constructed based on the virtual parameter information, where the virtual parameter information includes: size information and position information of a virtual screen, the size information of the virtual screen including: the length and width of the virtual screen.
S103, acquiring visible angle parameters of the video shooting equipment;
in the embodiment of the present invention, the visual angle parameter is a parameter of a video shooting device, where the video shooting device may be a camera lens, or the like, and the visual angle parameter of the video shooting device may be obtained directly by the angle parameter of the video shooting device, or based on calibration of the video shooting device, and after the calibration is completed, the visual angle parameter calibrated by the video shooting device is obtained.
S104, projecting a matting background to the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the matting size of the matting area is smaller than the size information of the physical screen, and the matting area changes along with the movement of the virtual projection camera.
In the embodiment of the present invention, the matting background is a background color of a shooting object, preferably, the matting background is blue, the visual angle parameter is rendered to a virtual projection camera, the virtual projection camera projects the matting background onto a virtual screen to obtain a matting region, in order to reduce a color reversal phenomenon caused by the matting background to the shooting object, the size of the matting region is reduced, the size of the matting region is made smaller than the size of the physical screen, the size of the matting region can be adjusted by adjusting parameters such as an angle and a position of the virtual projection camera, the matting region changes with a change of the parameter of the virtual projection camera, for example, the virtual projection camera is far away, the display range of the matting region is enlarged, the virtual projection camera is close, the range of the matting region is narrowed, and specifically how to adjust the matting region can be set based on experience or a specific situation, in the embodiment of the present invention, without limitation, the virtual projection camera may be a virtual camera or a virtual camera, and further, the image matting area changes with the movement of the virtual projection camera.
The invention discloses a method and a device for determining an image matting region, wherein the method comprises the following steps: acquiring physical parameter information of a physical screen; determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation; acquiring visible angle parameters of video shooting equipment; and projecting a matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the size of the matting area is smaller than that of the physical screen, and the matting area changes along with the movement of the video shooting equipment. Above-mentioned in-process, the size of matting the image size of region is less than the size of physical screen and has reduced the existence of matting the image background, has reduced the reverse degree, and the region of matting is along with the video shooting equipment removes and changes, has realized the adjustment of the region of matting, has further reduced the reverse degree of the regional shooting object of matting.
In the embodiment of the present invention, since the size of the image matting region is smaller than that of the physical screen, preferably, a region between the physical screen and a region other than the image matting region is used as a light supplement region, and a light supplement color of the light supplement region is set, where the light supplement color may be set based on experience or specific conditions.
In the embodiment of the present invention, a preset number of infrared positioning cameras are pre-installed at a preset position on the physical screen, and the process of acquiring the physical parameter information of the physical screen includes: initializing the preset number of infrared positioning cameras to obtain a space coordinate system, determining each coordinate of the preset position based on the space coordinate system and the preset number of infrared positioning cameras, and determining the size information and the position information of the physical screen by differentiating corresponding items of each coordinate.
In the embodiment of the invention, a space positioning and tracking technology is combined with a camera shooting matching technology, the position and the angle of a video shooting device are determined by acquiring visible angle parameters of the video shooting device, a matting solid-color image is displayed on a corresponding large screen, a matting region corresponding to the screen can change and move along with the camera motion, colors required for matting are ensured in a shooting range, a large screen is used for carrying out a dynamic matting function, the phenomenon of reverse color generated by a figure in a full-bright state of the large screen is eliminated, the effect of dynamic light supplement of the matting effect is increased by setting light supplement colors at the edge of the matting region, the dynamic display matting technology also saves energy consumption for using the large screen, the temperature and the noise of a using environment are reduced, and a novel high-efficiency and energy-saving dynamic matting system is created.
Taking an example based on the above-mentioned matting system, the matting system includes: the system comprises an LED screen for displaying images, a camera for reading lens files and shooting, a tracking and positioning device for initial generation positioning and shooting tracking, a rendering server for rendering and displaying images, a color key device for camera input, virtual background input and image summation, and a control machine for tracking debugging, controlling rendering and controlling tracking, wherein the components are exemplified by, but not limited to, the model of the rendering server is HP-Z8, the model of the control machine is HP-Z4 graphic workstation, and the display is HP-27F 4K; the network switch uses NETGEAR GS752 TP; the test cameras used are sony high definition cameras PXW-280; the LED large screen uses a Zhongming LED screen, the point spacing is P1.576, and the screen splicing server is the Zhongming MVC-2-203; the tracking and positioning device uses an OptiTrackP41 six-lens set, the color key device uses a BMD ultimate 12, and the implementation process is as follows: the method comprises the following steps of installing six OptiTrack infrared positioning cameras at corners of a test site, initializing space positioning of the test site by using an OptiTrack tracking and positioning system to obtain a space coordinate system of the test site, matching a physical screen with a virtual screen after the coordinate system is established, measuring the length, width and position of the physical screen, establishing a virtual screen patch matched with the physical screen patch in the virtual space, and binding parameters of the two; then, a camera lens is calibrated by using a lens calibration plate, visual angle parameters of the camera lens are determined, the camera lens parameters are calibrated and then applied to a virtual camera of a rendering system, a system virtual camera is used for projecting a pure blue image of image matting onto a virtual screen, a physical screen displays an image matting region corresponding to the virtual screen, the size, the position and the angle of the image matting region are calculated based on the virtual lens parameters, the mode is similar to a flashlight projection mode, the image matting region is only displayed in a lens parameter region, other regions of the physical screen are light supplement regions or are not displayed, the display range of the virtual screen is changed along with the camera parameters, the zoom-out display range of the camera is enlarged, the zoom-in display range of the camera is reduced, the light supplement regions can be arranged outside the display range, and the facial illumination of people is increased. The light supplement area and the image matting area are integrated and can change along with the projection display area, a camera is used for shooting pictures of a real screen, the pictures are connected to a color key device and an SDI background image output by a rendering server through a digital component serial interface (SDI) (serial interface) signal line to be subjected to image superposition and then synthesized and output, and the images are output to a monitor through an SDI signal line to obtain a final effect. The effect schematic diagram is shown in fig. 2, as shown in fig. 2, the physical screen may be a single screen or multiple screens, and the matting region (projection region) may be projected on the single screen or across the screens.
In the process, the camera and the virtual space technology are used for realizing the keying function of displaying dynamic matching between the camera and the physical screen by utilizing the space positioning and tracking technology, the phenomenon that the full-screen display keying color of the physical screen is eliminated to enable a figure to generate a reverse color phenomenon is eliminated, the system is also provided with a dynamic light supplementing function, the keying effect is better in the mode, the energy consumption of a large screen is saved, and a high-efficiency and energy-saving large-screen dynamic keying mode is created.
Based on the above method for determining a matte region, an embodiment of the present invention provides a matte region determining apparatus, where a structural block diagram of the determining apparatus is shown in fig. 3, and the apparatus includes:
a first acquisition module 201, a determination and establishment module 202, a second acquisition module 203 and a projection module 204.
Wherein the content of the first and second substances,
the first obtaining module 201 is configured to obtain physical parameter information of a physical screen, where the physical parameter information includes: size information and spatial position information;
the determining and establishing module 202 is configured to determine, based on the physical parameter information, virtual parameter information matched with the physical parameter information, and establish a virtual screen based on the virtual parameter information, where the physical parameter information and the virtual parameter information have a corresponding relationship;
the second obtaining module 203 is configured to obtain a visible angle parameter of the video shooting device;
the projection module 204 is configured to project a matting background onto the virtual screen using a rendering engine based on a visual angle parameter of the virtual projection camera to obtain a matting area, where a matting size of the matting area is smaller than the size information of the physical screen and the matting area varies with movement of the virtual projection camera.
The invention discloses a device for determining an image matting region, which comprises: acquiring physical parameter information of a physical screen; determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation; acquiring visible angle parameters of video shooting equipment; and projecting a matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the size of the matting area is smaller than that of the physical screen, and the matting area changes along with the movement of the video shooting equipment. Above-mentioned in-process, the size of matting the image size of region is less than the size of physical screen and has reduced the existence of matting the image background, has reduced the reverse degree, and the region of matting is along with the video shooting equipment removes and changes, has realized the adjustment of the region of matting, has further reduced the reverse degree of the regional shooting object of matting.
In this embodiment of the present invention, the determining apparatus further includes:
a setting module 205 and a fill light module 206.
Wherein the content of the first and second substances,
the setting module 205 is configured to use the remaining region other than the matting region as a light supplement region, and set a light supplement color of the light supplement region;
the light supplement module 206 is configured to supplement light for the shooting object in the image matting area based on the light supplement area.
In this embodiment of the present invention, a preset number of infrared positioning cameras are pre-installed at a preset position on the physical screen, and the first obtaining module 201 includes:
an initialization unit 207, a coordinate determination unit 208, and a parameter information determination unit 209.
Wherein the content of the first and second substances,
the initialization unit 207 is configured to initialize the preset number of infrared positioning cameras to obtain a spatial coordinate system;
the coordinate determination unit 208 is configured to determine each coordinate of the preset position based on the spatial coordinate system and the preset number of infrared positioning cameras;
the parameter information determining unit 209 is configured to determine parameter information of the physical screen based on the respective coordinates.
In this embodiment of the present invention, the second obtaining module 203 includes:
a calibration unit 210 and an acquisition unit 211.
Wherein the content of the first and second substances,
the calibration unit 210 is configured to calibrate the video shooting device;
the obtaining unit 211 is configured to obtain a visible angle parameter of the video capturing apparatus after the calibration is completed.
In an embodiment of the present invention, the projection module 204 includes:
a transfer unit 212 and a projection unit 213.
Wherein the content of the first and second substances,
the transfer unit 212 is configured to transfer the viewing angle parameter to a virtual camera, so that the virtual camera determines a projection area in the virtual screen based on the viewing angle parameter;
the projection unit 213 is configured to project the image matting background onto the projection area to obtain an image matting area.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same software and/or hardware or in a plurality of software and/or hardware when implementing the invention.
From the above description of the embodiments, it is clear to those skilled in the art that the present invention can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The method and the device for determining a matting area provided by the invention are described in detail above, a specific example is applied in the text to explain the principle and the implementation of the invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method for determining a matting region, comprising:
acquiring physical parameter information of a physical screen, wherein the physical parameter information comprises: size information and spatial position information;
determining virtual parameter information matched with the physical parameter information based on the physical parameter information, and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation;
acquiring visible angle parameters of video shooting equipment;
and projecting a keying background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a keying region, wherein the keying size of the keying region is smaller than the size information of the physical screen, and the keying region is changed along with the movement of the virtual projection camera.
2. The method of claim 1, further comprising:
taking the rest areas except the image matting area as light supplement areas, and setting light supplement colors of the light supplement areas;
and supplementing light to the shooting object in the image matting area based on the light supplementing area.
3. The method according to claim 1, wherein a preset number of infrared positioning cameras are pre-installed at preset positions on the physical screen to obtain physical parameter information of the physical screen, and the method comprises the following steps:
initializing the infrared positioning cameras of the preset number to obtain a space coordinate system;
determining each coordinate of the preset position based on the space coordinate system and the preset number of infrared positioning cameras;
determining parameter information of the physical screen based on the respective coordinates.
4. The method of claim 1, wherein obtaining the viewing angle parameter of the video capture device comprises:
calibrating the video shooting equipment;
and after the calibration is finished, acquiring the visible angle parameter of the video shooting equipment.
5. The method of claim 1, wherein projecting a matting background onto the virtual screen using a rendering engine based on a visual angle parameter of the virtual projection camera to obtain a matting area comprises:
transmitting the visual angle parameter to a virtual camera, and enabling the virtual camera to determine a projection area in the virtual screen based on the visual angle parameter;
and projecting the image matting background onto the projection area to obtain an image matting area.
6. A matting region determining apparatus, comprising:
the first acquisition module is used for acquiring physical parameter information of a physical screen, wherein the physical parameter information comprises: size information and spatial position information;
the determining and establishing module is used for determining virtual parameter information matched with the physical parameter information based on the physical parameter information and establishing a virtual screen based on the virtual parameter information, wherein the physical parameter information and the virtual parameter information have a corresponding relation;
the second acquisition module is used for acquiring the visible angle parameter of the video shooting equipment;
and the projection module is used for projecting the matting background onto the virtual screen by using a rendering engine based on the visual angle parameter of the virtual projection camera to obtain a matting area, wherein the matting size of the matting area is smaller than the size information of the physical screen, and the matting area is changed along with the movement of the virtual projection camera.
7. The apparatus of claim 6, further comprising:
the setting module is used for taking the rest areas except the image matting area as light supplement areas and setting light supplement colors of the light supplement areas;
and the light supplementing module is used for supplementing light to the shooting object in the image matting area based on the light supplementing area.
8. The apparatus according to claim 6, wherein a preset number of infrared positioning cameras are pre-installed at preset positions on the physical screen, and the first obtaining module comprises:
the initialization unit is used for initializing the infrared positioning cameras with the preset number to obtain a space coordinate system;
the coordinate determination unit is used for determining each coordinate of the preset position based on the space coordinate system and the preset number of infrared positioning cameras;
a parameter information determination unit for determining parameter information of the physical screen based on the respective coordinates.
9. The apparatus of claim 6, wherein the second obtaining module comprises:
the calibration unit is used for calibrating the video shooting equipment;
and the acquisition unit is used for acquiring the visible angle parameter of the video shooting equipment after calibration is finished.
10. The apparatus of claim 6, wherein the projection module comprises:
the transfer unit is used for transferring the visual angle parameter to a virtual camera, and enabling the virtual camera to determine a projection area in the virtual screen based on the visual angle parameter;
and the projection unit is used for projecting the image matting background onto the projection area to obtain an image matting area.
CN202111196680.5A 2021-10-14 2021-10-14 Image matting region determination method and device Pending CN113727041A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111196680.5A CN113727041A (en) 2021-10-14 2021-10-14 Image matting region determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111196680.5A CN113727041A (en) 2021-10-14 2021-10-14 Image matting region determination method and device

Publications (1)

Publication Number Publication Date
CN113727041A true CN113727041A (en) 2021-11-30

Family

ID=78685925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111196680.5A Pending CN113727041A (en) 2021-10-14 2021-10-14 Image matting region determination method and device

Country Status (1)

Country Link
CN (1) CN113727041A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006155442A (en) * 2004-12-01 2006-06-15 Nintendo Co Ltd Image processing program and image processor
KR20090028673A (en) * 2007-09-15 2009-03-19 김영대 Virtual studio posture correction machine
US20180088889A1 (en) * 2016-09-29 2018-03-29 Jiang Chang Three-dimensional image formation and color correction system and method
CN111447340A (en) * 2020-05-29 2020-07-24 深圳市瑞立视多媒体科技有限公司 Mixed reality virtual preview shooting system
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006155442A (en) * 2004-12-01 2006-06-15 Nintendo Co Ltd Image processing program and image processor
KR20090028673A (en) * 2007-09-15 2009-03-19 김영대 Virtual studio posture correction machine
US20180088889A1 (en) * 2016-09-29 2018-03-29 Jiang Chang Three-dimensional image formation and color correction system and method
CN111447340A (en) * 2020-05-29 2020-07-24 深圳市瑞立视多媒体科技有限公司 Mixed reality virtual preview shooting system
CN112040092A (en) * 2020-09-08 2020-12-04 杭州时光坐标影视传媒股份有限公司 Real-time virtual scene LED shooting system and method

Similar Documents

Publication Publication Date Title
Bimber et al. Embedded entertainment with smart projectors
Raskar et al. A low-cost projector mosaic with fast registration
JP3450833B2 (en) Image processing apparatus and method, program code, and storage medium
Majumder et al. Perceptual photometric seamlessness in projection-based tiled displays
CN107341832B (en) Multi-view switching shooting system and method based on infrared positioning system
US10950039B2 (en) Image processing apparatus
CN104954769A (en) Immersion type ultra-high-definition video processing system and method
CN106340064A (en) Mixed-reality sandbox device and method
CN107800979A (en) High dynamic range video image pickup method and filming apparatus
TWI321299B (en) System and method for evaluating a dynamic color deviation of a moving image of lcd
JP2006189708A (en) Display device
CN108363519A (en) Distributed infrared vision-based detection merges the touch control display system of automatic straightening with projection
CN112351266B (en) Three-dimensional visual processing method, device, equipment, display system and medium
US9372530B2 (en) Aspect-ratio independent, multimedia capture and editing systems and methods thereof
CN208506731U (en) Image display systems
Hamasaki et al. Hysar: Hybrid material rendering by an optical see-through head-mounted display with spatial augmented reality projection
Minomo et al. Transforming your shadow into colorful visual media: Multi-projection of complementary colors
CN113727041A (en) Image matting region determination method and device
US20210067732A1 (en) Image processing device, image processing method, program, and projection system
CN112866507B (en) Intelligent panoramic video synthesis method and system, electronic device and medium
JP5645448B2 (en) Image processing apparatus, image processing method, and program
Hilario et al. Occlusion detection for front-projected interactive displays
CN104777700B (en) Height immerses projection multi-projector Optimization deployment method
KR101990252B1 (en) Method for producing virtual reality image, portable device in which VR photographing program for performing the same is installed, and server supplying the VR photographing program to the portable device
CN113542463A (en) Video shooting device and method based on folding screen, storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination