WO2014133213A1 - Multi-image control system - Google Patents

Multi-image control system Download PDF

Info

Publication number
WO2014133213A1
WO2014133213A1 PCT/KR2013/001960 KR2013001960W WO2014133213A1 WO 2014133213 A1 WO2014133213 A1 WO 2014133213A1 KR 2013001960 W KR2013001960 W KR 2013001960W WO 2014133213 A1 WO2014133213 A1 WO 2014133213A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
dimensional space
image layers
layers
space
Prior art date
Application number
PCT/KR2013/001960
Other languages
French (fr)
Korean (ko)
Inventor
조홍래
김길중
이상훈
안효진
문치우
최윤호
박지현
주영렬
고승진
유태진
김대열
권영철
Original Assignee
(주)바이널
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)바이널 filed Critical (주)바이널
Publication of WO2014133213A1 publication Critical patent/WO2014133213A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present invention relates to a multiple image control system, and more particularly , to a multiple image control system for controlling a stage device constituting a stage such as a musical, a play.
  • a general multivision system is a system for enlarging or reducing a single image and outputting it to a screen.
  • the multivision system is connected to multiple image output means to form a multivision system, and simultaneously or selectively to multiple image output means. It is a system for outputting video signals.
  • the multi-vision system is installed in front of the exhibition hall or the event hall and used for the purpose of advertising by outputting the image required for the exhibition or event, and is also used for the display device that outputs the video and subtitles in places such as karaoke. .
  • the stage device is a device on the stage designed to highlight the content that is performed in accordance with the intention of the director in the play, musical, opera, etc. means a background, lighting, costume, and the like.
  • the background may be produced by various objects, props, plywood, etc. disposed on the stage.
  • materials used in one performance are generally discarded because they cannot be used in other performances.
  • Korean Patent No. 0936423 discloses a lighting system that can produce a variety of lighting effects according to the stage situation in order to make the stage or set stand out.
  • Patent No. 0936424 has a problem in that it is possible to expect a directing synergy effect of the stage or set by directing various lighting effects, but cannot expect a direct stage background.
  • the technical problem to be achieved by the present invention is to provide a variety of image sources varying on the basis of the timeline in a three-dimensional space, by projecting a plurality of divided image sources in a three-dimensional space to control each, musical, theater, etc. To provide a multi-image control system that can produce a variety of stage effects.
  • a multi-image control system for controlling a plurality of projectors arranged in a three-dimensional space and projecting at least one image layer, comprising: an image divider for dividing an image source into a plurality of image layers; And a controller configured to map the plurality of image layers on the 3D space and a controller to control the plurality of projectors such that the mapped plurality of image layers is projected on the 3D space in association with a timeline.
  • the timeline may be configured to correspond to a cue sheet.
  • the image source may configure a background on the three-dimensional space that is switched corresponding to the cue sheet.
  • the apparatus may further include a warping processor configured to warp the image source or the plurality of image layers to correspond to the curved surface of the 3D space.
  • the apparatus may further include a blending processor configured to blend the overlapped areas between the plurality of image layers.
  • the mapping unit may adjust at least one of positions, tilts, and sizes of the plurality of image layers to match the space to be mapped.
  • the controller may control at least one of a lighting device and a sound device in association with the timeline.
  • the mapping unit may map the plurality of image layers on the three-dimensional space photographed by the camera tracking method.
  • the multi-image control system of the present invention can provide a variety of image sources varying on the basis of the timeline in a three-dimensional space, by controlling a plurality of divided image sources by projecting on the three-dimensional space, respectively, such as musical, theater Various stage effects can be produced.
  • FIG. 1 is a block diagram of a multi-image control system according to an embodiment of the present invention.
  • FIG. 2 is a conceptual diagram illustrating the operation of a control unit according to an embodiment of the present invention
  • FIG. 3 illustrates a projector projection area in a three-dimensional space according to an embodiment of the present invention
  • FIG. 4 is a view for explaining a three-dimensional space in which a plurality of layers are projected according to an embodiment of the present invention.
  • ordinal numbers such as second and first
  • first and second components may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component.
  • FIG. 1 is a block diagram of a multi-image control system according to an embodiment of the present invention.
  • a multi-image control system includes an image divider 110 for dividing an image source into a plurality of image layers, and a mapping unit for mapping the plurality of image layers onto a three-dimensional space. 120, a warping processor 140 for warping the plurality of image layers to correspond to the curved surface in the three-dimensional space, a blending processor 150 for blending an overlapping region between the plurality of image layers, and The controller 130 may be configured to control the plurality of projectors 200 so that the plurality of image layers mapped in association with the timeline are projected on the 3D space.
  • the image divider 110 may divide an image source into a plurality of image layers.
  • the image source may be, for example, image data constituting a background which is one of stage apparatuses used for a musical, a theater stage, and the like.
  • the image divider 110 automatically divides an image source into a plurality of image layers according to a preset number, or automatically divides the number of projectors 200 connected to the multiple image control system and the number of image layers allocated to each projector 200. After the calculation, the image source may be divided into a plurality of image layers according to the calculated number.
  • the mapping unit 120 may map the plurality of divided image layers on the 3D space.
  • the three-dimensional space may be, for example, a stage on which musicals or plays are performed and may include any object having a volume and installed on the screen and the stage.
  • the mapping unit 120 defines a virtual projection surface for projecting a plurality of image layers and performs auto-calibration to perform mapping so that each image layer can be projected to an accurate point in three-dimensional space. can do.
  • the mapping unit 120 may map a plurality of image layers on a three-dimensional space using, for example, a camera tracking method.
  • the mapping unit 120 may set the 3D coordinates of the projection surface of the image data photographed by the camera by extracting information about a focal length, a motion, a view angle, etc. of the camera which actually photographs the 3D space.
  • a plurality of image layers may be mapped to correspond to the set three-dimensional coordinates.
  • the mapping unit 120 may adjust at least one of positions, tilts, and sizes of the plurality of image layers to match the space to be mapped. For example, the mapping unit 120 may adjust at least one of a position, an inclination, and a size of the plurality of image layers to match the projection surface by using coordinates of the projection surface calculated through a camera tracking method.
  • the warping processor 140 may warp the image source or the plurality of image layers to correspond to the curved surface in the 3D space.
  • the warping unit may perform a warping process of converting an image layer to correspond to an environment such as a bending angle of the curved surface by reducing or expanding a portion of the image layer so that the image projected on the curved surface in the 3D space is not distorted or distorted. .
  • the warping processor 140 may apply, for example, a screen adjustment and a partial image compression method in units of pixels or grids.
  • the blending processor 150 may blend the overlapping areas between the plurality of image layers. When a plurality of divided image layers are projected on a three-dimensional space, two image layers overlap each other in an overlapping area, so that brightness becomes brighter and appears to be brighter than the surroundings.
  • the blending processing unit 150 may remove the drooping of the overlapping region by adjusting the brightness and contrast of the overlapping region.
  • the controller 130 may control the plurality of projectors 200 such that the plurality of image layers mapped in association with the timeline are projected on the 3D space.
  • the controller 130 may control the plurality of image projectors to be projected by controlling the plurality of projectors 200 arranged around the 3D space.
  • the plurality of projectors 200 may project at least one image layer, and the number of image layers projected by each projector 200 may be set differently.
  • the controller 130 may control the plurality of projectors 200 to project an image layer in association with the timeline.
  • the timeline may mean a time axis variable that is arbitrarily set.
  • the timeline may be a cue sheet used for a musical, a play, a broadcast, or the like.
  • a cue sheet is a table that arranges various cues according to a time table when making a play, a radio or a television program.
  • an action cue used as a basis for a scene change in a musical or a play, etc.
  • the table summarized according to the time table will be described as an example.
  • the controller 130 may control each projector 200 according to the cue sheet, and the projection time of each image layer may be independently controlled.
  • the controller 130 may control at least one of the lighting device and the sound device in association with the timeline.
  • the lighting device and the sound device may be one of stage devices used in a musical, a play, and the like, and the controller 130 may control at least one of the lighting device and the sound device in association with the cue sheet.
  • FIG. 2 is a conceptual diagram illustrating the operation of the controller 130 according to an embodiment of the present invention.
  • each projector is set to project at least one image layer.
  • Each image layer is independently controlled according to an action cue, and is configured to additionally control a lighting device and a sound device.
  • FIG. 3 is a diagram illustrating a projector projection area in a three-dimensional space according to an embodiment of the present invention
  • FIG. 4 is a diagram for describing a three-dimensional space in which a plurality of layers are projected according to an embodiment of the present invention.
  • a three-dimensional space may include an object of a rectangular parallelepiped having an arbitrary volume in the center of an open space formed of four surfaces.
  • Each projector is arranged to have a constant projection angle around a three-dimensional space, and is set to project an image layer onto a four-sided or cuboid object.
  • a plurality of image layers are projected on a three-dimensional space to project an image source having a three-dimensional effect.
  • Each image layer is warped and blended, and is mapped by a camera tracking method to be projected at an accurate position in three-dimensional space.
  • ' ⁇ part' used in the present embodiment refers to software or a hardware component such as a field-programmable gate array (FPGA) or an ASIC, and ' ⁇ part' performs certain roles.
  • ' ⁇ ' is not meant to be limited to software or hardware.
  • ' ⁇ Portion' may be configured to be in an addressable storage medium or may be configured to play one or more processors.
  • ' ⁇ ' means components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and the like. Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables.
  • the functionality provided within the components and the 'parts' may be combined into a smaller number of components and the 'parts' or further separated into additional components and the 'parts'.
  • the components and ' ⁇ ' may be implemented to play one or more CPUs in the device or secure multimedia card.

Abstract

A multi-image control system is disclosed. The multi-image control system comprises: an image division part for dividing an image source into a plurality of image layers; a mapping part for mapping the plurality of image layers onto a 3D space; and a controlling part for controlling a plurality of projectors by linking with a timeline such that the plurality of mapped image layers are projected onto the 3D space.

Description

다중 영상 제어 시스템Multi-image control system
본 발명은 다중 영상 제어 시스템에 관한 것으로, 더욱 상세하게는 뮤지컬, 연극 등의 스테이지를 구성하는 무대 장치를 제어하기 위한 다중 영상 제어 시스템에 관한 것이다.The present invention relates to a multiple image control system, and more particularly , to a multiple image control system for controlling a stage device constituting a stage such as a musical, a play.
일반적인 멀티비전(Multivision) 시스템은, 하나의 영상을 확대하거나 축소하여 화면에 출력하기 위한 시스템으로서, 여러 대의 영상 출력 수단을 연결하여 멀티비전 시스템을 형성하고, 여러 대의 영상 출력 수단에 동시 또는 선택적으로 영상신호가 출력될 수 있도록 하기 위한 시스템이다.A general multivision system is a system for enlarging or reducing a single image and outputting it to a screen. The multivision system is connected to multiple image output means to form a multivision system, and simultaneously or selectively to multiple image output means. It is a system for outputting video signals.
멀티비전 시스템은 전시장이나 행사장의 전면에 설치되어 그 전시 또는 행사에 필요한 영상을 출력하여 광고를 하기 위한 목적으로 많이 이용되고 있으며, 노래방 등과 같은 장소에서 영상 및 자막이 출력되는 디스플레이 장치에도 이용되고 있다.The multi-vision system is installed in front of the exhibition hall or the event hall and used for the purpose of advertising by outputting the image required for the exhibition or event, and is also used for the display device that outputs the video and subtitles in places such as karaoke. .
한편, 무대 장치란, 연극, 뮤지컬, 오페라 등에서 연출자의 의도에 맞추어 상연되는 내용을 돋보이게 하기 위하여 설계되는 무대상의 장치로서 배경, 조명, 의상 등을 의미한다.On the other hand, the stage device is a device on the stage designed to highlight the content that is performed in accordance with the intention of the director in the play, musical, opera, etc. means a background, lighting, costume, and the like.
무대장치로서 배경은 스테이지 상에 배치되는 다양한 오브제, 소품, 합판 등으로 연출될 수 있다. 그러나 연극, 뮤지컬, 오페라 등은 공연이 진행되는 동안 수십차례 장면의 전환이 있게 되고, 이러한 무대장치를 꾸미는 소재들은 장면의 전환에 맞추어 수시로 변경되어야 한다. 또한, 특정 공연에 사용된 소재들은 다른 공연에 사용될 수 없으므로 폐기 처분되는 것이 일반적이다.As the stage device, the background may be produced by various objects, props, plywood, etc. disposed on the stage. However, plays, musicals, operas, etc., dozens of scenes change during the performance, and the material decorating these stage devices must be changed from time to time. In addition, materials used in one performance are generally discarded because they cannot be used in other performances.
한국등록특허 제0936423호에서는 무대나 세트장을 돋보이게 연출하기 위하여 무대상황에 따라 다양한 조명효과를 연출할 수 있는 조명 시스템을 개시하고 있다. 그러나 등록특허 제0936424호는 다양한 조명 효과를 연출함으로써 무대나 세트장의 연출 상승 효과를 기대할 수는 있으나 직접적인 무대 배경의 연출을 기대할 수는 없다는 문제가 있다.Korean Patent No. 0936423 discloses a lighting system that can produce a variety of lighting effects according to the stage situation in order to make the stage or set stand out. However, Patent No. 0936424 has a problem in that it is possible to expect a directing synergy effect of the stage or set by directing various lighting effects, but cannot expect a direct stage background.
본 발명이 이루고자 하는 기술적 과제는 타임라인에 기초하여 변화하는 다양한 영상 소스를 3차원 공간상에 제공할 수 있으며, 복수개로 분할된 영상 소스를 3차원 공간상에 투사하여 각각 제어함으로써 뮤지컬, 연극 등의 다양한 무대 효과를 연출할 수 있는 다중 영상 제어 시스템을 제공하는 데 있다.The technical problem to be achieved by the present invention is to provide a variety of image sources varying on the basis of the timeline in a three-dimensional space, by projecting a plurality of divided image sources in a three-dimensional space to control each, musical, theater, etc. To provide a multi-image control system that can produce a variety of stage effects.
본 발명의 일 양태에 따르면 3차원 공간상에 배치되며 적어도 하나의 영상 레이어를 투사하는 복수개의 프로젝터를 제어하기 위한 다중 영상 제어 시스템에 있어서, 영상 소스를 복수개의 영상 레이어로 분할하는 영상 분할부; 상기 복수개의 영상 레이어를 상기 3차원 공간상에 맵핑하는 맵핑부 및 타임라인에 연동하여 상기 맵핑된 복수개의 영상 레이어가 상기 3차원 공간상에 투사되도록 상기 복수개의 프로젝터를 제어하는 제어부를 포함하는 다중 영상 제어 시스템을 제공한다.According to an aspect of the present invention, a multi-image control system for controlling a plurality of projectors arranged in a three-dimensional space and projecting at least one image layer, comprising: an image divider for dividing an image source into a plurality of image layers; And a controller configured to map the plurality of image layers on the 3D space and a controller to control the plurality of projectors such that the mapped plurality of image layers is projected on the 3D space in association with a timeline. Provide an image control system.
상기 타임라인은 큐시트(cue sheet)에 대응하여 구성될 수 있다.The timeline may be configured to correspond to a cue sheet.
상기 영상 소스는 상기 큐시트에 대응하여 전환되는 상기 3차원 공간상의 배경을 구성할 수 있다.The image source may configure a background on the three-dimensional space that is switched corresponding to the cue sheet.
상기 3차원 공간상의 굴곡면에 대응하도록 상기 영상 소스 또는 상기 복수개의 영상 레이어를 워핑(warping)처리하는 워핑 처리부를 더 포함하여 구성될 수 있다.The apparatus may further include a warping processor configured to warp the image source or the plurality of image layers to correspond to the curved surface of the 3D space.
상기 복수개의 영상 레이어간의 중첩 영역을 블렌딩(blending)처리하는 블렌딩 처리부를 더 포함하여 구성될 수 있다.The apparatus may further include a blending processor configured to blend the overlapped areas between the plurality of image layers.
상기 맵핑부는 상기 맵핑되는 공간에 일치하도록 상기 복수개의 영상 레이어의 위치, 기울기 및 크기 중 적어도 하나를 조절할 수 있다.The mapping unit may adjust at least one of positions, tilts, and sizes of the plurality of image layers to match the space to be mapped.
상기 제어부는 상기 타임라인에 연동하여 조명 장치 및 사운드 장치 중 적어도 하나를 제어할 수 있다.The controller may control at least one of a lighting device and a sound device in association with the timeline.
상기 맵핑부는 카메라 트래킹(camera tracking)방법을 통하여 실사 촬영된 상기 3차원 공간상에 상기 복수개의 영상 레이어를 맵핑할 수 있다.The mapping unit may map the plurality of image layers on the three-dimensional space photographed by the camera tracking method.
본 발명인 다중 영상 제어 시스템은 타임라인에 기초하여 변화하는 다양한 영상 소스를 3차원 공간상에 제공할 수 있으며, 복수개로 분할된 영상 소스를 3차원 공간상에 투사하여 각각 제어함으로써 뮤지컬, 연극 등의 다양한 무대 효과를 연출할 수 있다.The multi-image control system of the present invention can provide a variety of image sources varying on the basis of the timeline in a three-dimensional space, by controlling a plurality of divided image sources by projecting on the three-dimensional space, respectively, such as musical, theater Various stage effects can be produced.
도1은 본 발명의 일실시예에 따른 다중 영상 제어 시스템의 블록도,1 is a block diagram of a multi-image control system according to an embodiment of the present invention;
도2는 본 발명의 일실시예에 따른 제어부의 동작을 설명하기 위한 개념도,2 is a conceptual diagram illustrating the operation of a control unit according to an embodiment of the present invention;
도3은 본 발명의 일실시예에 따른 3차원 공간상의 프로젝터 투사 영역을 도시한 도면 및3 illustrates a projector projection area in a three-dimensional space according to an embodiment of the present invention;
도4는 본 발명의 일실시예에 따라 복수개의 레이어가 투사된 3차원 공간을 설명하기 위한 도면이다.4 is a view for explaining a three-dimensional space in which a plurality of layers are projected according to an embodiment of the present invention.
본 발명은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있는 바, 특정 실시예들을 도면에 예시하고 설명하고자 한다. 그러나, 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다. As the invention allows for various changes and numerous embodiments, particular embodiments will be illustrated and described in the drawings. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all modifications, equivalents, and substitutes included in the spirit and scope of the present invention.
제2, 제1 등과 같이 서수를 포함하는 용어는 다양한 구성요소들을 설명하는데 사용될 수 있지만, 상기 구성요소들은 상기 용어들에 의해 한정되지는 않는다. 상기 용어들은 하나의 구성요소를 다른 구성요소로부터 구별하는 목적으로만 사용된다. 예를 들어, 본 발명의 권리 범위를 벗어나지 않으면서 제2 구성요소는 제1 구성요소로 명명될 수 있고, 유사하게 제1 구성요소도 제2 구성요소로 명명될 수 있다. 및/또는 이라는 용어는 복수의 관련된 기재된 항목들의 조합 또는 복수의 관련된 기재된 항목들 중의 어느 항목을 포함한다. Terms including ordinal numbers, such as second and first, may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component. The term and / or includes a combination of a plurality of related items or any item of a plurality of related items.
어떤 구성요소가 다른 구성요소에 "연결되어" 있다거나 "접속되어" 있다고 언급된 때에는, 그 다른 구성요소에 직접적으로 연결되어 있거나 또는 접속되어 있을 수도 있지만, 중간에 다른 구성요소가 존재할 수도 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소가 다른 구성요소에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 중간에 다른 구성요소가 존재하지 않는 것으로 이해되어야 할 것이다. When a component is referred to as being "connected" or "connected" to another component, it may be directly connected to or connected to that other component, but it may be understood that other components may be present in between. Should be. On the other hand, when a component is said to be "directly connected" or "directly connected" to another component, it should be understood that there is no other component in between.
본 출원에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. 본 출원에서, "포함하다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
다르게 정의되지 않는 한, 기술적이거나 과학적인 용어를 포함해서 여기서 사용되는 모든 용어들은 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자에 의해 일반적으로 이해되는 것과 동일한 의미를 가지고 있다. 일반적으로 사용되는 사전에 정의되어 있는 것과 같은 용어들은 관련 기술의 문맥 상 가지는 의미와 일치하는 의미를 가지는 것으로 해석되어야 하며, 본 출원에서 명백하게 정의하지 않는 한, 이상적이거나 과도하게 형식적인 의미로 해석되지 않는다.Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in the commonly used dictionaries should be construed as having meanings consistent with the meanings in the context of the related art and shall not be construed in ideal or excessively formal meanings unless expressly defined in this application. Do not.
이하, 첨부된 도면을 참조하여 실시예를 상세히 설명하되, 도면 부호에 관계없이 동일하거나 대응하는 구성 요소는 동일한 참조 번호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다.DETAILED DESCRIPTION Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings, and the same or corresponding components will be given the same reference numerals regardless of the reference numerals, and redundant description thereof will be omitted.
도1은 본 발명의 일실시예에 따른 다중 영상 제어 시스템의 블록도이다.1 is a block diagram of a multi-image control system according to an embodiment of the present invention.
도1을 참조하면, 본 발명의 일실시예에 따른 다중 영상 제어시스템은 영상 소스를 복수개의 영상 레이어로 분할하는 영상 분할부(110), 복수개의 영상 레이어를 3차원 공간상에 맵핑하는 맵핑부(120), 3차원 공간상의 굴곡면에 대응하도록 복수개의 영상 레이어를 워핑(warping)처리하는 워핑 처리부(140), 복수개의 영상 레이어간의 중첩 영역을 블렌딩(blending)처리하는 블렌딩 처리부(150) 및 타임라인에 연동하여 맵핑된 복수개의 영상 레이어가 3차원 공간상에 투사되도록 복수개의 프로젝터(200)를 제어하는 제어부(130)를 포함하여 구성될 수 있다.Referring to FIG. 1, a multi-image control system according to an exemplary embodiment of the present invention includes an image divider 110 for dividing an image source into a plurality of image layers, and a mapping unit for mapping the plurality of image layers onto a three-dimensional space. 120, a warping processor 140 for warping the plurality of image layers to correspond to the curved surface in the three-dimensional space, a blending processor 150 for blending an overlapping region between the plurality of image layers, and The controller 130 may be configured to control the plurality of projectors 200 so that the plurality of image layers mapped in association with the timeline are projected on the 3D space.
먼저, 영상 분할부(110)는 영상 소스를 복수개의 영상 레이어로 분할할 수 있다. 영상 소스는 예를 들면, 뮤지컬, 연극 무대 등에 사용되는 무대 장치 중의 하나인 배경을 구성하는 이미지 데이터 일 수 있다.First, the image divider 110 may divide an image source into a plurality of image layers. The image source may be, for example, image data constituting a background which is one of stage apparatuses used for a musical, a theater stage, and the like.
영상 분할부(110)는 기 설정된 개수에 따라 영상 소스를 복수개의 영상 레이어로 분할하거나 또는 다중 영상 제어 시스템에 연결된 프로젝터(200)의 개수 및 각 프로젝터(200)에 할당된 영상 레이어 개수를 자동으로 산정한 후 산정된 개수에 따라 영상 소스를 복수개의 영상 레이어로 분할할 수 있다.The image divider 110 automatically divides an image source into a plurality of image layers according to a preset number, or automatically divides the number of projectors 200 connected to the multiple image control system and the number of image layers allocated to each projector 200. After the calculation, the image source may be divided into a plurality of image layers according to the calculated number.
맵핑부(120)는 분할된 복수개의 영상 레이어를 3차원 공간상에 맵핑할 수 있다. 3차원 공간은 예를 들면 뮤지컬, 연극이 공연되는 스테이지일 수 있으며 스크린 및 스테이지 상에 설치되며 부피를 가지는 임의의 물체를 포함할 수 있다.The mapping unit 120 may map the plurality of divided image layers on the 3D space. The three-dimensional space may be, for example, a stage on which musicals or plays are performed and may include any object having a volume and installed on the screen and the stage.
맵핑부(120)는 복수개의 영상 레이어를 투사하기 위한 가상의 투사면을 정의하고 자동 보정(auto-calibration)을 수행하여 각각의 영상 레이어가 3차원 공간상의 정확한 지점에 투사될 수 있도록 맵핑을 수행할 수 있다.The mapping unit 120 defines a virtual projection surface for projecting a plurality of image layers and performs auto-calibration to perform mapping so that each image layer can be projected to an accurate point in three-dimensional space. can do.
맵핑부(120)는 예를 들면 카메라 트래킹(camera tacking)방법을 이용하여 3차원 공간상에 복수개의 영상 레이어를 맵핑할 수 있다.The mapping unit 120 may map a plurality of image layers on a three-dimensional space using, for example, a camera tracking method.
맵핑부(120)는 3차원 공간을 실제로 촬영하는 카메라의 초점거리, 움직임, 화각(view angle)등에 대한 정보를 추출하여 카메라가 촬영한 영상 데이터에 대한 투사면의 3차원 좌표를 설정할 수 있으며, 설정된 3차원 좌표에 대응하여 복수개의 영상 레이어를 맵핑시킬 수 있다.The mapping unit 120 may set the 3D coordinates of the projection surface of the image data photographed by the camera by extracting information about a focal length, a motion, a view angle, etc. of the camera which actually photographs the 3D space. A plurality of image layers may be mapped to correspond to the set three-dimensional coordinates.
맵핑부(120)는 맵핑되는 공간에 일치하도록 복수개의 영상 레이어의 위치, 기울기 및 크기 중 적어도 하나를 조절할 수 있다. 맵핑부(120)는 예를 들면, 카메라 트래킹 방법을 통하여 산출된 투사면의 좌표를 이용하여 복수개의 영상 레이어가 투사면에 일치할 수 있도록 위치, 기울기 및 크기 중 적어도 하나를 조절할 수 있다.The mapping unit 120 may adjust at least one of positions, tilts, and sizes of the plurality of image layers to match the space to be mapped. For example, the mapping unit 120 may adjust at least one of a position, an inclination, and a size of the plurality of image layers to match the projection surface by using coordinates of the projection surface calculated through a camera tracking method.
워핑 처리부(140)는 3차원 공간상의 굴곡면에 대응하도록 영상 소스 또는 복수개의 영상 레이어를 워핑(warping)처리 할 수 있다. 워핑부는 3차원 공간상의 굴곡면에 투사되는 영상이 일그러지거나 찌그러짐이 없도록 영상 레이어의 일부분을 축소 또는 확대시켜 굴곡면의 굴곡 각도 등의 환경에 대응하도록 영상 레이어를 변환하는 워핑 처리를 수행할 수 있다.The warping processor 140 may warp the image source or the plurality of image layers to correspond to the curved surface in the 3D space. The warping unit may perform a warping process of converting an image layer to correspond to an environment such as a bending angle of the curved surface by reducing or expanding a portion of the image layer so that the image projected on the curved surface in the 3D space is not distorted or distorted. .
워핑 처리부(140)는 예를 들면, 픽셀(pixel) 또는 그리드(grid) 단위의 화면 조정과 부분 영상 압축 방식을 적용할 수 있다.The warping processor 140 may apply, for example, a screen adjustment and a partial image compression method in units of pixels or grids.
블렌딩 처리부(150)는 복수개의 영상 레이어간의 중첩 영역을 블렌딩(blending)처리 할 수 있다. 분할된 복수개의 영상 레이어가 3차원 공간상에 투사되는 경우 중첩되는 영역에는 두 개의 영상 레이어가 중첩되므로 밝기가 밝아져 주변에 비하여 도드라져 보이게 된다. 블렌딩 처리부(150)는 이러한 중첩영역의 밝기(brightness)와 대비(contrast)를 조정하여 중첩 영역의 도드라짐을 제거할 수 있다.The blending processor 150 may blend the overlapping areas between the plurality of image layers. When a plurality of divided image layers are projected on a three-dimensional space, two image layers overlap each other in an overlapping area, so that brightness becomes brighter and appears to be brighter than the surroundings. The blending processing unit 150 may remove the drooping of the overlapping region by adjusting the brightness and contrast of the overlapping region.
제어부(130)는 타임라인에 연동하여 맵핑된 복수개의 영상 레이어가 3차원 공간상에 투사되도록 복수개의 프로젝터(200)를 제어할 수 있다.The controller 130 may control the plurality of projectors 200 such that the plurality of image layers mapped in association with the timeline are projected on the 3D space.
제어부(130)는 3차원 공간을 중심으로 배치된 복수개의 프로젝터(200)를 제어하여 복수개의 영상 레이어가 투사되도록 제어할 수 있다. 복수개의 프로젝터(200)는 적어도 하나의 영상 레이어를 투사할 수 있으며, 각각의 프로젝터(200)가 투사하는 영상 레이어의 개수는 상이하게 설정될 수 있다.The controller 130 may control the plurality of image projectors to be projected by controlling the plurality of projectors 200 arranged around the 3D space. The plurality of projectors 200 may project at least one image layer, and the number of image layers projected by each projector 200 may be set differently.
제어부(130)는 타임라인에 연동하여 복수개의 프로젝터(200)가 영상 레이어를 투사하도록 제어할 수 있다. 타임라인은 임의로 설정되는 시간축 변수를 의미할 수 있으며, 예를 들면 뮤지컬, 연극, 방송 등에 사용되는 큐시트(cue sheet)일 수 있다.The controller 130 may control the plurality of projectors 200 to project an image layer in association with the timeline. The timeline may mean a time axis variable that is arbitrarily set. For example, the timeline may be a cue sheet used for a musical, a play, a broadcast, or the like.
큐시트는 극이나 라디오,텔레비전 프로그램 따위를 만들 때, 여러 가지 큐를 타임 테이블에 따라 정리한 표로써, 본 발명의 일실시예에서는 뮤지컬, 연극 등에서 장면 전환의 기준으로 활용되는 액션 큐(action cue)를 타임 테이블에 따라 정리한 표를 일예로 설명하기로 한다.A cue sheet is a table that arranges various cues according to a time table when making a play, a radio or a television program. In one embodiment of the present invention, an action cue used as a basis for a scene change in a musical or a play, etc. The table summarized according to the time table will be described as an example.
제어부(130)는 큐시트에 따라 각각의 프로젝터(200)를 제어할 수 있으며, 각 영상 레이어의 투사 시간은 독립적으로 제어될 수 있다.The controller 130 may control each projector 200 according to the cue sheet, and the projection time of each image layer may be independently controlled.
제어부(130)는 타임라인에 연동하여 조명 장치 및 사운드 장치 중 적어도 하나를 제어할 수 있다. 조명 장치 및 사운드 장치는 뮤지컬, 연극 등에 사용되는 무대 장치 중 하나 일 수 있으며, 제어부(130)는 큐시트에 연동하여 조명 장치 및 사운드 장치 중 적어도 하나의 동작을 제어할 수 있다.The controller 130 may control at least one of the lighting device and the sound device in association with the timeline. The lighting device and the sound device may be one of stage devices used in a musical, a play, and the like, and the controller 130 may control at least one of the lighting device and the sound device in association with the cue sheet.
도 2는 본 발명의 일실시예에 따른 제어부(130)의 동작을 설명하기 위한 개념도이다.2 is a conceptual diagram illustrating the operation of the controller 130 according to an embodiment of the present invention.
3차원 공간을 중심으로 배치된 6대의 프로젝터가 있으며, 각각의 프로젝터는 적어도 하나의 영상 레이어를 투사할 수 있도록 설정되어 있다.There are six projectors arranged around a three-dimensional space, and each projector is set to project at least one image layer.
각각의 영상 레이어는 액션큐(action cue)에 따라 독립적으로 제어되며, 추가적으로 조명 장치 및 사운드 장치를 제어할 수 있도록 설정되어 있다.Each image layer is independently controlled according to an action cue, and is configured to additionally control a lighting device and a sound device.
도 3은 본 발명의 일실시예에 따른 3차원 공간상의 프로젝터 투사 영역을 도시한 도면 및 도 4는 본 발명의 일실시예에 따라 복수개의 레이어가 투사된 3차원 공간을 설명하기 위한 도면이다.3 is a diagram illustrating a projector projection area in a three-dimensional space according to an embodiment of the present invention, and FIG. 4 is a diagram for describing a three-dimensional space in which a plurality of layers are projected according to an embodiment of the present invention.
도 3을 참조하면, 본 발명의 일실시예에 따른3차원 공간은 4면으로 이루어진 개방된 공간의 중앙부에 임의의 부피를 가지는 직육면체의 물체를 포함하여 구성될 수 있다. Referring to FIG. 3, a three-dimensional space according to an embodiment of the present invention may include an object of a rectangular parallelepiped having an arbitrary volume in the center of an open space formed of four surfaces.
각각의 프로젝터는 3차원 공간을 중심으로 일정 투사각을 가지도록 배치되어 있으며, 4면 또는 직육면체의 물체에 영상 레이어를 투사할 수 있도록 설정되어 있다.Each projector is arranged to have a constant projection angle around a three-dimensional space, and is set to project an image layer onto a four-sided or cuboid object.
도 4를 참조하면, 3차원 공간상에 복수개의 영상 레이어가 투사되어 하나의 입체감을 가지는 영상 소스를 투영하고 있다. 각각의 영상 레이어는 와핑 처리 및 블렌딩 처리 되어 있으며, 카메라 트래킹 방식에 의하여 맵핑 됨으로써 3차원 공간상의 정확한 위치에 투사되어 있다.Referring to FIG. 4, a plurality of image layers are projected on a three-dimensional space to project an image source having a three-dimensional effect. Each image layer is warped and blended, and is mapped by a camera tracking method to be projected at an accurate position in three-dimensional space.
본 실시예에서 사용되는 '~부'라는 용어는 소프트웨어 또는 FPGA(field-programmable gate array) 또는 ASIC과 같은 하드웨어 구성요소를 의미하며, '~부'는 어떤 역할들을 수행한다. 그렇지만 '~부'는 소프트웨어 또는 하드웨어에 한정되는 의미는 아니다. '~부'는 어드레싱할 수 있는 저장 매체에 있도록 구성될 수도 있고 하나 또는 그 이상의 프로세서들을 재생시키도록 구성될 수도 있다. 따라서, 일 예로서 '~부'는 소프트웨어 구성요소들, 객체지향 소프트웨어 구성요소들, 클래스 구성요소들 및 태스크 구성요소들과 같은 구성요소들과, 프로세스들, 함수들, 속성들, 프로시저들, 서브루틴들, 프로그램 코드의 세그먼트들, 드라이버들, 펌웨어, 마이크로코드, 회로, 데이터, 데이터베이스, 데이터 구조들, 테이블들, 어레이들, 및 변수들을 포함한다. 구성요소들과 '~부'들 안에서 제공되는 기능은 더 작은 수의 구성요소들 및 '~부'들로 결합되거나 추가적인 구성요소들과 '~부'들로 더 분리될 수 있다. 뿐만 아니라, 구성요소들 및 '~부'들은 디바이스 또는 보안 멀티미디어카드 내의 하나 또는 그 이상의 CPU들을 재생시키도록 구현될 수도 있다.The term '~ part' used in the present embodiment refers to software or a hardware component such as a field-programmable gate array (FPGA) or an ASIC, and '~ part' performs certain roles. However, '~' is not meant to be limited to software or hardware. '~ Portion' may be configured to be in an addressable storage medium or may be configured to play one or more processors. Thus, as an example, '~' means components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and the like. Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables. The functionality provided within the components and the 'parts' may be combined into a smaller number of components and the 'parts' or further separated into additional components and the 'parts'. In addition, the components and '~' may be implemented to play one or more CPUs in the device or secure multimedia card.
상기에서는 본 발명의 바람직한 실시예를 참조하여 설명하였지만, 해당 기술 분야의 숙련된 당업자는 하기의 특허 청구의 범위에 기재된 본 발명의 사상 및 영역으로부터 벗어나지 않는 범위 내에서 본 발명을 다양하게 수정 및 변경시킬 수 있음을 이해할 수 있을 것이다. Although described above with reference to a preferred embodiment of the present invention, those skilled in the art will be variously modified and changed within the scope of the invention without departing from the spirit and scope of the invention described in the claims below I can understand that you can.

Claims (8)

  1. 스크린이 배치된 3차원 공간을 투사하도록 설치되며 적어도 하나의 영상 레이어를 투사하는 복수개의 프로젝터를 제어하기 위한 다중 영상 제어 시스템에 있어서,A multi-image control system for controlling a plurality of projectors installed to project a three-dimensional space in which a screen is arranged and projecting at least one image layer,
    영상 소스를 복수개의 영상 레이어로 분할하는 영상 분할부;An image divider dividing an image source into a plurality of image layers;
    상기 복수개의 영상 레이어를 상기 3차원 공간상에 맵핑하는 맵핑부 및A mapping unit for mapping the plurality of image layers onto the three-dimensional space;
    타임라인에 연동하여 상기 맵핑된 복수개의 영상 레이어가 상기 3차원 공간상에 투사되도록 상기 복수개의 프로젝터를 제어하는 제어부를 포함하는 다중 영상 제어 시스템.And a controller for controlling the plurality of projectors such that the plurality of mapped image layers are projected onto the three-dimensional space in association with a timeline.
  2. 제1항에 있어서,The method of claim 1,
    상기 타임라인은 큐시트(cue sheet)에 대응하여 구성되는 다중 영상 제어 시스템.And the timeline is configured to correspond to a cue sheet.
  3. 제2항에 있어서,The method of claim 2,
    상기 영상 소스는 상기 큐시트에 대응하여 전환되는 상기 3차원 공간상의 배경을 구성하는 다중 영상 제어 시스템.And the image source constitutes a background on the three-dimensional space that is switched corresponding to the cue sheet.
  4. 제1항에 있어서,The method of claim 1,
    상기 3차원 공간상의 굴곡면에 대응하도록 상기 영상 소스 또는 상기 복수개의 영상 레이어를 워핑(warping)처리하는 워핑 처리부를 더 포함하는 다중 영상 제어 시스템.And a warping processor configured to warp the image source or the plurality of image layers to correspond to the curved surface in the 3D space.
  5. 제1항에 있어서,The method of claim 1,
    상기 복수개의 영상 레이어간의 중첩 영역을 블렌딩(blending)처리하는 블렌딩 처리부를 더 포함하는 다중 영상 제어 시스템.And a blending processor configured to blend the overlapping areas between the plurality of image layers.
  6. 제1항에 있어서,The method of claim 1,
    상기 맵핑부는 상기 맵핑되는 공간에 일치하도록 상기 복수개의 영상 레이어의 위치, 기울기 및 크기 중 적어도 하나를 조절하는 다중 영상 제어 시스템.And the mapping unit adjusts at least one of positions, tilts, and sizes of the plurality of image layers to match the space to be mapped.
  7. 제1항에 있어서,The method of claim 1,
    상기 제어부는 상기 타임라인에 연동하여 조명 장치 및 사운드 장치 중 적어도 하나를 제어하는 다중 영상 제어 시스템.The controller controls at least one of a lighting device and a sound device in association with the timeline.
  8. 제1항에 있어서,The method of claim 1,
    상기 맵핑부는 카메라 트래킹(camera tracking)방법을 통하여 실사 촬영된 상기 3차원 공간상에 상기 복수개의 영상 레이어를 맵핑하는 다중 영상 제어 시스템.And the mapping unit maps the plurality of image layers on the three-dimensional space photographed by the camera tracking method.
PCT/KR2013/001960 2013-02-28 2013-03-12 Multi-image control system WO2014133213A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0022408 2013-02-28
KR1020130022408A KR20140108819A (en) 2013-02-28 2013-02-28 Control system for multi-image

Publications (1)

Publication Number Publication Date
WO2014133213A1 true WO2014133213A1 (en) 2014-09-04

Family

ID=51428464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/001960 WO2014133213A1 (en) 2013-02-28 2013-03-12 Multi-image control system

Country Status (2)

Country Link
KR (1) KR20140108819A (en)
WO (1) WO2014133213A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080052351A (en) * 2006-12-06 2008-06-11 한국전자통신연구원 Apparatus and method for displaing video for dome screen
US20080204663A1 (en) * 2004-05-26 2008-08-28 Tibor Balogh Method And Apparatus For Generating 3D Images
KR100863280B1 (en) * 2008-01-15 2008-10-15 (주)옴니레이저 Integral control device of image, sound and special effects and control method thererof
US20080309884A1 (en) * 2005-04-26 2008-12-18 O'dor Matthew Electronic Projection Systems and Methods
KR20120121647A (en) * 2011-04-27 2012-11-06 정영종 System and method for displaying 2D and 3D curved video

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080204663A1 (en) * 2004-05-26 2008-08-28 Tibor Balogh Method And Apparatus For Generating 3D Images
US20080309884A1 (en) * 2005-04-26 2008-12-18 O'dor Matthew Electronic Projection Systems and Methods
KR20080052351A (en) * 2006-12-06 2008-06-11 한국전자통신연구원 Apparatus and method for displaing video for dome screen
KR100863280B1 (en) * 2008-01-15 2008-10-15 (주)옴니레이저 Integral control device of image, sound and special effects and control method thererof
KR20120121647A (en) * 2011-04-27 2012-11-06 정영종 System and method for displaying 2D and 3D curved video

Also Published As

Publication number Publication date
KR20140108819A (en) 2014-09-15

Similar Documents

Publication Publication Date Title
US20210235049A1 (en) Method for projecting image content
US7407297B2 (en) Image projection system and method
KR101489262B1 (en) Multi-Projection System for expanding a visual element of main image
US9679369B2 (en) Depth key compositing for video and holographic projection
EA030861B1 (en) Image projection apparatus
BG62114B1 (en) Method and system of building in of an image in a video flow
GB2517730A (en) A method and system for producing a video production
CN110225224B (en) Virtual image guiding and broadcasting method, device and system
IL124539A (en) System for estabilishing a three-dimensional garbage matte which enables simplified adjusting of spatial relationships between physical and virtual scene elements
US20160249027A1 (en) Automatic keystone correction in an automated luminaire
WO2017073095A1 (en) Image projection device, stage installation, and image projection method
JP4914492B2 (en) Method and apparatus for displaying image with production switcher
JP2022058501A (en) Hall display system and event execution method using the same
US20180167596A1 (en) Image capture and display on a dome for chroma keying
CN113692734A (en) System and method for acquiring and projecting images, and use of such a system
WO2014133213A1 (en) Multi-image control system
US20090167949A1 (en) Method And Apparatus For Performing Edge Blending Using Production Switchers
CN209748722U (en) Projection real-time interaction device
JP6848575B2 (en) Control systems, control methods, and control programs
US20120327313A1 (en) Wallpaper Projection System
US20210125535A1 (en) Video lighting apparatus with full spectrum white color
JP2004361880A (en) Projector for planetarium and program for astronomical simulator
JP2002108319A (en) Video switcher and video display control system
WO2023196845A2 (en) System and method for providing dynamic backgrounds in live-action videography
WO2023196850A2 (en) System and method for providing dynamic backgrounds in live-action videography

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13876730

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13876730

Country of ref document: EP

Kind code of ref document: A1