WO2013179992A1 - Stage device and stage facility - Google Patents

Stage device and stage facility Download PDF

Info

Publication number
WO2013179992A1
WO2013179992A1 PCT/JP2013/064293 JP2013064293W WO2013179992A1 WO 2013179992 A1 WO2013179992 A1 WO 2013179992A1 JP 2013064293 W JP2013064293 W JP 2013064293W WO 2013179992 A1 WO2013179992 A1 WO 2013179992A1
Authority
WO
WIPO (PCT)
Prior art keywords
stage
video
virtual
illumination
video data
Prior art date
Application number
PCT/JP2013/064293
Other languages
French (fr)
Japanese (ja)
Inventor
ラクロワ ポール
健典 星
Original Assignee
株式会社セガ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社セガ filed Critical 株式会社セガ
Publication of WO2013179992A1 publication Critical patent/WO2013179992A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Definitions

  • the present invention relates to a stage apparatus and a stage facility used for performance, presentation, and the like, and more particularly to a stage apparatus and a stage facility in which a display unit for displaying an image on a stage is installed.
  • Patent Document 1 describes that a display unit (screen) for displaying a video on a stage where performance such as a play is performed is displayed on the display unit so as to display a video having performer performance and interactiveness. .
  • stage and display animations video objects
  • video objects video objects
  • stage apparatuses or stage facilities that can perform an effect as if a video object is performing with a real performer.
  • the present invention has been made in the above situation, and in a stage apparatus and stage equipment for displaying a video object on a display unit installed on a stage, the realism of the video object and / or video quality is improved. Objective.
  • the present invention includes a stage, illumination capable of irradiating the stage with light, illumination control means for controlling the illumination, a display unit for displaying a video object installed on the stage, and the video object Is a stage apparatus or stage facility provided with video data generation means for generating video data for displaying on the display unit, wherein the video data generation means is synchronized with the illumination control by the illumination control means, A stage apparatus or stage facility for adjusting the color of the video object.
  • “Stage” in this specification means a place or space where performances and presentations by real people and things are performed.
  • the performance, presentation type, and content are arbitrary. Examples of performances and presentations include human dances, performances, singing, plays, etc. and display of goods.
  • illumination means a member, equipment, device or the like that can irradiate the stage with light.
  • a halogen lamp, a xenon lamp, an LED illumination, or the like may be used.
  • the number and arrangement of “lights” are arbitrary. “Illumination” includes spotlights, ceiling lights, footlights, front lights, moving lights, and the like depending on their roles, functions, and positional relationship with the stage.
  • illumination control means in this specification is means for controlling illumination (irradiation of light to the stage). Examples of “control” of lighting include turning on / off (lighting / extinguishing) lighting, changing light intensity, changing lighting color, lighting position and light irradiation direction (directivity) Is included. “Control” may be performed manually or automatically. In the case of manual control, the illumination control means can be configured by a console (operation panel) operated by an operator. The console may have operation members such as buttons, switches, and volumes for the operator to input operations. When the computer automatically controls lighting according to a preset program, the computer can constitute lighting control means.
  • Display section in this specification means a portion, member, device, system, or the like on which an image is displayed.
  • the display unit may be a CRT, LCD, plasma display, LED display or screen thereof, a screen for projecting a projector image, or the like.
  • Virtual object in this specification means shape data in a virtual space.
  • a “virtual object” can be defined by a plurality of polygons. The shape of the “virtual object” can be three-dimensional.
  • a “virtual object” can have the shape of a person, animal, fictional character, or other object.
  • Video object in this specification refers to an object such as a person or an object displayed as a video.
  • a “video object” may be generated by rendering a “virtual object”.
  • Color in this specification includes brightness.
  • stage facility may be an indoor facility having an outer wall, a ceiling, or the like, or an outdoor facility installed in a space such as a stadium.
  • the color of the video object is adjusted in synchronization with the lighting control by the lighting control means, the effect corresponding to the lighting effect such as a shadow generated on a real person or object on the stage by the lighting is displayed.
  • the effect corresponding to the lighting effect such as a shadow generated on a real person or object on the stage by the lighting is displayed.
  • the adjustment of the color of the video object in the present invention is as follows. (1) The right side of the video object is brightened in synchronism with the control of irradiating the stage with illumination light from the right direction (this produces an effect such that the right side of the person or object on the stage is bright and the left side is dark). The left side is darkened (2) The color (brightness) of the video object is brightened in synchronization with the control that irradiates the stage with strong illumination light (this produces an effect such that a person or object on the stage looks bright).
  • the corresponding part of the video object can be changed to a color corresponding to the illumination light.
  • the video data generation unit includes a virtual space setting unit that arranges a virtual object and a virtual light source in a virtual space, and a color adjustment unit that adjusts the color of the virtual object by reflecting the influence of the virtual light source. It is preferable to have a rendering means for generating the video data by rendering the virtual object, and this makes it easy to adjust the color of the video object in real time in synchronization with the lighting control by the lighting control means. It becomes.
  • the virtual space setting means arranges the virtual object and the virtual light source in the virtual space with a positional relationship corresponding to the positional relationship between the display unit and the illumination, and / or the virtual light source.
  • the number, intensity, color and / or direction preferably corresponds to the number, intensity, color and / or direction of the illumination, so that the color of the video object can be faithfully adjusted by controlling the illumination by the illumination control means. Can be done.
  • main motion data generating means for generating main motion data by detecting a measurement subject's motion
  • the video data generating means is configured to detect the video object based on the main motion data. It is preferable to generate video data that operates simultaneously with the aforementioned operation. In this invention, it becomes possible to operate the video object in an appropriate manner according to the situation at the time (for example, the progress of the stage or the situation of the audience seat).
  • the present invention further includes a controller that can be operated by the person being measured while performing the operation, and a sub-motion data generating unit that generates sub-motion data in accordance with an operation on the controller. It is preferable to change the shape and / or pattern of all or part of the video object based on motion data.
  • the person to be measured inputs sub-motion data for changing the shape and / or pattern of all or part of the video object by operating the controller while performing the operation for generating the main motion data. It is possible.
  • the video object has a shape of a person or the like, and the shape of the video object's hand (hand gesture, etc.) or the shape / pattern of the head (expression, hair shape, etc.) is changed by sub-motion data.
  • the gesture or expression of the hand of the video object it becomes easy to change the gesture or expression of the hand of the video object at an appropriate timing according to the motion of the video object.
  • preset motion data storage means for storing preset motion data defining the operation of the video object, and switching means for switching the video data generation means between the first state and the second state are further provided.
  • the video data generating means generates video data for operating the video object based on the main motion data in the first state, and based on the preset motion data in the second state. It is preferable to generate video data on which the video object operates.
  • a state (first state) in which video data for operating the video object is generated based on the main motion data and a state (first state) in which video data for operating the video object is generated based on the preset motion data.
  • switching between two states Therefore, even if the main motion data cannot be generated properly due to, for example, a device failure or the movement of the measurement subject to a place where the motion cannot be detected, the first state to the second state.
  • FIG. 3 is an explanatory diagram showing a configuration of an exemplary motion capture system 310.
  • 3 is an explanatory diagram illustrating a functional configuration of the video processing device 450.
  • FIG. 5 is an explanatory diagram illustrating an exemplary virtual space 600 set by a virtual space setting unit 530.
  • FIG. 4 is an explanatory diagram showing a state in which a video object 230 is displayed on a display unit 211 installed on a stage 210.
  • FIG. 5 is an explanatory diagram showing a video object 230 displayed on the display unit 211 when switching is performed by a data switching unit 570.
  • stage apparatus 10 and the stage facility 100 according to a preferred embodiment will be described based on the drawings.
  • FIG. 1 is an explanatory diagram showing a schematic arrangement of a stage facility 100 having a stage apparatus 10 according to an embodiment of the present invention.
  • FIG. 2 is an explanatory diagram showing the main part of the stage apparatus 10 from the audience area 240 side.
  • the stage facility 100 includes a venue 200, a motion capture room 300, and an operation room 400.
  • the venue 200 has a stage 210 and a spectator area 240.
  • Stage 210 is a place where performers 220 can perform performances and present things.
  • the shape of the stage 210 is arbitrary, and may have a flat surface, a curved surface, or a three-dimensional shape.
  • the display unit 211 is installed on the stage 210.
  • the display unit 211 is an arbitrary part, member, apparatus, or system capable of displaying a video such as the video object 230.
  • a screen that displays a video from the projector 212 is used as the display unit 211. Is done.
  • the shape of the display unit 211 is arbitrary, and may have a flat surface, a curved surface, or a three-dimensional shape.
  • the projector 212 is disposed at an arbitrary position where an image can be projected on the display unit 211. As shown in the figure, when the projector 212 is arranged on the back side of the display unit 211, it is possible to prevent the image of the projector 212 from being blocked by a performer 220 performing on the front side of the display unit 211. . In this case, a rear projection screen made of a transparent sheet-like member or the like can be used as the display unit 211.
  • a single display unit 211 is installed at the approximate center of the stage 210, but the arrangement and number of the display units 211 on the stage 210 are arbitrary.
  • a display unit made of a three-dimensional model that displays an image by projection mapping on both sides of the display unit 211 shown in the figure, and / or a display unit made of LED vision or the like behind the display unit 211 By arranging a plurality of display units so as to surround the stage 210 such as arranging them, it is possible to perform an effect in which the performance on the stage 210 and the presentation and the video on the display unit are integrated.
  • the stage apparatus 10 to the stage facility 100 have illuminations 213L and 213R for irradiating the stage 210 with light.
  • one illumination 213L and 213R is shown near the left and right ceilings of the stage 210, respectively, but the illumination can be arranged at any other position where the stage 210 can be irradiated with light.
  • the number of lights may be one or three or more.
  • the stage apparatus 10 to the stage facility 100 further include stage equipment such as one or a plurality of speakers 214 that output sound such as music, a smoke machine 215 for performing a SFX-like stage, and a laser device 216. Can do.
  • the audience area 240 is a place for the audience to see the performance performed at the stage 210.
  • the audience area 240 may have one or more seats 241 on which the audience can sit.
  • the stage apparatus 10 to the stage facility 100 may further include a photographing unit 217 for photographing the audience area 240.
  • the photographing means 217 is arranged at an appropriate position such as near the ceiling at the center of the stage 210 so that the entire audience area 240 or a specific part can be appropriately photographed.
  • a motion capture (MC) system 310 for generating main motion data by detecting (capturing) an action such as a dance performed by a measurement subject (motion capture actor) 320 is arranged.
  • the operation of the measurement subject 320 can be detected by an arbitrary method such as an optical method, a mechanical method, or a magnetic method.
  • FIG. 3 is an explanatory diagram showing a configuration of an exemplary motion capture system 310.
  • the motion capture system 310 includes a plurality of markers 311 attached to appropriate positions such as a limb, a head, and a waist of the measured person 320, and a predetermined motion area (an area where the measured person 320 operates).
  • One or a plurality of sensors 313 capable of detecting markers 311 arranged around 312 are provided.
  • the sensor 313 is connected to the processing device 314 wirelessly or by wire, and the processing device 314 calculates the three-dimensional position of each marker 311 based on the data from the sensor 313, thereby generating main motion data. .
  • the main motion data is data for determining the position and posture of the virtual object 610 (see FIG. 5) to be arranged in the virtual space 600 in accordance with the operation of the person 320 to be measured.
  • the position of the reference point of the virtual object for example, the origin of the object coordinate system fixed to the virtual object 610
  • the parts head 611, hand 612, foot of the virtual object 610 with respect to the reference point
  • Data of the relative position of the unit 613 is generated.
  • the processing device 314 can be composed of a computer having a CPU and a storage device, and the sensor 313 can be composed of photographing means such as a camera.
  • the number and arrangement of the sensors 313 are appropriately set in consideration of the accuracy of position detection of the marker 311 and the like.
  • the motion capture system 310 can have controllers 315L and 315R operated by the person 320 to be measured as means for generating sub motion data.
  • the specifications, number, etc. of the controllers 315L, 315R are arbitrary, but in this embodiment, the left and right handy controllers 315L, 315R that can be gripped / operated with one hand are used, and the person 320 to be measured can It is possible to input sub motion data from the handy controllers 315L and 315R while performing the operation.
  • the handy controllers 315L and 315R have operation members such as buttons and joysticks.
  • the handy controllers 315L and 315R are preferably connected to the processing device 314 wirelessly.
  • the sub motion data is data for determining the shape and / or pattern of the whole or a part of the virtual object 610.
  • the data input by operating the buttons of the handy controllers 315L and 315R is It is used as hand gesture data, and the data input by operating the joystick is used as facial expression and hair shape data.
  • a display device 330 that displays live video of the spectator area 240 photographed by the photographing means 217 is disposed in front of the motion area 312.
  • the person 320 to be measured can operate while viewing the video on the display device 330 or can operate the handy controllers 315L and 315R.
  • the person 320 to be measured can input the main motion data and / or the sub-motion data according to the real-time situation of the spectator area 240.
  • the video object 230 interacts with a specific spectator. It becomes possible to generate video data of the video object 230 that operates according to the situation of the audience area 240.
  • main motion data and sub-motion data generated by the motion capture system 310 are collectively referred to as “real-time motion data”.
  • a lighting console 410 a video console 420, a sound console 430, an SFX console 440, and a video processing device 450.
  • a video console 420 a sound console 430
  • a SFX console 440 a video processing device 450.
  • the lighting console 410 is for controlling the lighting 213L and 213R.
  • the lighting 213L and 213R are individually turned on / off, and the intensity ( It is possible to perform control such as changing brightness) and / or color.
  • a control signal S1 based on the operation of the operator 460 is transmitted from the lighting console 410 to the lights 213L and 213R, and the lights 213L and 213R are controlled based on the control signal S1.
  • a notification signal S2 for notifying the control or the contents of the control is transmitted to the video generation device 450.
  • the notification signal S2 may be the same signal as the control signal S1 or a separate signal, and a signal obtained by branching the control signal S1 can be used as the notification signal S2.
  • the video console 420 is for performing control related to video data generated by the video generation device 450.
  • the operator 460 performs a predetermined operation on the video console 420, thereby realizing real-time motion data. It is possible to switch between a first state in which video data for operating the video object 230 is generated based on the video and a second state in which video data for operating the video object 230 is generated based on the preset motion data. It is.
  • a notification signal S3 notifying that the control has been performed or the contents of the control is transmitted to the video generation device 450.
  • the preset motion data will be described later.
  • the sound console 430 controls the sound output from the speaker 214, and the sound source data stored in an appropriate medium is reproduced from the speaker 214 at the timing, volume, etc. designated by the sound console 430.
  • SFX-related production equipment such as the smoke machine 215 and the laser device 216 is controlled by operating the SFX console 440.
  • the video processing device 450 is connected to the lighting console 410, the video console 420, and the motion capture system 310, and generates video data to be displayed on the display unit 211 based on the notification signals S2, S3 and data received from these. The processing is performed.
  • the video processing device 450 can be configured by a CPU and a computer including a storage device that stores various programs and data necessary for generating video data.
  • the storage device can be configured by a known storage device such as a ROM, a RAM, a hard disk, or a flash memory, alone or in combination.
  • the stage apparatus 10 of this embodiment includes a stage 210, illuminations 213L and 213R, an illumination console 410, a video processing apparatus 450, and the like.
  • FIG. 4 is an explanatory diagram showing a functional configuration of the video processing device 450.
  • the video processing device 450 includes an object storage unit 510, a texture storage unit 520, a virtual space setting unit 530, a color adjustment unit 540, a rendering unit 550, a preset motion data storage unit 560, a data switching unit 570, and the like.
  • a video data generation unit 500 is included.
  • the object storage unit 510 stores object data that defines one or a plurality of virtual objects 610 to be arranged in the virtual space.
  • object data about a virtual object 610 having a shape such as a person or a character is stored. It is remembered.
  • the format for defining the virtual object 610 is arbitrary.
  • the virtual object 610 can be defined by the position coordinates of polygons constituting the virtual object 610 in the local coordinate system fixed to the virtual object 610.
  • the posture of the virtual object 610 can be changed by changing the arrangement of the parts (the head 611, the hand 612, the foot 613, etc.) constituting the virtual object 610 in the local coordinate system.
  • the object storage unit 510 stores a plurality of types of shapes for the head 611 and the hand 612.
  • the texture storage unit 520 stores a texture for mapping to the virtual object 610.
  • the texture is image data drawn in the form of a bitmap or the like for defining the pattern, color, transparency, etc. of the virtual object 610.
  • a facial expression is displayed on the virtual object 610. It is possible to give details such as clothes and clothes.
  • the texture storage unit 520 stores a plurality of types of textures for the head 611 and the like.
  • the virtual space setting unit 530 executes processing for arranging the virtual object 610, the virtual light sources 620L and 620R, and the virtual viewpoint in the virtual space 600 in the manner illustrated in FIG.
  • the virtual object 610 is arranged in the virtual space at the position and the posture determined by the main motion data supplied from the motion capture system 310 by streaming (in real time). That is, the position coordinates of the reference point of the virtual object 610 in the virtual space 600 and the arrangement of parts such as the head 611, the hand 612, and the foot 613 with respect to the reference point are determined by the real-time main motion data. As a result, the motion of the measured person 320 is reflected in real time on the motion of the virtual object 610, and the virtual object 610 performs the same or corresponding action as the measured person 320 simultaneously with the motion of the measured person 320. Will do.
  • object data and texture data having a shape / pattern corresponding to the sub motion data input from the handy controllers 315L and 315R are applied to the head 611 and the hand 612 of the virtual object 610.
  • the expression of the head 611 and / or the shape of the hair can be changed variously as shown in the frame 621, and the shape (hand gesture) of the hand portion 612 can be changed variously as shown in the frame 622. It becomes.
  • the virtual viewpoint is set at an appropriate position and direction in the virtual space 600.
  • the position and direction of the virtual viewpoint may be fixed or movable, but here, a description will be given assuming that the virtual viewpoint oriented in the direction of the paper surface is set at a predetermined position on the front side of the paper surface in FIG.
  • the arrangement and / or number of the virtual light sources 620L and 620R can be determined according to the control of the lights 213L and 213R performed in the lighting console 410. For example, when the lighting console 410 is controlled to turn on the right side illumination 213R of the stage 210 as viewed from the audience area 240 and turn off the left side illumination 213L, the right side of the virtual object 610 viewed from the virtual viewpoint. If the control is performed so that both the left and right illuminations 213L and 213R are turned on, the virtual light sources 620L and 620R can be arranged on both the left and right sides of the virtual object 610.
  • the placement, movement, parameter control, and the like of the virtual light sources 620 and 620R as described above are performed in synchronization with the control of the lights 213L and 213R in the lighting console 410. Synchronization is automatically performed based on the notification signal S2 output from the lighting console 410.
  • the color adjustment unit 540 adjusts the color of the virtual object 610 by reflecting the influence of the virtual light sources 620L and 620R (virtual light rays from the virtual light sources 620L and 620R). Specifically, a process such as causing a shadow 614 by the virtual light sources 620L and 620R in the virtual object 610 or changing the color of the virtual object 610 corresponding to the colors of the virtual light sources 620L and 620R is performed. In such color adjustment, a known method such as von shading or glow shading can be used.
  • the rendering unit 550 renders an image of the video object 230 displayed on the display unit 211 by rendering an image when the virtual object 610 whose color has been adjusted by the color adjustment unit 540 is viewed from a virtual viewpoint at every predetermined frame time.
  • Video data is generated. Rendering can be performed using a known method such as projection conversion, rasterization, and hidden surface removal.
  • the video data generated by the rendering unit 550 is output to the projector 212, whereby the video object 230 as a moving image is displayed on the display unit 211.
  • FIG. 6 is an explanatory diagram showing a state where the video object 230 rendered by the rendering unit 550 is displayed on the display unit 211 installed on the stage 210.
  • the performer 220 on the stage 210 has the lighting 213R.
  • the effect of the shadow 221 and the like by the light from An effect such as a shadow 231 corresponding to the video object 230 is also produced.
  • the effect is as if the video object 230 is illuminated by the actual illumination 213L, 213R (the effect as if the performer 220 and the video object 230 are illuminated by the common illumination 213L, 213R).
  • the object 230 By causing the object 230 to occur, the realism of the video object 230 can be enhanced.
  • the preset motion data storage unit 560 stores one or more types of preset motion data.
  • the preset motion data defines the motion of the virtual object 610 over a predetermined time.
  • the position and posture of the virtual object 610 in the virtual space 600, and the shapes / patterns of the head 611 and the hand 612 Data that defines the hand gesture, facial expression, hair shape, etc.) in time series is used.
  • the data switching unit 570 includes a first state in which the video generation unit 500 generates video data of the video object 230 based on real-time motion data, and preset motion data. Based on this, the second state of generating video data of the video object 230 is switched. In the first state, the data switching unit 570 receives the real-time motion data from the motion capture system 310 in the virtual space setting unit 550. In the second state, the preset motion data stored in the preset motion data storage unit 560 is supplied to the virtual space setting unit 550. The processes executed by the virtual space setting unit 530, the color adjustment unit 540, and the rendering unit 550 in the first state are as described above.
  • the placement of the virtual object 610 in the virtual space 600 by the virtual space setting unit 530 is performed based on the preset motion data in the preset motion data storage unit 560, except for this point, the virtual space setting unit
  • the color adjustment unit 540, the color adjustment unit 540, and the rendering unit 550 execute processing such as the placement of the virtual light sources 620L and 620R, color adjustment, and rendering in the same manner as in the first state, and thereby based on the preset motion data.
  • the video data of the video object 230 that operates in this manner is output to the projector 212.
  • FIG. 7 is an explanatory diagram showing a video object 230 that is rendered when switching is performed by the data switching unit 570, and arrows at the bottom of the figure indicate the passage of time.
  • the period from the time point T0 to T1 is the first state, and the video objects 2301 and 2302 that operate based on the real-time motion data are rendered, and the data switching unit 570 changes to the second state at the time point T1. Thereafter, the video objects 2303 to 2305 that operate based on the preset motion data are rendered until the time T3, and the data switching unit 570 switches to the first state at the time T3. Thereafter, video objects 2306 and 2307 operating based on real-time motion data are rendered.
  • the present invention has been described based on the exemplary embodiment. However, the present invention is executed in the stage apparatus and the stage facility or the members, apparatuses, systems, functional configurations, parameters, the stage apparatus and the stage facility in the above embodiment.
  • the contents of the processing are described only as examples, and these can be arbitrarily changed within the scope of the claims.
  • control of illumination 213L, 213R was performed via control signal S1 output from the illumination control means (illumination console 410)
  • a control signal is used for control of illumination 213L, 213R.
  • the lighting control means controls the lighting 213L and 213R without using a control signal such as turning on / off the power supply to the lighting 213L and 213R or adjusting the power supply amount by switch operation or volume adjustment.
  • the control of the illuminations 213L and 213R is performed, or the content of the control is notified to the video generation means (video processing device 450, video processing data generation unit 500).
  • the color of the video object can be adjusted in synchronization with the illumination control in the same manner as in the above embodiment.
  • DESCRIPTION OF SYMBOLS 10 ... Stage apparatus 100 ... Stage facility 210 ... Stage 211 ... Display unit 212 ... Projector 213L, 213R ... Illumination 214 ... Speaker 217 ... Imaging means 230 ... Video object 310 ... motion capture system 400 ... operation room 410 ... lighting console 420 ... video console 450 ... video processing device 500 ... video data generation unit 510 ... object storage unit 520 ... Texture storage unit 530 ... Virtual space setting unit 540 ... Color adjustment unit 550 ... Rendering unit 550 ... Virtual space setting unit 560 ... Preset motion data storage unit 570 ... Data switching Part 600 ... virtual space 610 ... virtual object 620L, 20R ⁇ virtual light source

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)
  • Image Analysis (AREA)

Abstract

In order to increase the realism of an image object (230) in a stage device that displays the image object (230) on a display unit (211) installed on a stage (210), this stage device (10) is equipped with: a stage (210); lights (213L, 213R) capable of shining light on the stage (210); a lighting control means (410) that controls the lights (213L, 213R); a display unit (211) that is installed on the stage (210) and displays an image object (230); and image data generation means (450, 500) that generate image data for the purpose of displaying the image object (230) on the display unit (211). In addition, the color of the image object (230) is adjusted in synchronization with the control of the lights (213L, 213R) by the lighting control means (410).

Description

ステージ装置及びステージ施設Stage device and stage facility
 本発明は、パフォーマンスやプレゼンテーション等に使用されるステージ装置及びステージ施設に関し、特に、ステージに映像を表示する表示部が設置されたステージ装置及びステージ施設に関する。 The present invention relates to a stage apparatus and a stage facility used for performance, presentation, and the like, and more particularly to a stage apparatus and a stage facility in which a display unit for displaying an image on a stage is installed.
 特許文献1には、演劇などのパフォーマンスが行われるステージに映像を表示する表示部(スクリーン)を設置し、出演者のパフォーマンスとインタラクティブ性のある映像を表示部に表示することが記載されている。 Patent Document 1 describes that a display unit (screen) for displaying a video on a stage where performance such as a play is performed is displayed on the display unit so as to display a video having performer performance and interactiveness. .
 近年では、現実の出演者(ダンサーや演奏家など)がステージでパフォーマンスを行うとともに、仮想オブジェクトをレンダリングすることで生成したアニメ等のキャラクタの映像(映像オブジェクト)をステージに設置した表示部に表示することで、映像オブジェクトが現実の出演者と共演しているかのような演出を行うことができるステージ装置乃至ステージ施設も知られている。 In recent years, real performers (dancers, performers, etc.) perform on stage and display animations (video objects) of characters such as animations generated by rendering virtual objects on a display unit installed on the stage. There are also known stage apparatuses or stage facilities that can perform an effect as if a video object is performing with a real performer.
特表2003-533235号公報Special table 2003-533235 gazette
 ステージに設置した表示部において映像オブジェクトを表示する場合、特に、映像オブジェクトが出演者と共演しているかのような演出を行う場合は、映像オブジェクトがステージ上の空間に現実に存在しているかのような実在感を有することが望ましく、その目的のため、画質の向上やキャラクタの動作の円滑化などの面での様々な工夫がなされてきたが、現状においてもなお一層の実在感乃至映像品質の向上が求められている。 When displaying a video object on a display unit installed on the stage, especially when performing an effect as if the video object is performing with a performer, whether the video object actually exists in the space on the stage It is desirable to have such a sense of reality, and for that purpose, various improvements have been made in terms of improving the image quality and facilitating the movement of the character. Improvement is demanded.
 本発明は、上記の状況においてなされたものであり、ステージに設置された表示部において映像オブジェクトを表示するステージ装置及びステージ設備において、映像オブジェクトの実在感及び/又は映像品質を向上させることをその目的とする。 The present invention has been made in the above situation, and in a stage apparatus and stage equipment for displaying a video object on a display unit installed on a stage, the realism of the video object and / or video quality is improved. Objective.
 本発明は、ステージと、前記ステージに光を照射することができる照明と、前記照明を制御するための照明制御手段と、前記ステージに設置された映像オブジェクトを表示する表示部と、前記映像オブジェクトを前記表示部に表示するための映像データを生成する映像データ生成手段を備えるステージ装置又はステージ施設であって、前記映像データ生成手段は、前記照明制御手段による前記照明の制御と同期して、前記映像オブジェクトの色の調整を行うことを特徴とするステージ装置又はステージ施設である。 The present invention includes a stage, illumination capable of irradiating the stage with light, illumination control means for controlling the illumination, a display unit for displaying a video object installed on the stage, and the video object Is a stage apparatus or stage facility provided with video data generation means for generating video data for displaying on the display unit, wherein the video data generation means is synchronized with the illumination control by the illumination control means, A stage apparatus or stage facility for adjusting the color of the video object.
 本明細書における「ステージ」は、現実の人や物によるパフォーマンスやプレゼンテーションが行われる場所又は空間を意味する。パフォーマンスやプレゼンテーションの種類、内容は任意である。パフォーマンス、プレゼンテーションの例には、人によるダンス、演奏、歌唱、演劇等や物品の展示等が含まれる。 “Stage” in this specification means a place or space where performances and presentations by real people and things are performed. The performance, presentation type, and content are arbitrary. Examples of performances and presentations include human dances, performances, singing, plays, etc. and display of goods.
 本明細書における「照明」は、舞台に光を照射することができる部材、機材、装置等を意味する。「照明」には、ハロゲンランプ、キセノンランプ、LED照明等を使用し得る。「照明」の個数や配置は任意である。「照明」は、その役割、機能、ステージとの位置関係などにより、スポットライト、シーリングライト、フットライト、フロントライト、ムービングライトなどの種類がある。 In this specification, “illumination” means a member, equipment, device or the like that can irradiate the stage with light. For “illumination”, a halogen lamp, a xenon lamp, an LED illumination, or the like may be used. The number and arrangement of “lights” are arbitrary. “Illumination” includes spotlights, ceiling lights, footlights, front lights, moving lights, and the like depending on their roles, functions, and positional relationship with the stage.
 本明細書における「照明制御手段」は、照明(ステージへの光の照射)を制御するための手段である。照明の「制御」の例には、照明をオン/オフ(点灯/消灯)させること、光の強度を変化させること、照明色を変化させること、照明の位置や光の照射方向(指向性)を変化させることが含まれる。「制御」は、手動により行っても良く、自動により行っても良い。手動制御の場合、オペレーターが操作するコンソール(操作盤)等により照明制御手段が構成され得る。コンソールは、オペレーターが操作を入力するためのボタン、スイッチ、ボリュームなどの操作部材を有し得る。コンピュータが予め設定されたプログラムに従って自動的に照明の制御を行う場合は、そのコンピュータにより照明制御手段が構成され得る。 “Illumination control means” in this specification is means for controlling illumination (irradiation of light to the stage). Examples of “control” of lighting include turning on / off (lighting / extinguishing) lighting, changing light intensity, changing lighting color, lighting position and light irradiation direction (directivity) Is included. “Control” may be performed manually or automatically. In the case of manual control, the illumination control means can be configured by a console (operation panel) operated by an operator. The console may have operation members such as buttons, switches, and volumes for the operator to input operations. When the computer automatically controls lighting according to a preset program, the computer can constitute lighting control means.
 本明細書における「表示部」は、映像が表示される部分、部材、装置、システム等を意味する。表示部は、CRT、LCD、プラズマディスプレイ、LEDディスプレイ又はその画面、プロジェクターの映像を投影するスクリーン等であり得る。 “Display section” in this specification means a portion, member, device, system, or the like on which an image is displayed. The display unit may be a CRT, LCD, plasma display, LED display or screen thereof, a screen for projecting a projector image, or the like.
 本明細書における「仮想オブジェクト」は、仮想空間における形状データを意味する。「仮想オブジェクト」は、複数のポリゴンにより定義され得る。「仮想オブジェクト」の形状は、3次元的であり得る。「仮想オブジェクト」は、人、動物、架空のキャラクタ、その他の物の形状を有し得る。 “Virtual object” in this specification means shape data in a virtual space. A “virtual object” can be defined by a plurality of polygons. The shape of the “virtual object” can be three-dimensional. A “virtual object” can have the shape of a person, animal, fictional character, or other object.
 本明細書における「映像オブジェクト」は、映像として表示される人や物などの物体を言う。「映像オブジェクト」は、「仮想オブジェクト」をレンダリングすることにより生成され得る。 “Video object” in this specification refers to an object such as a person or an object displayed as a video. A “video object” may be generated by rendering a “virtual object”.
 本明細書における「色」には、明度が含まれる。 “Color” in this specification includes brightness.
 本明細書における「ステージ施設」は、外壁や天井等を有する屋内型の施設であっても良く、競技場などのスペースに設置された屋外型の施設であっても良い。 In this specification, the “stage facility” may be an indoor facility having an outer wall, a ceiling, or the like, or an outdoor facility installed in a space such as a stadium.
 本発明では、照明制御手段による照明の制御と同期して、映像オブジェクトの色が調整されるため、照明によってステージ上の現実の人又は物に生じる陰影等の照明効果に対応する効果を映像オブジェクトに生じさせることができ、これにより、映像オブジェクトの実在感を高め、及び/又は、画像としての品質を高めることが可能となる。 In the present invention, since the color of the video object is adjusted in synchronization with the lighting control by the lighting control means, the effect corresponding to the lighting effect such as a shadow generated on a real person or object on the stage by the lighting is displayed. Thus, it is possible to increase the realism of the video object and / or to improve the quality as an image.
 本発明における映像オブジェクトの色の調整は、
(1)ステージに右方向から照明光を照射する(これにより、ステージ上の人又は物の右側は明るく、左側は暗くなるなどの効果を生じる)制御と同期して、映像オブジェクトの右側を明るく、左側を暗くする
(2)ステージに強い照明光を照射する(これにより、ステージ上の人又は物は明るく見えるなどの効果を生じる)制御と同期して、映像オブジェクトの色(明度)を明るくする
(3)ステージに赤や青などの着色光を照射する(これにより、ステージ上の人又は物はその照明光が当たっている部分が赤くなったり、青くなったりする効果を生じる)制御に同期して、映像オブジェクトの対応する部分を照明光に対応する色に変更する
 などの態様で行うことができる。
The adjustment of the color of the video object in the present invention is as follows.
(1) The right side of the video object is brightened in synchronism with the control of irradiating the stage with illumination light from the right direction (this produces an effect such that the right side of the person or object on the stage is bright and the left side is dark). The left side is darkened (2) The color (brightness) of the video object is brightened in synchronization with the control that irradiates the stage with strong illumination light (this produces an effect such that a person or object on the stage looks bright). (3) To irradiate the stage with colored light such as red or blue (this causes the person or object on the stage to have a red or blue effect on the part where the illumination light hits) In synchronization, the corresponding part of the video object can be changed to a color corresponding to the illumination light.
 本発明では、前記映像データ生成手段が、仮想空間に仮想オブジェクト及び仮想光源を配置する仮想空間設定手段と、前記仮装光源の影響を反映させて前記仮想オブジェクトの色の調整を行う色調整手段と、前記仮想オブジェクトをレンダリングすることにより前記映像データを生成するレンダリング手段を有することが好ましく、これにより、照明制御手段による照明の制御と同期して映像オブジェクトの色の調整をリアルタイムに行うことが容易となる。 In the present invention, the video data generation unit includes a virtual space setting unit that arranges a virtual object and a virtual light source in a virtual space, and a color adjustment unit that adjusts the color of the virtual object by reflecting the influence of the virtual light source. It is preferable to have a rendering means for generating the video data by rendering the virtual object, and this makes it easy to adjust the color of the video object in real time in synchronization with the lighting control by the lighting control means. It becomes.
 本発明では、前記仮想空間設定手段は、前記表示部と前記照明の位置関係に対応する位置関係をもって前記仮想空間に前記仮想オブジェクト及び前記仮想光源を配置すること、及び/又は、前記仮想光源の数、強度、色及び/又は方向が、前記照明の数、強度、色及び/又は方向に対応することが好ましく、これにより、照明制御手段による照明の制御により忠実に映像オブジェクトの色の調整を行うことが可能となる。 In the present invention, the virtual space setting means arranges the virtual object and the virtual light source in the virtual space with a positional relationship corresponding to the positional relationship between the display unit and the illumination, and / or the virtual light source. The number, intensity, color and / or direction preferably corresponds to the number, intensity, color and / or direction of the illumination, so that the color of the video object can be faithfully adjusted by controlling the illumination by the illumination control means. Can be done.
 本発明では、被計測者の動作を検出することにより主モーションデータを生成する主モーションデータ生成手段を更に備え、前記映像データ生成手段は、前記主モーションデータに基づいて前記映像オブジェクトが前記被計測者の前記動作と同時的に動作する映像データを生成することが好ましい。かかる発明では、その時々の状況(例えば、ステージの進行や客席の状況など)に応じた適切な態様で映像オブジェクトを動作させることが可能になる。 In the present invention, there is further provided main motion data generating means for generating main motion data by detecting a measurement subject's motion, wherein the video data generating means is configured to detect the video object based on the main motion data. It is preferable to generate video data that operates simultaneously with the aforementioned operation. In this invention, it becomes possible to operate the video object in an appropriate manner according to the situation at the time (for example, the progress of the stage or the situation of the audience seat).
 本発明では、前記被計測者が前記動作を行いつつ操作可能なコントローラと、前記コントローラへの操作に従って副モーションデータを生成する副モーションデータ生成手段を更に備え、前記映像データ生成手段は、前記副モーションデータに基づいて前記映像オブジェクトの全部又は一部の形状及び/又は模様を変化させることが好ましい。かかる発明では、被計測者は、主モーションデータの生成のための動作を行いつつ、コントローラの操作によって映像オブジェクトの全部又は一部の形状及び/又は模様を変化させるための副モーションデータを入力することが可能である。したがって、例えば、映像オブジェクトが人等の形状であり、副モーションデータによって映像オブジェクトの手の形状(手の仕草など)や頭部の形状/模様(表情や頭髪の形状など)を変化させるものとした場合、映像オブジェクトの動作に合わせた適切なタイミングで映像オブジェクトの手の仕草や表情などを変化させることが容易となる。 In the present invention, it further includes a controller that can be operated by the person being measured while performing the operation, and a sub-motion data generating unit that generates sub-motion data in accordance with an operation on the controller. It is preferable to change the shape and / or pattern of all or part of the video object based on motion data. In this invention, the person to be measured inputs sub-motion data for changing the shape and / or pattern of all or part of the video object by operating the controller while performing the operation for generating the main motion data. It is possible. Therefore, for example, the video object has a shape of a person or the like, and the shape of the video object's hand (hand gesture, etc.) or the shape / pattern of the head (expression, hair shape, etc.) is changed by sub-motion data. In this case, it becomes easy to change the gesture or expression of the hand of the video object at an appropriate timing according to the motion of the video object.
 本発明では、前記映像オブジェクトの動作を規定したプリセットモーションデータを記憶するプリセットモーションデータ記憶手段と、前記映像データ生成手段を、第1の状態と第2の状態のいずれかに切り換える切換手段を更に備え、前記映像データ生成手段は、前記第1の状態では、前記主モーションデータに基づいて前記映像オブジェクトが動作する映像データを生成し、前記第2の状態では、前記プリセットモーションデータに基づいて前記映像オブジェクトが動作する映像データを生成することが好ましい。 In the present invention, preset motion data storage means for storing preset motion data defining the operation of the video object, and switching means for switching the video data generation means between the first state and the second state are further provided. The video data generating means generates video data for operating the video object based on the main motion data in the first state, and based on the preset motion data in the second state. It is preferable to generate video data on which the video object operates.
 かかる発明では、主モーションデータに基づいて映像オブジェクトが動作する映像データが生成される状態(第1の状態)と、プリセットモーションデータに基づいて映像オブジェクトが動作する映像データが生成される状態(第2の状態)の間での切り換えが可能になる。したがって、例えば、機器の故障や、被計測者が動作の検出ができない場所に移動するなどにより、主モーションデータが適切に生成できなくなった場合であっても、第1の状態から第2の状態に切り替えることにより、映像オブジェクトが不自然又は意図しない動作を行うなどの不都合を防止することが可能であり、或いは、場面に応じて第2の状態への切り換えを行うことにより、被計測者の動作に基づく主モーションデータでは実現できないような突飛な動作等を映像オブジェクトに行わせるなどが可能である。 In this invention, a state (first state) in which video data for operating the video object is generated based on the main motion data, and a state (first state) in which video data for operating the video object is generated based on the preset motion data. Switching between two states). Therefore, even if the main motion data cannot be generated properly due to, for example, a device failure or the movement of the measurement subject to a place where the motion cannot be detected, the first state to the second state By switching to, it is possible to prevent inconvenience such as an unnatural or unintentional motion of the video object, or by switching to the second state according to the scene, It is possible to cause the video object to perform an abrupt motion or the like that cannot be realized by the main motion data based on the motion.
本発明の一実施形態に係るステージ装置10を有するステージ施設100の概略的な配置を示す説明図。Explanatory drawing which shows schematic arrangement | positioning of the stage facility 100 which has the stage apparatus 10 which concerns on one Embodiment of this invention. ステージ装置10の主要部を観客エリア240側から示す説明図。Explanatory drawing which shows the principal part of the stage apparatus 10 from the audience area 240 side. 例示的なモーションキャプチャシステム310の構成を示す説明図。FIG. 3 is an explanatory diagram showing a configuration of an exemplary motion capture system 310. 映像処理装置450が有する機能的な構成を示す説明図。3 is an explanatory diagram illustrating a functional configuration of the video processing device 450. FIG. 仮想空間設定部530により設定される例示的な仮想空間600を示す説明図。FIG. 5 is an explanatory diagram illustrating an exemplary virtual space 600 set by a virtual space setting unit 530. ステージ210に設置された表示部211に映像オブジェクト230が表示される様子を示す説明図。FIG. 4 is an explanatory diagram showing a state in which a video object 230 is displayed on a display unit 211 installed on a stage 210. データ切替部570による切り替えが行われた場合に表示部211に表示される映像オブジェクト230を示す説明図。FIG. 5 is an explanatory diagram showing a video object 230 displayed on the display unit 211 when switching is performed by a data switching unit 570.
 以下、図面に基づいて、好ましい実施形態に係るステージ装置10及びステージ施設100を説明する。 Hereinafter, the stage apparatus 10 and the stage facility 100 according to a preferred embodiment will be described based on the drawings.
 図1は、本発明の一実施形態に係るステージ装置10を有するステージ施設100の概略的な配置を示す説明図である。図2は、ステージ装置10の主要部を観客エリア240側から示す説明図である。 FIG. 1 is an explanatory diagram showing a schematic arrangement of a stage facility 100 having a stage apparatus 10 according to an embodiment of the present invention. FIG. 2 is an explanatory diagram showing the main part of the stage apparatus 10 from the audience area 240 side.
 図示のように、ステージ施設100は、会場200、モーションキャプチャルーム300及びオペレーションルーム400を有する。 As illustrated, the stage facility 100 includes a venue 200, a motion capture room 300, and an operation room 400.
 会場200は、ステージ210及び観客エリア240を有する。 The venue 200 has a stage 210 and a spectator area 240.
 ステージ210は、出演者220によるパフォーマンスや物のプレゼンテーション等を行うことができる場所である。ステージ210の形状は任意であり、平面、曲面又は立体的な形状を有し得る。 Stage 210 is a place where performers 220 can perform performances and present things. The shape of the stage 210 is arbitrary, and may have a flat surface, a curved surface, or a three-dimensional shape.
 ステージ210には、表示部211が設置されている。表示部211は、映像オブジェクト230等の映像を表示することが可能な任意の部分、部材、装置又はシステムであり、本実施形態では、プロジェクター212からの映像を表示するスクリーンが表示部211として使用される。表示部211の形状は任意であり、平面、曲面又は立体的な形状を有し得る。 The display unit 211 is installed on the stage 210. The display unit 211 is an arbitrary part, member, apparatus, or system capable of displaying a video such as the video object 230. In this embodiment, a screen that displays a video from the projector 212 is used as the display unit 211. Is done. The shape of the display unit 211 is arbitrary, and may have a flat surface, a curved surface, or a three-dimensional shape.
 プロジェクター212は、表示部211に映像を映写できる任意の位置に配置される。図のように、プロジェクター212を表示部211の背面側に配置した場合には、表示部211の前面側でパフォーマンスを行う出演者220などによりプロジェクター212の映像が遮られることを防止することができる。この場合、表示部211として、透明のシート状部材等で構成される背面投射用スクリーンが使用され得る。 The projector 212 is disposed at an arbitrary position where an image can be projected on the display unit 211. As shown in the figure, when the projector 212 is arranged on the back side of the display unit 211, it is possible to prevent the image of the projector 212 from being blocked by a performer 220 performing on the front side of the display unit 211. . In this case, a rear projection screen made of a transparent sheet-like member or the like can be used as the display unit 211.
 図では、単一の表示部211がステージ210の概略中央に設置されているが、ステージ210における表示部211の配置や個数は任意である。例えば、図示の表示部211の両側等にプロジェクションマッピングによって映像を表示する立体的な造形物からなる表示部、及び/又は、表示部211の背後等にLEDビジョンなどからなる表示部を追加的に配置するなど、ステージ210を取り囲むように複数の表示部を配置することで、ステージ210でのパフォーマンスやプレゼンテーションと表示部における映像とが一体となった演出を行うことが可能である。 In the figure, a single display unit 211 is installed at the approximate center of the stage 210, but the arrangement and number of the display units 211 on the stage 210 are arbitrary. For example, a display unit made of a three-dimensional model that displays an image by projection mapping on both sides of the display unit 211 shown in the figure, and / or a display unit made of LED vision or the like behind the display unit 211 By arranging a plurality of display units so as to surround the stage 210 such as arranging them, it is possible to perform an effect in which the performance on the stage 210 and the presentation and the video on the display unit are integrated.
 ステージ装置10乃至ステージ施設100は、ステージ210に光を照射するための照明213L,213Rを有する。図では、ステージ210の左右の天井付近にそれぞれ1つの照明213L,213Rが示されているが、照明は、ステージ210に光を照射し得る他の任意の位置に配置することが可能であり、照明の個数は1つでも、3つ以上でも良い。 The stage apparatus 10 to the stage facility 100 have illuminations 213L and 213R for irradiating the stage 210 with light. In the figure, one illumination 213L and 213R is shown near the left and right ceilings of the stage 210, respectively, but the illumination can be arranged at any other position where the stage 210 can be irradiated with light. The number of lights may be one or three or more.
 好ましい実施形態では、ステージ装置10乃至ステージ施設100は、音楽等のサウンドを出力する一又は複数のスピーカー214、SFX的な演出を行うためのスモークマシン215やレーザーデバイス216等の演出機器を更に有し得る。 In the preferred embodiment, the stage apparatus 10 to the stage facility 100 further include stage equipment such as one or a plurality of speakers 214 that output sound such as music, a smoke machine 215 for performing a SFX-like stage, and a laser device 216. Can do.
 観客エリア240は、観客がステージ210で行われるパフォーマンスを見るための場所である。観客エリア240は、観客が着席できる一又は複数の座席241を有し得る。 The audience area 240 is a place for the audience to see the performance performed at the stage 210. The audience area 240 may have one or more seats 241 on which the audience can sit.
 ステージ装置10乃至ステージ施設100は、観客エリア240を撮影する撮影手段217を更に有し得る。撮影手段217は、観客エリア240の全体又は特定の部分を適切に撮影できるように、例えば、ステージ210中央の天井付近等の適宜の位置に配置される。 The stage apparatus 10 to the stage facility 100 may further include a photographing unit 217 for photographing the audience area 240. The photographing means 217 is arranged at an appropriate position such as near the ceiling at the center of the stage 210 so that the entire audience area 240 or a specific part can be appropriately photographed.
 モーションキャプチャルーム300には、被計測者(モーションキャプチャアクター)320が行うダンスなどの動作を検出(キャプチャ)することにより主モーションデータを生成するためのモーションキャプチャ(MC)システム310が配置される。被計測者320の動作の検出は、光学式、機械式、磁気式など、任意の方式で行うことが可能である。 In the motion capture room 300, a motion capture (MC) system 310 for generating main motion data by detecting (capturing) an action such as a dance performed by a measurement subject (motion capture actor) 320 is arranged. The operation of the measurement subject 320 can be detected by an arbitrary method such as an optical method, a mechanical method, or a magnetic method.
 図3は、例示的なモーションキャプチャシステム310の構成を示す説明図である。 FIG. 3 is an explanatory diagram showing a configuration of an exemplary motion capture system 310.
 図示のように、モーションキャプチャシステム310は、被計測者320の手足、頭、腰など適宜の箇所に装着される複数のマーカー311と、所定のモーションエリア(被計測者320が動作を行うエリア)312の周囲(例えば、上下前後左右)に配置されたマーカー311を検出可能な一又は複数のセンサー313を有する。 As shown in the figure, the motion capture system 310 includes a plurality of markers 311 attached to appropriate positions such as a limb, a head, and a waist of the measured person 320, and a predetermined motion area (an area where the measured person 320 operates). One or a plurality of sensors 313 capable of detecting markers 311 arranged around 312 (for example, up, down, front, back, left, and right) are provided.
 センサー313は、無線又は有線により処理装置314に接続されており、処理装置314がセンサー313からのデータに基づいて各マーカー311の3次元的な位置を算出することにより主モーションデータが生成される。 The sensor 313 is connected to the processing device 314 wirelessly or by wire, and the processing device 314 calculates the three-dimensional position of each marker 311 based on the data from the sensor 313, thereby generating main motion data. .
 主モーションデータは、被計測者320の動作に対応して、仮想空間600に配置する仮想オブジェクト610(図5参照)の位置や姿勢などを決定するためのデータであり、本実施形態では、上記処理により、仮想オブジェクトの基準点(例えば、仮想オブジェクト610に固定されたオブジェクト座標系の原点)の位置、及び、当該基準点に対する仮想オブジェクト610を構成するパーツ(頭部611、手部612、足部613など)の相対位置のデータが生成される。 The main motion data is data for determining the position and posture of the virtual object 610 (see FIG. 5) to be arranged in the virtual space 600 in accordance with the operation of the person 320 to be measured. Through the processing, the position of the reference point of the virtual object (for example, the origin of the object coordinate system fixed to the virtual object 610) and the parts (head 611, hand 612, foot of the virtual object 610 with respect to the reference point) Data of the relative position of the unit 613) is generated.
 処理装置314は、CPU及び記憶装置を有するコンピュータで構成することが可能であり、センサー313は、カメラなどの撮影手段で構成することが可能である。センサー313の個数、配置は、マーカー311の位置検出の精度等を勘案して適宜設定される。 The processing device 314 can be composed of a computer having a CPU and a storage device, and the sensor 313 can be composed of photographing means such as a camera. The number and arrangement of the sensors 313 are appropriately set in consideration of the accuracy of position detection of the marker 311 and the like.
 モーションキャプチャシステム310は、副モーションデータを生成する手段として、被計測者320により操作されるコントローラ315L,315Rを有し得る。コントローラ315L,315Rの仕様、個数等は任意であるが、本実施形態では、片手で把持/操作が可能な左右のハンディコントローラ315L,315Rが使用されており、被計測者320は、ダンス等の動作を行いつつ、ハンディコントローラ315L,315Rから副モーションデータを入力することが可能である。ハンディコントローラ315L,315Rは、ボタンやジョイスティック等の操作部材を有する。ハンディコントローラ315L,315Rは、好ましくは、無線により処理装置314と接続される。 The motion capture system 310 can have controllers 315L and 315R operated by the person 320 to be measured as means for generating sub motion data. The specifications, number, etc. of the controllers 315L, 315R are arbitrary, but in this embodiment, the left and right handy controllers 315L, 315R that can be gripped / operated with one hand are used, and the person 320 to be measured can It is possible to input sub motion data from the handy controllers 315L and 315R while performing the operation. The handy controllers 315L and 315R have operation members such as buttons and joysticks. The handy controllers 315L and 315R are preferably connected to the processing device 314 wirelessly.
 副モーションデータは、仮想オブジェクト610の全体又は一部の形状及び/又は模様などを決定するためのデータであり、本実施形態では、ハンディコントローラ315L,315Rのボタンへの操作により入力されたデータは手の仕草のデータとして使用され、ジョイスティックへの操作により入力されたデータは表情や頭髪の形状のデータとして使用される。 The sub motion data is data for determining the shape and / or pattern of the whole or a part of the virtual object 610. In the present embodiment, the data input by operating the buttons of the handy controllers 315L and 315R is It is used as hand gesture data, and the data input by operating the joystick is used as facial expression and hair shape data.
 好ましい実施形態では、モーションエリア312の正面に、撮影手段217で撮影した観客エリア240のライブ映像を表示する表示装置330が配置される。この場合、被計測者320は、表示装置330の映像を見ながら動作を行い、或いは、ハンディコントローラ315L,315Rを操作することが可能である。これにより、被計測者320は、観客エリア240のリアルタイムの状況に応じて主モーションデータ及び/又は副モーションデータを入力することが可能となり、例えば、映像オブジェクト230が特定の観客と掛け合いを行うなど、観客エリア240の状況に応じて動作する映像オブジェクト230の映像データを生成することが可能となる。 In a preferred embodiment, a display device 330 that displays live video of the spectator area 240 photographed by the photographing means 217 is disposed in front of the motion area 312. In this case, the person 320 to be measured can operate while viewing the video on the display device 330 or can operate the handy controllers 315L and 315R. As a result, the person 320 to be measured can input the main motion data and / or the sub-motion data according to the real-time situation of the spectator area 240. For example, the video object 230 interacts with a specific spectator. It becomes possible to generate video data of the video object 230 that operates according to the situation of the audience area 240.
 なお、以下では、モーションキャプチャシステム310により生成される主モーションデータ及び副モーションデータを併せて、「リアルタイムモーションデータ」と言う。 In the following, the main motion data and sub-motion data generated by the motion capture system 310 are collectively referred to as “real-time motion data”.
 オペレーションルーム400には、照明コンソール410,映像コンソール420,サウンドコンソール430,SFXコンソール440、映像処理装置450など、オペレーター460がステージ施設100に設置される各種の機器の制御操作を行うための機器が配置される。 In the operation room 400, there are devices for the operator 460 to control various devices installed in the stage facility 100 such as a lighting console 410, a video console 420, a sound console 430, an SFX console 440, and a video processing device 450. Be placed.
 照明コンソール410は、照明213L,213Rの制御を行うためのものであり、オペレーター460が照明コンソール410に対して所定の操作を行うことにより、照明213L,213Rを個別に点灯/消灯させ、強度(明るさ)及び/又は色を変化させるなどの制御を行うことが可能である。本実施形態では、オペレーター460の操作に基づく制御信号S1が照明コンソール410から照明213L,213Rに送信され、この制御信号S1に基づいて照明213L,213Rが制御される。照明コンソール410において照明213L,213Rの制御が行われた場合、制御が行われたこと、或いは、制御の内容を通知するための通知信号S2が映像生成装置450に送信される。通知信号S2は、制御信号S1と同一の信号でも別個の信号でも良く、制御信号S1を分岐した信号を通知信号S2として使用することも可能である。 The lighting console 410 is for controlling the lighting 213L and 213R. When the operator 460 performs a predetermined operation on the lighting console 410, the lighting 213L and 213R are individually turned on / off, and the intensity ( It is possible to perform control such as changing brightness) and / or color. In the present embodiment, a control signal S1 based on the operation of the operator 460 is transmitted from the lighting console 410 to the lights 213L and 213R, and the lights 213L and 213R are controlled based on the control signal S1. When the lighting consoles 213L and 213R are controlled in the lighting console 410, a notification signal S2 for notifying the control or the contents of the control is transmitted to the video generation device 450. The notification signal S2 may be the same signal as the control signal S1 or a separate signal, and a signal obtained by branching the control signal S1 can be used as the notification signal S2.
 映像コンソール420は、映像生成装置450により生成される映像データに関する制御を行うためのものであり、本実施形態では、オペレーター460が映像コンソール420に対して所定の操作を行うことにより、リアルタイムモーションデータに基づいて映像オブジェクト230が動作する映像データが生成される第1の状態と、プリセットモーションデータに基づいて映像オブジェクト230が動作する映像データが生成される第2の状態の切り替えを行うことが可能である。映像コンソール420において上記制御が行われた場合、制御が行われたこと、或いは、制御の内容を通知する通知信号S3が映像生成装置450に送信される。プリセットモーションデータについては後述する。 The video console 420 is for performing control related to video data generated by the video generation device 450. In the present embodiment, the operator 460 performs a predetermined operation on the video console 420, thereby realizing real-time motion data. It is possible to switch between a first state in which video data for operating the video object 230 is generated based on the video and a second state in which video data for operating the video object 230 is generated based on the preset motion data. It is. When the above-described control is performed in the video console 420, a notification signal S3 notifying that the control has been performed or the contents of the control is transmitted to the video generation device 450. The preset motion data will be described later.
 サウンドコンソール430は、スピーカー214から出力するサウンドの制御を行うものであり、適宜の媒体に記憶された音源データがサウンドコンソール430で指定されるタイミング、音量等でスピーカー214から再生される。 The sound console 430 controls the sound output from the speaker 214, and the sound source data stored in an appropriate medium is reproduced from the speaker 214 at the timing, volume, etc. designated by the sound console 430.
 スモークマシン215やレーザーデバイス216などのSFX関連の演出機器は、SFXコンソール440への操作によって制御される。 SFX-related production equipment such as the smoke machine 215 and the laser device 216 is controlled by operating the SFX console 440.
 映像処理装置450は、照明コンソール410、映像コンソール420及びモーションキャプチャシステム310に接続されており、これらから受信する通知信号S2,S3やデータ等に基づいて表示部211に表示する映像データを生成する処理を行うものである。映像処理装置450は、CPU及び映像データの生成に必要となる各種のプログラムやデータを記憶した記憶装置を備えるコンピュータで構成することができる。記憶装置は、ROM、RAM、ハードディスク、フラッシュメモリ等の公知の記憶装置を単独又は組み合わせて構成することができる。 The video processing device 450 is connected to the lighting console 410, the video console 420, and the motion capture system 310, and generates video data to be displayed on the display unit 211 based on the notification signals S2, S3 and data received from these. The processing is performed. The video processing device 450 can be configured by a CPU and a computer including a storage device that stores various programs and data necessary for generating video data. The storage device can be configured by a known storage device such as a ROM, a RAM, a hard disk, or a flash memory, alone or in combination.
 本実施形態のステージ装置10は、ステージ210、照明213L,213R、照明コンソール410、映像処理装置450等から構成される。 The stage apparatus 10 of this embodiment includes a stage 210, illuminations 213L and 213R, an illumination console 410, a video processing apparatus 450, and the like.
 図4は、映像処理装置450が有する機能的な構成を示す説明図である。 FIG. 4 is an explanatory diagram showing a functional configuration of the video processing device 450.
 図示のように、映像処理装置450は、オブジェクト記憶部510、テクスチャ記憶部520、仮想空間設定部530、色調整部540、レンダリング部550、プリセットモーションデータ記憶部560、データ切替部570などからなる映像データ生成部500を有する。 As illustrated, the video processing device 450 includes an object storage unit 510, a texture storage unit 520, a virtual space setting unit 530, a color adjustment unit 540, a rendering unit 550, a preset motion data storage unit 560, a data switching unit 570, and the like. A video data generation unit 500 is included.
 オブジェクト記憶部510は、仮想空間に配置する一又は複数の仮想オブジェクト610を定義するオブジェクトデータを記憶するものであり、本実施形態では、人又はキャラクタ等の形状の仮想オブジェクト610についてのオブジェクトデータが記憶されている。仮想オブジェクト610を定義する形式は任意であるが、例えば、仮想オブジェクト610に固定されたローカル座標系における仮想オブジェクト610を構成するポリゴンの位置座標により仮想オブジェクト610を定義することができる。仮想オブジェクト610の姿勢は、ローカル座標系において仮想オブジェクト610を構成するパーツ(頭部611、手部612、足部613など)の配置を変化させることによって変化させることが可能である。仮想オブジェクト610の手の仕草、表情、頭髪の形状等を変化させることを可能にするため、オブジェクト記憶部510には、頭部611及び手部612について複数種類の形状が記憶されている。 The object storage unit 510 stores object data that defines one or a plurality of virtual objects 610 to be arranged in the virtual space. In the present embodiment, object data about a virtual object 610 having a shape such as a person or a character is stored. It is remembered. The format for defining the virtual object 610 is arbitrary. For example, the virtual object 610 can be defined by the position coordinates of polygons constituting the virtual object 610 in the local coordinate system fixed to the virtual object 610. The posture of the virtual object 610 can be changed by changing the arrangement of the parts (the head 611, the hand 612, the foot 613, etc.) constituting the virtual object 610 in the local coordinate system. In order to make it possible to change the gesture, facial expression, hair shape, and the like of the virtual object 610, the object storage unit 510 stores a plurality of types of shapes for the head 611 and the hand 612.
 テクスチャ記憶部520は、仮想オブジェクト610にマッピングするためのテクスチャを記憶するものである。テクスチャは、仮想オブジェクト610の模様、色、透明度等を規定するためのビットマップなどの形式で描画された画像データであり、仮想オブジェクト610の表面にテクスチャをマッピングすることで、仮想オブジェクト610に表情や洋服などのディテールを付与することが可能となる。仮想オブジェクト610の表情等を変化させることを可能にするため、テクスチャ記憶部520には、頭部611等について複数種類のテクスチャが記憶されている。 The texture storage unit 520 stores a texture for mapping to the virtual object 610. The texture is image data drawn in the form of a bitmap or the like for defining the pattern, color, transparency, etc. of the virtual object 610. By mapping the texture on the surface of the virtual object 610, a facial expression is displayed on the virtual object 610. It is possible to give details such as clothes and clothes. In order to make it possible to change the facial expression or the like of the virtual object 610, the texture storage unit 520 stores a plurality of types of textures for the head 611 and the like.
 仮想空間設定部530は、図5に例示する態様で、仮想空間600に仮想オブジェクト610、仮想光源620L,620R及び仮想視点を配置する処理を実行する。 The virtual space setting unit 530 executes processing for arranging the virtual object 610, the virtual light sources 620L and 620R, and the virtual viewpoint in the virtual space 600 in the manner illustrated in FIG.
 ここで、仮想オブジェクト610は、モーションキャプチャシステム310からストリーミングで(リアルタイムに)供給される主モーションデータにより決定される位置及び姿勢で仮想空間に配置される。すなわち、仮想空間600における仮想オブジェクト610の基準点の位置座標及び基準点に対する頭部611、手部612、足部613などのパーツの配置がリアルタイムの主モーションデータにより決定される。これにより、被計測者320の動作が仮想オブジェクト610の動作にリアルタイムに反映され、仮想オブジェクト610は、被計測者320の動作と同時的に、被計測者320の動作と同様又は対応する動作を行うことになる。 Here, the virtual object 610 is arranged in the virtual space at the position and the posture determined by the main motion data supplied from the motion capture system 310 by streaming (in real time). That is, the position coordinates of the reference point of the virtual object 610 in the virtual space 600 and the arrangement of parts such as the head 611, the hand 612, and the foot 613 with respect to the reference point are determined by the real-time main motion data. As a result, the motion of the measured person 320 is reflected in real time on the motion of the virtual object 610, and the virtual object 610 performs the same or corresponding action as the measured person 320 simultaneously with the motion of the measured person 320. Will do.
 また、仮想オブジェクト610の頭部611及び手部612には、ハンディコントローラ315L,315Rから入力された副モーションデータに応じた形状/模様のオブジェクトデータ及びテクスチャデータが適用される。これにより、枠621に示すように頭部611の表情及び/又は頭髪の形状を様々に変化させ、枠622に示すように手部612の形状(手の仕草)を様々に変化させることが可能となる。 Further, object data and texture data having a shape / pattern corresponding to the sub motion data input from the handy controllers 315L and 315R are applied to the head 611 and the hand 612 of the virtual object 610. Thereby, the expression of the head 611 and / or the shape of the hair can be changed variously as shown in the frame 621, and the shape (hand gesture) of the hand portion 612 can be changed variously as shown in the frame 622. It becomes.
 仮想視点は、仮想空間600の適宜の位置及び方向に設定される。仮想視点の位置及び方向は、固定でも可動でも良いが、ここでは、図5の紙面手前側の所定の位置に紙面奧方向に向いた仮想視点が設定されるものとして説明する。 The virtual viewpoint is set at an appropriate position and direction in the virtual space 600. The position and direction of the virtual viewpoint may be fixed or movable, but here, a description will be given assuming that the virtual viewpoint oriented in the direction of the paper surface is set at a predetermined position on the front side of the paper surface in FIG.
 仮想光源620L,620Rの配置及び/又は個数は、照明コンソール410において行われた照明213L,213Rの制御に応じて決定され得る。例えば、照明コンソール410において、観客エリア240から見てステージ210の右側の照明213Rを点灯し、左側の照明213Lを消灯する制御が行われた場合には、仮想視点から見て仮想オブジェクト610の右側にのみ仮想光源620Rを配置し、左右両方の照明213L,213Rを点灯する制御が行われた場合には、仮想オブジェクト610の左右両側に仮想光源620L,620Rを配置するなどが可能である。 The arrangement and / or number of the virtual light sources 620L and 620R can be determined according to the control of the lights 213L and 213R performed in the lighting console 410. For example, when the lighting console 410 is controlled to turn on the right side illumination 213R of the stage 210 as viewed from the audience area 240 and turn off the left side illumination 213L, the right side of the virtual object 610 viewed from the virtual viewpoint. If the control is performed so that both the left and right illuminations 213L and 213R are turned on, the virtual light sources 620L and 620R can be arranged on both the left and right sides of the virtual object 610.
 照明コンソール410での操作により、照明213L,213Rの色、強度、指向性、位置等を変化させる制御が可能である場合には、これらの制御に対応して、仮想光源620L,620Rの色、強度、指向性、位置等のパラメータを制御することが可能である。 When the control of changing the color, intensity, directivity, position, etc. of the lights 213L, 213R is possible by the operation on the lighting console 410, the colors of the virtual light sources 620L, 620R corresponding to these controls, Parameters such as intensity, directivity, and position can be controlled.
 上記による仮想光源620,620Rの配置、移動、パラメータの制御等は、照明コンソール410における照明213L,213Rの制御と同期して行われる。同期は、照明コンソール410から出力された通知信号S2に基づいて自動的に行われる。 The placement, movement, parameter control, and the like of the virtual light sources 620 and 620R as described above are performed in synchronization with the control of the lights 213L and 213R in the lighting console 410. Synchronization is automatically performed based on the notification signal S2 output from the lighting console 410.
 色調整部540は、仮想光源620L,620R(仮想光源620L,620Rからの仮想光線)の影響を反映させて仮想オブジェクト610の色の調整を行うものである。具体的には、仮想オブジェクト610に仮想光源620L,620Rによる陰影614を生じさせ、或いは、仮想オブジェクト610の色を仮想光源620L,620Rの色に対応して変化させるなどの処理が行われる。このような色の調整に際して、フォンシェーディングやグローシェーディングなどの公知の手法を使用することが可能である。 The color adjustment unit 540 adjusts the color of the virtual object 610 by reflecting the influence of the virtual light sources 620L and 620R (virtual light rays from the virtual light sources 620L and 620R). Specifically, a process such as causing a shadow 614 by the virtual light sources 620L and 620R in the virtual object 610 or changing the color of the virtual object 610 corresponding to the colors of the virtual light sources 620L and 620R is performed. In such color adjustment, a known method such as von shading or glow shading can be used.
 レンダリング部550は、色調整部540により色の調整が行われた仮想オブジェクト610を仮想視点から見た時の画像を所定のフレーム時間毎にレンダリングすることにより表示部211に表示する映像オブジェクト230の映像データを生成するものである。レンダリングは、投影変換、ラスタライズ、陰面消去等の公知の手法を用いて行うことが可能である。レンダリング部550により生成された映像データは、プロジェクター212に出力され、これにより、表示部211に動画としての映像オブジェクト230が表示される。 The rendering unit 550 renders an image of the video object 230 displayed on the display unit 211 by rendering an image when the virtual object 610 whose color has been adjusted by the color adjustment unit 540 is viewed from a virtual viewpoint at every predetermined frame time. Video data is generated. Rendering can be performed using a known method such as projection conversion, rasterization, and hidden surface removal. The video data generated by the rendering unit 550 is output to the projector 212, whereby the video object 230 as a moving image is displayed on the display unit 211.
 図6は、レンダリング部550によりレンダリングされた映像オブジェクト230がステージ210に設置された表示部211に表示される様子を示す説明図である。 FIG. 6 is an explanatory diagram showing a state where the video object 230 rendered by the rendering unit 550 is displayed on the display unit 211 installed on the stage 210.
 図示の例のように、照明コンソール410への操作によって、左側の照明213Lを消灯し、右側の照明213Rのみを点灯させる制御が行われた場合、ステージ210上の出演者220には、照明213Rからの光による陰影221等の効果が生じるが、上記制御に同期して、仮想オブジェクト610の右側に配置された仮想光源620Rの影響を反映させて仮想オブジェクト610の色の調整が行われる結果、映像オブジェクト230にも対応する陰影231等の効果を生じることになる。このように、映像オブジェクト230が現実の照明213L,213Rによって照明されているかのような効果(出演者220と映像オブジェクト230が共通の照明213L,213Rによって照明されているかのような効果)を映像オブジェクト230に生じさせることにより、映像オブジェクト230の実在感を高めることができる。 As shown in the example, when the lighting console 410 is operated to turn off the left side lighting 213L and turn on only the right side lighting 213R, the performer 220 on the stage 210 has the lighting 213R. As a result of adjusting the color of the virtual object 610 by reflecting the influence of the virtual light source 620R arranged on the right side of the virtual object 610 in synchronization with the above control, the effect of the shadow 221 and the like by the light from An effect such as a shadow 231 corresponding to the video object 230 is also produced. In this way, the effect is as if the video object 230 is illuminated by the actual illumination 213L, 213R (the effect as if the performer 220 and the video object 230 are illuminated by the common illumination 213L, 213R). By causing the object 230 to occur, the realism of the video object 230 can be enhanced.
 図4に戻って、プリセットモーションデータ記憶部560は、一又は複数種類のプリセットモーションデータを記憶するものである。プリセットモーションデータは、所定時間に渡る仮想オブジェクト610の動作を規定するものであり、本実施形態では、仮想空間600における仮想オブジェクト610の位置、姿勢、頭部611及び手部612の形状/模様(手の仕草、表情、頭髪の形状等)を時系列で規定するデータが使用される。 Referring back to FIG. 4, the preset motion data storage unit 560 stores one or more types of preset motion data. The preset motion data defines the motion of the virtual object 610 over a predetermined time. In this embodiment, the position and posture of the virtual object 610 in the virtual space 600, and the shapes / patterns of the head 611 and the hand 612 ( Data that defines the hand gesture, facial expression, hair shape, etc.) in time series is used.
 データ切替部570は、映像コンソール420に行われた操作に応答して、映像生成部500が、リアルタイムモーションデータに基づいて映像オブジェクト230の映像データを生成する第1の状態と、プリセットモーションデータに基づいて映像オブジェクト230の映像データを生成する第2の状態の切り替えを行うものであり、データ切替部570は、第1の状態では、モーションキャプチャシステム310からのリアルタイムモーションデータを仮想空間設定部550に供給し、第2の状態では、プリセットモーションデータ記憶部560のプリセットモーションデータを仮想空間設定部550に供給する。第1の状態において仮想空間設定部530、色調整部540及びレンダリング部550により実行される処理は、上述の通りである。第2の状態では、仮想空間設定部530による仮想空間600における仮想オブジェクト610の配置は、プリセットモーションデータ記憶部560のプリセットモーションデータに基づいて行われるが、この点を除いて、仮想空間設定部530、色調整部540及びレンダリング部550は、第1の状態と同様の態様で、仮想光源620L,620Rの配置、色の調整、レンダリング等の処理を実行し、これにより、プリセットモーションデータに基づいて動作する映像オブジェクト230の映像データがプロジェクター212に出力される。 In response to an operation performed on the video console 420, the data switching unit 570 includes a first state in which the video generation unit 500 generates video data of the video object 230 based on real-time motion data, and preset motion data. Based on this, the second state of generating video data of the video object 230 is switched. In the first state, the data switching unit 570 receives the real-time motion data from the motion capture system 310 in the virtual space setting unit 550. In the second state, the preset motion data stored in the preset motion data storage unit 560 is supplied to the virtual space setting unit 550. The processes executed by the virtual space setting unit 530, the color adjustment unit 540, and the rendering unit 550 in the first state are as described above. In the second state, the placement of the virtual object 610 in the virtual space 600 by the virtual space setting unit 530 is performed based on the preset motion data in the preset motion data storage unit 560, except for this point, the virtual space setting unit The color adjustment unit 540, the color adjustment unit 540, and the rendering unit 550 execute processing such as the placement of the virtual light sources 620L and 620R, color adjustment, and rendering in the same manner as in the first state, and thereby based on the preset motion data. The video data of the video object 230 that operates in this manner is output to the projector 212.
 図7は、データ切替部570による切り替えが行われた場合にレンダリングされる映像オブジェクト230を示す説明図であり、図下部の矢印は、時間の経過を表している。 FIG. 7 is an explanatory diagram showing a video object 230 that is rendered when switching is performed by the data switching unit 570, and arrows at the bottom of the figure indicate the passage of time.
 この例では、時点T0からT1までの期間は、第1の状態であり、リアルタイムモーションデータに基づいて動作する映像オブジェクト2301,2302がレンダリングされ、時点T1においてデータ切替部570により第2の状態への切り替えが行われることにより、以降、時刻T3までプリセットモーションデータに基づいて動作する映像オブジェクト2303~2305がレンダリングされ、時点T3においてデータ切替部570により第1の状態への切り替えが行われることにより、以降、リアルタイムモーションデータに基づいて動作する映像オブジェクト2306,2307がレンダリングされる。 In this example, the period from the time point T0 to T1 is the first state, and the video objects 2301 and 2302 that operate based on the real-time motion data are rendered, and the data switching unit 570 changes to the second state at the time point T1. Thereafter, the video objects 2303 to 2305 that operate based on the preset motion data are rendered until the time T3, and the data switching unit 570 switches to the first state at the time T3. Thereafter, video objects 2306 and 2307 operating based on real-time motion data are rendered.
 第1の状態と第2の状態の切り換えをシームレスに行うため、すなわち、映像オブジェクト230の動作に不連続を生じさせることなく第1の状態と第2の状態の切り換えを行うため、切り換えの前後でリアルタイムモーションデータとプリセットモーションデータの間でのデータ補間を行うことが可能である。例えば、時点T1における第1の状態から第2の状態への切り換えに際しては、切り換えの時点T1における映像オブジェクト2302のレンダリングに使用したリアルタイムモーションデータと、切り換え直後の映像オブジェクト2303のレンダリングに使用されるプリセットモーションデータの間でデータ補間を行い、時刻T1からプリセットモーションデータに基づく最初の映像オブジェクト2303がレンダリングされる時点T2までの期間は、このデータ補間により生成したモーションデータに基づいて動作する映像オブジェクト230をレンダリングすることが可能である。 Before and after the switching in order to seamlessly switch between the first state and the second state, that is, to switch between the first state and the second state without causing discontinuity in the operation of the video object 230. It is possible to perform data interpolation between real-time motion data and preset motion data. For example, when switching from the first state to the second state at the time T1, the real-time motion data used for rendering the video object 2302 at the switching time T1 and the rendering of the video object 2303 immediately after the switching are used. Data interpolation is performed between the preset motion data, and a video object that operates based on the motion data generated by this data interpolation is a period from time T1 to time T2 when the first video object 2303 based on the preset motion data is rendered. 230 can be rendered.
 以上、例示的な実施形態に基づいて本発明を説明したが、上記実施形態におけるステージ装置及びステージ施設又はこれを構成する部材、装置、システム、機能構成、パラメータ、ステージ装置及びステージ施設において実行される処理の内容等は単なる例として説明したものであり、これらは特許請求の範囲の記載内において任意に変更することが可能である。 As described above, the present invention has been described based on the exemplary embodiment. However, the present invention is executed in the stage apparatus and the stage facility or the members, apparatuses, systems, functional configurations, parameters, the stage apparatus and the stage facility in the above embodiment. The contents of the processing are described only as examples, and these can be arbitrarily changed within the scope of the claims.
 例えば、上記実施形態では、照明213L,213Rの制御及び第1の状態と第2の状態の切り換えが照明コンソール410及び映像コンソール420への操作に基づいて行われる場合を説明したが、コンピュータ等により構成される手段を用いて、予めプログラムされたシーケンスに従って自動的に照明213L,213Rの制御及び/又は第1の状態と第2の状態の切り換えを行うことも可能である。 For example, in the above embodiment, the case where the control of the lighting 213L and 213R and the switching between the first state and the second state is performed based on the operation to the lighting console 410 and the video console 420 has been described. It is also possible to automatically control the lighting 213L, 213R and / or switch between the first state and the second state according to a pre-programmed sequence using the means configured.
 また、上記実施形態では、照明制御手段(照明コンソール410)から出力された制御信号S1を介して照明213L,213Rの制御が行われる場合を説明したが、照明213L,213Rの制御に制御信号を使用するか否かは任意である。例えば、照明制御手段において、スイッチ操作やボリューム調整等によって、照明213L,213Rへの給電をオン/オフし、或いは、給電量を調整するなど、制御信号を使用せずに照明213L,213Rの制御を行うことも可能であり、この場合も、照明213L,213Rの制御が行われたこと、或いは、制御の内容を映像生成手段(映像処理装置450、映像処理データ生成部500)に通知することにより、上記実施形態と同様の態様で照明の制御と同期した映像オブジェクトの色の調整を行うことが可能である。 Moreover, although the said embodiment demonstrated the case where control of illumination 213L, 213R was performed via control signal S1 output from the illumination control means (illumination console 410), a control signal is used for control of illumination 213L, 213R. Whether or not to use is arbitrary. For example, the lighting control means controls the lighting 213L and 213R without using a control signal such as turning on / off the power supply to the lighting 213L and 213R or adjusting the power supply amount by switch operation or volume adjustment. In this case as well, the control of the illuminations 213L and 213R is performed, or the content of the control is notified to the video generation means (video processing device 450, video processing data generation unit 500). Thus, the color of the video object can be adjusted in synchronization with the illumination control in the same manner as in the above embodiment.
10・・・ステージ装置
100・・・ステージ施設
210・・・ステージ
211・・・表示部
212・・・プロジェクター
213L,213R・・・照明
214・・・スピーカー
217・・・撮影手段
230・・・映像オブジェクト
310・・・モーションキャプチャシステム
400・・・オペレーションルーム
410・・・照明コンソール
420・・・映像コンソール
450・・・映像処理装置
500・・・映像データ生成部
510・・・オブジェクト記憶部
520・・・テクスチャ記憶部
530・・・仮想空間設定部
540・・・色調整部
550・・・レンダリング部
550・・・仮想空間設定部
560・・・プリセットモーションデータ記憶部
570・・・データ切替部
600・・・仮想空間
610・・・仮想オブジェクト
620L,620R・・・仮想光源
DESCRIPTION OF SYMBOLS 10 ... Stage apparatus 100 ... Stage facility 210 ... Stage 211 ... Display unit 212 ... Projector 213L, 213R ... Illumination 214 ... Speaker 217 ... Imaging means 230 ... Video object 310 ... motion capture system 400 ... operation room 410 ... lighting console 420 ... video console 450 ... video processing device 500 ... video data generation unit 510 ... object storage unit 520 ... Texture storage unit 530 ... Virtual space setting unit 540 ... Color adjustment unit 550 ... Rendering unit 550 ... Virtual space setting unit 560 ... Preset motion data storage unit 570 ... Data switching Part 600 ... virtual space 610 ... virtual object 620L, 20R ··· virtual light source

Claims (8)

  1.  ステージと、
     前記ステージに光を照射することができる照明と、
     前記照明を制御するための照明制御手段と、
     前記ステージに設置された映像オブジェクトを表示する表示部と、
     前記映像オブジェクトを前記表示部に表示するための映像データを生成する映像データ生成手段を備えるステージ装置であって、
     前記映像データ生成手段は、前記照明制御手段による前記照明の制御と同期して、前記映像オブジェクトの色の調整を行うことを特徴とするステージ装置。
    Stage,
    Illumination capable of irradiating the stage with light;
    Illumination control means for controlling the illumination;
    A display unit for displaying a video object installed on the stage;
    A stage apparatus comprising video data generating means for generating video data for displaying the video object on the display unit,
    The stage apparatus characterized in that the video data generation means adjusts the color of the video object in synchronization with the illumination control by the illumination control means.
  2.  前記映像データ生成手段が、
     仮想空間に仮想オブジェクト及び仮想光源を配置する仮想空間設定手段と、
     前記仮装光源の影響を反映させて前記仮想オブジェクトの色の調整を行う色調整手段と、
     前記仮想オブジェクトをレンダリングすることにより前記映像データを生成するレンダリング手段を有することを特徴とする請求項1に記載のステージ装置。
    The video data generation means is
    Virtual space setting means for arranging a virtual object and a virtual light source in the virtual space;
    Color adjustment means for adjusting the color of the virtual object by reflecting the influence of the virtual light source;
    The stage apparatus according to claim 1, further comprising: a rendering unit that generates the video data by rendering the virtual object.
  3.  前記仮想空間設定手段は、前記表示部と前記照明の位置関係に対応する位置関係をもって前記仮想空間に前記仮想オブジェクト及び前記仮想光源を配置することを特徴とする請求項2に記載のステージ装置。 3. The stage apparatus according to claim 2, wherein the virtual space setting unit arranges the virtual object and the virtual light source in the virtual space with a positional relationship corresponding to a positional relationship between the display unit and the illumination.
  4.  前記仮想光源の数、強度、色及び/又は方向が、前記照明の数、強度、色及び/又は方向に対応することを特徴とする請求項2又は3に記載のステージ装置。 4. The stage apparatus according to claim 2, wherein the number, intensity, color, and / or direction of the virtual light source corresponds to the number, intensity, color, and / or direction of the illumination.
  5.  被計測者の動作を検出することにより主モーションデータを生成する主モーションデータ生成手段を更に備え、
     前記映像データ生成手段は、前記主モーションデータに基づいて前記映像オブジェクトが前記被計測者の前記動作と同時的に動作する映像データを生成することを特徴とする請求項1~4のいずれか一項に記載のステージ装置。
    Further comprising main motion data generating means for generating main motion data by detecting the measurement subject's movement;
    5. The video data generating means generates video data in which the video object operates simultaneously with the movement of the measurement subject based on the main motion data. The stage apparatus according to item.
  6.  前記被計測者が前記動作を行いつつ操作可能なコントローラと、
     前記コントローラへの操作に従って副モーションデータを生成する副モーションデータ生成手段を更に備え、
     前記映像データ生成手段は、前記副モーションデータに基づいて前記映像オブジェクトの全部又は一部の形状及び/又は模様を変化させることを特徴とする請求項5に記載のステージ装置。
    A controller operable by the person being measured while performing the operation;
    Sub-motion data generation means for generating sub-motion data according to the operation to the controller,
    6. The stage apparatus according to claim 5, wherein the video data generating unit changes the shape and / or pattern of all or part of the video object based on the sub-motion data.
  7.  前記映像オブジェクトの動作を規定したプリセットモーションデータを記憶するプリセットモーションデータ記憶手段と、
     前記映像データ生成手段を、第1の状態と第2の状態のいずれかに切り換える切換手段を更に備え、
     前記映像データ生成手段は、
     前記第1の状態では、前記主モーションデータに基づいて前記映像オブジェクトが動作する映像データを生成し、前記第2の状態では、前記プリセットモーションデータに基づいて前記映像オブジェクトが動作する映像データを生成することを特徴とする請求項5又は6に記載のステージ装置
    Preset motion data storage means for storing preset motion data defining the operation of the video object;
    A switching means for switching the video data generation means between the first state and the second state;
    The video data generation means includes
    In the first state, video data for operating the video object is generated based on the main motion data, and in the second state, video data for operating the video object is generated based on the preset motion data. The stage apparatus according to claim 5 or 6, wherein
  8.  ステージと、
     前記ステージに光を照射することができる照明と、
     前記照明を制御するための照明制御手段と、
     前記ステージに設置された映像オブジェクトを表示する表示部と、
     前記映像オブジェクトを前記表示部に表示するための映像データを生成する映像データ生成手段を備えるステージ施設であって、
     前記映像データ生成手段は、前記照明制御手段による前記照明の制御と同期して、前記映像オブジェクトの色の調整を行うことを特徴とするステージ施設。
    Stage,
    Illumination capable of irradiating the stage with light;
    Illumination control means for controlling the illumination;
    A display unit for displaying a video object installed on the stage;
    A stage facility comprising video data generating means for generating video data for displaying the video object on the display unit,
    The stage facility characterized in that the video data generation means adjusts the color of the video object in synchronization with the control of the lighting by the lighting control means.
PCT/JP2013/064293 2012-05-31 2013-05-22 Stage device and stage facility WO2013179992A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012124984A JP2013250773A (en) 2012-05-31 2012-05-31 Stage device and stage facility
JP2012-124984 2012-05-31

Publications (1)

Publication Number Publication Date
WO2013179992A1 true WO2013179992A1 (en) 2013-12-05

Family

ID=49673192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/064293 WO2013179992A1 (en) 2012-05-31 2013-05-22 Stage device and stage facility

Country Status (2)

Country Link
JP (1) JP2013250773A (en)
WO (1) WO2013179992A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015170680A1 (en) * 2014-05-09 2015-11-12 コニカミノルタ株式会社 Projection system
CN110502039A (en) * 2019-08-26 2019-11-26 太仓秦风广告传媒有限公司 A kind of dance & art light automatic method and its system based on limbs identification

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018027215A (en) * 2016-08-18 2018-02-22 株式会社カプコン Game program and game system
JP6514397B1 (en) * 2018-06-29 2019-05-15 株式会社コロプラ SYSTEM, PROGRAM, METHOD, AND INFORMATION PROCESSING APPARATUS
JP7181148B2 (en) * 2019-04-11 2022-11-30 株式会社コロプラ System, program, method, and information processing device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001078230A (en) * 1999-08-31 2001-03-23 Mixed Reality Systems Laboratory Inc Composite reality presenting device, and its method and program storage medium
JP2003264740A (en) * 2002-03-08 2003-09-19 Cad Center:Kk Observation scope
JP2003533235A (en) * 1999-07-26 2003-11-11 ガイ・ジョナサン・ジェームズ・ラッカム Virtual production device and method
JP2008036097A (en) * 2006-08-04 2008-02-21 Heiankaku:Kk Hall for ceremonial occasions, wedding hall, and performance method for wedding
JP2009076060A (en) * 2007-08-29 2009-04-09 Casio Comput Co Ltd Image composition apparatus and image composition processing program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003533235A (en) * 1999-07-26 2003-11-11 ガイ・ジョナサン・ジェームズ・ラッカム Virtual production device and method
JP2001078230A (en) * 1999-08-31 2001-03-23 Mixed Reality Systems Laboratory Inc Composite reality presenting device, and its method and program storage medium
JP2003264740A (en) * 2002-03-08 2003-09-19 Cad Center:Kk Observation scope
JP2008036097A (en) * 2006-08-04 2008-02-21 Heiankaku:Kk Hall for ceremonial occasions, wedding hall, and performance method for wedding
JP2009076060A (en) * 2007-08-29 2009-04-09 Casio Comput Co Ltd Image composition apparatus and image composition processing program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015170680A1 (en) * 2014-05-09 2015-11-12 コニカミノルタ株式会社 Projection system
CN110502039A (en) * 2019-08-26 2019-11-26 太仓秦风广告传媒有限公司 A kind of dance & art light automatic method and its system based on limbs identification

Also Published As

Publication number Publication date
JP2013250773A (en) 2013-12-12

Similar Documents

Publication Publication Date Title
KR101470834B1 (en) Performance system with multi-projection environment
WO2013179992A1 (en) Stage device and stage facility
US9849399B2 (en) Background imagery for enhanced pepper's ghost illusion
US8199108B2 (en) Interactive directed light/sound system
EP4096801B1 (en) Correlative effect augmented reality system and method
WO2009003011A1 (en) Digital image projection system
US20160266543A1 (en) Three-dimensional image source for enhanced pepper's ghost illusion
WO2017073095A1 (en) Image projection device, stage installation, and image projection method
JP2023175742A (en) Game machine
US9989775B2 (en) Dual-sided pepper's ghost illusion
JP5258387B2 (en) Lighting device, space production system
JP7273345B2 (en) VIDEO PROCESSING DEVICE, DISPLAY SYSTEM, VIDEO PROCESSING METHOD, AND PROGRAM
JP2022539009A (en) Systems and methods for virtual feature development
CA2815975A1 (en) Portable simulated 3d projection apparatus
JPWO2006006642A1 (en) Video production system and video production method
US9645404B2 (en) Low-profile bounce chamber for Pepper's Ghost Illusion
US20220401849A1 (en) Systems and methods for projection mapping for an attraction system
JP2024523410A (en) System and method for projection mapping of attraction systems
JP2024506636A (en) Interactive Peppers Ghost Effect System and Method
JP6403650B2 (en) Aerial image rendering device, control method thereof, and program
WO2022265972A1 (en) Systems and methods for projection mapping for an attraction system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13797482

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13797482

Country of ref document: EP

Kind code of ref document: A1