KR20170024374A - Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same - Google Patents

Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same Download PDF

Info

Publication number
KR20170024374A
KR20170024374A KR1020150119548A KR20150119548A KR20170024374A KR 20170024374 A KR20170024374 A KR 20170024374A KR 1020150119548 A KR1020150119548 A KR 1020150119548A KR 20150119548 A KR20150119548 A KR 20150119548A KR 20170024374 A KR20170024374 A KR 20170024374A
Authority
KR
South Korea
Prior art keywords
performance
stage
data
emotion
sound
Prior art date
Application number
KR1020150119548A
Other languages
Korean (ko)
Inventor
권광만
Original Assignee
(주)케이엠정보기술
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)케이엠정보기술 filed Critical (주)케이엠정보기술
Priority to KR1020150119548A priority Critical patent/KR20170024374A/en
Publication of KR20170024374A publication Critical patent/KR20170024374A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • A63J5/02Arrangements for making stage effects; Auxiliary stage appliances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J21/00Conjuring appliances; Auxiliary apparatus for conjurers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • A63J5/02Arrangements for making stage effects; Auxiliary stage appliances
    • A63J5/04Arrangements for making sound-effects
    • H05B37/0227
    • H05B37/029

Abstract

[0001] The present invention relates to a stage interactive apparatus and a method of directing a stage-based visual image to a viewer, and more particularly, it relates to a stage image producing apparatus and method for controlling a stage image and lighting according to a feeling of music played in a performance hall, The purpose is to realize the stage image and lighting suitable for the feeling and the actual atmosphere of the theater.
To this end, the present invention comprises a performance sensibility analysis unit 10 for analyzing a sound of a performance hall, a feeling of playing music, a movement of a performer, a sound and movement of a viewer, Based stage image and lighting control unit 20 for generating and controlling a stage image and an illumination suitable for the feel of the performance music and the scene atmosphere and a stage control unit 20 for controlling the stage of the performance stage according to the signals transmitted from the emotion- Based stage image / sound reproducing unit 40 for reproducing a stage image / sound communicating with the atmosphere, and an emotion-based stage image / sound reproducing unit 40 for providing illumination suitable for the stage atmosphere of the performance hall based on the signals transmitted from the emotion- Based lighting control unit 41 to allow the feel of the performance music and the scene atmosphere of the audience to be reflected in real time on stage images and lighting .

Description

TECHNICAL FIELD [0001] The present invention relates to a stage image display apparatus, and more particularly,

More particularly, the present invention relates to a method and apparatus for analyzing a performance sensibility such as a feeling of music, an atmosphere of a performance stage, a movement of a performer, an audience response, and the like, By controlling the stage image, sound and lighting to match the sensibility of the performance, it is possible to realize a stage image and lighting that interact with the scene atmosphere of a theater, and a stage image directing device capable of performing with the audience, .

Conventional performances were mostly performed in a large scale performance hall, but recently, there has been an increasing tendency to perform in a small performance hall such as a park, a subway station, and a shopping mall.

In addition, conventional performances are often performed in a stationary atmosphere with a player as the center, and in this case, the audience will passively watch performances.

In addition, since the conventional stage image and illumination provided together with the performance are implemented by a predetermined pattern, there is a problem that a stage image or illumination different from the actual atmosphere of the theater is provided.

Although the technique of converting the intensity of the stage image based on the signal processing of sound or image has been recently used, it does not reflect the actual atmosphere of the theater such as the performer or the reaction of the audience.

That is, the conventional stage image producing method has a problem that the atmosphere of the theater, the emotion of the performer, and the sensitivity of the audience can not be realized on the stage in real time.

KR 10-2013-0060486 A KR 10-2010-0121213 A

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems of the prior art, and it is an object of the present invention to provide a musical instrument and a method of controlling the stage image and lighting according to the feeling of music played at a performance site, The purpose is to realize the appropriate stage image and lighting.

Another object of the present invention is to allow a spectator to participate in a performance through a smart phone, thereby allowing the audience to perform with the audience.

Another object of the present invention is to maximize the effect of the performance by inducing the audience to actively participate in the performance.

According to an aspect of the present invention, there is provided a stage image directing apparatus comprising a stage for performing a performance, a screen provided on a stage, an amplifier device, a main speaker, and a lighting device A performance sensibility analyzing section for analyzing the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and movement of the audience, and the performance sensibility data obtained by the performance sensibility analysis section, Based stage image and lighting control unit for generating and adjusting a proper stage image and illumination; and a sensibility-based stage for reproducing the stage image / sound communicating with the stage atmosphere of the performance stage according to the signals transmitted from the emotion- Based on the signal transmitted from the emotion-based stage image and the lighting control unit, And a sensibility-based lighting control unit for providing a lighting suitable for a large-sized atmosphere, so that the feeling of the performance music and the scene atmosphere of the audience are reflected in the stage image and the lighting in real time.

The smartphone application data processing unit is further provided with a smartphone application data processing unit for interacting with the smartphone application of the audience so that the audience can participate in the performance.

The performance analysis section may include a data analysis section for analyzing the feel of the playing music, the sound, the movement of the player, the sound and the movement of the audience, and a performance analysis section for quantifying performance sensibility using the data obtained by the data analysis section And a data generating unit.

The data analyzing unit may include a sound data analyzing unit for analyzing the frequency of the concert hall sound, a video / image data analyzing unit for analyzing the performance image and the image, a music data analyzing unit for analyzing the feeling of the music to be played, And a motion data analyzing unit for analyzing the motion of the viewer.

Further, the performance sensibility data generation section digitizes the sound emotion data, the image / image emotion data, the motion emotion data, and the music emotion data based on the data obtained by the data analysis section.

The music data analyzing unit classifies the music performance performed by the performer into music that is cheerful, happy, relaxed, calm, depressed, crying, scary, and tense. The tempo, dynamics, brightness, And the music data is digitized by assigning weights to the music data.

The motion data analyzing unit may classify the motion of the viewer into surprise, sadness, anger, disgust, fear, and joy to derive motion sensation data.

In addition, the smartphone app data processing unit may transmit a message transmitted by the viewer through the smartphone application to the emotion-based stage image / sound reproducing unit to be implemented on the screen.

Meanwhile, a method of directing a stage image according to the present invention is a method of directing a stage image using a stage, a screen provided on a stage, an amplifier device, a main speaker, and an illumination device, (B) generating emotion-based stage images and lights corresponding to the feel of the performance music and the scene atmosphere according to the performance emotion data; (C) reproducing the stage video / sound communicating with the stage atmosphere of the performance hall; and (d) providing the lighting suitable for the stage atmosphere of the performance hall, so that the feeling of the performance music and the scene of the audience So that the atmosphere is reflected in real time on the stage image / sound and illumination.

The method further includes the step of interacting with the smartphone application of the audience so that the audience can participate in the performance.

The step of generating the performance sensibility data may include analyzing the sound of the performance hall, the feeling of the performance music, the performer's movement, the sound of the audience, and the motion with engineering data and then calculating the intensity of the data with respect to the terminal node of the performance emotion tree And the emotion index of the upper node is numerically expressed.

Also, the step of generating the emotion-based stage image and the illumination may include generating and adjusting the stage image / sound and the illumination communicating with the actual performance situation by the performance sensibility data obtained by digitizing the performance sensibility.

According to the present invention, the stage image and the lighting are controlled according to the feeling of music played at the performance site, the movement of the player, the reaction of the audience, and the like, so that it is possible to realize a stage image and lighting suitable for the atmosphere of the performance music, It is effective.

In addition, it is possible to maximize the effect of performances because it is possible to perform the audience participation performance instead of the player center.

In addition, when an audience transmits a message through a smartphone application, the contents can be displayed on the stage, thereby making it possible for the audience to participate in the performance.

1 and 2 are conceptual diagrams of a stage image rendering apparatus according to the present invention.
3 is a system configuration diagram of a stage image producing apparatus according to the present invention.
4 is a block diagram of a stage image rendering apparatus according to the present invention;
5 is a conceptual diagram of performance sensibility data and media engineering data according to the present invention.
6 to 17 illustrate a process of analyzing music emotion in the performance emotion analyzing unit,
FIG. 6 is a diagram for classifying emotions that are heard and felt by the music.
Fig. 7 is a diagram showing a sensibility tree in which sensibility of music is analyzed. Fig.
8 is a view showing an example of classification of emotion of music.
9 is a diagram showing the result of extracting each element from music.
10 is a diagram illustrating a process of calculating a probability using musical element data using a normal distribution;
11 is a diagram showing a result of calculating probability using musical element data using a normal distribution;
12 is a diagram showing a two-dimensional emotional coordinate system.
13 is a diagram showing an example of analyzing music using a emotional tree;
14 and 15 are diagrams in which music is mapped to a two-dimensional emotional coordinate system.
16 is a diagram showing a method of controlling emotion of music in a two-dimensional emotional coordinate system.
17 is a conceptual diagram for analyzing an operation of an audience;
18 is a view showing an emotional tree for an action of an audience;
19 shows an example of sound visualization;

Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings.

FIG. 1 and FIG. 2 illustrate conceptual views of a stage image producing apparatus according to the present invention.

As shown in FIGS. 1 and 2, the stage image producing apparatus according to the present invention reflects the atmosphere of the performance stage, the performer and the reaction of the audience to the stage image and illumination.

In other words, the stage image and lighting are controlled by perceiving sensibility in the theater through the sound and music of the performance hall, thereby quantifying performance sensibility.

Accordingly, it is possible to realize an effect image and illumination communicating with the actual stage atmosphere of the theater.

In addition, by enabling audiences to participate in performances through smartphone applications, they can maximize performance.

To this end, the present invention provides a module as shown in FIG. 3 and FIG. 4 to analyze the situation of a stage through a sensor to generate an interactive image and to allow an audience to participate in a performance through an application of a smart phone .

4 and 5, a stage image producing apparatus according to the present invention includes a stage for performing a performance, a screen provided on a stage, an amplifier device, a main speaker, (10) for analyzing the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and the movement of the audience, and the performance analysis section (10) Based stage image and lighting control unit 20 for generating and adjusting a stage image and lighting corresponding to the feeling of the performance music and the scene atmosphere based on the performance sensibility data obtained in the emotion based stage image and lighting control unit 20, Based stage video / sound reproducing unit 40 for reproducing a stage video / sound communicating with the stage atmosphere of the performance stage according to a signal transmitted from the emotion-based stage video / Depending on the signal transmitted from the command control unit 20 it is configured to include the emotion based lighting control unit 41 to provide illumination for the scene in the hall atmosphere.

With the above-described configuration, it is possible to reflect the feeling of the performance music and the scene atmosphere of the audience in real time on the stage image and the illumination.

In addition, it is preferable to further include a smartphone app data processing unit 30 that enables a viewer to participate in a performance by interacting with a smartphone application of an audience.

The smartphone app data processing unit 30 transmits a message transmitted by the viewer through the smartphone application to the emotion based stage image / sound reproducing unit 40 to be implemented on the screen.

Accordingly, various characters can be implemented on the screen during performances, and various events can be performed.

4, the performance analysis section 10 includes a data analysis section for analyzing the feel of the playing music, the sound, the movement of the player, the sound and movement of the audience, And a performance sensibility data generation unit for digitizing performance sensibility using the obtained data.

The data analyzing unit includes a sound data analyzing unit for analyzing the frequency of the concert hall sound, a video / image data analyzing unit for analyzing the performance image and the image, a music data analyzing unit for analyzing the feeling of the music to be played, And a motion data analyzing unit for analyzing the motion of the subject.

 The performance sensibility data generation unit digitizes the sound sensibility data, the image / image sensibility data, the motion sensibility data, and the music sensibility data into numerical values by using the data obtained by the data analysis unit, and outputs the emotion based stage image and the lighting control unit 20.

Hereinafter, a method of analyzing performance music using the emotion tree in the music data analysis unit of the performance sensitivity analysis unit 10 will be described with reference to FIGS. 6 to 16. FIG.

The music data analysis section classifies the feeling of the music performed by the performer into light, happy, comfortable, calm, depressed, crying, scary, and tense music, as shown in Figs.

In addition, music features are classified by tempo, dynamics, brightness, amplitude change, and noise, and each item is weighted to emotionally express emotion.

The tempo measures the number of beats per minute, the dynamics of the signal, the intensity of the brightness of the music, the amplitude change of the amplitude of the frequency, and the noise measure the amount of noise.

FIG. 8 shows an example of classifying sensibility of music with respect to a real song, and FIG. 9 shows each element extracted from music and numeric.

However, since each element of the music shown in Fig. 9 has a different unit and range, it is necessary to unify the unit and the range using the normal distribution.

To do this, the mean and standard deviation are calculated and the probability is calculated using the normal distribution.

FIG. 10 shows a part of calculating the probability using the normal distribution, and FIG. 11 is a numerical value of the probability data using the normal distribution.

11, it can be seen that the values of emotion, tempo, dynamics, noise, amplitude and brightness for the song are unified.

As a result of evaluating emotion accuracy for 70 sample songs, the accuracy was the highest when the weights were as shown in [Table 1].

Therefore, weights for music extraction elements are preferably given as shown in [Table 1] below.

Weight of music extraction element variable Tempo Dynamics Noise Amplitude change brightness X (happiness too) 20% 10% 40% 20% 10% Y (excitement degree) 40% 10% 10% 20% 20%

In Table 1, the happiness degree X and the excitation degree Y can be obtained by multiplying the probability of each extracted element by a weight.

12 shows a music emotion coordinate system in which the happiness degree is the X axis and the excitement degree is the Y axis. FIGS. 13 to 16 show a process of mapping the happiness and excitement obtained in the example song to the two-dimensional emotional coordinate system . The process will be described below.

First, as shown in FIG. 13, after measuring the tempo, the dynamics, the noise, the amplitude change, and the brightness, and integrating the numerical values by the probability values, The excitation degree (Y) coordinate value is obtained.

Then, as shown in Fig. 14, the values of happiness (X) and excitability (Y) are mapped to points in a two-dimensional coordinate system.

Then, when a sentence probability distribution is created by drawing a circle based on the point, the percentage of the area occupied by the circle becomes the sensitivity probability.

For example, as shown in FIG. 14, when the coordinates are (65.54, 54.65) and the radius of the circle is 30, the emotion probability values are happy: 32%, comfort: 24%, lightness: 23% %, Tense: 6%, scary: 4%, depressed: 3%, crying: 2%.

At this time, as shown in FIG. 15, the emotional probability can be changed by changing the radius of the circle.

For example, if the radius of a circle is 40, as shown in Fig. 15, the sensibility probability value is as follows: happy: 23%, comfort: 18%, light: 16%, calm: 15%, tense: 10% , Depressed: 6%, crying: 5%.

In addition, in order to obtain a comfortable and calm sensibility at the initial coordinate (65.54, 54.65), the emotional probability circle may be moved to the lower side of the coordinate as shown in Fig.

For example, when the emotional coordinate is moved to (65.54, 24.65), the value of Y becomes smaller by 30, and the sensitivity probability value at this time is calm: 52%, comfort: 34%, depressed: 10%, happy: : 0%, scary: 0%, cheerful: 0%, crying: 0%.

In this way, we can obtain the music with the desired sensitivity by decreasing the value of each element so that the sum is 24.65 by referring to the weights that affect the Y excitement by the tempo, dynamics, noise, amplitude change, and brightness.

That is, the sensibility probability can be calculated by converting music data into computable data and mapping it to an emotional coordinate system.

In addition, by changing the emotion probability in the emotion probability coordinate system, it is possible to realize music with a desired emotion.

17 and 18 show that the motion data analysis unit of the performance sensitivity analysis unit 10 classifies the motion of the viewer into surprise, sadness, anger, disgust, fear, and joy to derive the motion sensation data, 19 shows an example of visualizing the sound.

The function and effect of the present invention will be described below.

When the performer starts the performance, the performance analysis section 10 analyzes the sound of the performance hall, the feeling of the performance music, the performer's movement, the sound and movement of the audience to generate performance sensibility data.

At this time, the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and movement of the audience are analyzed by the engineering data, and the intensity of the emotion data for the terminal node of the performance emotion tree is calculated to quantify the emotion index of the upper node .

Based on the performance sensory data, the emotion-based stage image and lighting controller 20 generates emotion-based stage images and lighting corresponding to the feel of the performance music and the scene atmosphere.

At this time, stage image / sound and illumination communicating with the actual performance situation are generated and adjusted by the performance sensibility data in which the performance sensibility is quantified.

Meanwhile, the smartphone application processing unit 30 interacts with the application of the smartphone of the audience so that the audience can actively participate in the performance.

Then, the emotion-based stage image / sound reproducing unit 40 provides the stage image / sound that communicates with the stage atmosphere, and the emotion based illumination control unit 41 provides illumination communicating with the stage atmosphere.

According to the present invention, the stage image and illumination can be changed by extracting human performance sensibility felt in a performance situation based on the emotion tree, and numerically controlling it.

That is, after analyzing the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and movement of the audience by the engineering data, the intensity of the emotion data for the terminal node of the performance emotion tree is calculated, And can be controlled by inputting it to a module for controlling the stage device.

Thus, the feeling of the performance music and the scene atmosphere of the audience can be reflected in the stage video / sound and illumination in real time.

While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. It will be understood by those skilled in the art that various changes and modifications may be made without departing from the scope of the present invention.

For example, the present invention can be applied not only to a performance stage, but also to a karaoke system in an embedded type.

10: Performance Sensibility Analysis Department
20: Emotion-based stage image and lighting control unit
21: Stage image and illumination search unit
22: Concert sympathy video and lighting generator
23: Performing sympathetic image and lighting control unit
30: smart phone app data processing unit
40: emotion-based stage video / sound reproduction unit
41: Emotion-based lighting control unit

Claims (12)

A stage image producing apparatus comprising a stage for performing a performance, a screen provided on a stage, an amplifier device, a main speaker, and a lighting device,
A performance sensibility analysis unit 10 for analyzing the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and movement of the audience,
An emotion-based stage image and lighting control unit 20 for generating and adjusting a stage image and an illumination corresponding to the feel of the performance music and the scene atmosphere on the basis of the performance emotion data obtained by the performance emotion analyzing unit 10,
Based stage video / sound reproducing unit 40 for reproducing a stage video / sound that communicates with the stage atmosphere of the performance stage according to the emotion-based stage video and the signal transmitted from the lighting controller 20,
Based on the emotion-based stage image and a signal transmitted from the illumination controller 20, the emotion-based illumination controller 41 for providing illumination corresponding to the stage atmosphere of the performance hall,
And the scene atmosphere of the audience is reflected in the stage image and the lighting in real time.
The method according to claim 1,
And a smartphone app data processing unit (30) for interacting with the smartphone application of the audience so that the audience can participate in the performance.
The method according to claim 1,
The performance / sensitivity analysis unit (10)
A data analyzing unit for analyzing the feeling of the playing music, the sound, the movement of the player, the sound and movement of the audience,
And a performance sensory data generation unit for digitizing performance sensibility by using the data obtained by the data analysis unit.
The method of claim 3,
The data analysis unit may include:
A sound data analysis unit for analyzing the frequency of the concert hall sound,
An image / image data analyzing unit for analyzing the performance image and the image,
A music data analysis unit for analyzing a feeling of music to be played;
And a motion data analyzer for analyzing the motion of the performer and the audience.
The method of claim 3,
Wherein the performance sensibility data generation unit comprises:
Wherein the sound emotion data, the image / image emotion data, the motion emotion data, and the music emotion data are digitized by the data obtained by the data analyzing unit.
5. The method of claim 4,
Wherein the music data analyzing unit comprises:
The performers classify the music feelings into music that is cheerful, happy, relaxed, calm, depressed, crying, scary, tense,
Wherein the musical data is digitized by assigning weights to the tempo, dynamics, brightness, amplitude variation, and noise.
5. The method of claim 4,
Wherein the motion data analyzer comprises:
Wherein the motion sensibility data is derived by classifying the motion of the viewer into surprise, sadness, anger, disgust, fear, and joy.
3. The method of claim 2,
The smartphone app data processing unit (30)
And transmits the message transmitted by the viewer through the smartphone application to the emotion-based stage / sound reproducing unit 40 so as to be implemented on the screen.
A stage image producing method using a stage, a screen provided on a stage, an amplifier device, a main speaker, and a lighting device,
(a) generating (S10) performance sensibility data by analyzing the sound of the performance hall, the feeling of the performance music, the performer's movement, the voice and movement of the audience,
(b) generating (S20) an emotion-based stage image and illumination suitable for the feel of the performance music and the scene atmosphere by the performance emotion data,
(c) reproducing a stage image / sound communicating with the stage atmosphere of the performance hall (S30)
(d) providing illumination appropriate to the stage atmosphere of the performance hall (S40)
And the scene atmosphere of the audience is reflected in the stage image / sound and illumination in real time.
10. The method of claim 9,
The method of any preceding claim, further comprising the step of interoperating with the smartphone application of the audience before the step S30 so that the audience can participate in the performance.
10. The method of claim 9,
The step (S10) of generating the performance sensibility data comprises:
After analyzing the sound of the performance hall, the feeling of the performance music, the movement of the performer, the sound and the movement of the audience with the engineering data, the strength of the emotion data for the terminal node of the performance emotion tree is calculated and the emotion index of the upper node is digitized A Method of Interaction of Performing Situations and Method of Producing Stage Images in Participation in Audience.
10. The method of claim 9,
The step S20 of generating the emotion-based stage image and illumination comprises:
And a stage image / sound and illumination communicating with the actual performance situation are generated and adjusted by the performance sensibility data obtained by digitizing the performance sensibility, respectively.


KR1020150119548A 2015-08-25 2015-08-25 Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same KR20170024374A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150119548A KR20170024374A (en) 2015-08-25 2015-08-25 Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150119548A KR20170024374A (en) 2015-08-25 2015-08-25 Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same

Publications (1)

Publication Number Publication Date
KR20170024374A true KR20170024374A (en) 2017-03-07

Family

ID=58411620

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150119548A KR20170024374A (en) 2015-08-25 2015-08-25 Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same

Country Status (1)

Country Link
KR (1) KR20170024374A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109668077A (en) * 2017-10-16 2019-04-23 提姆拉博株式会社 Stage lighting system and stage illumination, lighting method, and record have the record media of computer program
KR20190096787A (en) * 2018-02-09 2019-08-20 광주과학기술원 Emotion evaluation system
CN110853459A (en) * 2019-12-09 2020-02-28 唐山师范学院 Music teaching environment simulation system
KR20200023564A (en) * 2018-08-23 2020-03-05 동국대학교 산학협력단 Apparatus for visualizing musical score and operating method thereof
CN110968115A (en) * 2019-11-20 2020-04-07 杭州友邦演艺设备有限公司 Stage acoustic reflection cover control method
KR20200121568A (en) * 2019-04-16 2020-10-26 박덕선 Performance production control system for displaying performance information to portable light emitting stick by being interworked with rallying cry of participant and Method for controlling the same
CN116962885A (en) * 2023-09-20 2023-10-27 成都实时技术股份有限公司 Multipath video acquisition, synthesis and processing system based on embedded computer
KR102617842B1 (en) * 2022-12-28 2023-12-22 하동선 Control system for stage

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100121213A (en) 2009-05-08 2010-11-17 엘피엠테크놀로지 주식회사 Method and apparatus for stage effect based on the location of actors
KR20130060486A (en) 2011-11-30 2013-06-10 하이네트(주) Automatic control appratus for smart light using stage

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100121213A (en) 2009-05-08 2010-11-17 엘피엠테크놀로지 주식회사 Method and apparatus for stage effect based on the location of actors
KR20130060486A (en) 2011-11-30 2013-06-10 하이네트(주) Automatic control appratus for smart light using stage

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109668077A (en) * 2017-10-16 2019-04-23 提姆拉博株式会社 Stage lighting system and stage illumination, lighting method, and record have the record media of computer program
KR20190096787A (en) * 2018-02-09 2019-08-20 광주과학기술원 Emotion evaluation system
KR20200023564A (en) * 2018-08-23 2020-03-05 동국대학교 산학협력단 Apparatus for visualizing musical score and operating method thereof
KR20200121568A (en) * 2019-04-16 2020-10-26 박덕선 Performance production control system for displaying performance information to portable light emitting stick by being interworked with rallying cry of participant and Method for controlling the same
CN110968115A (en) * 2019-11-20 2020-04-07 杭州友邦演艺设备有限公司 Stage acoustic reflection cover control method
CN110853459A (en) * 2019-12-09 2020-02-28 唐山师范学院 Music teaching environment simulation system
KR102617842B1 (en) * 2022-12-28 2023-12-22 하동선 Control system for stage
CN116962885A (en) * 2023-09-20 2023-10-27 成都实时技术股份有限公司 Multipath video acquisition, synthesis and processing system based on embedded computer
CN116962885B (en) * 2023-09-20 2023-11-28 成都实时技术股份有限公司 Multipath video acquisition, synthesis and processing system based on embedded computer

Similar Documents

Publication Publication Date Title
KR20170024374A (en) Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same
JP6923245B2 (en) Audience-based haptics
TWI486904B (en) Method for rhythm visualization, system, and computer-readable memory
CN107329980B (en) Real-time linkage display method based on audio and storage device
US11043216B2 (en) Voice feedback for user interface of media playback device
CN111916039B (en) Music file processing method, device, terminal and storage medium
TWI451897B (en) Method and system for providing user visual feedback
Rocchesso Explorations in sonic interaction design
Camurri et al. The MEGA project: Analysis and synthesis of multisensory expressive gesture in performing art applications
US11120633B2 (en) Interactive virtual reality system for experiencing sound
Su et al. Interactive exploration of a hierarchical spider web structure with sound
Selfridge et al. Interactive mixing using wii controller
TW201340694A (en) Situation command system and operating method thereof
Zandt-Escobar et al. Piaf: A tool for augmented piano performance using gesture variation following
Wang et al. Heart fire: A smart watch-based musician-listener interaction system for online live-streamed concerts: A pilot study
Selfridge et al. Augmented live music performance using mixed reality and emotion feedback
WO2022163137A1 (en) Information processing device, information processing method, and program
Liljedahl Musical Pathfinding; or How to Listen to Interactive Music Video
KR20150012012A (en) Music fountain control method and system based on musical emotion
Evreinova et al. An exploration of volumetric data in auditory space
Rönnberg et al. Traces of modal synergy: studying interactive musical sonification of images in general-audience use
Choi et al. Sounds shadowing agents generating audible features from emergent behaviors
CN114302232B (en) Animation playing method and device, computer equipment and storage medium
JP7419768B2 (en) Music generation method and music generation system
Suzuki Tactile Computing Interaction and its Application for Arts