WO2018199115A1 - Production control device, direction system, and program - Google Patents

Production control device, direction system, and program Download PDF

Info

Publication number
WO2018199115A1
WO2018199115A1 PCT/JP2018/016678 JP2018016678W WO2018199115A1 WO 2018199115 A1 WO2018199115 A1 WO 2018199115A1 JP 2018016678 W JP2018016678 W JP 2018016678W WO 2018199115 A1 WO2018199115 A1 WO 2018199115A1
Authority
WO
WIPO (PCT)
Prior art keywords
effect
sound
event
detection information
detection
Prior art date
Application number
PCT/JP2018/016678
Other languages
French (fr)
Japanese (ja)
Inventor
培仁 田中
浩晃 杉村
安樹絵 伊東
柿下 正尋
貴裕 浅野
旬 臼井
多田 幸生
Original Assignee
ヤマハ株式会社
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ株式会社, 富士通株式会社 filed Critical ヤマハ株式会社
Priority to JP2019514549A priority Critical patent/JPWO2018199115A1/en
Publication of WO2018199115A1 publication Critical patent/WO2018199115A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • A63J5/02Arrangements for making stage effects; Auxiliary stage appliances
    • A63J5/04Arrangements for making sound-effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones

Definitions

  • the present invention relates to a technique for producing an event.
  • Patent Document 1 describes a technique in which a video stream obtained from a camera that captures a live sports event is transmitted to a portable electronic device of a spectator and displayed on the display.
  • Patent Document 1 it is only possible to see the situation of the player on the video, and lacks a sense of reality.
  • an effect using a medium such as sound, light, and video is performed in order to excite the event.
  • the conventional performance is performed regardless of the player's play and the audience's reaction performed at the event, and there is no contrivance for enhancing the sense of presence of the event.
  • An object of this invention is to provide the production effect which raises the realistic feeling of an event.
  • the present invention provides an acquisition unit that acquires detection information indicating a physical quantity detected by a detection device at a place related to an event, and a determination unit that determines an effect of content based on the detection information acquired by the acquisition unit.
  • An effect control device is provided that includes an effect control means for controlling the effect device in accordance with the effect effect determined by the determining means and providing the effect effect.
  • the production control device further includes a specifying unit that specifies the status of the event based on the detection information acquired by the acquiring unit, and the determining unit is configured to add the event status specified by the specifying unit to the event status. A corresponding production effect may be determined.
  • the event situation may include a play performed by a player in the event.
  • the production control device further includes a generation unit that generates a sound signal based on the detection information when the production effect using sound is determined by the determination unit, and the production control unit is generated by the generation unit.
  • the sound signal may be supplied to a speaker, and a sound corresponding to the sound signal may be output from the speaker.
  • the detection device includes a microphone that collects sound emitted in the event, the detection information includes a sound signal indicating the sound collected by the microphone, and the generation means is included in the detection information
  • the sound signal may be subjected to sound processing, and the effect control means may supply the sound signal subjected to the sound processing to the speaker.
  • the detection device includes a motion detection device that detects a motion of a player of the event, the detection information indicates the motion detected by the motion detection device, and the generation means is indicated by the detection information.
  • a sound signal corresponding to the movement may be generated.
  • the detection device includes a biological sensor that measures biological information of the player of the event, the detection information includes the biological information measured by the biological sensor, and the generation unit is included in the detection information A sound signal corresponding to the biological information may be generated.
  • the acquisition unit may further acquire progress information indicating the progress of the event, and the determination unit may determine the effect based on the detection information and the progress information acquired by the acquisition unit. Good.
  • the determining means determines a plurality of different presentation effects for a plurality of spectators or a plurality of spectator groups watching the event, and the presentation control means is a plurality corresponding to the plurality of spectators or the plurality of spectator groups.
  • the effect device may be controlled to provide each of the plurality of effect devices with the effect determined for the audience or group of spectators to which the effect device corresponds among the plurality of effect effects.
  • the effect may be an effect using sound, light, vibration, or an image.
  • the present invention provides a detection device that detects a physical quantity at a place related to an event, an effect device that provides an effect of the event, and an acquisition unit that acquires detection information indicating the physical quantity detected by the detection device.
  • an effect system including an effect control means.
  • the present invention is a step of acquiring detection information indicating a physical quantity detected by a detection device at a location related to an event in a computer, and determining a presentation effect of contents based on the acquired detection information;
  • a program for controlling an effect device according to the determined effect and executing the step of providing the effect is provided.
  • bowl 60 The figure which shows an example of the display apparatus 28 which concerns on a modification.
  • FIG. 1 is a diagram illustrating an example of a configuration of an effect system 1 according to a first embodiment.
  • the production system 1 produces an event.
  • This event may be, for example, various sports matches, concerts, or plays.
  • the event is a basketball game.
  • the place where the event takes place is composed of the field 2 and the auditorium area 3.
  • Field 2 includes a coat 4.
  • a player plays a basketball game.
  • a plurality of spectator seats 50 are provided in the spectator seat area 3. In the spectator seat area 3, the spectators watch the game.
  • a plurality of sound collecting devices 11, a plurality of vibration detecting devices 12, a plurality of photographing devices 13, a plurality of sound emitting devices 21, a plurality of light emitting devices 22, and a projection device 23 are provided. Is provided. In addition, the number and position of these apparatuses shown in FIG. 1 are illustrations, and the number and position of these apparatuses are not limited to the number illustrated in FIG.
  • the plurality of sound collecting devices 11 are provided at positions where the player's play sounds can be collected.
  • the plurality of sound collecting devices 11 may be provided around the coat 4 in the field 2.
  • the plurality of sound collecting devices 11 may be embedded in the coat 4.
  • Each sound collection device 11 collects sound at a predetermined time interval.
  • Each sound collection device 11 has a microphone, for example.
  • the plurality of vibration detection devices 12 are provided at positions where vibrations generated by the player's play can be detected.
  • the plurality of vibration detection devices 12 may be embedded in the coat 4.
  • Each vibration detection device 12 detects vibration at a predetermined interval.
  • Each vibration detection device 12 includes, for example, a vibration sensor.
  • the plurality of photographing devices 13 are provided at positions where the player's play can be photographed.
  • Each photographing device 13 photographs a video.
  • Each photographing device 13 has a digital video camera, for example.
  • the plurality of sound emitting devices 21 are provided at positions where the sound output from these sound emitting devices 21 can be heard by the audience.
  • the plurality of sound emitting devices 21 may be provided between the passages in the auditorium area 3 or between the auditoriums 50.
  • Each sound emitting device 21 outputs a sound corresponding to the sound signal.
  • Each sound emitting device 21 has a speaker, for example. This speaker may be a directional speaker.
  • the plurality of light emitting devices 22 are provided at positions where light emitted from these light emitting devices 22 is visible to the player, the audience, or both the athlete and the audience.
  • the plurality of light emitting devices 22 may be provided between the passage in the spectator seat area 3 or between the spectator seats 50.
  • Each light emitting device 22 emits light according to the light emission control signal.
  • Each light emitting device 22 has, for example, an LED (Light Emitting Diode).
  • Projection device 23 is provided at a position where a video can be projected to a player, a spectator, or a position visible to both the player and the spectator.
  • the projection device 23 projects an image corresponding to the image signal onto the projection surface.
  • the projection device 23 includes, for example, a projector. This projection surface may be, for example, the field 2, the coat 4, or the ceiling.
  • FIG. 2 is a diagram illustrating an example of the spectator seat 50.
  • the auditorium 50 is provided with a sound emitting device 24 similar to the sound emitting device 21 described above and a vibration device 25.
  • the spectator seat 50 may not necessarily be provided with all of these devices. For example, only one of the sound emitting device 24 and the vibration device 25 may be provided in the spectator seat 50. Further, these devices may not be provided in all the spectator seats 50. For example, these devices may be provided only in some spectator seats 50.
  • the vibration device 25 is provided at a position where vibration can be given to the audience.
  • the vibration device 25 is provided behind the seat surface of the spectator seat 50.
  • the vibration device 25 vibrates according to the vibration control signal.
  • the vibration device 25 has, for example, a vibration motor.
  • FIG. 3 is a diagram illustrating an example of a ball 60 used in a basket game and a wearable terminal 70 worn by a player.
  • the ball 60 incorporates the motion detection device 14.
  • the motion detection device 14 detects the motion of the ball 60.
  • the motion detection device 14 includes, for example, an acceleration sensor.
  • the wearable terminal 70 is, for example, a wristwatch type terminal worn by a player. However, the wearable terminal 70 may have other shapes as long as the shape is wearable by the player.
  • the wearable terminal 70 includes a sound collection device 15 similar to the sound collection device 11 described above, and a biological sensor 16. Note that the wearable terminal 70 is not necessarily provided with all of these devices. For example, only one of the sound collection device 15 and the biological sensor 16 may be provided in the wearable terminal 70.
  • the biosensor 16 measures the player's biometric information at predetermined time intervals.
  • This biological information may be, for example, heart rate, pulse rate, body temperature, or blood pressure.
  • the biological sensor 16 includes a heart rate meter that measures the heart rate.
  • FIG. 4 is a diagram illustrating an example of a wearable terminal 80 worn by a spectator.
  • the wearable terminal 80 is, for example, a wristwatch type terminal worn by a spectator. However, the wearable terminal 80 may have other shapes as long as it can be worn by the audience.
  • the wearable terminal 80 includes a sound collecting device 17 similar to the sound collecting device 11 described above, a sound emitting device 26 similar to the sound emitting device 21 described above, and a light emitting device 27 similar to the light emitting device 22 described above. Note that the wearable terminal 80 is not necessarily provided with all of these devices. For example, only one or two of the sound collecting device 17, the sound emitting device 26, and the light emitting device 27 may be provided in the wearable terminal 80. Further, the wearable terminal 80 may be provided with a vibration device similar to the vibration device 25 described above.
  • the above-described sound collection devices 11, 15, and 17, the vibration detection device 12, the imaging device 13, the motion detection device 14, and the biological sensor 16 are collectively referred to as “detection device 10”.
  • the sound emitting devices 21, 24, and 26, the light emitting devices 22 and 27, the projection device 23, and the vibration device 25 are collectively referred to as the “production device 20”.
  • Each device included in the detection device 10 and the rendering device 20 has a communication interface, and is connected to the server device 30 via the communication line 40 as shown in FIG. Each communication interface performs data communication with the server device 30 via the communication line 40.
  • Individual devices included in the detection device 10 and the rendering device 20 may be connected to the communication line 40 wirelessly or may be connected by wire.
  • the communication line 40 transmits data between the individual devices included in the detection device 10 and the server device 30, and between the individual devices included in the rendering device 20 and the server device 30.
  • the communication line 40 may be constituted by, for example, a wireless communication network and a private network such as a LAN (Local Area Network), and may further include the Internet.
  • LAN Local Area Network
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of the server device 30.
  • the server device 30 includes a processor 31, a memory 32, a communication interface 33, and a storage 34. These devices are connected via a bus 35 for transmitting data.
  • the processor 31 executes various processes by reading the program into the memory 32 and executing it.
  • a CPU Central Processing Unit
  • the memory 32 stores a program executed by the processor 31.
  • a ROM Read Only Memory
  • RAM Random Access Memory
  • the communication interface 33 performs data communication with individual devices included in the detection device 10 and the rendering device 20 via the communication line 40.
  • the storage 34 stores various data and programs.
  • a hard disk, a flash memory, or a combination thereof may be used.
  • FIG. 6 is a diagram illustrating an example of a functional configuration of the production system 1.
  • the production system 1 functions as the acquisition unit 101, the identification unit 102, the determination unit 103, the generation unit 104, and the production control unit 105.
  • all of these functions are implemented in the server device 30. Specifically, these functions are realized by the cooperation of a program stored in the memory 32 and a processor 31 that executes the program.
  • the server device 30 functions as an effect control device.
  • the acquisition unit 101 acquires detection information indicating a physical quantity detected by the detection device 10 at a place where an event is performed.
  • This physical quantity relates to at least one of sound, vibration, video, motion, and biological information. Note that the concept of the term “acquisition” includes reception.
  • the specifying unit 102 specifies the event status based on the detection information acquired by the acquiring unit 101.
  • the status of this event includes play played by the players.
  • the determining unit 103 determines the effect of the content based on the detection information acquired by the acquiring unit 101. For example, the determining unit 103 determines the effect of the content based on the detection information related to the play specified by the specifying unit 102.
  • This effect is an effect using at least one of sound, light, vibration, and video. That is, this effect may be an effect felt by at least one of auditory, visual, and tactile sensations.
  • the generating unit 104 generates a sound signal based on the detection information acquired by the acquiring unit 101 when the effect using the sound is determined by the determining unit 103.
  • the generation unit 104 may generate a sound signal by performing sound processing on the sound signal.
  • This sound processing may be, for example, processing for processing a sound signal, or processing for synthesizing two or more sound signals. This processing includes amplification.
  • the sound signal used for the sound processing may be included in the detection information, for example, or may be stored in the storage 34 in advance.
  • the generation unit 104 may generate a sound signal using a sound source.
  • the production control unit 105 controls the production device 20 according to the production effect determined by the determination unit 103 and provides the production effect.
  • the production device 20 provides an event production effect. This control is performed, for example, by transmitting control information for controlling the presentation effect.
  • the control information includes the sound signal generated by the generating unit 104.
  • FIG. 7 is a sequence chart showing an example of the operation of the rendering system 1. This operation is started at a predetermined timing, such as at the start of an event.
  • the detection apparatus 10 detects a physical quantity at a place where an event is performed.
  • the plurality of sound collection devices 11 collects player's play sounds.
  • Each sound collecting device 15 collects sounds emitted from each player such as voices of each player and sounds of breathing.
  • Each sound collecting device 17 collects sound emitted from each spectator, such as the sound of each spectator. These sounds are all sounds generated in the event.
  • the plurality of vibration detection devices 12 detect vibrations generated by the player's play.
  • the plurality of photographing devices 13 photograph a player's play.
  • the motion detection device 14 detects the motion of the ball 60.
  • Each biological sensor 16 measures biological information of each player. These devices detect these physical quantities at predetermined time intervals or continuously. However, the timing at which these devices detect physical quantities may be different.
  • the detection device 10 transmits detection information indicating the physical quantity detected in step S101 to the server device 30.
  • the plurality of sound collection devices 11 transmit detection information including a sound signal indicating the collected sound and identification information for uniquely identifying each sound collection device 11. This identification information is used to identify each sound collection device 11.
  • the plurality of vibration detection devices 12 transmit detection information including a vibration signal indicating the detected vibration and identification information for uniquely identifying each vibration detection device 12. This identification information is used to identify each vibration detection device 12.
  • the plurality of photographing devices 13 transmit detection information including a video signal indicating the photographed video and identification information for uniquely identifying each photographing device 13. This identification information is used to identify each imaging device 13.
  • the motion detection device 14 transmits detection information including a value indicating the detected motion. These devices may transmit detection information every time a physical quantity is detected, or may transmit detection information collectively at a predetermined timing.
  • the detection information transmitted from the detection device 10 is received by the acquisition unit 101 and stored in the storage 34. As a result, the detection information transmitted from the individual devices included in the detection device 10 is accumulated in the storage 34.
  • step S103 the acquisition unit 101 acquires progress information indicating the progress of the game. For example, when the staff operating the game inputs progress information to a terminal device (not shown), the progress information is acquired from this terminal device.
  • the progress of the game includes, for example, the elapsed time of the game, the score of the game, and the penalty imposed on the game.
  • the acquisition of the progress information may be performed at a predetermined time interval or may be performed at a predetermined timing.
  • the progress information acquired by the acquisition unit 101 is stored in the storage 34.
  • the identification unit 102 identifies the play performed by the player by analyzing the detection information and the progress information stored in the storage 34. For example, the specifying unit 102 identifies the type of sound indicated by the sound signal by analyzing the sound signal included in the detection information received from each sound collecting device 11. Further, the specifying unit 102 identifies the type of vibration indicated by the vibration signal by analyzing the vibration signal included in the detection information received from each vibration detection device 12. Further, the specifying unit 102 detects the movement of the ball 60 and the movement of the player by performing an image recognition process on the video signal included in the detection information received from each imaging device 13. Further, the specifying unit 102 detects the movement of the ball 60 by analyzing the value included in the detection information received from the movement detecting device 14.
  • the specifying unit 102 recognizes the progress of the game indicated by the progress information.
  • the identifying means 102 is based on the result obtained by such an analysis, that is, the type of sound, the type of vibration, the movement of the player, the movement of the ball 60, the course of the game, or a combination of at least two of these. Identify the play made by the player.
  • step S105 a production effect corresponding to the play specified in step S104 is determined.
  • the determination of the effect may be performed with reference to the effect database 111, for example.
  • the effect database 111 may be generated in advance and stored in the storage 34, for example.
  • FIG. 8 is a diagram showing an example of the effect database 111.
  • the effect database 111 stores a play type and an effect corresponding to the play type in association with each other. This effect has contents based on detection information related to the corresponding type of play. For example, when a certain play is specified in step S104, the effect stored in association with this play type is determined.
  • step S106 when the production effect determined in step S105 includes a production effect using sound, the generation means 104 generates a sound signal indicating the sound used for the production effect. In addition, when the effect using sound is not included in the effect determined in step S105, the process of step S106 is not performed.
  • step S107 the effect control means 105 transmits to the effect device 20 control information for controlling the effect device 20 in accordance with the effect determined in step S105. For example, when a production effect using sound is determined, control information including the sound signal generated in step S106 is transmitted to at least one of the sound emission devices 21, 24, and 26. When an effect using light is determined, control information including a light emission control signal corresponding to the effect is transmitted to at least one of the light emitting devices 22 and 27. When the effect using the vibration is determined, control information including a vibration signal indicating the vibration corresponding to the effect is transmitted to the vibration device 25. When the effect using the video is determined, control information including a video signal indicating the video corresponding to the effect is transmitted to the projection device 23.
  • the rendering device 20 provides a rendering effect based on the control information received from the server device 30.
  • the sound emitting devices 21, 24, and 26 output a sound corresponding to a sound signal included in the control information.
  • the light emitting devices 22 and 27 emit light according to a light emission control signal included in the control information.
  • the vibration device 25 vibrates according to a vibration control signal included in the control information.
  • the projection device 23 projects an image corresponding to the image signal included in the control information on the projection surface.
  • dribbling is specified by analyzing detection information in step S104 mentioned above. For example, when a dribble sound is identified by analyzing a sound signal included in the detection information received from the sound collection device 11, the dribble may be specified. In addition, when the movement of the ball 60 indicating the dribble and the player is detected by performing image recognition processing on the video signal included in the detection information received from the imaging device 13, the dribble may be specified. Furthermore, dribble may be specified by adding at least one of these pieces of detection information to an analysis result of other detection information.
  • step S105 the effect “amplify and output dribble sound” stored in association with the play type “dribble” is determined in the effect database 111 shown in FIG.
  • step S106 sound processing for amplifying a sound signal indicating a dribble sound is performed.
  • a sound signal indicating a dribble sound included in the detection information received from the sound collection device 11 may be used.
  • this sound signal may be used.
  • control information including the sound signal generated in step S106 is transmitted to the sound emitting devices 21, 24, and 26, for example.
  • step S108 the sound output devices 21, 24, and 26 output sounds corresponding to the sound signals, that is, amplified dribble sounds.
  • This dribbling sound is output in real time. That is, a dribble sound is output while the player is dribbling.
  • the period during which the dribble sound is output and the period during which the dribble is performed may not be completely synchronized, and there may be a slight delay.
  • a shot is identified by analyzing detection information in step S104 mentioned above. For example, when a motion of the ball 60 indicating a shot and a player is detected by performing image recognition processing on the video signal included in the detection information received from the imaging device 13, the shot may be specified. Furthermore, the shoot may be specified by adding the analysis result of other detection information to the analysis result of the detection information.
  • step S105 the production effect “output sound effect corresponding to movement of shooter and ball 60” stored in association with the play type “shoot” is determined in the production database 111 shown in FIG.
  • step S106 processing for generating a sound signal indicating the sound effect 120 corresponding to the movement of the shooter and the ball 60 is performed.
  • FIG. 9 is a diagram illustrating an example of the sound effect 120 corresponding to the movement of the shooter and the ball 60.
  • the player is ready to shoot with the ball 60 in the period T11.
  • the sound effect 120 includes a sound 121 in the period T11, a sound 122 in the period T12, and a sound 123 at time t13 when the period T12 ends.
  • the sound 121 may be, for example, a sound indicating that a shooting posture has been entered.
  • the sound 122 may be, for example, the sound of the ball 60 thrown into the air “Hyu”.
  • the sound 122 may have a length corresponding to the period T12, for example.
  • the sound 123 may be, for example, a sound according to the shooting result. For example, when a shot is made, the sound 123 may be the sound of the ball 60 that has entered the goal of “bash”.
  • control information including the sound signal generated in step S106 is transmitted to, for example, sound emitting devices 21, 24, and 26.
  • the sound effects 120 corresponding to the sound signal are output from the sound emitting devices 21, 24, and 26.
  • the output of the sound effect 120 is performed in real time. That is, the sound effect 120 is output while the player is shooting.
  • a sound 121 is output in the period T11
  • a sound 122 is output in the period T12
  • a sound 123 is output at the time t13 in accordance with the movement of the player who performs the shot.
  • the output of the sound effect 120 may not be completely synchronized with the movement of the player, and there may be a slight delay.
  • the free throw is identified by analyzing the detection information in step S104 described above. For example, when the progress information indicates that a free throw is performed, the free throw may be specified. Furthermore, the free throw may be specified in consideration of the analysis result of the detection information.
  • step S105 in the effect database 111 shown in FIG. 8, an effect “output a sound corresponding to the heart rate of the thrower” stored in association with the play type “free throw” is determined.
  • step S106 described above a sound signal indicating a sound effect corresponding to the heart rate included in the detection information received from the wearable terminal 70 worn by the player who performs the free throw is generated.
  • This sound effect may be, for example, a sound of “Dokudokudoku” having a rhythm corresponding to the heartbeat cycle.
  • the sound effect changes in rhythm according to the heart rate. For example, the higher the heart rate, the faster the sound effect tempo may be.
  • control information including the sound signal generated in step S106 is transmitted to, for example, sound emitting devices 21, 24, and 26.
  • the sound emitting devices 21, 24, and 26 output sound effects corresponding to the sound signals, that is, sound effects corresponding to the heart rate of the player performing the shot.
  • This sound effect is output in real time. That is, this sound effect is output during the free throw.
  • the period in which the sound effect is output and the period in which the free throw is performed may not be completely synchronized, and there may be a slight delay.
  • the contents are produced based on the detection information acquired from the detection device 10.
  • the realistic sensation of the event can be improved.
  • this effect has contents corresponding to the player's situation such as the player's play and state, and can be felt with various senses such as hearing, tactile sensation, and vision.
  • the audience can feel the situation of the player more strongly, the communication between the athlete and the audience is strengthened.
  • the determining unit 103 has contents based on the detection information acquired by the acquiring unit 101, and determines a production effect corresponding to the situation of the audience. For example, the determination unit 103 analyzes the sound signals included in the detection information received from the plurality of sound collection devices 17 and calculates the sum of the amount of sound of the audience indicated by these sound signals. The determining means 103 determines the effect of emitting light with a light emission amount corresponding to the sum of the calculated audio amounts. In this case, the amount of emitted light may increase as the sum of the amount of audio increases, for example.
  • the production control unit 105 controls the production device 20 according to the production effect determined by the determination unit 103 and provides the production effect.
  • the effect control unit 105 transmits control information including a light emission control signal indicating the light emission amount corresponding to the sum of the sound amounts to the light emitting devices 22 and 27.
  • the light emitting devices 22 and 27 emit light according to the light emission control signal included in the control information.
  • the light emitting devices 22 and 27 emit light at the light emission amount indicated by the light emission control signal.
  • the light emission amount of the light emitting devices 22 and 27 increases as the audio volume of the audience increases. Therefore, the player can feel the level of cheering by the audience based on the light emission amounts of the light emitting devices 22 and 27.
  • a plurality of spectators are divided into a plurality of groups.
  • the plurality of spectators may be divided into supporters and general customers.
  • the plurality of spectators may be divided depending on the location of the spectator seat area 3.
  • the deciding means 103 selects a reference group that is a base of the production effect and a target group that is a destination of the production effect from a plurality of groups.
  • the reference group may be a group to which a supporter belongs
  • the target group may be a group to which a general customer belongs.
  • the reference group and the target group may be selected based on the sum of the audio volume of the audience of each group.
  • the determining unit 103 analyzes the sound signals included in the detection information received from the plurality of sound collecting devices 17 and calculates the sum of the sound volume of the audience of each group indicated by these sound signals.
  • the determining unit 103 may select a group having the largest sum of the calculated sound amounts as a reference group, and may select a group having the smallest sum of the calculated sound amounts as a target group.
  • the determination unit 103 determines the effect of content based on the detection information acquired from the detection device 10 corresponding to the reference group among the detection information acquired by the acquisition unit 101. For example, this effect may be that the sound of the audience of the reference group is output to the audience of the target group.
  • the generation unit 104 generates a sound signal based on the detection information acquired from the detection device 10 corresponding to the reference group.
  • a sound signal included in the detection information received from the sound collection device 17 corresponding to the reference group may be used as it is, or a sound signal subjected to sound processing may be used. Also good.
  • the sound collection device 17 corresponding to the reference group is a sound collection device 17 that collects the sound of the audience of the reference group. Such a sound collection device 17 is identified by identification information included in the detection information.
  • the effect control means 105 selects the effect device 20 corresponding to the target group. For example, when outputting the sound of the reference group toward the target group, the sound emitting devices 21, 24, and 26 corresponding to the target group are selected from the plurality of sound emitting devices 21, 24, and 26.
  • the sound emitting devices 21, 24, and 26 corresponding to the target group are the sound emitting devices 21, 24, and 26 that output sound toward the audience of the target group.
  • the sound emitting device 24 corresponding to the target group is the sound emitting device 24 provided in the spectator seat 50 where the audience of the target group sits.
  • the sound emitting device 26 corresponding to the target group is the sound emitting device 26 of the wearable terminal 80 attached to the audience of the target group.
  • Such sound emitting devices 21, 24, and 26 are identified by identification information included in the detection information.
  • the production control unit 105 controls the selected production device 20 according to the production effect determined by the determination unit 103 and provides the production effect. For example, the effect control unit 105 transmits control information including the sound signal generated by the generation unit 104 to the selected sound emitting devices 21, 24, and 26. Upon receiving this control information, these sound emitting devices 21, 24, and 26 output sound corresponding to the sound signal included in the control information. Thereby, the voice of the reference group audience is output to the audience of the target group.
  • the present invention is not limited to the above-described embodiments. At least two of the first to third embodiments described above may be implemented in combination. Various modifications may be made to the above-described embodiments. In addition, at least two of the following modifications may be implemented in combination.
  • the effect may be determined in consideration of the progress of the game.
  • the determination unit 103 determines the effect based on the detection information and the progress information acquired by the acquisition unit 101.
  • the determining unit 103 may determine different effects depending on the progress of the game indicated by the progress information. For example, when the shoot is specified by the specifying unit 102, normally, as described in the first embodiment, the effect of “output sound effects corresponding to the movement of the shooter and the ball 60” is determined. .
  • a special effect may be determined. This predetermined state may be, for example, a state where the game is exciting.
  • the predetermined state may be a reverse state.
  • the predetermined state may be a state reversed within a predetermined time before the end of the game.
  • a predetermined state may be defined for each team. This predetermined state may be determined based on, for example, a team hex.
  • the special performance effect is different from the normal performance effect.
  • the special effect may be a result of processing a normal effect, or may be a result of adding an effect using another medium to the normal effect.
  • the normal production effect is a production effect using sound
  • the special production effect may be output by performing sound processing on the sound used for the normal production effect.
  • an effect using at least one of light, vibration, and video may be provided. According to this modified example, even when the same play is performed in the game, it is possible to provide different effects depending on the progress of the game.
  • the progress of the game described above may be specified based on rule information indicating the rules of the game in addition to the progress information.
  • This rule information may be stored, for example, in the production database 111, may be stored in the storage 34 of the server device 30, or may be stored in an external storage device.
  • the progress of this game may be the remaining time of the game, for example. For example, when the elapsed time is included in the progress information and the game information is included in the rule information, the remaining time of the game is calculated by subtracting the elapsed time from the match time. This elapsed time may be, for example, a time measured by a game referee.
  • the staff operating the game may input the elapsed time measured by the referee to a terminal device (not shown), and the progress information including the elapsed time may be acquired from the terminal device.
  • the effect of the content corresponding to the remaining time of the game may be provided.
  • the parameters of the audio processing applied to the sound signal may change according to the remaining time of the game.
  • the amplification degree may change according to the remaining time of the game. Thereby, the production according to the remaining time of the game can be performed.
  • a different production effect may be provided by an audience or an audience group.
  • the determination unit 103 determines a plurality of different effects for each of a plurality of spectators or a plurality of spectator groups.
  • the production control means 105 controls a plurality of production devices 20 corresponding to a plurality of spectators or a plurality of audience groups, and provides production effects corresponding to the production devices 20.
  • the effect device 20 corresponding to the audience or the audience group is an effect device 20 that provides an effect to the audience or the audience group.
  • different presentation effects may be provided by the audience supporting one team and the audience supporting the other team.
  • the sound emitting devices 21, 24, and 26 corresponding to the first sound effects corresponding to the movement of the shooter and the ball 60 are provided to the audience supporting the one team. It may be output.
  • the first sound effect may be a sound that excites the audience, for example.
  • a second sound effect corresponding to the movement of the shooter and the ball 60 may be output from the corresponding sound emitting devices 21, 24, and 26 to the spectator who supports the other team.
  • the second sound effect is different from the first sound effect, and may be a sound that keeps the audience cool, for example. According to this modification, it is possible to provide a presentation effect suitable for the team to be supported.
  • different effects may be provided depending on the location of the spectator seat area 3.
  • the audience area 3 there is a passionate zone where there are spectators who cheer enthusiastically, an analysis zone where there is a spectator who likes to analyze the game, a beginner zone where there are spectators who watch the game for the first time, and player play A case is assumed in which a sensation zone in which there are spectators who want to experience more strongly is included.
  • these zones may be separated by the distance from the court 4 or the arrangement in the spectator seat area 3 (outfield seats, second floor seats, etc.).
  • presentation effects corresponding to the attributes of the audience may be provided to the audience in each zone.
  • a gorgeous staging effect that excites cheering by the corresponding staging device 20 may be provided to the audience in the intense zone.
  • the spectator in the sensation zone may be provided with an effect such that the corresponding effect device 20 can experience the player's play, for example, an effect using vibration.
  • this modification since a plurality of effects are provided in the spectator seat area 3, variations in how to enjoy watching events can be increased. Further, when the spectator seat area 3 is divided according to the spectator attributes, it is possible to provide a production effect according to the spectator attributes.
  • a specific audience may be provided with a plurality of performance effects different from those of other audiences.
  • an effect different from other spectators may be provided to a spectator who supports the player who has shot.
  • the spectator registers in advance the players he supports.
  • registration information indicating the registration contents is stored in advance. Based on this registration information, the audience that supports the player who shot is identified.
  • a normal presentation effect may be provided to the audience other than the identified audience by the corresponding presentation device 20.
  • a special effect may be provided to the specified audience by the corresponding effect device 20.
  • the special production effect is a production effect different from the normal production effect, like the special production effect described above. According to this modification, it is possible to provide a presentation effect suitable for each audience.
  • the audience in a predetermined state may be provided with a plurality of performance effects different from those of other audiences.
  • This predetermined state may be, for example, a state of standing up from the spectator seat 50.
  • the seat 50 is provided with a seating sensor.
  • the seating sensor detects the seating of the spectator on the spectator seat 50 and transmits detection information indicating the detection result to the server device 30. Based on this detection information, it is determined whether each spectator is sitting in the spectator seat 50. For example, a normal presentation effect may be provided to the audience sitting in the spectator seat 50 by the corresponding presentation device 20. On the other hand, since it is considered that the audience standing up from the spectator seat 50 is in an excited state, a special performance effect may be provided by the corresponding performance device 20.
  • the special production effect is a production effect different from the normal production effect, like the special production effect described above.
  • the audience standing up from the spectator seat 50 cannot be vibrated, and thus a presentation effect using a medium other than the vibration may be provided. According to this modification, it is possible to provide an effect that is suitable for the state of the audience.
  • Modification 3 In each of the above-described embodiments, different presentation effects may be provided depending on the player related to the event status specified by the specifying unit 102. For example, when a player change is performed in a game, a predetermined effect may be provided for a new player who has entered through the change. Further, different presentation effects may be provided depending on the popularity of the players related to the event status specified by the specifying means 102. For example, when an ace having a high degree of popularity participates in the middle of a match, a presentation effect different from that of other players may be provided. This effect may be, for example, the special effect described above.
  • a production effect is not limited to what used sound.
  • the effect may be one that uses at least one of sound, light, vibration, and an image.
  • This image includes a still image and a moving image.
  • the vibration device 25 may vibrate and the light emitting devices 22 and 27 may emit light according to the dribbling sound.
  • This vibration and light emission may correspond to a dribble sound.
  • a vibration control signal based on a sound signal indicating a dribble sound may be generated and provided to the vibration device 25.
  • a light emission control signal based on a sound signal indicating a dribble sound may be generated and provided to the light emitting devices 22 and 27.
  • the vibration device 25 may vibrate and the light emitting devices 22 and 27 may emit light according to the sound effect corresponding to the heart rate.
  • This vibration and light emission may correspond to the heart rate.
  • a vibration control signal based on the heart rate may be generated and provided to the vibration device 25.
  • a light emission control signal based on the heart rate may be generated and provided to the light emitting devices 22 and 27.
  • the information of the player who performs free throw may be projected by the projection device 23.
  • This player information may include, for example, the player's spine number and face image.
  • different color images may be projected depending on the state of the player indicated by the heart rate. For example, when the player is in an excited state, an image based on red may be projected. On the other hand, an image based on blue may be projected.
  • the image which shows the contents of the cheering of a spectator may be displayed.
  • this image may be displayed on equipment used in the game.
  • This equipment may be a goal used in, for example, a basketball game.
  • the rendering device 20 includes a display device 28.
  • FIG. 10 is a diagram showing an example of the display device 28 according to this modification.
  • the display device 28 is provided integrally with the backboard of the basket goal.
  • the display device 28 displays an image.
  • the display device 28 may be, for example, a transparent organic EL (Electro-Luminescence) display.
  • a sound recognition process is performed on the sound signals included in the detection information received from the plurality of sound collection devices 17, and an image signal indicating an image corresponding to the result of the sound recognition process is generated. For example, when the content of the sound of a certain audience specified by the speech recognition process is “shoot! Decide”, an image signal indicating an image including the content of the sound is generated.
  • the effect control unit 105 transmits control information including the generated image signal to the display device 28.
  • the display device 28 displays an image corresponding to the image signal included in the control information. Thereby, the player can visually recognize the cheering of the audience.
  • the audience does not necessarily need to be in the place where an event is performed.
  • a spectator may watch an event remotely.
  • the audience is in a place different from the place where the event is held, such as a home or a live viewing venue.
  • this other place is a place related to the event.
  • the video imaged by the imaging device 13 is distributed to another place via the communication line 40.
  • the communication line 40 may be configured including the Internet, for example.
  • the audience browses the distributed video using the terminal device.
  • the spectator may wear the wearable terminal 80 in the same manner as the spectator at the place where the event is held.
  • the wearable terminal 70 may include a motion detection device that detects the motion of the player.
  • This motion detection device may be, for example, at least one of an acceleration sensor, a gyro sensor, and a magnetic sensor.
  • a player's movement may be detected using a non-contact type radar.
  • the motion detection device that detects the motion of the player may be realized by the cooperation of the photographing device 13 and the specifying unit 102 described above, or realized by various sensors provided on the wearable terminal 70. Alternatively, it may be realized by a non-contact type radar.
  • Modification 8 In each embodiment described above, an effect of displaying a virtual object using AR (Augmented Reality) may be performed.
  • a display is provided on the wearable terminal 80.
  • a virtual object corresponding to the event situation is displayed using AR.
  • the event player is not limited to a player.
  • the event is an event other than a sport game
  • a person who performs some performance at the event is a player.
  • an object other than the ball 60 may be used in the event.
  • the motion detection device 14 described above may be provided for an object used in the event.
  • Modification 10 The object for implementing the function of the production system 1 is not limited to the examples described in the above embodiments.
  • some of the functions implemented by the server device 30 may be implemented by the detection device 10, the rendering device 20, or other devices.
  • the specifying unit 102 described above may be mounted on the detection apparatus 10. In this case, the above-described play specification is performed by the detection device 10.
  • Modification 11 The steps of processing performed in the production system 1 are not limited to the examples described in the above-described embodiment. The steps of this process may be interchanged as long as there is no contradiction.
  • the present invention may be provided as an effect method including steps of processing performed in the effect system 1.
  • the present invention may be provided as a program executed in the detection device 10, the rendering device 20, or the server device 30.
  • This program may be downloaded via a communication line such as the Internet.
  • the program is provided in a state where it is recorded on a computer-readable recording medium such as a magnetic recording medium (magnetic tape, magnetic disk, etc.), an optical recording medium (optical disk, etc.), a magneto-optical recording medium, or a semiconductor memory. May be.
  • an effect corresponding to the replay may be provided.
  • a large screen is provided at a venue where a game is held, and a characteristic scene of the game is replayed on the screen.
  • the play or penalty scene may be replayed immediately after that.
  • the video imaged by the imaging device 13 is recorded, and the video image corresponding to the period of this scene is displayed on the screen.
  • a presentation effect corresponding to the replay video is provided.
  • detection information indicating a physical quantity detected by the detection device 10 is stored.
  • This detection information may be stored in the storage 34 of the server device 30, or may be stored in the detection device 10 or an external storage device.
  • the determination unit 103 determines the effect of the content based on the detection information corresponding to the above-described scene period among the stored detection information. This effect is provided in accordance with the replay video. Thereby, powerful replay can be realized.
  • production system 10 detection device 11, 15, 17: sound collection device 13: imaging device 16: biometric sensor 20: production device 21, 24, 26: sound emission device 30: server device 101: acquisition unit 102: identification unit 103: Determination means 104: Generation means 105: Production control means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Details Of Audible-Bandwidth Transducers (AREA)
  • Circuit For Audible Band Transducer (AREA)
  • User Interface Of Digital Computer (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

In the present invention, an acquisition means acquires detection information which indicates a physical quantity detected by a detection device in a location related to an event. A determination means determines the production effect for content based on the detection information acquired by the acquisition means. A production control means controls a production device in accordance with the production effect determined by the determination means, and causes the production effect to be provided.

Description

演出制御装置、演出システム、及びプログラムProduction control device, production system, and program
 本発明は、イベントを演出する技術に関する。 The present invention relates to a technique for producing an event.
 イベントの観客の満足度を向上させる技術が知られている。例えば特許文献1には、ライブのスポーツイベントを撮影するカメラから得られたビデオストリームを、観客の携帯電子装置に送信し、そのディスプレイに表示する技術が記載されている。 The technology to improve the satisfaction of the event audience is known. For example, Patent Document 1 describes a technique in which a video stream obtained from a camera that captures a live sports event is transmitted to a portable electronic device of a spectator and displayed on the display.
日本国特表2009-503922号公報Japan Special Table 2009-503922
 しかし、特許文献1に記載の技術では、選手の状況を映像で見ることができるだけであり、臨場感に欠ける。他方、近年イベントを盛り上げるために、音、光、映像等の媒体を用いた演出が行われることがある。しかし、従来の演出は、イベントにおいて行われるプレーヤーのプレーや観客の反応とは無関係に行われており、イベントの臨場感を高めるような工夫はない。
 本発明は、イベントの臨場感を高める演出効果を提供することを目的とする。
However, with the technique described in Patent Document 1, it is only possible to see the situation of the player on the video, and lacks a sense of reality. On the other hand, in recent years, there is a case where an effect using a medium such as sound, light, and video is performed in order to excite the event. However, the conventional performance is performed regardless of the player's play and the audience's reaction performed at the event, and there is no contrivance for enhancing the sense of presence of the event.
An object of this invention is to provide the production effect which raises the realistic feeling of an event.
 本発明は、イベントに関連する場所において検出装置により検出された物理量を示す検出情報を取得する取得手段と、前記取得手段により取得された前記検出情報に基づく内容の演出効果を決定する決定手段と、前記決定手段により決定された前記演出効果に応じて演出装置を制御し、前記演出効果を提供させる演出制御手段とを備える演出制御装置を提供する。 The present invention provides an acquisition unit that acquires detection information indicating a physical quantity detected by a detection device at a place related to an event, and a determination unit that determines an effect of content based on the detection information acquired by the acquisition unit. An effect control device is provided that includes an effect control means for controlling the effect device in accordance with the effect effect determined by the determining means and providing the effect effect.
 前記演出制御装置は、前記取得手段により取得された前記検出情報に基づいて、前記イベントの状況を特定する特定手段をさらに備え、前記決定手段は、前記特定手段により特定された前記イベントの状況に対応する演出効果を決定してもよい。 The production control device further includes a specifying unit that specifies the status of the event based on the detection information acquired by the acquiring unit, and the determining unit is configured to add the event status specified by the specifying unit to the event status. A corresponding production effect may be determined.
 前記イベントの状況は、前記イベントにおいてプレーヤーにより行われるプレーを含んでもよい。 The event situation may include a play performed by a player in the event.
 前記演出制御装置は、前記決定手段により音を用いた演出効果が決定された場合、前記検出情報に基づく音信号を生成する生成手段をさらに備え、前記演出制御手段は、前記生成手段により生成された前記音信号をスピーカに供給し、当該音信号に応じた音を前記スピーカから出力させてもよい。 The production control device further includes a generation unit that generates a sound signal based on the detection information when the production effect using sound is determined by the determination unit, and the production control unit is generated by the generation unit. The sound signal may be supplied to a speaker, and a sound corresponding to the sound signal may be output from the speaker.
 前記検出装置は、前記イベントにおいて発せられた音を収集するマイクロフォンを含み、前記検出情報は、前記マイクロフォンにより収集された前記音を示す音信号を含み、前記生成手段は、前記検出情報に含まれる前記音信号に音声処理を施し、前記演出制御手段は、前記音声処理が施された前記音信号を前記スピーカに供給してもよい。 The detection device includes a microphone that collects sound emitted in the event, the detection information includes a sound signal indicating the sound collected by the microphone, and the generation means is included in the detection information The sound signal may be subjected to sound processing, and the effect control means may supply the sound signal subjected to the sound processing to the speaker.
 前記検出装置は、前記イベントのプレーヤーの動きを検出する動き検出装置を含み、前記検出情報は、前記動き検出装置により検出された前記動きを示し、前記生成手段は、前記検出情報により示される前記動きに対応する音信号を生成してもよい。 The detection device includes a motion detection device that detects a motion of a player of the event, the detection information indicates the motion detected by the motion detection device, and the generation means is indicated by the detection information. A sound signal corresponding to the movement may be generated.
 前記検出装置は、前記イベントのプレーヤーの生体情報を測定する生体センサーを含み、前記検出情報は、前記生体センサーにより測定された前記生体情報を含み、前記生成手段は、前記検出情報に含まれる前記生体情報に対応する音信号を生成してもよい。 The detection device includes a biological sensor that measures biological information of the player of the event, the detection information includes the biological information measured by the biological sensor, and the generation unit is included in the detection information A sound signal corresponding to the biological information may be generated.
 前記取得手段は、前記イベントの経過を示す経過情報をさらに取得し、前記決定手段は、前記取得手段により取得された前記検出情報と前記経過情報とに基づいて、前記演出効果を決定してもよい。 The acquisition unit may further acquire progress information indicating the progress of the event, and the determination unit may determine the effect based on the detection information and the progress information acquired by the acquisition unit. Good.
 前記決定手段は、前記イベントを観戦する複数の観客又は複数の観客グループについて、互いに異なる複数の演出効果を決定し、前記演出制御手段は、前記複数の観客又は前記複数の観客グループに対応する複数の演出装置を制御し、前記複数の演出装置の各々に前記複数の演出効果のうち当該演出装置が対応する観客又は観客のグループについて決定された演出効果を提供させてもよい。 The determining means determines a plurality of different presentation effects for a plurality of spectators or a plurality of spectator groups watching the event, and the presentation control means is a plurality corresponding to the plurality of spectators or the plurality of spectator groups. The effect device may be controlled to provide each of the plurality of effect devices with the effect determined for the audience or group of spectators to which the effect device corresponds among the plurality of effect effects.
 前記演出効果は、音、光、振動、又は画像を用いた効果であってもよい。 The effect may be an effect using sound, light, vibration, or an image.
 また、本発明は、イベントに関連する場所において物理量を検出する検出装置と、前記イベントの演出効果を提供する演出装置と、前記検出装置により検出された前記物理量を示す検出情報を取得する取得手段と、前記取得手段により取得された前記検出情報に基づく内容の演出効果を決定する決定手段と、前記決定手段により決定された前記演出効果に応じて前記演出装置を制御し、前記演出効果を提供させる演出制御手段とを備える演出システムを提供する。 In addition, the present invention provides a detection device that detects a physical quantity at a place related to an event, an effect device that provides an effect of the event, and an acquisition unit that acquires detection information indicating the physical quantity detected by the detection device. Determining means for determining the effect of the content based on the detection information acquired by the acquisition means; and controlling the effect device according to the effect determined by the determining means to provide the effect There is provided an effect system including an effect control means.
 さらに、本発明は、コンピュータに、イベントに関連する場所において検出装置により検出された物理量を示す検出情報を取得するステップと、前記取得された検出情報に基づく内容の演出効果を決定するステップと、前記決定された演出効果に応じて演出装置を制御し、前記演出効果を提供させるステップとを実行させるためのプログラムを提供する。 Further, the present invention is a step of acquiring detection information indicating a physical quantity detected by a detection device at a location related to an event in a computer, and determining a presentation effect of contents based on the acquired detection information; A program for controlling an effect device according to the determined effect and executing the step of providing the effect is provided.
 本発明によれば、イベントの臨場感を高める演出効果を提供することができる。 According to the present invention, it is possible to provide a production effect that enhances the presence of an event.
第1実施形態に係る演出システム1の構成の一例を示す図。The figure which shows an example of a structure of the production system 1 which concerns on 1st Embodiment. 観客席50の一例を示す図。The figure which shows an example of the audience seat 50. FIG. バスケットの試合で使用されるボール60と、選手が装着するウェアラブル端末70の一例を示す図。The figure which shows an example of the ball | bowl 60 used by the game of a basket, and the wearable terminal 70 with which a player wears. 観客が装着するウェアラブル端末80の一例を示す図。The figure which shows an example of the wearable terminal 80 with which a spectator wears. サーバー装置30のハードウェア構成の一例を示す図。The figure which shows an example of the hardware constitutions of the server apparatus. 演出システム1の機能構成の一例を示す図。The figure which shows an example of the function structure of the production system 1. FIG. 演出システム1の動作の一例を示すシーケンスチャート。The sequence chart which shows an example of operation | movement of the production system 1. 演出データベース111の一例を示す図。The figure which shows an example of the production database 111. FIG. シューター及びボール60の動きに対応する効果音120の一例を示す図。The figure which shows an example of the sound effect 120 corresponding to the motion of a shooter and the ball | bowl 60. FIG. 変形例に係る表示装置28の一例を示す図。The figure which shows an example of the display apparatus 28 which concerns on a modification.
第1実施形態構成
 図1は、第1実施形態に係る演出システム1の構成の一例を示す図である。演出システム1は、イベントの演出を行う。このイベントは、例えば各種のスポーツの試合、コンサート、又は演劇であってもよい。ここでは、イベントがバスケットボールの試合である場合を想定する。
First Embodiment Configuration FIG. 1 is a diagram illustrating an example of a configuration of an effect system 1 according to a first embodiment. The production system 1 produces an event. This event may be, for example, various sports matches, concerts, or plays. Here, it is assumed that the event is a basketball game.
 イベントが行われる場所は、フィールド2と観客席エリア3とにより構成される。フィールド2には、コート4が含まれる。コート4上では、選手によりバスケットボールの試合が行われる。観客席エリア3には、複数の観客席50が設けられる。観客席エリア3では、観客が試合を観戦する。 The place where the event takes place is composed of the field 2 and the auditorium area 3. Field 2 includes a coat 4. On the court 4, a player plays a basketball game. A plurality of spectator seats 50 are provided in the spectator seat area 3. In the spectator seat area 3, the spectators watch the game.
 イベントが行われる場所には、複数の収音装置11と、複数の振動検出装置12と、複数の撮影装置13と、複数の放音装置21と、複数の発光装置22と、投射装置23とが設けられる。なお、図1に示すこれらの装置の数及び位置は例示であり、これらの装置の数及び位置は図1に例示した数に限定されない。 In the place where the event is performed, a plurality of sound collecting devices 11, a plurality of vibration detecting devices 12, a plurality of photographing devices 13, a plurality of sound emitting devices 21, a plurality of light emitting devices 22, and a projection device 23 are provided. Is provided. In addition, the number and position of these apparatuses shown in FIG. 1 are illustrations, and the number and position of these apparatuses are not limited to the number illustrated in FIG.
 複数の収音装置11は、選手のプレーの音が収集できるような位置に設けられる。例えば複数の収音装置11は、フィールド2においてコート4の周りに設けられてもよい。或いは複数の収音装置11は、コート4に埋め込まれてもよい。各収音装置11は、所定の時間間隔で音を収集する。各収音装置11は、例えばマイクロフォンを有する。 The plurality of sound collecting devices 11 are provided at positions where the player's play sounds can be collected. For example, the plurality of sound collecting devices 11 may be provided around the coat 4 in the field 2. Alternatively, the plurality of sound collecting devices 11 may be embedded in the coat 4. Each sound collection device 11 collects sound at a predetermined time interval. Each sound collection device 11 has a microphone, for example.
 複数の振動検出装置12は、選手のプレーにより発生する振動を検出できるような位置に設けられる。例えば複数の振動検出装置12は、コート4に埋め込まれてもよい。各振動検出装置12は、所定の間隔で振動を検出する。各振動検出装置12は、例えば振動センサーを有する。 The plurality of vibration detection devices 12 are provided at positions where vibrations generated by the player's play can be detected. For example, the plurality of vibration detection devices 12 may be embedded in the coat 4. Each vibration detection device 12 detects vibration at a predetermined interval. Each vibration detection device 12 includes, for example, a vibration sensor.
 複数の撮影装置13は、選手のプレーが撮影できるような位置に設けられる。各撮影装置13は、映像を撮影する。各撮影装置13は、例えばデジタルビデオカメラを有する。 The plurality of photographing devices 13 are provided at positions where the player's play can be photographed. Each photographing device 13 photographs a video. Each photographing device 13 has a digital video camera, for example.
 複数の放音装置21は、これらの放音装置21から出力された音が観客に聞こえるような位置に設けられる。例えば複数の放音装置21は、観客席エリア3の通路又は観客席50の間に設けられてもよい。各放音装置21は、音信号に応じた音を出力する。各放音装置21は、例えばスピーカを有する。このスピーカは、指向性スピーカであってもよい。 The plurality of sound emitting devices 21 are provided at positions where the sound output from these sound emitting devices 21 can be heard by the audience. For example, the plurality of sound emitting devices 21 may be provided between the passages in the auditorium area 3 or between the auditoriums 50. Each sound emitting device 21 outputs a sound corresponding to the sound signal. Each sound emitting device 21 has a speaker, for example. This speaker may be a directional speaker.
 複数の発光装置22は、これらの発光装置22から発せられた光が選手、観客、又は選手と観客の両方に見えるような位置に設けられる。例えば複数の発光装置22は、観客席エリア3の通路又は観客席50の間に設けられてもよい。各発光装置22は、発光制御信号に応じて発光する。各発光装置22は、例えばLED(Light Emitting Diode)を有する。 The plurality of light emitting devices 22 are provided at positions where light emitted from these light emitting devices 22 is visible to the player, the audience, or both the athlete and the audience. For example, the plurality of light emitting devices 22 may be provided between the passage in the spectator seat area 3 or between the spectator seats 50. Each light emitting device 22 emits light according to the light emission control signal. Each light emitting device 22 has, for example, an LED (Light Emitting Diode).
 投射装置23は、選手、観客、又は選手と観客の両方から見える位置に映像を投射し得るような位置に設けられる。投射装置23は、映像信号に応じた映像を投射面に投射する。投射装置23は、例えばプロジェクターを有する。この投射面は、例えばフィールド2、コート4、又は天井であってもよい。 Projection device 23 is provided at a position where a video can be projected to a player, a spectator, or a position visible to both the player and the spectator. The projection device 23 projects an image corresponding to the image signal onto the projection surface. The projection device 23 includes, for example, a projector. This projection surface may be, for example, the field 2, the coat 4, or the ceiling.
 図2は、観客席50の一例を示す図である。観客席50には、上述した放音装置21と同様の放音装置24と、振動装置25とが設けられる。なお、観客席50には、必ずしもこれらの装置全てが設けられなくてもよい。例えば放音装置24と振動装置25とのいずれか一方だけが観客席50に設けられてもよい。また、これらの装置は、全ての観客席50に設けられなくてもよい。例えば一部の観客席50だけにこれらの装置が設けられてもよい。 FIG. 2 is a diagram illustrating an example of the spectator seat 50. The auditorium 50 is provided with a sound emitting device 24 similar to the sound emitting device 21 described above and a vibration device 25. Note that the spectator seat 50 may not necessarily be provided with all of these devices. For example, only one of the sound emitting device 24 and the vibration device 25 may be provided in the spectator seat 50. Further, these devices may not be provided in all the spectator seats 50. For example, these devices may be provided only in some spectator seats 50.
 振動装置25は、観客に振動を与えられるような位置に設けられる。例えば振動装置25は、観客席50の座面の裏に設けられる。振動装置25は、振動制御信号に応じて振動する。振動装置25は、例えば振動モーターを有する。 The vibration device 25 is provided at a position where vibration can be given to the audience. For example, the vibration device 25 is provided behind the seat surface of the spectator seat 50. The vibration device 25 vibrates according to the vibration control signal. The vibration device 25 has, for example, a vibration motor.
 図3は、バスケットの試合で使用されるボール60と、選手が装着するウェアラブル端末70の一例を示す図である。ボール60には、動き検出装置14が内蔵される。動き検出装置14は、ボール60の動きを検出する。動き検出装置14は、例えば加速度センサーを有する。 FIG. 3 is a diagram illustrating an example of a ball 60 used in a basket game and a wearable terminal 70 worn by a player. The ball 60 incorporates the motion detection device 14. The motion detection device 14 detects the motion of the ball 60. The motion detection device 14 includes, for example, an acceleration sensor.
 ウェアラブル端末70は、例えば選手により装着される腕時計型の端末である。ただし、ウェアラブル端末70は、選手が装着可能な形状であれば、他の形状を有していてもよい。ウェアラブル端末70は、上述した収音装置11と同様の収音装置15と、生体センサー16とを有する。なお、ウェアラブル端末70には、必ずしもこれらの装置全てが設けられなくてもよい。例えば収音装置15と生体センサー16とのいずれか一方だけがウェアラブル端末70に設けられてもよい。 The wearable terminal 70 is, for example, a wristwatch type terminal worn by a player. However, the wearable terminal 70 may have other shapes as long as the shape is wearable by the player. The wearable terminal 70 includes a sound collection device 15 similar to the sound collection device 11 described above, and a biological sensor 16. Note that the wearable terminal 70 is not necessarily provided with all of these devices. For example, only one of the sound collection device 15 and the biological sensor 16 may be provided in the wearable terminal 70.
 生体センサー16は、所定の時間間隔で選手の生体情報を測定する。この生体情報は、例えば心拍数、脈拍数、体温、又は血圧であってもよい。例えば生体情報が心拍数である場合、生体センサー16は、心拍数を測定する心拍計を有する。 The biosensor 16 measures the player's biometric information at predetermined time intervals. This biological information may be, for example, heart rate, pulse rate, body temperature, or blood pressure. For example, when the biological information is a heart rate, the biological sensor 16 includes a heart rate meter that measures the heart rate.
 図4は、観客が装着するウェアラブル端末80の一例を示す図である。ウェアラブル端末80は、例えば観客により装着される腕時計型の端末である。ただし、ウェアラブル端末80は、観客が装着可能な形状であれば、他の形状を有していてもよい。ウェアラブル端末80は、上述した収音装置11と同様の収音装置17と、上述した放音装置21と同様の放音装置26と、上述した発光装置22と同様の発光装置27とを有する。なお、ウェアラブル端末80には、必ずしもこれらの装置全てが設けられなくてもよい。例えば収音装置17と放音装置26と発光装置27とのうち1つ又は2つだけがウェアラブル端末80に設けられてもよい。また、ウェアラブル端末80にも、上述した振動装置25と同様の振動装置が設けられていてもよい。 FIG. 4 is a diagram illustrating an example of a wearable terminal 80 worn by a spectator. The wearable terminal 80 is, for example, a wristwatch type terminal worn by a spectator. However, the wearable terminal 80 may have other shapes as long as it can be worn by the audience. The wearable terminal 80 includes a sound collecting device 17 similar to the sound collecting device 11 described above, a sound emitting device 26 similar to the sound emitting device 21 described above, and a light emitting device 27 similar to the light emitting device 22 described above. Note that the wearable terminal 80 is not necessarily provided with all of these devices. For example, only one or two of the sound collecting device 17, the sound emitting device 26, and the light emitting device 27 may be provided in the wearable terminal 80. Further, the wearable terminal 80 may be provided with a vibration device similar to the vibration device 25 described above.
 以下の説明では、上述した収音装置11、15、及び17と、振動検出装置12と、撮影装置13と、動き検出装置14と、生体センサー16とを総称して、「検出装置10」と言う。また、上述した放音装置21、24、及び26と、発光装置22及び27と、投射装置23と、振動装置25とを総称して、「演出装置20」と言う。 In the following description, the above-described sound collection devices 11, 15, and 17, the vibration detection device 12, the imaging device 13, the motion detection device 14, and the biological sensor 16 are collectively referred to as “detection device 10”. To tell. In addition, the sound emitting devices 21, 24, and 26, the light emitting devices 22 and 27, the projection device 23, and the vibration device 25 are collectively referred to as the “production device 20”.
 検出装置10及び演出装置20に含まれる個々の装置は、いずれも通信インタフェースを有し、図1に示すように、通信回線40を介してサーバー装置30に接続される。各通信インタフェースは、通信回線40を介して、サーバー装置30とデータ通信を行う。検出装置10及び演出装置20に含まれる個々の装置は、通信回線40に無線で接続されてもよいし、有線で接続されてもよい。通信回線40は、検出装置10に含まれる個々の装置とサーバー装置30、及び演出装置20に含まれる個々の装置とサーバー装置30との間でデータを伝送する。通信回線40は、例えば無線通信ネットワークとLAN(Local Area Network)等のプライベートネットワークとにより構成されてもよいし、さらにインターネットを含んで構成されてもよい。 Each device included in the detection device 10 and the rendering device 20 has a communication interface, and is connected to the server device 30 via the communication line 40 as shown in FIG. Each communication interface performs data communication with the server device 30 via the communication line 40. Individual devices included in the detection device 10 and the rendering device 20 may be connected to the communication line 40 wirelessly or may be connected by wire. The communication line 40 transmits data between the individual devices included in the detection device 10 and the server device 30, and between the individual devices included in the rendering device 20 and the server device 30. The communication line 40 may be constituted by, for example, a wireless communication network and a private network such as a LAN (Local Area Network), and may further include the Internet.
 図5は、サーバー装置30のハードウェア構成の一例を示す図である。サーバー装置30は、プロセッサー31と、メモリー32と、通信インタフェース33と、ストレージ34とを有する。これらの装置は、データを伝送するバス35を介して接続される。 FIG. 5 is a diagram illustrating an example of a hardware configuration of the server device 30. The server device 30 includes a processor 31, a memory 32, a communication interface 33, and a storage 34. These devices are connected via a bus 35 for transmitting data.
 プロセッサー31は、プログラムをメモリー32に読み出して実行することにより、各種の処理を実行する。プロセッサー31としては、例えばCPU(Central Processing Unit)が用いられてもよい。メモリー32は、プロセッサー31により実行されるプログラムを記憶する。メモリー32としては、例えばROM(Read Only Memory)、RAM(Random Access Memory)、又はこれらの組み合わせが用いられてもよい。通信インタフェース33は、通信回線40を介して、検出装置10及び演出装置20に含まれる個々の装置とデータ通信を行う。ストレージ34は、各種のデータ及びプログラムを記憶する。ストレージ34としては、例えばハードディスク、フラッシュメモリー、又はこれらの組み合わせが用いられてもよい。 The processor 31 executes various processes by reading the program into the memory 32 and executing it. As the processor 31, for example, a CPU (Central Processing Unit) may be used. The memory 32 stores a program executed by the processor 31. As the memory 32, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), or a combination thereof may be used. The communication interface 33 performs data communication with individual devices included in the detection device 10 and the rendering device 20 via the communication line 40. The storage 34 stores various data and programs. As the storage 34, for example, a hard disk, a flash memory, or a combination thereof may be used.
 図6は、演出システム1の機能構成の一例を示す図である。演出システム1は、取得手段101と、特定手段102と、決定手段103と、生成手段104と、演出制御手段105として機能する。この例では、これらの機能はいずれもサーバー装置30に実装される。具体的には、これらの機能は、メモリー32に記憶されたプログラムと、プログラムを実行するプロセッサー31との協働によって実現される。この場合、サーバー装置30は、演出制御装置として機能する。 FIG. 6 is a diagram illustrating an example of a functional configuration of the production system 1. The production system 1 functions as the acquisition unit 101, the identification unit 102, the determination unit 103, the generation unit 104, and the production control unit 105. In this example, all of these functions are implemented in the server device 30. Specifically, these functions are realized by the cooperation of a program stored in the memory 32 and a processor 31 that executes the program. In this case, the server device 30 functions as an effect control device.
 取得手段101は、イベントが行われる場所において検出装置10により検出された物理量を示す検出情報を取得する。この物理量は、音、振動、映像、動き、及び生体情報のうち少なくともいずれかに関するものである。なお、この「取得」という用語の概念には、受信が含まれる。 The acquisition unit 101 acquires detection information indicating a physical quantity detected by the detection device 10 at a place where an event is performed. This physical quantity relates to at least one of sound, vibration, video, motion, and biological information. Note that the concept of the term “acquisition” includes reception.
 特定手段102は、取得手段101により取得された検出情報に基づいて、イベントの状況を特定する。このイベントの状況には、選手により行われるプレーが含まれる。 The specifying unit 102 specifies the event status based on the detection information acquired by the acquiring unit 101. The status of this event includes play played by the players.
 決定手段103は、取得手段101により取得された検出情報に基づく内容の演出効果を決定する。例えば決定手段103は、特定手段102により特定されたプレーに関する検出情報に基づく内容の演出効果を決定する。この演出効果は、音、光、振動、及び映像のうち少なくともいずれかを用いた効果である。すなわち、この演出効果は、聴覚、視覚、及び触覚のうち少なくともいずれかの感覚により感じられる効果であってもよい。 The determining unit 103 determines the effect of the content based on the detection information acquired by the acquiring unit 101. For example, the determining unit 103 determines the effect of the content based on the detection information related to the play specified by the specifying unit 102. This effect is an effect using at least one of sound, light, vibration, and video. That is, this effect may be an effect felt by at least one of auditory, visual, and tactile sensations.
 生成手段104は、決定手段103により音を用いた演出効果が決定された場合、取得手段101により取得された検出情報に基づく音信号を生成する。例えば生成手段104は、音信号に音声処理を施すことにより、音信号を生成してもよい。この音声処理は、例えば音信号を加工する処理であってもよいし、2つ以上の音信号を合成する処理であってもよい。この加工には、増幅が含まれる。この音声処理に用いられる音信号は、例えば検出情報に含まれるものであってもよいし、予めストレージ34に記憶されたものであってもよい。他の例として、生成手段104は、音源を用いて音信号を生成してもよい。 The generating unit 104 generates a sound signal based on the detection information acquired by the acquiring unit 101 when the effect using the sound is determined by the determining unit 103. For example, the generation unit 104 may generate a sound signal by performing sound processing on the sound signal. This sound processing may be, for example, processing for processing a sound signal, or processing for synthesizing two or more sound signals. This processing includes amplification. The sound signal used for the sound processing may be included in the detection information, for example, or may be stored in the storage 34 in advance. As another example, the generation unit 104 may generate a sound signal using a sound source.
 演出制御手段105は、決定手段103により決定された演出効果に応じて演出装置20を制御し、演出効果を提供させる。演出装置20は、イベントの演出効果を提供する。この制御は、例えば演出効果を提供するよう制御する制御情報を送信することにより行われる。決定手段103により音を用いた演出効果が決定された場合、この制御情報には、生成手段104により生成された音信号が含まれる。 The production control unit 105 controls the production device 20 according to the production effect determined by the determination unit 103 and provides the production effect. The production device 20 provides an event production effect. This control is performed, for example, by transmitting control information for controlling the presentation effect. When the effect using sound is determined by the determining unit 103, the control information includes the sound signal generated by the generating unit 104.
動作
 図7は、演出システム1の動作の一例を示すシーケンスチャートである。この動作は、イベントの開始時等、所定のタイミングで開始される。
Operation FIG. 7 is a sequence chart showing an example of the operation of the rendering system 1. This operation is started at a predetermined timing, such as at the start of an event.
 ステップS101において、検出装置10は、イベントが行われる場所において物理量を検出する。例えば複数の収音装置11は、選手のプレーの音を収集する。各収音装置15は、各選手の音声や呼吸の音等、各選手から発せられた音を収集する。各収音装置17は、各観客の音声等、各観客から発せられた音を収集する。これらの音は、いずれもイベントにおいて発せられた音である。複数の振動検出装置12は、選手のプレーにより発生する振動を検出する。複数の撮影装置13は、選手のプレーを撮影する。動き検出装置14は、ボール60の動きを検出する。各生体センサー16は、各選手の生体情報を測定する。これらの装置は、所定の時間間隔で又は連続的にこれらの物理量を検出する。ただし、これらの装置が物理量を検出するタイミングは、異なっていてもよい。 In step S101, the detection apparatus 10 detects a physical quantity at a place where an event is performed. For example, the plurality of sound collection devices 11 collects player's play sounds. Each sound collecting device 15 collects sounds emitted from each player such as voices of each player and sounds of breathing. Each sound collecting device 17 collects sound emitted from each spectator, such as the sound of each spectator. These sounds are all sounds generated in the event. The plurality of vibration detection devices 12 detect vibrations generated by the player's play. The plurality of photographing devices 13 photograph a player's play. The motion detection device 14 detects the motion of the ball 60. Each biological sensor 16 measures biological information of each player. These devices detect these physical quantities at predetermined time intervals or continuously. However, the timing at which these devices detect physical quantities may be different.
 ステップS102において、検出装置10は、ステップS101において検出された物理量を示す検出情報をサーバー装置30に送信する。例えば複数の収音装置11は、収集された音を示す音信号と、各収音装置11を一意に識別する識別情報とを含む検出情報を送信する。この識別情報は、各収音装置11を識別するのに用いられる。収音装置15及び17も同様である。複数の振動検出装置12は、検出された振動を示す振動信号と、各振動検出装置12を一意に識別する識別情報とを含む検出情報を送信する。この識別情報は、各振動検出装置12を識別するのに用いられる。複数の撮影装置13は、撮影された映像を示す映像信号と、各撮影装置13を一意に識別する識別情報とを含む検出情報を送信する。この識別情報は、各撮影装置13を識別するのに用いられる。動き検出装置14は、検出された動きを示す値を含む検出情報を送信する。これらの装置は、物理量を検出する度に検出情報を送信してもよいし、所定のタイミングでまとめて検出情報を送信してもよい。 In step S102, the detection device 10 transmits detection information indicating the physical quantity detected in step S101 to the server device 30. For example, the plurality of sound collection devices 11 transmit detection information including a sound signal indicating the collected sound and identification information for uniquely identifying each sound collection device 11. This identification information is used to identify each sound collection device 11. The same applies to the sound collecting devices 15 and 17. The plurality of vibration detection devices 12 transmit detection information including a vibration signal indicating the detected vibration and identification information for uniquely identifying each vibration detection device 12. This identification information is used to identify each vibration detection device 12. The plurality of photographing devices 13 transmit detection information including a video signal indicating the photographed video and identification information for uniquely identifying each photographing device 13. This identification information is used to identify each imaging device 13. The motion detection device 14 transmits detection information including a value indicating the detected motion. These devices may transmit detection information every time a physical quantity is detected, or may transmit detection information collectively at a predetermined timing.
 検出装置10から送信された検出情報は、取得手段101により受信され、ストレージ34に格納される。これにより、ストレージ34には、検出装置10に含まれる個々の装置から送信された検出情報が蓄積される。 The detection information transmitted from the detection device 10 is received by the acquisition unit 101 and stored in the storage 34. As a result, the detection information transmitted from the individual devices included in the detection device 10 is accumulated in the storage 34.
 ステップS103において、取得手段101は、試合の経過を示す経過情報を取得する。例えば試合を運営するスタッフが図示せぬ端末装置に経過情報を入力する場合には、この端末装置から経過情報が取得される。この試合の経過には、例えば試合の経過時間、試合のスコア、及び試合において科せられたペナルティが含まれる。経過情報の取得は、所定の時間間隔で行われてもよいし、所定のタイミングで行われてもよい。取得手段101により取得された経過情報は、ストレージ34に格納される。 In step S103, the acquisition unit 101 acquires progress information indicating the progress of the game. For example, when the staff operating the game inputs progress information to a terminal device (not shown), the progress information is acquired from this terminal device. The progress of the game includes, for example, the elapsed time of the game, the score of the game, and the penalty imposed on the game. The acquisition of the progress information may be performed at a predetermined time interval or may be performed at a predetermined timing. The progress information acquired by the acquisition unit 101 is stored in the storage 34.
 ステップS104において、特定手段102は、ストレージ34に格納された検出情報及び経過情報を分析することにより、選手により行われたプレーを特定する。例えば特定手段102は、各収音装置11から受信された検出情報に含まれる音信号を分析することにより、この音信号により示される音の種別を識別する。また、特定手段102は、各振動検出装置12から受信された検出情報に含まれる振動信号を分析することにより、この振動信号により示される振動の種別を識別する。また、特定手段102は、各撮影装置13から受信された検出情報に含まれる映像信号に画像認識処理を施すことにより、ボール60の動き及び選手の動きを検出する。また、特定手段102は、動き検出装置14から受信された検出情報に含まれる値を分析することにより、ボール60の動きを検出する。また、特定手段102は、経過情報により示される試合の経過を認識する。そして、特定手段102は、このような分析により得られた結果、すなわち音の種別、振動の種別、選手の動き、ボール60の動き、試合の経過、又はこれらのうち少なくとも2つの組み合わせに基づいて、選手により行われたプレーを特定する。 In step S104, the identification unit 102 identifies the play performed by the player by analyzing the detection information and the progress information stored in the storage 34. For example, the specifying unit 102 identifies the type of sound indicated by the sound signal by analyzing the sound signal included in the detection information received from each sound collecting device 11. Further, the specifying unit 102 identifies the type of vibration indicated by the vibration signal by analyzing the vibration signal included in the detection information received from each vibration detection device 12. Further, the specifying unit 102 detects the movement of the ball 60 and the movement of the player by performing an image recognition process on the video signal included in the detection information received from each imaging device 13. Further, the specifying unit 102 detects the movement of the ball 60 by analyzing the value included in the detection information received from the movement detecting device 14. Further, the specifying unit 102 recognizes the progress of the game indicated by the progress information. The identifying means 102 is based on the result obtained by such an analysis, that is, the type of sound, the type of vibration, the movement of the player, the movement of the ball 60, the course of the game, or a combination of at least two of these. Identify the play made by the player.
 ステップS105において、ステップS104において特定されたプレーに対応する演出効果を決定する。この演出効果の決定は、例えば演出データベース111を参照して行われてもよい。演出データベース111は、例えば予め生成されてストレージ34に格納されてもよい。 In step S105, a production effect corresponding to the play specified in step S104 is determined. The determination of the effect may be performed with reference to the effect database 111, for example. The effect database 111 may be generated in advance and stored in the storage 34, for example.
 図8は、演出データベース111の一例を示す図である。演出データベース111は、プレーの種別と、そのプレーの種別に対応する演出効果とが対応付けて格納される。この演出効果は、対応する種別のプレーに関連する検出情報に基づく内容を有する。例えばステップS104において或るプレーが特定された場合、このプレーの種別と対応付けて格納された演出効果が決定される。 FIG. 8 is a diagram showing an example of the effect database 111. The effect database 111 stores a play type and an effect corresponding to the play type in association with each other. This effect has contents based on detection information related to the corresponding type of play. For example, when a certain play is specified in step S104, the effect stored in association with this play type is determined.
 ステップS106において、生成手段104は、ステップS105において決定された演出効果に音を用いた演出効果が含まれる場合には、その演出効果で用いられる音を示す音信号を生成する。なお、ステップS105において決定された演出効果に音を用いたものが含まれない場合には、ステップS106の処理は行われない。 In step S106, when the production effect determined in step S105 includes a production effect using sound, the generation means 104 generates a sound signal indicating the sound used for the production effect. In addition, when the effect using sound is not included in the effect determined in step S105, the process of step S106 is not performed.
 ステップS107において、演出制御手段105は、ステップS105において決定された演出効果に応じて演出装置20を制御する制御情報を演出装置20に送信する。例えば、音を用いた演出効果が決定された場合には、ステップS106において生成された音信号を含む制御情報が放音装置21、24、及び26のうち少なくともいずれかに送信される。光を用いた演出効果が決定された場合には、その演出効果に応じた発光制御信号を含む制御情報が発光装置22及び27の少なくともいずれかに送信される。振動を用いた演出効果が決定された場合には、その演出効果に応じた振動を示す振動信号を含む制御情報が振動装置25に送信される。映像を用いた演出効果が決定された場合には、その演出効果に応じた映像を示す映像信号を含む制御情報が投射装置23に送信される。 In step S107, the effect control means 105 transmits to the effect device 20 control information for controlling the effect device 20 in accordance with the effect determined in step S105. For example, when a production effect using sound is determined, control information including the sound signal generated in step S106 is transmitted to at least one of the sound emission devices 21, 24, and 26. When an effect using light is determined, control information including a light emission control signal corresponding to the effect is transmitted to at least one of the light emitting devices 22 and 27. When the effect using the vibration is determined, control information including a vibration signal indicating the vibration corresponding to the effect is transmitted to the vibration device 25. When the effect using the video is determined, control information including a video signal indicating the video corresponding to the effect is transmitted to the projection device 23.
 ステップS108において、演出装置20は、サーバー装置30から受信した制御情報に基づいて、演出効果を提供する。例えば放音装置21、24、及び26は、制御情報に含まれる音信号に応じた音を出力する。発光装置22及び27は、制御情報に含まれる発光制御信号に応じて発光する。振動装置25は、制御情報に含まれる振動制御信号に応じて振動する。投射装置23は、制御情報に含まれる映像信号に応じた映像を投射面に投射する。 In step S108, the rendering device 20 provides a rendering effect based on the control information received from the server device 30. For example, the sound emitting devices 21, 24, and 26 output a sound corresponding to a sound signal included in the control information. The light emitting devices 22 and 27 emit light according to a light emission control signal included in the control information. The vibration device 25 vibrates according to a vibration control signal included in the control information. The projection device 23 projects an image corresponding to the image signal included in the control information on the projection surface.
選手がドリブルを行った場合
 選手がドリブルを行った場合、上述したステップS104では検出情報を分析することにより、ドリブルが特定される。例えば収音装置11から受信された検出情報に含まれる音信号を分析することにより、ドリブルの音が識別された場合には、ドリブルが特定されてもよい。また、撮影装置13から受信された検出情報に含まれる映像信号に画像認識処理を施すことにより、ドリブルを示すボール60及び選手の動きが検出された場合には、ドリブルが特定されてもよい。さらに、これらの検出情報の少なくともいずれかに、他の検出情報の分析結果を加味して、ドリブルが特定されてもよい。
When a player dribbles When a player dribbles, dribbling is specified by analyzing detection information in step S104 mentioned above. For example, when a dribble sound is identified by analyzing a sound signal included in the detection information received from the sound collection device 11, the dribble may be specified. In addition, when the movement of the ball 60 indicating the dribble and the player is detected by performing image recognition processing on the video signal included in the detection information received from the imaging device 13, the dribble may be specified. Furthermore, dribble may be specified by adding at least one of these pieces of detection information to an analysis result of other detection information.
 上述したステップS105では、図8に示す演出データベース111において、プレーの種別「ドリブル」と対応付けて格納された演出効果「ドリブルの音を増幅して出力」が決定される。上述したステップS106では、ドリブルの音を示す音信号を増幅する音声処理が行われる。この音信号としては、例えば収音装置11から受信された検出情報に含まれるドリブルの音を示す音信号が用いられてもよい。他の例として、予めドリブルの音を示す音信号が生成されストレージ34に格納された場合には、この音信号が用いられてもよい。上述したステップS107では、ステップS106において生成された音信号を含む制御情報が例えば放音装置21、24、及び26に送信される。この場合、上述したステップS108では、放音装置21、24、及び26から、この音信号に応じた音、すなわち増幅されたドリブルの音が出力される。このドリブルの音の出力は、リアルタイムで行われる。すなわち、選手がドリブルをしている間に、ドリブルの音が出力される。ただし、ドリブルの音が出力される期間とドリブルが行われる期間とは完全に同期していなくてもよく、多少の時間の遅延があってもよい。 In step S105 described above, the effect “amplify and output dribble sound” stored in association with the play type “dribble” is determined in the effect database 111 shown in FIG. In step S106 described above, sound processing for amplifying a sound signal indicating a dribble sound is performed. As this sound signal, for example, a sound signal indicating a dribble sound included in the detection information received from the sound collection device 11 may be used. As another example, when a sound signal indicating a dribble sound is generated in advance and stored in the storage 34, this sound signal may be used. In step S107 described above, control information including the sound signal generated in step S106 is transmitted to the sound emitting devices 21, 24, and 26, for example. In this case, in step S108 described above, the sound output devices 21, 24, and 26 output sounds corresponding to the sound signals, that is, amplified dribble sounds. This dribbling sound is output in real time. That is, a dribble sound is output while the player is dribbling. However, the period during which the dribble sound is output and the period during which the dribble is performed may not be completely synchronized, and there may be a slight delay.
選手がシュートを行った場合
 選手がシュートを行った場合、上述したステップS104では検出情報を分析することにより、シュートが特定される。例えば撮影装置13から受信された検出情報に含まれる映像信号に画像認識処理を施すことにより、シュートを示すボール60及び選手の動きが検出された場合には、シュートが特定されてもよい。さらに、この検出情報の分析結果に、他の検出情報の分析結果を加味して、シュートが特定されてもよい。
When a player shoots When a player shoots, a shot is identified by analyzing detection information in step S104 mentioned above. For example, when a motion of the ball 60 indicating a shot and a player is detected by performing image recognition processing on the video signal included in the detection information received from the imaging device 13, the shot may be specified. Furthermore, the shoot may be specified by adding the analysis result of other detection information to the analysis result of the detection information.
 上述したステップS105では、図8に示す演出データベース111において、プレーの種別「シュート」と対応付けて格納された演出効果「シューター及びボール60の動きに対応する効果音を出力」が決定される。上述したステップS106では、シューター及びボール60の動きに対応する効果音120を示す音信号を生成する処理が行われる。 In step S105 described above, the production effect “output sound effect corresponding to movement of shooter and ball 60” stored in association with the play type “shoot” is determined in the production database 111 shown in FIG. In step S106 described above, processing for generating a sound signal indicating the sound effect 120 corresponding to the movement of the shooter and the ball 60 is performed.
 図9は、シューター及びボール60の動きに対応する効果音120の一例を示す図である。この例では、期間T11において、選手がボール60を構えてシュートの準備をする態勢をとる。期間T12において、選手がシュートを行い、ボール60が空中を移動する。時刻t13において、ボール60がゴール又はそれ以外の場所に到達し、シュートが終了する。この場合、効果音120は、期間T11の音121と、期間T12の音122と、期間T12が終了する時刻t13の音123とにより構成される。音121は、例えばシュート態勢に入ったことを示す音であってもよい。音122は、例えば「ひゅー」という空中に投げられたボール60の音であってもよい。この音122は、例えば期間T12に対応する長さを有してもよい。音123は、例えばシュート結果に応じた音であってもよい。例えばシュートが入った場合には、音123は「バシッ」というゴールに入ったボール60の音であってもよい。 FIG. 9 is a diagram illustrating an example of the sound effect 120 corresponding to the movement of the shooter and the ball 60. In this example, the player is ready to shoot with the ball 60 in the period T11. In a period T12, the player shoots and the ball 60 moves in the air. At time t13, the ball 60 reaches the goal or other place, and the shot ends. In this case, the sound effect 120 includes a sound 121 in the period T11, a sound 122 in the period T12, and a sound 123 at time t13 when the period T12 ends. The sound 121 may be, for example, a sound indicating that a shooting posture has been entered. The sound 122 may be, for example, the sound of the ball 60 thrown into the air “Hyu”. The sound 122 may have a length corresponding to the period T12, for example. The sound 123 may be, for example, a sound according to the shooting result. For example, when a shot is made, the sound 123 may be the sound of the ball 60 that has entered the goal of “bash”.
 上述したステップS107では、ステップS106において生成された音信号を含む制御情報が例えば放音装置21、24、及び26に送信される。この場合、上述したステップS108では、放音装置21、24、及び26から、この音信号に応じた効果音120が出力される。この効果音120の出力は、リアルタイムで行われる。すなわち、選手がシュートをしている間に、効果音120が出力される。図9に示す例では、シュートを行う選手の動きに合わせて、期間T11には音121が出力され、期間T12には音122が出力され、時刻t13には音123が出力される。ただし、効果音120の出力は、選手の動きに完全に同期していなくてもよく、多少の時間の遅延があってもよい。 In step S107 described above, control information including the sound signal generated in step S106 is transmitted to, for example, sound emitting devices 21, 24, and 26. In this case, in step S108 described above, the sound effects 120 corresponding to the sound signal are output from the sound emitting devices 21, 24, and 26. The output of the sound effect 120 is performed in real time. That is, the sound effect 120 is output while the player is shooting. In the example shown in FIG. 9, a sound 121 is output in the period T11, a sound 122 is output in the period T12, and a sound 123 is output at the time t13 in accordance with the movement of the player who performs the shot. However, the output of the sound effect 120 may not be completely synchronized with the movement of the player, and there may be a slight delay.
選手がフリースローを行った場合
 選手がフリースローを行った場合、上述したステップS104では検出情報を分析することにより、フリースローが特定される。例えば経過情報によりフリースローが行われることが示される場合には、フリースローが特定されてもよい。さらに、検出情報の分析結果を加味して、フリースローが特定されてもよい。
When a player performs a free throw When a player performs a free throw, the free throw is identified by analyzing the detection information in step S104 described above. For example, when the progress information indicates that a free throw is performed, the free throw may be specified. Furthermore, the free throw may be specified in consideration of the analysis result of the detection information.
 上述したステップS105では、図8に示す演出データベース111において、プレーの種別「フリースロー」と対応付けて格納された演出効果「スロワーの心拍数に対応する音を出力」が決定される。上述したステップS106では、フリースローを行う選手が装着するウェアラブル端末70から受信された検出情報に含まれる心拍数に対応する効果音を示す音信号が生成される。この効果音は、例えば心拍の周期に対応するリズムを有する「ドクドクドク」という音であってもよい。この場合、効果音は、心拍数に応じてリズムが変わる。例えば心拍数が多い程、効果音のテンポが速くなってもよい。 In step S105 described above, in the effect database 111 shown in FIG. 8, an effect “output a sound corresponding to the heart rate of the thrower” stored in association with the play type “free throw” is determined. In step S106 described above, a sound signal indicating a sound effect corresponding to the heart rate included in the detection information received from the wearable terminal 70 worn by the player who performs the free throw is generated. This sound effect may be, for example, a sound of “Dokudokudoku” having a rhythm corresponding to the heartbeat cycle. In this case, the sound effect changes in rhythm according to the heart rate. For example, the higher the heart rate, the faster the sound effect tempo may be.
 上述したステップS107では、ステップS106において生成された音信号を含む制御情報が例えば放音装置21、24、及び26に送信される。この場合、上述したステップS108では、放音装置21、24、及び26から、この音信号に応じた効果音、すなわちシュートを行う選手の心拍数に対応する効果音が出力される。この効果音の出力は、リアルタイムで行われる。すなわち、フリースローが行われている間に、この効果音が出力される。ただし、効果音が出力される期間とフリースローが行われる期間とは完全に同期していなくてもよく、多少の時間の遅延があってもよい。 In step S107 described above, control information including the sound signal generated in step S106 is transmitted to, for example, sound emitting devices 21, 24, and 26. In this case, in step S108 described above, the sound emitting devices 21, 24, and 26 output sound effects corresponding to the sound signals, that is, sound effects corresponding to the heart rate of the player performing the shot. This sound effect is output in real time. That is, this sound effect is output during the free throw. However, the period in which the sound effect is output and the period in which the free throw is performed may not be completely synchronized, and there may be a slight delay.
 以上説明した第1実施形態では、検出装置10から取得された検出情報に基づく内容の演出が行われる。これにより、イベントの臨場感を向上させることができる。また、この演出は、選手のプレーや状態等、選手の状況に対応する内容を有し、聴覚、触覚、視覚等の様々な感覚で感じることができる。これにより、観客は、選手の状況をより強く感じることができるため、選手と観客との間のコミュニケーションが強化される。 In the first embodiment described above, the contents are produced based on the detection information acquired from the detection device 10. Thereby, the realistic sensation of the event can be improved. In addition, this effect has contents corresponding to the player's situation such as the player's play and state, and can be felt with various senses such as hearing, tactile sensation, and vision. Thereby, since the audience can feel the situation of the player more strongly, the communication between the athlete and the audience is strengthened.
第2実施形態
 上述した第1実施形態では、選手の状況に対応する演出が行われていた。これに対し、第2実施形態では、観客の状況に対応する演出が行われる。第2実施形態に係る演出システム1の構成は、上述した第1実施形態に係る演出システム1の構成を同一である。以下、第1実施形態と共通する事項についてはその説明を省略する。
2nd Embodiment In 1st Embodiment mentioned above, the production | presentation corresponding to the condition of the player was performed. On the other hand, in the second embodiment, an effect corresponding to the situation of the audience is performed. The configuration of the rendering system 1 according to the second embodiment is the same as the configuration of the rendering system 1 according to the first embodiment described above. Hereinafter, descriptions of matters common to the first embodiment are omitted.
 決定手段103は、取得手段101により取得された検出情報に基づく内容を有し、観客の状況に対応する演出効果を決定する。例えば決定手段103は、複数の収音装置17から受信された検出情報に含まれる音信号を分析し、これらの音信号により示される観客の音声量の総和を算出する。決定手段103は、算出した音声量の総和に応じた発光量で発光する演出効果を決定する。この場合、発光量は、例えば音声量の総和が多い程、多くなってもよい。 The determining unit 103 has contents based on the detection information acquired by the acquiring unit 101, and determines a production effect corresponding to the situation of the audience. For example, the determination unit 103 analyzes the sound signals included in the detection information received from the plurality of sound collection devices 17 and calculates the sum of the amount of sound of the audience indicated by these sound signals. The determining means 103 determines the effect of emitting light with a light emission amount corresponding to the sum of the calculated audio amounts. In this case, the amount of emitted light may increase as the sum of the amount of audio increases, for example.
 演出制御手段105は、決定手段103により決定された演出効果に応じて演出装置20を制御し、演出効果を提供させる。例えば演出制御手段105は、音声量の総和に応じた発光量を示す発光制御信号を含む制御情報を発光装置22及び27に送信する。発光装置22及び27は、この制御情報を受信すると、この制御情報に含まれる発光制御信号に応じて発光する。例えば発光装置22及び27は、発光制御信号により示される発光量で発光する。この場合、観客の音声量が大きい程、発光装置22及び27の発光量が多くなる。そのため、選手は、発光装置22及び27の発光量により、観客の声援の大きさを感じることができる。 The production control unit 105 controls the production device 20 according to the production effect determined by the determination unit 103 and provides the production effect. For example, the effect control unit 105 transmits control information including a light emission control signal indicating the light emission amount corresponding to the sum of the sound amounts to the light emitting devices 22 and 27. When receiving the control information, the light emitting devices 22 and 27 emit light according to the light emission control signal included in the control information. For example, the light emitting devices 22 and 27 emit light at the light emission amount indicated by the light emission control signal. In this case, the light emission amount of the light emitting devices 22 and 27 increases as the audio volume of the audience increases. Therefore, the player can feel the level of cheering by the audience based on the light emission amounts of the light emitting devices 22 and 27.
 以上説明した第2実施形態では、イベントの観客の状況に対応する演出が行われる。これにより、選手は、観客の状況をより強く感じることができるため、選手と観客との間のコミュニケーションが強化される。 In the second embodiment described above, an effect corresponding to the situation of the audience of the event is performed. Thereby, since the player can feel the situation of the audience more strongly, the communication between the player and the audience is strengthened.
第3実施形態
 上述した第1実施形態では、選手の状況に対応する演出が行われ、上述した第2実施形態では、観客の状況に対応する演出が行われていた。これに対し、第3実施形態では、複数の観客に含まれる一方の観客の状況に対応する演出が、他方の観客に対して行われる。第3実施形態に係る演出システム1の構成は、上述した第1実施形態に係る演出システム1の構成を同一である。以下、第1実施形態と共通する事項についてはその説明を省略する。
Third Embodiment In the first embodiment described above, an effect corresponding to the situation of the player is performed, and in the second embodiment described above, an effect corresponding to the situation of the audience is performed. In contrast, in the third embodiment, an effect corresponding to the situation of one spectator included in a plurality of spectators is performed for the other spectator. The configuration of the rendering system 1 according to the third embodiment is the same as the configuration of the rendering system 1 according to the first embodiment described above. Hereinafter, descriptions of matters common to the first embodiment are omitted.
 第3実施形態においては、複数の観客が複数のグループに分割される。例えば複数の観客にサポーターとサポーター以外の一般客とが含まれる場合には、複数の観客がサポーターと一般客とに分割されてもよい。他の例として、複数の観客は、観客席エリア3の場所によって分割されてもよい。 In the third embodiment, a plurality of spectators are divided into a plurality of groups. For example, when a plurality of spectators include supporters and general customers other than the supporters, the plurality of spectators may be divided into supporters and general customers. As another example, the plurality of spectators may be divided depending on the location of the spectator seat area 3.
 決定手段103は、複数のグループの中から、演出効果の基となる参照グループと、演出効果の提供先となる対象グループとを選択する。例えば参照グループはサポーターが属するグループであり、対象グループは一般客が属するグループであってもよい。他の例として、各グループの観客の音声量の総和に基づいて、参照グループ及び対象グループが選択されてもよい。この場合、決定手段103は、複数の収音装置17から受信された検出情報に含まれる音信号を分析し、これらの音信号により示される各グループの観客の音声量の総和を算出する。決定手段103は、算出された音声量の総和が最も多いグループを参照グループとして選択し、算出された音声量の総和が最も少ないグループを対象グループとして選択してもよい。 The deciding means 103 selects a reference group that is a base of the production effect and a target group that is a destination of the production effect from a plurality of groups. For example, the reference group may be a group to which a supporter belongs, and the target group may be a group to which a general customer belongs. As another example, the reference group and the target group may be selected based on the sum of the audio volume of the audience of each group. In this case, the determining unit 103 analyzes the sound signals included in the detection information received from the plurality of sound collecting devices 17 and calculates the sum of the sound volume of the audience of each group indicated by these sound signals. The determining unit 103 may select a group having the largest sum of the calculated sound amounts as a reference group, and may select a group having the smallest sum of the calculated sound amounts as a target group.
 決定手段103は、取得手段101により取得された検出情報のうち参照グループに対応する検出装置10から取得された検出情報に基づく内容の演出効果を決定する。例えばこの演出効果は、参照グループの観客の音声を対象グループの観客に向けて出力するというものであってもよい。 The determination unit 103 determines the effect of content based on the detection information acquired from the detection device 10 corresponding to the reference group among the detection information acquired by the acquisition unit 101. For example, this effect may be that the sound of the audience of the reference group is output to the audience of the target group.
 生成手段104は、参照グループに対応する検出装置10から取得された検出情報に基づく音信号を生成する。この音信号としては、例えば参照グループに対応する収音装置17から受信された検出情報に含まれる音信号がそのまま用いられてもよいし、この音信号に音声処理を施したものが用いられてもよい。参照グループに対応する収音装置17は、参照グループの観客の音声を収集する収音装置17である。このような収音装置17は、検出情報に含まれる識別情報によって識別される。 The generation unit 104 generates a sound signal based on the detection information acquired from the detection device 10 corresponding to the reference group. As this sound signal, for example, a sound signal included in the detection information received from the sound collection device 17 corresponding to the reference group may be used as it is, or a sound signal subjected to sound processing may be used. Also good. The sound collection device 17 corresponding to the reference group is a sound collection device 17 that collects the sound of the audience of the reference group. Such a sound collection device 17 is identified by identification information included in the detection information.
 演出制御手段105は、対象グループに対応する演出装置20を選択する。例えば参照グループの音声を対象グループに向けて出力する場合、複数の放音装置21、24、及び26の中から対象グループに対応する放音装置21、24、及び26が選択される。この対象グループに対応する放音装置21、24、及び26は、対象グループの観客に向けて音を出力する放音装置21、24、及び26である。例えば対象グループに対応する放音装置24は、対象グループの観客が座る観客席50に設けられた放音装置24である。また、対象グループに対応する放音装置26は、対象グループの観客に装着されたウェアラブル端末80の放音装置26である。このような放音装置21、24、及び26は、検出情報に含まれる識別情報によって識別される。 The effect control means 105 selects the effect device 20 corresponding to the target group. For example, when outputting the sound of the reference group toward the target group, the sound emitting devices 21, 24, and 26 corresponding to the target group are selected from the plurality of sound emitting devices 21, 24, and 26. The sound emitting devices 21, 24, and 26 corresponding to the target group are the sound emitting devices 21, 24, and 26 that output sound toward the audience of the target group. For example, the sound emitting device 24 corresponding to the target group is the sound emitting device 24 provided in the spectator seat 50 where the audience of the target group sits. The sound emitting device 26 corresponding to the target group is the sound emitting device 26 of the wearable terminal 80 attached to the audience of the target group. Such sound emitting devices 21, 24, and 26 are identified by identification information included in the detection information.
 演出制御手段105は、決定手段103により決定された演出効果に応じて、選択された演出装置20を制御し、演出効果を提供させる。例えば演出制御手段105は、生成手段104により生成された音信号を含む制御情報を、選択された放音装置21、24、及び26に送信する。これらの放音装置21、24、及び26は、この制御情報を受信すると、制御情報に含まれる音信号に応じた音声を出力する。これにより、参照グループの観客の音声が対象グループの観客に向けて出力される。 The production control unit 105 controls the selected production device 20 according to the production effect determined by the determination unit 103 and provides the production effect. For example, the effect control unit 105 transmits control information including the sound signal generated by the generation unit 104 to the selected sound emitting devices 21, 24, and 26. Upon receiving this control information, these sound emitting devices 21, 24, and 26 output sound corresponding to the sound signal included in the control information. Thereby, the voice of the reference group audience is output to the audience of the target group.
 以上説明した第3実施形態では、一方の観客の状況に対応する演出が、他方の観客に対して行われる。これにより、他方の観客は、一方の観客の状況を感じることができるため、これらの観客の間の一体感が強化される。 In the third embodiment described above, an effect corresponding to the situation of one audience is performed for the other audience. Thereby, since the other audience can feel the situation of one audience, the sense of unity between these audiences is strengthened.
変形例
 本発明は上述した各実施形態に限定されない。上述した第1実施形態から第3実施形態のうち少なくとも2つが組み合わせて実施されてもよい。また、上述した各実施形態に対し、種々の変形がなされてもよい。また、以下の変形例のうち少なくとも2つが組み合わせて実施されてもよい。
The present invention is not limited to the above-described embodiments. At least two of the first to third embodiments described above may be implemented in combination. Various modifications may be made to the above-described embodiments. In addition, at least two of the following modifications may be implemented in combination.
変形例1
 上述した各実施形態において、試合の経過を加味して演出効果が決定されてもよい。この場合、決定手段103は、取得手段101により取得された検出情報及び経過情報に基づいて、演出効果を決定する。例えば決定手段103は、経過情報により示される試合の経過によって異なる演出効果を決定してもよい。例えば、特定手段102によりシュートが特定された場合、通常は、上述した第1実施形態で説明したように、「シューター及びボール60の動きに対応する効果音を出力」という演出効果が決定される。ただし、経過情報により示される試合の経過が所定の状態である場合には、特別な演出効果が決定されてもよい。この所定の状態は、例えば試合の盛り上がるような状態であってもよい。例えば所定の状態は、逆転した状態であってもよい。或いは所定の状態は、試合終了前の所定の時間内に逆転した状態であってもよい。他の例において、チーム毎に所定の状態が定められていてもよい。この所定の状態は、例えばチームのジンクスに基づいて定められてもよい。
Modification 1
In each embodiment described above, the effect may be determined in consideration of the progress of the game. In this case, the determination unit 103 determines the effect based on the detection information and the progress information acquired by the acquisition unit 101. For example, the determining unit 103 may determine different effects depending on the progress of the game indicated by the progress information. For example, when the shoot is specified by the specifying unit 102, normally, as described in the first embodiment, the effect of “output sound effects corresponding to the movement of the shooter and the ball 60” is determined. . However, when the progress of the game indicated by the progress information is in a predetermined state, a special effect may be determined. This predetermined state may be, for example, a state where the game is exciting. For example, the predetermined state may be a reverse state. Alternatively, the predetermined state may be a state reversed within a predetermined time before the end of the game. In another example, a predetermined state may be defined for each team. This predetermined state may be determined based on, for example, a team hex.
 特別な演出効果は、通常の演出効果とは異なる演出効果である。例えば特別な演出効果は、通常の演出効果を加工したものであってもよいし、通常の演出効果に他の媒体を用いた演出効果を追加したものであってもよい。例えば通常の演出効果が音を用いた演出効果である場合、特別な演出効果では、通常の演出効果で用いられる音に音声処理を施したものが出力されてもよい。また、特別な演出効果では、通常の演出効果で行われる音の出力に加えて、光、振動、及び映像のうち少なくともいずれかを用いた演出効果が提供されてもよい。この変形例によれば、試合において同一のプレーが行われた場合にも、試合の経過によって異なる演出効果を提供することができる。 The special performance effect is different from the normal performance effect. For example, the special effect may be a result of processing a normal effect, or may be a result of adding an effect using another medium to the normal effect. For example, when the normal production effect is a production effect using sound, the special production effect may be output by performing sound processing on the sound used for the normal production effect. In addition, in the special effect, in addition to the sound output performed in the normal effect, an effect using at least one of light, vibration, and video may be provided. According to this modified example, even when the same play is performed in the game, it is possible to provide different effects depending on the progress of the game.
 また、上述した試合の経過は、経過情報に加えて、試合のルールを示すルール情報に基づいて特定されてもよい。このルール情報は、例えば演出データベース111に格納されてもよいし、サーバー装置30のストレージ34に格納されてもよいし、外部のストレージ装置に格納されてもよい。この試合の経過は、例えば試合の残り時間であってもよい。例えば経過情報に試合の経過時間が含まれ、ルール情報に試合時間が含まれる場合には、この試合時間から経過時間を減ずることにより、試合の残り時間が算出される。この経過時間は、例えば試合の審判により計測された時間であってもよい。この場合、例えば試合を運営するスタッフが審判により計測された経過時間を図示せぬ端末装置に入力し、この端末装置からこの経過時間を含む経過情報が取得されてもよい。さらに、試合の残り時間に対応する内容の演出効果が提供されてもよい。例えば音を用いた演出効果が提供される場合には、試合の残り時間に応じて音信号に施される音声処理のパラメーターが変化してもよい。例えば音声処理により音信号が増幅される場合には、試合の残り時間に応じて増幅度が変化してもよい。これにより、試合の残り時間に応じた演出を行うことができる。 Further, the progress of the game described above may be specified based on rule information indicating the rules of the game in addition to the progress information. This rule information may be stored, for example, in the production database 111, may be stored in the storage 34 of the server device 30, or may be stored in an external storage device. The progress of this game may be the remaining time of the game, for example. For example, when the elapsed time is included in the progress information and the game information is included in the rule information, the remaining time of the game is calculated by subtracting the elapsed time from the match time. This elapsed time may be, for example, a time measured by a game referee. In this case, for example, the staff operating the game may input the elapsed time measured by the referee to a terminal device (not shown), and the progress information including the elapsed time may be acquired from the terminal device. Furthermore, the effect of the content corresponding to the remaining time of the game may be provided. For example, when a production effect using sound is provided, the parameters of the audio processing applied to the sound signal may change according to the remaining time of the game. For example, when a sound signal is amplified by voice processing, the amplification degree may change according to the remaining time of the game. Thereby, the production according to the remaining time of the game can be performed.
変形例2
 上述した各実施形態において、観客又は観客グループによって、異なる演出効果が提供されてもよい。この場合、決定手段103は、複数の観客又は複数の観客グループについて、互いに異なる複数の演出効果を決定する。演出制御手段105は、複数の観客又は複数の観客グループに対応する複数の演出装置20を制御し、各演出装置20に対応する演出効果を提供させる。この観客又は観客グループに対応する演出装置20は、観客又は観客グループに対して演出効果を提供する演出装置20である。
Modification 2
In each embodiment mentioned above, a different production effect may be provided by an audience or an audience group. In this case, the determination unit 103 determines a plurality of different effects for each of a plurality of spectators or a plurality of spectator groups. The production control means 105 controls a plurality of production devices 20 corresponding to a plurality of spectators or a plurality of audience groups, and provides production effects corresponding to the production devices 20. The effect device 20 corresponding to the audience or the audience group is an effect device 20 that provides an effect to the audience or the audience group.
 例えば一方のチームを応援する観客と、他方のチームを応援する観客とで、異なる演出効果が提供されてもよい。例えば一方のチームの選手によりシュートが行われた場合、一方のチームを応援する観客には、シューター及びボール60の動きに対応する第1効果音が対応する放音装置21、24、及び26から出力されてもよい。この第1効果音は、例えば観客を盛り上げるような音であってもよい。また、他方のチームを応援する観客には、シューター及びボール60の動きに対応する第2効果音が、対応する放音装置21、24、及び26から出力されてもよい。この第2効果音は、第1効果音とは異なる音であり、例えば観客を冷静にさせるような音であってもよい。この変形例によれば、応援するチームに適した演出効果を提供することができる。 For example, different presentation effects may be provided by the audience supporting one team and the audience supporting the other team. For example, when a shot is made by a player of one team, the sound emitting devices 21, 24, and 26 corresponding to the first sound effects corresponding to the movement of the shooter and the ball 60 are provided to the audience supporting the one team. It may be output. The first sound effect may be a sound that excites the audience, for example. In addition, a second sound effect corresponding to the movement of the shooter and the ball 60 may be output from the corresponding sound emitting devices 21, 24, and 26 to the spectator who supports the other team. The second sound effect is different from the first sound effect, and may be a sound that keeps the audience cool, for example. According to this modification, it is possible to provide a presentation effect suitable for the team to be supported.
 他の例として、観客席エリア3の場所によって、異なる演出効果が提供されてもよい。ここでは、観客席エリア3に、熱烈に応援する観客が居る熱烈ゾーンと、試合を分析するのを好む観客が居る分析ゾーンと、初めて試合を観戦する観客が居る初心者ゾーンと、選手のプレーをより強く体感したい観客が居る体感ゾーンとが含まれている場合を想定する。なお、これらのゾーンは、コート4からの距離や観客席エリア3内の配置(外野席、二階席等)によって区切られていてもよい。この場合、各ゾーンに居る観客に対し、その観客の属性に対応する演出効果が提供されてもよい。例えば熱烈ゾーンに居る観客には、対応する演出装置20により応援を盛り上げるような派手な演出効果が提供されてもよい。また、体感ゾーンに居る観客には、対応する演出装置20により選手のプレーを体感できるような演出効果、例えば振動を用いた演出効果が提供されてもよい。この変形例によれば、観客席エリア3において複数の演出効果が提供されるため、イベントの観戦の楽しみ方のバリエーションを増やすことができる。また、観客席エリア3が観客の属性に応じて区切られている場合には、観客の属性に応じた演出効果を提供することができる。 As another example, different effects may be provided depending on the location of the spectator seat area 3. Here, in the audience area 3, there is a passionate zone where there are spectators who cheer enthusiastically, an analysis zone where there is a spectator who likes to analyze the game, a beginner zone where there are spectators who watch the game for the first time, and player play A case is assumed in which a sensation zone in which there are spectators who want to experience more strongly is included. Note that these zones may be separated by the distance from the court 4 or the arrangement in the spectator seat area 3 (outfield seats, second floor seats, etc.). In this case, presentation effects corresponding to the attributes of the audience may be provided to the audience in each zone. For example, a gorgeous staging effect that excites cheering by the corresponding staging device 20 may be provided to the audience in the intense zone. In addition, the spectator in the sensation zone may be provided with an effect such that the corresponding effect device 20 can experience the player's play, for example, an effect using vibration. According to this modification, since a plurality of effects are provided in the spectator seat area 3, variations in how to enjoy watching events can be increased. Further, when the spectator seat area 3 is divided according to the spectator attributes, it is possible to provide a production effect according to the spectator attributes.
 他の例として、特定の観客には、他の観客とは異なる複数の演出効果が提供されてもよい。例えば選手によりシュートが行われた場合、シュートを行った選手を応援する観客には、他の観客とは異なる演出効果が提供されてもよい。この場合、観客は、自分が応援する選手を予め登録する。ストレージ34には、この登録内容を示す登録情報が予め格納される。この登録情報に基づいて、シュートを行った選手を応援する観客が特定される。例えば、特定された観客以外の観客には、対応する演出装置20により通常の演出効果が提供されてもよい。一方、特定された観客には、対応する演出装置20により特別な演出効果が提供されてもよい。特別な演出効果は、上述した特別な演出効果と同様に、通常の演出効果とは異なる演出効果である。この変形例によれば、個々の観客に適した演出効果を提供することができる。 As another example, a specific audience may be provided with a plurality of performance effects different from those of other audiences. For example, when a shoot is performed by a player, an effect different from other spectators may be provided to a spectator who supports the player who has shot. In this case, the spectator registers in advance the players he supports. In the storage 34, registration information indicating the registration contents is stored in advance. Based on this registration information, the audience that supports the player who shot is identified. For example, a normal presentation effect may be provided to the audience other than the identified audience by the corresponding presentation device 20. On the other hand, a special effect may be provided to the specified audience by the corresponding effect device 20. The special production effect is a production effect different from the normal production effect, like the special production effect described above. According to this modification, it is possible to provide a presentation effect suitable for each audience.
 他の例として、所定の状態の観客には、他の観客とは異なる複数の演出効果が提供されてもよい。この所定の状態は、例えば観客席50から立ち上がっている状態であってもよい。この場合、観客席50には、着座センサーが設けられる。着座センサーは、観客の観客席50への着座を検出し、その検出結果を示す検出情報をサーバー装置30に送信する。この検出情報に基づいて、各観客が観客席50に座っているか否かが判定される。例えば、観客席50に座っている観客には、対応する演出装置20により通常の演出効果が提供されてもよい。一方、観客席50から立ち上がっている観客には、興奮している状態であると考えられるため、対応する演出装置20により特別な演出効果が提供されてもよい。特別な演出効果は、上述した特別な演出効果と同様に、通常の演出効果とは異なる演出効果である。他の例として、観客席50から立ち上がっている観客には、振動を与えることができないため、振動以外の媒体を用いた演出効果が提供されてもよい。この変形例によれば、観客の状態に適した演出効果を提供することができる。 As another example, the audience in a predetermined state may be provided with a plurality of performance effects different from those of other audiences. This predetermined state may be, for example, a state of standing up from the spectator seat 50. In this case, the seat 50 is provided with a seating sensor. The seating sensor detects the seating of the spectator on the spectator seat 50 and transmits detection information indicating the detection result to the server device 30. Based on this detection information, it is determined whether each spectator is sitting in the spectator seat 50. For example, a normal presentation effect may be provided to the audience sitting in the spectator seat 50 by the corresponding presentation device 20. On the other hand, since it is considered that the audience standing up from the spectator seat 50 is in an excited state, a special performance effect may be provided by the corresponding performance device 20. The special production effect is a production effect different from the normal production effect, like the special production effect described above. As another example, the audience standing up from the spectator seat 50 cannot be vibrated, and thus a presentation effect using a medium other than the vibration may be provided. According to this modification, it is possible to provide an effect that is suitable for the state of the audience.
変形例3
 上述した各実施形態において、特定手段102により特定されたイベントの状況に関連する選手によって、異なる演出効果が提供されてもよい。例えば、試合において選手交代が行われた場合には、交代により入った新たな選手について予め定められた演出効果が提供されてもよい。また、特定手段102により特定されたイベントの状況に関連する選手の人気度によって、異なる演出効果が提供されてもよい。例えば人気度が高いエースが試合の途中で出場した場合には、他の選手とは異なる演出効果が提供されてもよい。この演出効果は、例えば上述した特別な演出効果であってもよい。
Modification 3
In each of the above-described embodiments, different presentation effects may be provided depending on the player related to the event status specified by the specifying unit 102. For example, when a player change is performed in a game, a predetermined effect may be provided for a new player who has entered through the change. Further, different presentation effects may be provided depending on the popularity of the players related to the event status specified by the specifying means 102. For example, when an ace having a high degree of popularity participates in the middle of a match, a presentation effect different from that of other players may be provided. This effect may be, for example, the special effect described above.
変形例4
 上述した各実施形態において、演出効果は、音を用いたものに限定されない。例えば演出効果は、音、光、振動、及び画像のうちの少なくともいずれかを用いたものであってもよい。この画像には、静止画像と動画像とが含まれる。例えば、ドリブルが行われた場合、ドリブルの音に合わせて、振動装置25が振動するとともに、発光装置22及び27が発光してもよい。この振動及び発光は、ドリブルの音に対応するものであってもよい。この場合、ドリブルの音を示す音信号に基づく振動制御信号が生成され、振動装置25に提供されてもよい。同様に、ドリブルの音を示す音信号に基づく発光制御信号が生成され、発光装置22及び27に提供されてもよい。
Modification 4
In each embodiment mentioned above, a production effect is not limited to what used sound. For example, the effect may be one that uses at least one of sound, light, vibration, and an image. This image includes a still image and a moving image. For example, when dribbling is performed, the vibration device 25 may vibrate and the light emitting devices 22 and 27 may emit light according to the dribbling sound. This vibration and light emission may correspond to a dribble sound. In this case, a vibration control signal based on a sound signal indicating a dribble sound may be generated and provided to the vibration device 25. Similarly, a light emission control signal based on a sound signal indicating a dribble sound may be generated and provided to the light emitting devices 22 and 27.
 同様に、フリースローが行われたる場合、心拍数に対応する効果音に合わせて、振動装置25が振動するとともに、発光装置22及び27が発光してもよい。この振動及び発光は、心拍数に対応するものであってもよい。この場合、心拍数に基づく振動制御信号が生成され、振動装置25に提供されてもよい。同様に、心拍数に基づく発光制御信号が生成され、発光装置22及び27に提供されてもよい。さらに、フリースローを行う選手の情報が投射装置23により投射されてもよい。この選手の情報には、例えば選手の背番号や顔画像が含まれてもよい。また、心拍数により示される選手の状態によって、異なる色の映像が投射されてもよい。例えば、選手が興奮状態である場合には、赤色を基調とした映像が投射されてもよい。一方、選手が冷静な状態である場合には、青色を基調とした映像が投射されてもよい。 Similarly, when a free throw is performed, the vibration device 25 may vibrate and the light emitting devices 22 and 27 may emit light according to the sound effect corresponding to the heart rate. This vibration and light emission may correspond to the heart rate. In this case, a vibration control signal based on the heart rate may be generated and provided to the vibration device 25. Similarly, a light emission control signal based on the heart rate may be generated and provided to the light emitting devices 22 and 27. Furthermore, the information of the player who performs free throw may be projected by the projection device 23. This player information may include, for example, the player's spine number and face image. Also, different color images may be projected depending on the state of the player indicated by the heart rate. For example, when the player is in an excited state, an image based on red may be projected. On the other hand, when the player is in a calm state, an image based on blue may be projected.
変形例5
 上述した各実施形態において、観客の声援の内容を示す画像が表示されてもよい。例えばこの画像は、試合で用いられる設備に表示されてもよい。この設備は、例えばバスケットボールの試合で用いられるゴールであってもよい。この場合、演出装置20には、表示装置28が含まれる。
Modification 5
In each embodiment mentioned above, the image which shows the contents of the cheering of a spectator may be displayed. For example, this image may be displayed on equipment used in the game. This equipment may be a goal used in, for example, a basketball game. In this case, the rendering device 20 includes a display device 28.
 図10は、この変形例に係る表示装置28の一例を示す図である。表示装置28は、バスケットのゴールのバックボードと一体に設けられる。表示装置28は、画像を表示する。表示装置28は、例えば透明な有機EL(Electro―Luminescence)ディスプレイであってもよい。この場合、複数の収音装置17から受信された検出情報に含まれる音信号に音声認識処理が施され、音声認識処理の結果に応じた画像を示す画像信号が生成される。例えば音声認識処理により特定された或る観客の音声の内容が「シュート!決めろ」である場合には、この音声の内容を含む画像を示す画像信号が生成される。演出制御手段105は、生成された画像信号を含む制御情報を表示装置28に送信する。表示装置28は、この制御情報を受信すると、制御情報に含まれる画像信号に応じた画像を表示する。これにより、選手は、観客の声援を視覚で認識することができる。 FIG. 10 is a diagram showing an example of the display device 28 according to this modification. The display device 28 is provided integrally with the backboard of the basket goal. The display device 28 displays an image. The display device 28 may be, for example, a transparent organic EL (Electro-Luminescence) display. In this case, a sound recognition process is performed on the sound signals included in the detection information received from the plurality of sound collection devices 17, and an image signal indicating an image corresponding to the result of the sound recognition process is generated. For example, when the content of the sound of a certain audience specified by the speech recognition process is “shoot! Decide”, an image signal indicating an image including the content of the sound is generated. The effect control unit 105 transmits control information including the generated image signal to the display device 28. When receiving the control information, the display device 28 displays an image corresponding to the image signal included in the control information. Thereby, the player can visually recognize the cheering of the audience.
変形例6
 上述した各実施形態において、観客は、必ずしもイベントが行われる場所に居なくてもよい。例えば観客は、遠隔でイベントを観戦してもよい。この場合、観客は、自宅やライブビューイングの会場等、イベントが行われる場所とは別の場所に居る。この場合、上述したイベントが行われる場所に加えてこの別の場所が、イベントに関連する場所となる。そして、撮影装置13により撮影された映像は、通信回線40を介して別の場所に配信される。この場合、通信回線40は、例えばインターネットを含んで構成されていてもよい。観客は、端末装置を用いて、配信された映像を閲覧する。また、観客は、イベントが行われる場所に居る観客と同様に、ウェアラブル端末80を装着してもよい。
Modification 6
In each embodiment mentioned above, the audience does not necessarily need to be in the place where an event is performed. For example, a spectator may watch an event remotely. In this case, the audience is in a place different from the place where the event is held, such as a home or a live viewing venue. In this case, in addition to the place where the above-described event is performed, this other place is a place related to the event. Then, the video imaged by the imaging device 13 is distributed to another place via the communication line 40. In this case, the communication line 40 may be configured including the Internet, for example. The audience browses the distributed video using the terminal device. In addition, the spectator may wear the wearable terminal 80 in the same manner as the spectator at the place where the event is held.
変形例7
 上述した各実施形態では、撮影装置13により撮影された映像を用いて選手の動きを検出していたが、選手の動きを検出する方法はこれに限定されない。例えば、ウェアラブル端末70が、選手の動きを検出する動き検出装置を備えてもよい。この動き検出装置は、例えば加速度センサー、ジャイロセンサー、及び磁気センサーのうち少なくともいずれかであってもよい。他の例において、非接触型のレーダーを用いて選手の動きを検出してもよい。このように、選手の動きを検出する動き検出装置は、上述した撮影装置13と特定手段102とが協働することにより実現されてもよいし、ウェアラブル端末70に設けられた各種のセンサーにより実現されてもよいし、非接触型のレーダーにより実現されてもよい。
Modification 7
In each embodiment described above, the movement of the player is detected using the video imaged by the imaging device 13, but the method of detecting the movement of the player is not limited to this. For example, the wearable terminal 70 may include a motion detection device that detects the motion of the player. This motion detection device may be, for example, at least one of an acceleration sensor, a gyro sensor, and a magnetic sensor. In another example, a player's movement may be detected using a non-contact type radar. As described above, the motion detection device that detects the motion of the player may be realized by the cooperation of the photographing device 13 and the specifying unit 102 described above, or realized by various sensors provided on the wearable terminal 70. Alternatively, it may be realized by a non-contact type radar.
変形例8
 上述した各実施形態において、AR(Augmented Reality)を利用して仮想オブジェクトを表示する演出が行われてもよい。この場合、例えばウェアラブル端末80にディスプレイが設けられる。このディスプレイには、ARを利用してイベントの状況に対応する仮想オブジェクトが表示される。この場合、ディスプレイを見ることにより、現実世界とは異なる演出を楽しむことができる。
Modification 8
In each embodiment described above, an effect of displaying a virtual object using AR (Augmented Reality) may be performed. In this case, for example, a display is provided on the wearable terminal 80. On this display, a virtual object corresponding to the event situation is displayed using AR. In this case, it is possible to enjoy an effect different from the real world by looking at the display.
変形例9
 上述した各実施形態では、イベントのプレーヤーが選手である例について説明したが、イベントのプレーヤーは選手に限定されない。例えばイベントがスポーツの試合以外のイベントである場合には、そのイベントで何らかのパフォーマンスを行う人がプレーヤーとなる。また、イベントがバスケットボールの試合以外のイベントである場合には、ボール60以外の物がイベントで用いられてもよい。この場合、イベントで用いられる物に上述した動き検出装置14が設けられてもよい。
Modification 9
In each embodiment described above, an example in which the event player is a player has been described, but the event player is not limited to a player. For example, when the event is an event other than a sport game, a person who performs some performance at the event is a player. When the event is an event other than a basketball game, an object other than the ball 60 may be used in the event. In this case, the motion detection device 14 described above may be provided for an object used in the event.
変形例10
 演出システム1の機能を実装する対象は、上述した各実施形態において説明した例に限定されない。例えばサーバー装置30が実装する機能の一部を検出装置10、演出装置20、又はその他の装置が実装してもよい。例えば、上述した特定手段102が検出装置10に実装されてもよい。この場合、上述したプレーの特定は、検出装置10により行われる。
Modification 10
The object for implementing the function of the production system 1 is not limited to the examples described in the above embodiments. For example, some of the functions implemented by the server device 30 may be implemented by the detection device 10, the rendering device 20, or other devices. For example, the specifying unit 102 described above may be mounted on the detection apparatus 10. In this case, the above-described play specification is performed by the detection device 10.
変形例11
 演出システム1において行われる処理のステップは、上述した実施形態で説明した例に限定されない。この処理のステップは、矛盾のない限り、入れ替えられてもよい。本発明は、演出システム1において行われる処理のステップを備える演出方法として提供されてもよい。
Modification 11
The steps of processing performed in the production system 1 are not limited to the examples described in the above-described embodiment. The steps of this process may be interchanged as long as there is no contradiction. The present invention may be provided as an effect method including steps of processing performed in the effect system 1.
変形例12
 本発明は、検出装置10、演出装置20、又はサーバー装置30において実行されるプログラムとして提供されてもよい。このプログラムは、インターネットなどの通信回線を介してダウンロードされてもよい。また、このプログラムは、磁気記録媒体(磁気テープ、磁気ディスクなど)、光記録媒体(光ディスクなど)、光磁気記録媒体、半導体メモリーなどの、コンピュータが読取可能な記録媒体に記録した状態で提供されてもよい。
Modification 12
The present invention may be provided as a program executed in the detection device 10, the rendering device 20, or the server device 30. This program may be downloaded via a communication line such as the Internet. The program is provided in a state where it is recorded on a computer-readable recording medium such as a magnetic recording medium (magnetic tape, magnetic disk, etc.), an optical recording medium (optical disk, etc.), a magneto-optical recording medium, or a semiconductor memory. May be.
変形例13
 上述した実施形態において、試合のリプレーが行われる場合、このリプレーに対応する演出効果が提供されてもよい。例えば試合が行われる会場に大型のスクリーンが設けられ、試合の特徴的なシーンがスクリーン上でリプレーされる場合がある。この場合、試合でゴール等の所定のプレーが行われたり、ファール等の所定のペナルティが科されたりした場合には、そのプレーやペナルティのシーンがその直後にリプレーされてもよい。具体的には、撮影装置13により撮影された映像が録画され、録画された映像のうちこのシーンの期間に対応する映像がスクリーンに映し出される。このとき、このリプレー映像に対応する演出効果が提供される。具体的には、検出装置10により検出された物理量を示す検出情報が記憶される。この検出情報は、サーバー装置30のストレージ34に記憶されてもよいし、検出装置10又は外部のストレージ装置に記憶されてもよい。決定手段103は、記憶された検出情報のうち上述したシーンの期間に対応する検出情報に基づく内容の演出効果を決定する。この演出効果は、リプレー映像に合わせて提供される。これにより、迫力のあるリプレーが実現できる。
Modification 13
In the above-described embodiment, when a game replay is performed, an effect corresponding to the replay may be provided. For example, there is a case where a large screen is provided at a venue where a game is held, and a characteristic scene of the game is replayed on the screen. In this case, when a predetermined play such as a goal is performed in a game or a predetermined penalty such as a foul is imposed, the play or penalty scene may be replayed immediately after that. Specifically, the video imaged by the imaging device 13 is recorded, and the video image corresponding to the period of this scene is displayed on the screen. At this time, a presentation effect corresponding to the replay video is provided. Specifically, detection information indicating a physical quantity detected by the detection device 10 is stored. This detection information may be stored in the storage 34 of the server device 30, or may be stored in the detection device 10 or an external storage device. The determination unit 103 determines the effect of the content based on the detection information corresponding to the above-described scene period among the stored detection information. This effect is provided in accordance with the replay video. Thereby, powerful replay can be realized.
 本出願は、2017年4月24日出願の日本特許出願特願2017-085454に基づくものであり、その内容はここに参照として取り込まれる。 This application is based on Japanese Patent Application No. 2017-085454 filed on Apr. 24, 2017, the contents of which are incorporated herein by reference.
1:演出システム
10:検出装置
11、15、17:収音装置
13:撮影装置
16:生体センサー
20:演出装置
21、24、26:放音装置
30:サーバー装置
101:取得手段
102:特定手段
103:決定手段
104:生成手段
105:演出制御手段
1: production system 10: detection device 11, 15, 17: sound collection device 13: imaging device 16: biometric sensor 20: production device 21, 24, 26: sound emission device 30: server device 101: acquisition unit 102: identification unit 103: Determination means 104: Generation means 105: Production control means

Claims (13)

  1.  イベントに関連する場所において検出装置により検出された物理量を示す検出情報を取得する取得手段と、
     前記取得手段により取得された前記検出情報に基づく内容の演出効果を決定する決定手段と、
     前記決定手段により決定された前記演出効果に応じて演出装置を制御し、前記演出効果を提供させる演出制御手段と
     を備える演出制御装置。
    An acquisition means for acquiring detection information indicating a physical quantity detected by the detection device at a location related to the event;
    Determining means for determining the effect of content based on the detection information acquired by the acquiring means;
    An effect control device comprising: an effect control means for controlling the effect device in accordance with the effect effect determined by the determining means and providing the effect effect.
  2.  前記取得手段により取得された前記検出情報に基づいて、前記イベントの状況を特定する特定手段をさらに備え、
     前記決定手段は、前記特定手段により特定された前記イベントの状況に対応する演出効果を決定する
     請求項1に記載の演出制御装置。
    Based on the detection information acquired by the acquisition means, further comprising a specifying means for specifying the status of the event,
    The effect control device according to claim 1, wherein the determining unit determines an effect that corresponds to the situation of the event specified by the specifying unit.
  3.  前記イベントの状況は、前記イベントにおいてプレーヤーにより行われるプレーを含む
     請求項2に記載の演出制御装置。
    The effect control device according to claim 2, wherein the event state includes a play performed by a player in the event.
  4.  前記決定手段により音を用いた演出効果が決定された場合、前記検出情報に基づく音信号を生成する生成手段をさらに備え、
     前記演出制御手段は、前記生成手段により生成された前記音信号をスピーカに供給し、当該音信号に応じた音を前記スピーカから出力させる
     請求項1から3のいずれか1項に記載の演出制御装置。
    When the effect using the sound is determined by the determining means, further comprising a generating means for generating a sound signal based on the detection information,
    The effect control according to any one of claims 1 to 3, wherein the effect control means supplies the sound signal generated by the generating means to a speaker and outputs a sound corresponding to the sound signal from the speaker. apparatus.
  5.  前記検出装置は、前記イベントにおいて発せられた音を収集するマイクロフォンを含み、
     前記検出情報は、前記マイクロフォンにより収集された前記音を示す音信号を含み、
     前記生成手段は、前記検出情報に含まれる前記音信号に音声処理を施し、
     前記演出制御手段は、前記音声処理が施された前記音信号を前記スピーカに供給する
     請求項4に記載の演出制御装置。
    The detection device includes a microphone that collects sound emitted in the event;
    The detection information includes a sound signal indicating the sound collected by the microphone,
    The generation means performs sound processing on the sound signal included in the detection information,
    The effect control device according to claim 4, wherein the effect control unit supplies the sound signal on which the sound processing has been performed to the speaker.
  6.  前記検出装置は、前記イベントのプレーヤーの動きを検出する動き検出装置を含み、
     前記検出情報は、前記動き検出装置により検出された前記動きを示し、
     前記生成手段は、前記検出情報により示される前記動きに対応する音信号を生成する
     請求項4又は5に記載の演出制御装置。
    The detection device includes a motion detection device that detects a motion of a player of the event,
    The detection information indicates the motion detected by the motion detection device;
    The production control device according to claim 4, wherein the generation unit generates a sound signal corresponding to the movement indicated by the detection information.
  7.  前記検出装置は、前記イベントのプレーヤーの生体情報を測定する生体センサーを含み、
     前記検出情報は、前記生体センサーにより測定された前記生体情報を含み、
     前記生成手段は、前記検出情報に含まれる前記生体情報に対応する音信号を生成する
     請求項4から6のいずれか1項に記載の演出制御装置。
    The detection device includes a biological sensor that measures biological information of the player of the event,
    The detection information includes the biological information measured by the biological sensor,
    The production control device according to any one of claims 4 to 6, wherein the generation unit generates a sound signal corresponding to the biological information included in the detection information.
  8.  前記取得手段は、前記イベントの経過を示す経過情報をさらに取得し、
     前記決定手段は、前記取得手段により取得された前記検出情報と前記経過情報とに基づいて、前記演出効果を決定する
     請求項1から7のいずれか1項に記載の演出制御装置。
    The acquisition means further acquires progress information indicating the progress of the event,
    The effect control device according to any one of claims 1 to 7, wherein the determining unit determines the effect based on the detection information and the progress information acquired by the acquiring unit.
  9.  前記決定手段は、前記イベントを観戦する複数の観客又は複数の観客グループについて、互いに異なる複数の演出効果を決定し、
     前記演出制御手段は、前記複数の観客又は前記複数の観客グループに対応する複数の演出装置を制御し、前記複数の演出装置の各々に前記複数の演出効果のうち当該演出装置が対応する観客又は観客のグループについて決定された演出効果を提供させる
     請求項1から8のいずれか1項に記載の演出制御装置。
    The determining means determines a plurality of effects different from each other for a plurality of spectators or a plurality of spectator groups watching the event,
    The effect control means controls the plurality of effect devices corresponding to the plurality of audience members or the plurality of audience groups, and the effect device corresponds to each of the plurality of effect devices. The production control device according to claim 1, wherein the production effect determined for the audience group is provided.
  10.  前記演出効果は、音、光、振動、又は画像を用いた効果である
     請求項1から9のいずれか1項に記載の演出制御装置。
    The effect control device according to any one of claims 1 to 9, wherein the effect is an effect using sound, light, vibration, or an image.
  11.  イベントに関連する場所において物理量を検出する検出装置と、
     前記イベントの演出効果を提供する演出装置と、
     前記検出装置により検出された前記物理量を示す検出情報を取得する取得手段と、
     前記取得手段により取得された前記検出情報に基づく内容の演出効果を決定する決定手段と、
     前記決定手段により決定された前記演出効果に応じて前記演出装置を制御し、前記演出効果を提供させる演出制御手段と
     を備える演出システム。
    A detection device for detecting a physical quantity at a location related to the event;
    An effect device that provides the effect of the event;
    Obtaining means for obtaining detection information indicating the physical quantity detected by the detection device;
    Determining means for determining the effect of content based on the detection information acquired by the acquiring means;
    An effect system comprising: an effect control means for controlling the effect device in accordance with the effect effect determined by the determining means and providing the effect effect.
  12.  イベントに関連する場所において検出装置により検出された物理量を示す検出情報を取得し、
     前記取得により取得された前記検出情報に基づく内容の演出効果を決定し、
     前記決定により決定された前記演出効果に応じて演出装置を制御し、前記演出効果を提供させる、
     コンピュータシステムにより実行される演出制御方法。
    Obtain detection information indicating the physical quantity detected by the detection device at the location related to the event,
    Determine the effect of the content based on the detection information acquired by the acquisition,
    Controlling the effect device according to the effect determined by the determination, and providing the effect.
    An effect control method executed by a computer system.
  13.  コンピュータに、
     イベントに関連する場所において検出装置により検出された物理量を示す検出情報を取得するステップと、
     前記取得された検出情報に基づく内容の演出効果を決定するステップと、
     前記決定された演出効果に応じて演出装置を制御し、前記演出効果を提供させるステップと
     を実行させるためのプログラム。
    On the computer,
    Obtaining detection information indicating a physical quantity detected by the detection device at a location related to the event;
    Determining an effect of content based on the acquired detection information;
    A program for controlling a rendering device in accordance with the determined rendering effect and providing the rendering effect.
PCT/JP2018/016678 2017-04-24 2018-04-24 Production control device, direction system, and program WO2018199115A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019514549A JPWO2018199115A1 (en) 2017-04-24 2018-04-24 Effect control device, effect system, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017085454 2017-04-24
JP2017-085454 2017-04-24

Publications (1)

Publication Number Publication Date
WO2018199115A1 true WO2018199115A1 (en) 2018-11-01

Family

ID=63920430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/016678 WO2018199115A1 (en) 2017-04-24 2018-04-24 Production control device, direction system, and program

Country Status (2)

Country Link
JP (1) JPWO2018199115A1 (en)
WO (1) WO2018199115A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111079096A (en) * 2019-12-30 2020-04-28 李文红 Copyright protection method and device for performance place and computer readable storage medium
JP2020093107A (en) * 2018-12-14 2020-06-18 株式会社ポケモン Costumed character performance assistance device, costumed character performance assistance system, and costumed character performance assistance method
CN113767643A (en) * 2019-03-13 2021-12-07 巴鲁斯株式会社 Live broadcast transmission system and live broadcast transmission method
WO2022230066A1 (en) * 2021-04-27 2022-11-03 株式会社I’mbesideyou Video analysis system
JP7203161B1 (en) 2021-07-26 2023-01-12 株式会社ポケモン Program, information processing device and information processing method
US11683646B2 (en) * 2020-03-17 2023-06-20 Sony Corporation Device, computer program and method
US11998066B2 (en) 2018-12-14 2024-06-04 The Pokemon Company Kigurumi staging support apparatus, kigurumi staging support system, and kigurumi staging support method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4391091B2 (en) * 2003-01-17 2009-12-24 ソニー株式会社 Information transmission method, information transmission device, information recording method, information recording device, information reproducing method, information reproducing device, and recording medium
JP2016051675A (en) * 2014-09-02 2016-04-11 カシオ計算機株式会社 Performance control system, communication terminal, and performance control device
JP2016167245A (en) * 2015-03-18 2016-09-15 株式会社東芝 Service providing device
JP2017016442A (en) * 2015-07-01 2017-01-19 チームラボ株式会社 User participation-type event rendering system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4391091B2 (en) * 2003-01-17 2009-12-24 ソニー株式会社 Information transmission method, information transmission device, information recording method, information recording device, information reproducing method, information reproducing device, and recording medium
JP2016051675A (en) * 2014-09-02 2016-04-11 カシオ計算機株式会社 Performance control system, communication terminal, and performance control device
JP2016167245A (en) * 2015-03-18 2016-09-15 株式会社東芝 Service providing device
JP2017016442A (en) * 2015-07-01 2017-01-19 チームラボ株式会社 User participation-type event rendering system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020093107A (en) * 2018-12-14 2020-06-18 株式会社ポケモン Costumed character performance assistance device, costumed character performance assistance system, and costumed character performance assistance method
JP7329457B2 (en) 2018-12-14 2023-08-18 株式会社ポケモン Costume production support device, costume production support system, and costume production support method
US11998066B2 (en) 2018-12-14 2024-06-04 The Pokemon Company Kigurumi staging support apparatus, kigurumi staging support system, and kigurumi staging support method
CN113767643A (en) * 2019-03-13 2021-12-07 巴鲁斯株式会社 Live broadcast transmission system and live broadcast transmission method
CN113767643B (en) * 2019-03-13 2024-04-05 巴鲁斯株式会社 Live broadcast transmission system and live broadcast transmission method
CN111079096A (en) * 2019-12-30 2020-04-28 李文红 Copyright protection method and device for performance place and computer readable storage medium
CN111079096B (en) * 2019-12-30 2023-07-07 李文红 Performance place copyright protection method and device and computer readable storage medium
US11683646B2 (en) * 2020-03-17 2023-06-20 Sony Corporation Device, computer program and method
WO2022230066A1 (en) * 2021-04-27 2022-11-03 株式会社I’mbesideyou Video analysis system
JP7203161B1 (en) 2021-07-26 2023-01-12 株式会社ポケモン Program, information processing device and information processing method
JP2023017140A (en) * 2021-07-26 2023-02-07 株式会社ポケモン Program, information processing unit, and information processing method

Also Published As

Publication number Publication date
JPWO2018199115A1 (en) 2020-02-27

Similar Documents

Publication Publication Date Title
WO2018199115A1 (en) Production control device, direction system, and program
JP7239596B2 (en) Scaled VR Engagement and Views at Esports Events
US8241118B2 (en) System for promoting physical activity employing virtual interactive arena
JP2017033536A (en) Crowd-based haptics
US9652949B1 (en) Sensor experience garment
JP5323413B2 (en) Additional data generation system
JP2021514748A (en) Creating a winner tournament under the influence of fandom
US20120269360A1 (en) Large Scale Participatory Entertainment Systems For Generating Music Or Other Ordered, Discernible Sounds And/Or Displays Sequentially Responsive To Movement Detected At Venue Seating
JPWO2019021375A1 (en) Video generation program, video generation method and video generation device
US11977671B2 (en) Augmented audio conditioning system
WO2019124069A1 (en) Information processing device, information processing method, and program
TWI793633B (en) Image transfer system, recording medium storing computer program used therein, and control method
JP2019141162A (en) Computer system
WO2017002642A1 (en) Information device and display processing method
JP2023133397A (en) Image processing device, image processing method, and image processing system
US11941177B2 (en) Information processing device and information processing terminal
JP7074800B2 (en) Information provision system and information provision method
JP2021124958A (en) Spectator analyzer, spectator analysis method, and computer program
JP2019144882A (en) Production providing system
US20240048934A1 (en) Interactive mixed reality audio technology
US20230009322A1 (en) Information processing device, information processing terminal, and program
CN114945893A (en) Information processing apparatus and information processing terminal
WO2023127044A1 (en) Image processing device, image processing method, and non-transitory computer-readable medium
JP6523038B2 (en) Sensory presentation device
WO2022102550A1 (en) Information processing device and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18790373

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019514549

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18790373

Country of ref document: EP

Kind code of ref document: A1