WO2010087155A1 - データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム - Google Patents
データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム Download PDFInfo
- Publication number
- WO2010087155A1 WO2010087155A1 PCT/JP2010/000438 JP2010000438W WO2010087155A1 WO 2010087155 A1 WO2010087155 A1 WO 2010087155A1 JP 2010000438 W JP2010000438 W JP 2010000438W WO 2010087155 A1 WO2010087155 A1 WO 2010087155A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- viewing environment
- feature amount
- video data
- calculation
- Prior art date
Links
- 230000005540 biological transmission Effects 0.000 title claims description 127
- 238000000034 method Methods 0.000 title claims description 95
- 238000004364 calculation method Methods 0.000 claims abstract description 124
- 238000012937 correction Methods 0.000 claims description 71
- 230000001276 controlling effect Effects 0.000 claims description 43
- 230000000875 corresponding effect Effects 0.000 claims description 41
- 230000002596 correlated effect Effects 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 5
- 230000000694 effects Effects 0.000 abstract description 38
- 238000005286 illumination Methods 0.000 description 90
- 230000008569 process Effects 0.000 description 24
- 238000012545 processing Methods 0.000 description 17
- 230000008859 change Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000003672 processing method Methods 0.000 description 4
- 238000000926 separation method Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 3
- 238000007796 conventional method Methods 0.000 description 2
- 239000003205 fragrance Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013643 reference control Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
Definitions
- the present invention relates to a data transmission device, a data transmission method, a viewing environment control device, a viewing environment control method, and a viewing environment control system, and in particular, a data transmission device and a data transmission method for controlling peripheral devices in a user's viewing environment space.
- the present invention relates to a viewing environment control device, a viewing environment control method, and a viewing environment control system.
- Patent Document 1 discloses a technique for linking an image displayed on such a display with illumination light of an illumination device.
- the illumination system disclosed in Patent Document 1 is intended to provide a high sense of presence, and is an illumination system that controls a plurality of illumination devices in conjunction with an image to be displayed.
- illumination control data for a plurality of illumination devices is generated from feature amounts of video data such as representative colors and average luminance. Specifically, a feature amount of video data in a predetermined screen area is detected according to the installation position of each lighting device, and illumination control data for each lighting device is generated based on the detected feature amount. Yes.
- the lighting control data is not limited to the data calculated from the feature amount of the video data, but may be used alone or with the video data distributed via the Internet or the like and distributed by the carrier wave. This is described in Patent Document 1.
- the ambient light in a system for controlling ambient light, is not simply set to reflect the average color of the entire screen of video data, but reacts to the color at a specific screen position. It describes that ambient light can be set.
- JP 2001-343900 A Japanese Patent Gazette “Special Publication 2005-531909” (October 20, 2005)
- the feature quantity (color signal and luminance signal) at a specific screen position for each frame (screen) in the video signal to be displayed is detected, and the illumination light is controlled. For this reason, it is difficult to generate illumination light that matches the field (atmosphere) of the video depending on the content of the displayed video. For example, if the surroundings are exposed to inappropriate colors of illumination light due to the influence of clothes of the subject person included in the video signal or artifacts in the background of the subject, the atmosphere of each scene is reproduced. , And the realistic sensation of each scene cannot be maintained. Viewing environment lighting that deviates significantly from the lighting situation at the time of shooting a video scene, on the contrary, impairs the sense of reality.
- the state of the illumination light changes according to the change in the luminance or hue of the video signal for each frame. Therefore, particularly when the degree of change in luminance and hue between frames is large, there is a problem that illumination light changes complicatedly and viewers feel uncomfortable due to flicker. Furthermore, when a single scene is displayed with no change in the lighting situation at the time of shooting, the illumination light greatly fluctuates according to changes in luminance and hue for each frame, which adversely inhibits the reproduction of the atmosphere in each scene. This is not preferable.
- video scenes are usually created by shooting under appropriate lighting conditions as a single segment based on a set of scene settings, for example, for the purpose of video producers (screenwriters, directors, etc.) Is done. Therefore, it is desirable to irradiate the viewing space with illumination light in accordance with the illumination status at the time of scene shooting of the displayed video in order to enhance the sense of presence and the atmosphere when viewing the video.
- the present invention has been made in view of the above problems, and its purpose is to provide a viewing environment in which an optimal viewing environment can be realized by appropriately controlling a viewing environment device that adjusts the viewing environment.
- a control device and a data transmission device are provided.
- a viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above-described problem.
- the viewing environment device is a device for adjusting the surrounding environment when the user views the video, and is intended to be, for example, a lighting device and an air conditioner.
- the viewing environment control device receives the reference information indicating the calculation area when calculating the feature amount of the video data or the audio data, and the video data or the audio data in the calculation area indicated by the received reference information The feature amount is calculated.
- the viewing environment control device controls the viewing environment device based on the calculated feature amount.
- the viewing environment device based on the feature amount in the specific area of the video data or audio data specified by the reference information received from the outside. Therefore, for example, when calculating the feature amount of the video data, it is possible to eliminate the influence of the subject person's clothes included in the video data or the artifacts in the background of the subject. It is possible to control the viewing environment device according to the atmosphere or scene setting. As a result, it is possible to provide the user with a viewing environment that provides a high sense of realism.
- a viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above-described problem.
- a calculation means for calculating the feature quantity of the video data or audio data by the calculation method indicated by the calculation designation information.
- the viewing environment control device receives the operation designation information indicating the calculation method when calculating the feature amount of the video data or the audio data, and the video data or the audio data is calculated by the calculation method indicated by the received operation designation information.
- the feature amount of the audio data is calculated.
- the viewing environment control device controls the viewing environment device based on the calculated feature amount.
- the viewing environment device can be controlled based on the feature amount of the video data or audio data obtained by the specific calculation method designated by the computation designation information received from the outside. Therefore, for example, when calculating the feature amount of the video data, it is possible to obtain an appropriate feature amount according to the video content, and the viewing environment device is adapted to the shooting scene atmosphere or scene setting intended by the content creator. Can be controlled. As a result, it is possible to provide the user with a viewing environment that provides a high sense of realism.
- a viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above-described problem.
- the viewing environment control device receives the correction information indicating the allowable range when calculating the feature amount of the video data or the audio data, and the video data or the video data within the allowable range indicated by the received correction information.
- the feature amount of the audio data is calculated.
- the viewing environment control device controls the viewing environment device based on the calculated feature amount.
- the calculated feature amount falls within the allowable range specified by the correction information received from the outside. Therefore, it is possible to prevent the feature amount used for controlling the viewing environment device from being repeatedly increased and decreased. Thereby, it is possible to prevent the driving intensity of the viewing environment device from fluctuating violently and prevent the viewing environment from changing greatly.
- a viewing environment control device is a viewing environment control device that controls a viewing environment device based on control reference data received from the outside in order to solve the above-described problem. It comprises a calculating means for calculating a feature amount of data and a correcting means for correcting the control reference data using the feature amount of the video data or audio data.
- the viewing environment control device calculates the feature amount of the video data or audio data received from the outside, and corrects the control reference data received from the outside using the calculated feature amount.
- the viewing environment control device controls the viewing environment device based on the corrected control reference data.
- control reference data for the viewing environment device received from the outside can be corrected using the feature amount calculated from the video data or the audio data, and the viewing environment device can be controlled based on this data. It is possible to provide a viewing environment that follows changes in the contents of video and audio without greatly deviating from the atmosphere or scene setting of the shooting scene.
- a data transmission device provides data to a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data including one or more frames.
- a configuration including a transmission unit that transmits reference information indicating a calculation area when calculating the feature amount of the video data or audio data.
- the data transmission device transmits the reference information to the viewing environment control device.
- the reference information indicates a calculation area when the viewing environment control apparatus calculates a feature amount of video data or audio data.
- the calculated feature amount is used when the viewing environment control apparatus controls the viewing environment device.
- the viewing environment control device calculates the feature amount of the video data or audio data in the calculation area indicated by the reference information.
- the viewing environment control device controls the viewing environment device based on the calculated feature amount. Therefore, it is possible to control the viewing environment device based on the feature amount in a specific area of video data and audio data designated in advance. Accordingly, the viewing environment device can be controlled in accordance with the atmosphere or scene setting of the shooting scene intended by the content creator. Therefore, it is possible to provide the user with a viewing environment that can provide a high sense of realism.
- a data transmission device provides data to a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data including one or more frames.
- a data transmission device that includes transmission means for transmitting calculation designation information indicating a calculation method for calculating the feature amount of the video data or audio data.
- the data transmission device transmits the calculation designation information to the viewing environment control device.
- the calculation designation information indicates a calculation method used when the viewing environment control apparatus calculates the feature amount of the video data or audio data.
- the calculated feature amount is used when the viewing environment control apparatus controls the viewing environment device.
- the viewing environment control device calculates the feature amount of the video data or the audio data based on the calculation method indicated by the calculation designation information.
- the viewing environment control device controls the viewing environment device based on the calculated feature amount. Therefore, it is possible to control the viewing environment device based on the feature amount of video data or audio data obtained by a specific calculation method specified in advance. Accordingly, the viewing environment device can be controlled in accordance with the atmosphere or scene setting of the shooting scene intended by the content creator. Therefore, it is possible to provide the user with a viewing environment that can provide a high sense of realism.
- a data transmission device provides data to a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data including one or more frames.
- a configuration including transmission means for transmitting correction information indicating an allowable range when calculating the feature amount of the video data or audio data.
- the data transmission device transmits the correction information to the viewing environment control device.
- the correction information indicates an allowable range when the viewing environment control apparatus calculates the feature amount of the video data or audio data.
- the calculated feature amount is used when the viewing environment control apparatus controls the viewing environment device.
- the viewing environment control device calculates the feature amount of the video data or the audio data within the allowable range indicated by the correction information. For this reason, the calculated feature amount falls within an allowable range designated in advance.
- the viewing environment control device controls the viewing environment device based on the feature amount falling within the allowable range. Thereby, it is possible to prevent the feature amount used for controlling the viewing environment device from being repeatedly increased and decreased. Therefore, it is possible to prevent the drive strength of the viewing environment device from fluctuating drastically and prevent the viewing environment from changing greatly.
- a data transmission device is a data transmission device that transmits control reference data for controlling a viewing environment device to a viewing environment control device, wherein the control reference data is This is a configuration set for each segment which is a set of frames constituting video data or audio data.
- the data transmission device transmits the control reference data set for each segment, which is a set of frames constituting video data or audio data.
- the viewing environment control device controls the viewing environment device based on the control reference data. At this time, control can be performed for each segment unit which is a set of frames. By dividing the segments in correspondence with scenes, shots, frames, etc., the viewing environment device can be controlled in accordance with the scene setting. As a result, it is possible to provide the user with a viewing environment that provides a high sense of realism.
- a viewing environment control method is a viewing environment control method for controlling a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above problem.
- a receiving step of receiving reference information indicating a calculation area when calculating a feature amount of the video data or audio data, and a calculation step of calculating a feature amount of the video data or audio data in the calculation region indicated by the reference information It is the structure containing these.
- a viewing environment control method is a viewing environment control method for controlling a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above problem.
- a calculation step for calculating is a calculation designation information indicating a calculation method for calculating the feature amount of the video data or audio data; and a feature amount of the video data or audio data by the calculation method indicated by the calculation designation information.
- a viewing environment control method is a viewing environment control method for controlling a viewing environment device based on a feature amount of video data or audio data composed of one or more frames in order to solve the above problem.
- a viewing environment control method is a viewing environment control method for controlling a viewing environment device based on control reference data received from the outside in order to solve the above-described problem.
- it may be configured to include a calculation step of calculating a feature amount of audio data and a correction step of correcting the control reference data using the feature amount of the video data or audio data.
- the viewing environment control method according to the present invention has the same effects as the viewing environment control apparatus according to the present invention.
- the viewing environment control apparatus it is possible to control appropriate viewing environment devices according to video data or audio data, and therefore it is possible to provide a user with a viewing environment suitable for video or audio.
- a lighting device an air conditioner, a blower, and a vibration device are described as examples of peripheral devices arranged in the viewing environment space.
- any device that controls the viewing environment is limited to these devices. is not.
- the present invention can be applied to a scent generator.
- FIG. 2 is a block diagram showing a schematic configuration showing an embodiment of the data transmission apparatus according to the present invention.
- the data transmission device is a device that transmits video data, audio data, and metadata to a viewing environment control system described later.
- the data transmission device 1 includes a data multiplexing unit 11, a transmission unit (transmission means) 12, and a data encoding unit 13.
- the data encoding unit 13 compresses and encodes the input video data and outputs it to the data multiplexing unit 11.
- various compression methods such as ISO / IEC 13818-2 (MPEG-2 Video), ISO / IEC 14496-2 (MPEG-4 Visual), ISO / IEC 14496-10 (MPEG-4 AVC), etc. Is available.
- the data encoding unit 13 compresses and encodes the input audio data and outputs the compressed audio data to the data multiplexing unit 11.
- Various compression methods such as ISO / IEC 13818-7 (MPEG-2 AAC) and ISO / IEC 14496-3 (MPEG-4 Audio) can be used for audio encoding.
- the data encoding unit 13 compresses and encodes the input metadata and outputs it to the data multiplexing unit 11. Details of the metadata will be described later.
- a metadata description method for example, an XML (Extensible Markup Language) format or the like is used.
- XML Extensible Markup Language
- the BiM Binary format for MPEG-7 method in ISO / IEC 15938-1 (MPEG-7 Systems) can be used. Note that the XML format may be output without compression.
- the data multiplexing unit 11 multiplexes the encoded video data, audio data, and metadata.
- the multiplexing method for example, MPEG-2 transport stream packet (TSP), IP packet, RTP packet or the like in ISO / IEC 13818-1 (MPEG-2 Systems) can be used.
- Metadata is described in an extension header portion following a header in which information stipulated in MPEG-2 is described. It is possible to transmit video data and audio data using the payload following the header. Alternatively, the metadata may be transmitted using a payload in the same manner as video data and audio data. Further, different data streams of video data, audio data, and metadata may be multiplexed and transmitted.
- TSP transport stream packet
- metadata is described in an extension header portion following a header in which information stipulated in MPEG-2 is described. It is possible to transmit video data and audio data using the payload following the header. Alternatively, the metadata may be transmitted using a payload in the same manner as video data and audio data. Further, different data streams of video data, audio data, and metadata may be multiplexed and transmitted.
- the transmission unit 12 transmits or stores the multiplexed video data, audio data, and metadata as broadcast data.
- the three types of data, video data, audio data, and metadata are multiplexed and transmitted as broadcast data.
- multiplexing is not an essential requirement.
- an appropriate transmission method may be selected.
- the respective data may be transmitted separately without being multiplexed, or only the video data and audio data may be multiplexed and the metadata may be transmitted independently.
- the metadata is stored in an external server device accessible via the Internet, and a URL (Uniform Resource Locator) for identifying the stored metadata is multiplexed with the video data. May be transmitted.
- the information (identification data) for associating the metadata with the video / audio multiplexed data is not limited to the URL, but is a CRID (Content) in the TV-Anytime standard.
- Reference information (Reference ID), content name, and the like may be specific information that can specify the correspondence between the metadata and the video / audio multiplexed data.
- Metadata may be recorded on a separate recording medium and distributed.
- video / audio data may be recorded and distributed on a large-capacity recording medium such as Blu-ray Disc or DVD, and metadata may be recorded and distributed on a small semiconductor recording medium.
- specific information that can clarify the correspondence between the video / audio data and the metadata is included together with the video / audio data.
- metadata is handled as separate data, but it goes without saying that it may be described in one data format including the data contents of both.
- FIG. 3 is a diagram showing an example of the data structure of metadata.
- video data is divided into segments each composed of an arbitrary number of one or more frames, and various additional information related to each segment is described in segment units.
- Various additional information includes segment delimiter information Ds, control information Dr, automatic control device control information Da, and manual control device control information Dm.
- automatic control refers to viewing using the output video data or audio data according to a predetermined processing method (for example, a region calculation designation or acoustic calculation designation processing method). It determines and controls the driving strength of environmental equipment.
- the manual control is to control the viewing environment device by directly using the control data designated in advance (manually) directly from the transmission side.
- FIG. 4 shows the configuration of the segment delimiter information Ds.
- the segment delimiter information is information for specifying a segment.
- the segment delimiter information Ds is composed of a segment number, a start time (start point), and section information.
- the segment start time is specified by a time code added to indicate reproduction time information of each of video data and audio data.
- the video data is composed of information indicating time (h): minute (m): second (s): frame (f).
- the segment section is constituted by information indicating a time (h): minute (m): second (s): frame (f) indicating a period from the time indicating the start time code of the segment to the start time code of the next segment. Has been.
- both the start time of the segment (start frame) and the section (number of frames) are described, but at least one of them may be described. Even in the case of only one of them, the segment break can be uniquely determined if the segment is continuous with the next segment and there is no overlap.
- a frame in which information related to certain lighting control is set may be set as a start frame of the segment, and a frame in which information related to lighting control is set next may be set as the segment.
- the segment is defined as a configuration of video content divided into arbitrary lengths in the time direction, and is divided in accordance with, for example, a scene, a shot, or a frame.
- a segment corresponds to a scene
- information existing in shot units is collectively described in the segment.
- a segment corresponds to a shot
- information existing in a scene unit is repeatedly described in all corresponding segments.
- a segment corresponds to a frame
- information existing in a scene unit and a shot unit is repeatedly described in all corresponding segments.
- the control information Dr defines which data of manual control device control information Dm and automatic control device control information Da is used to control the viewing environment device in each segment. That is, it shows a control method for each viewing environment device when each frame included in each segment is output to the video display device. As shown in FIG. 3, in the present embodiment, two types of control methods, an automatic control mode and a manual control mode, are defined, and one mode is designated as control information Dr for one segment as control information Dr. Described and specified in the area.
- the manual control device control information Dm is control data designated by the content creator for each viewing environment device in order to reproduce the environmental situation at the time of shooting the video data included in each segment. Specifically, illumination data representing photographing illumination used for photographing video data, wind power data representing wind power, temperature data representing temperature, vibration data representing vibration, and the like are described.
- FIG. 5 is a diagram showing an example of description contents of each data in the manual control device control information Dm. As shown in FIG. 5, the effect control type and condition (value indicating the effect) to be added to the content are input to the manual control device control information Dm.
- the brightness of lighting (unit: lx) and the color temperature of lighting (unit: K) in the viewing environment recommended by the content creator or content provider are specified as lighting conditions.
- the color temperature of lighting (unit: K) in the viewing environment recommended by the content creator or content provider are specified as lighting conditions.
- candela (Cd) and lumen (lm) may be used for expression of brightness.
- color expression not the color temperature but an XYZ color system, an RGB color system, a YCbCr color system, or the like may be used.
- the video data can produce the atmosphere and presence of each scene with illumination light. Therefore, the illumination conditions for the illumination device are useful information. For example, by controlling the lighting in the viewing environment, the realistic sensation of each scene of contemporary and historical drama can be improved. That is, the color temperature of light of a lighting device generally used in modern dramas is about 5000K for a fluorescent lamp (day white), about 6700K for a fluorescent lamp (daylight color), and about 2800K for an incandescent light bulb. On the other hand, the color temperature of the flame of a candle often used as a light source at night in a historical drama is about 1800-2500K. In addition, lighting intensity tends to be high in modern plays and low lighting intensity in historical plays.
- the content creator or the content provider has a drive condition for appropriately controlling the peripheral viewing environment device in the content viewing environment when viewing video data and audio data. It is desirable to specify. As this condition, the content creator or the content provider actually views the video data and audio data, and manually specifies the lighting conditions (brightness, color temperature) in each scene, for example. Can be created. Alternatively, by analyzing video data or video encoded data, the average brightness and dominant color in the entire screen or around the screen are automatically extracted, and the brightness and color temperature of the lighting are based on the extracted brightness information and color information. It is also possible to determine illumination conditions such as
- the viewing environment device can create an appropriate viewing space by recording the environmental state at the time of capturing the image in segments, regardless of the feature amount of the image to be displayed. Can be controlled.
- the manual control device control information Dm often changes in units of frames or shots, so that the shooting environment information can be described in units of segments by dividing the segments in accordance with the frames or shots.
- Each segment only needs to include at least one of automatic control device control information Da and manual control device control information Dm. From the viewpoint of suppressing an increase in the amount of data, May be included. For example, when the control information Dr indicates the automatic control mode, describe only the automatic control device control information Da, and when the control information Dr indicates the manual control mode, describe the automatic control device control information Da. Instead, only the manual control device control information Dm may be described.
- the automatic control device control information Da indicates a restriction to be applied when the data receiving device described later generates control data for each viewing environment device using video data and audio data included in each segment in the automatic control mode. Compensation information, reference information used to extract a reference feature value when the data receiving apparatus applies a restriction, and operation specification indicating an operation specifying method for calculating a reference feature value when the data receiving apparatus applies a restriction Information or control reference information (control reference data) that is control data designated by the content creator.
- the reference information is a target area that is an area for extracting (calculating) a video feature amount (also referred to as an image feature amount) that is a feature amount of video data in one or more representative frames in each segment.
- the correction information includes color correction information representing a color range allowed as a video feature amount and acoustic correction information representing a frequency region of a sound level allowed as an acoustic feature amount in the generated control data.
- luminance correction information representing a luminance range allowed as a video feature amount may be included.
- calculation designation information area calculation designation information indicating that the area calculation method is designated, acoustic calculation designation information indicating that the acoustic calculation method is designated, and the like are described.
- the representative frame of each segment is determined by selecting in advance for each segment or by automatically selecting some frame of each segment.
- the control reference information indicates color reference information and acoustic reference information predetermined by the content creator, which is information of the same type as the video feature quantity and the acoustic feature quantity extracted from each frame.
- the video feature amount or the audio feature amount is appropriately calculated from the video data to be displayed or the audio data to be reproduced using the automatic control device control information Da, and the feature amount is calculated. Based on this, control data for the viewing environment device that defines the driving strength of the viewing environment device is generated. Then, the viewing environment device is controlled using the control data. Therefore, it is possible to appropriately determine the driving strength of the viewing environment device according to the content of the display video or audio (scene setting status on the story) and output it to the viewing environment device.
- the area information relates to information obtained by dividing the display screen of the video display device as shown in FIGS. 7 and 8 into one or more predetermined blocks. For example, as shown in FIG. 8, information such as block coordinates (x1, y1) and (x2, y2) may be stored.
- the area information is not limited to the information related to the blocks divided in units of pixels as described above. Even in the case of division in units of pixels, it is not necessary to make the number of pixels the same in the x direction or the y direction, and the size of the block may be changed freely.
- region information such as a RegionLocator descriptor of ISO / IEC 15938-3 (MPEG-7 Visual Descriptions) or a Spatial Temporal Locator descriptor may be referred to.
- the block may change within a segment.
- Various linear interpolations or nonlinear interpolations are performed on the size and shape of the block in the section where there is a change.
- the image feature amount in the screen area specified by the area information is extracted from the representative frame and the output frame, and the extracted image feature is extracted.
- Control data for controlling a specific viewing environment device is generated using the amount. That is, as will be described later, the data receiving apparatus calculates the video feature quantity in the screen area specified by the area information in the representative frame and the video feature quantity in the screen area specified by the same area information in each frame. The final feature amount in each frame is calculated and control data is generated.
- the above-described color information can be transmitted as the automatic control device control information Da for metadata after designating a desired color.
- brightness unit: lx
- illumination color temperature unit: K
- the like are designated as illumination conditions.
- candela Cd
- lumen lm
- the color temperature ranges are categorized in advance, and terms such as “hot”, “warm”, “moderate”, and “cool” are used. You may express using.
- the color expression may use an XYZ color system, an RGB color system, a YCbCr color system, or the like instead of the color temperature. Note that both the color information and the area information may be transmitted, or only one of them may be transmitted.
- the color correction information represents a color range allowed as a final feature amount. Therefore, in the threshold processing of the viewing environment control method to be described later, the color correction information specified by the content creator and the color extracted in each frame are used without using the threshold set and held by the user in the threshold setting unit. From the feature amount, a final feature amount that falls within the allowable range can be obtained. Then, the control data of the viewing environment device is generated from the final feature amount within the allowable range. That is, in the same segment, the change of the color range in illumination can be limited to a desired allowable range.
- the acoustic information is specified as a frequency (unit: Hz), for example.
- the acoustic information relates to information specifying one or more sound frequencies as shown in FIG.
- information such as the wind power application frequency Fw and the vibration application frequency Fv may be stored.
- the acoustic information is not limited to the frequency designation. Even if the frequency is specified, the frequency range may be freely changed. For example, information such as Fw_s to Fw_e for the range of the applied wind frequency and Fv_s to Fv_e for the range of the vibration applied frequency may be stored.
- this range is the final feature amount allowable range. Therefore, in the threshold processing of the viewing environment control method to be described later, the sound correction information specified by the content creator and the audio corresponding to the frame to be displayed are used without using the threshold set and held by the user in the threshold setting unit. From the sound pressure level (signal intensity) that is the feature amount of the data, a final feature amount that falls within the allowable range can be obtained. Then, the control data of the viewing environment device is generated from the final feature amount within the allowable range. That is, in the same segment, changes in wind power and vibration in the wind power device and the vibration device can be limited to a desired tolerance.
- the calculation designation information indicates a method for calculating the video feature quantity or the acoustic feature quantity.
- the video feature amount in the screen area is calculated according to the specified area calculation method.
- the area calculation method include calculation of an average color in a predetermined screen area, calculation of a high luminance area color, calculation of a color corresponding to an illumination color temperature (correlated color temperature), and a color corresponding to a memory color (for example, Calculation of blue sky, sunset, fresh green color, etc.), but is not limited to this.
- the sound pressure level (signal intensity) within the frequency range is calculated according to the specified acoustic calculation method.
- acoustic calculation methods include, but are not limited to, calculation of an average sound pressure level and a maximum sound pressure level within a frequency range.
- the device control information for automatic control is used to generate control data for driving and controlling the viewing environment device in an effect control data generation unit of the data receiving apparatus described later.
- each of the automatic control device control information Da is set only for the representative frame in the segment. It goes without saying that automatic control device control information Da may be set for each frame.
- Viewing environment control system The viewing environment control system according to the present embodiment will be described below with reference to FIG.
- FIG. 1 is a schematic configuration diagram showing an embodiment of a viewing environment control system 100 according to the present invention.
- the viewing environment control system 100 includes a data receiving device (viewing environment control device) 2, a video display device (display device) 3, an audio playback device 4, a lighting device (viewing environment device) 5, and a wind power device.
- Viewing environment device) 6 an air conditioner (viewing environment device) 7, and a vibration device (viewing environment device) 8 are configured.
- the viewing environment control system 100 is used together with the data transmission device 1.
- the data receiving device 2 receives broadcast data provided from the data transmitting device 1 provided outside the viewing environment control system 100 and receives video data and audio data included in the broadcast data as a video display device 3 and an audio playback device 4. Output to. Further, after the metadata is converted into drive control data for driving and controlling each viewing environment device, the metadata is output to each viewing environment device such as the illumination device 5 in synchronization with the corresponding video data or the like. Thus, a viewing environment suitable for the video displayed on the video display device 3 is formed.
- the data receiving apparatus 2 includes a receiving unit (receiving unit) 21, a data separating unit 22, delay generating units 23 and 24, an effect control data generating unit (calculating unit, correcting unit) 25 as a drive control data generating unit, and a threshold setting unit. 26.
- the receiving unit 21 receives broadcast data in which video data, audio data, and metadata are multiplexed transmitted from the transmission device 1.
- the receiving unit 21 demodulates broadcast data input from the transmission path and performs error correction.
- the data separator 22 separates and extracts video data, audio data, and metadata from the broadcast data received by the receiver 21.
- the separated video data and audio data are transmitted to the delay generators 23 and 24 together with a TC (time code) indicating the start time (start point) of the video data and audio data.
- the video data is sent to the video display device 3 via the delay generator 23.
- the audio data is sent to the audio reproduction device 4 via the delay generator 24.
- the separated video data and audio data are sent to the effect control data generation unit 25 together with the separated metadata.
- the effect control data generation unit 25 is based on the video data, audio data, and metadata separated by the data separation unit 22, the lighting device 5, the wind device 6, and the air conditioner that are installed in the viewer's actual viewing environment space.
- Illumination dimming data RGB data
- wind power data wind speed data
- temperature data temperature data
- vibration data are generated as drive control data for appropriately driving and controlling the device 7 and the vibration device 8, respectively.
- the lighting dimming data, wind power data, temperature data, and vibration data are output to the lighting device 5, the wind power device 6, the air conditioner 7, and the vibration device 8, respectively.
- the video feature amount and the acoustic feature amount used when generating the drive control data are extracted by the effect control data generation unit 25.
- the threshold setting unit 26 holds the threshold used by the effect control data generation unit 25 so that the effect control data generation unit 25 can use it.
- the effect control data generation unit 25 uses the threshold value stored in the threshold setting unit 26 to generate data for driving and controlling the viewing environment device installed in the viewing environment space of the user.
- the threshold value held by the threshold value setting unit 26 is set in advance. However, for example, the setting may be changed according to a user operation, or according to the content type (genre) of the video to be displayed. May be variably set.
- Delay generation units 23 and 24 delay the video data and audio data separated by the data separation unit 22, and synchronize each drive control data with the video data and audio data. For example, the video data and the audio data separated in the data separation unit 22 are delayed by the time required for the conversion to the drive control data in the effect control data generation unit 25. Thereby, the illumination dimming data sent to the lighting device 5, the wind power data sent to the wind power device 6, the temperature data sent to the air conditioner 7, and the vibration data sent to the vibration device 8, the video data and the audio data The output timing can be matched.
- the data receiving device 2 of the viewing environment control system 100 may be provided integrally with the video display device 3 and the audio playback device 4, or may be provided separately from each other.
- the illuminating device 5 for example, LED light sources of R (red), G (green), and B (blue) colors that can be independently controlled to emit light are arranged with a certain interval. it can.
- the illumination device 5 emits illumination light having a desired color and brightness using the LED light sources of these three primary colors.
- the illumination device 5 is not limited to the combination of LED light sources that emit light of a predetermined color, as long as the illumination color and brightness of the surrounding environment of the video display device 3 can be controlled. .
- you may comprise by white LED and a color filter, the combination of a white light bulb or a fluorescent tube and a color filter, a color lamp, etc. can also be applied.
- the expression is not limited to R (red), G (green), and B (blue) colors, but may be expressed using, for example, illumination color temperature (unit: K).
- a fan As the wind power device 6, a fan, a fan, or a blower can be used.
- an air conditioner (air conditioner) can be used.
- a vibration chair or a vibration mat can be used as the vibration device 8.
- FIG. 6 shows an example in which the lighting devices 5a, 5b, 5c, the wind power devices 6a, 6b, the air conditioner 7, and the vibration device 8 are driven in conjunction with each other.
- the effect control data generation unit 25 determines whether the control information Dr is the automatic control mode or the manual control mode, and derives effect control data suitable for each mode.
- the effect control data generation unit 25 corrects the illumination data, wind data, temperature data, and vibration data, which are the device control information Dm for manual control. Instead, the lighting device 5, the wind power device 6, the air conditioning device 7, and the vibration device 8 are converted into data for driving control, and the data is output to each device.
- the effect control data generation unit 25 uses reference information, calculation designation information, color correction information, acoustic correction information, and the like, which are automatic control device control information Da. Then, the video feature amount of the frame to be displayed and the acoustic feature amount of the audio data corresponding to the frame to be displayed are calculated.
- the effect control data generation unit 25 drives and controls the lighting device 5, the wind power device 6, the air conditioner 7, and the vibration device 8 using the video feature amount and the audio feature amount calculated using the automatic control device control information Da. Data is generated, and the generated data is output to each device.
- the effect control data generation unit 25 stores the video data of a predetermined pixel stored in the region specified by the region information in the representative frame in the segment including the frame output from the data receiving device 2 to the video display device 3.
- a block average value is calculated and used as reference lighting data (when there is no reference lighting data specified by the content creator).
- the reference illumination data is converted into data indicating lightness (Lightness), saturation (Chroma) (color), and hue (Hue) (color) (hereinafter, brightness, saturation, and hue information are referred to as “ Called color signal data).
- Called color signal data an embodiment of a method for generating illumination data for each frame in the corresponding segment will be described.
- FIG. 11 is a view for explaining threshold setting of illumination data in the automatic setting mode.
- the effect control data generation unit 25 calculates the color signal data in the region of each frame specified by the region information and performs threshold processing, that is, the threshold set by the user or set by the content creator.
- the final color signal data of each frame is obtained by calculating the video feature quantity or the acoustic feature quantity within the allowable range indicated by the correction information.
- the threshold value obtained based on the color correction information is set and held in the threshold value setting unit 26 in the same manner as the threshold value specified by the user.
- the illumination data corresponding to each frame for controlling the illumination color is set based on the color signal data of the reference illumination data.
- FIG. 12 is a flowchart illustrating an example of a process flow for determining illumination data from color signal data.
- the effect control data generation unit 25 acquires the metadata and video data output from the data separation unit 22, and acquires the color signal data from the metadata and video data by the above method (step S1).
- the acquired color signal data is compared with Threshold_max (step S2). Since the color signal data is generated for each video frame, the color signal data in step S1 is acquired for each frame, and the comparison of the color signal data in step S2 is performed for each frame. And If the acquired color signal data is larger than Thresh_max, Thresh_max is set as final color signal data in the frame (step S5). If the acquired color signal data is not larger than Thresh_max, it is determined whether or not the acquired color signal data is smaller than Thresh_min (step S3). If the acquired color signal data is smaller than Thresh_min, Thresh_min is set as final color signal data in the frame (step S6). If the acquired color signal data is not smaller than Thresh_min, the acquired color signal data is directly used as the final color signal data in the frame (step S4).
- the final color signal data determined by this threshold processing is the illumination data for that frame.
- the effect control data generation unit 25 determines whether there is a next frame in the same segment (step S7). If there is a next frame, the process returns to step S1. The color signal data of the next frame is acquired, and the above process is repeated.
- the illumination data when determining the illumination data, is determined within the threshold range by referring to the threshold according to the increase / decrease of the color signal data representing the video feature amount of each frame. Thereby, the illumination data is set so as to be within an allowable range defined for each segment. Therefore, even when the color signal data of each frame repeats a large increase and decrease, it is possible to prevent the illumination data from fluctuating violently following this. Thereby, stable switching control of the lighting device 5 can be realized.
- the reference illumination data is calculated from the video data and the audio data in the data receiving device 2, but when the reference data preset by the content creator can be received, the reference illumination data received from the outside Can be used. Even in this case, final color signal data can be acquired by the processing shown in FIG. Note that the reference illumination data specified by the content creator is described as color reference information in the automatic control device control information Da as shown in FIG.
- the color signal data in segment units received as reference illumination data from the outside, the color signal data that is the feature amount of each frame calculated from the video data, and the threshold value are used to make the final color signal within an allowable range.
- the processing method can be rephrased as follows. That is, the color signal data acquired from the outside as the reference illumination data is compared with the color signal data of each frame calculated from the video data, and the color signal data acquired as the reference illumination data is calculated for each frame calculated from the video data. A correction process is performed so as to approach the color signal data.
- the final color signal data is set to Thresh_max, and when the color signal data of the reference illumination data becomes smaller than Thresh_min, Let the color signal data be Thresh_min.
- the color signal data after the correction processing is set as final color signal data. Even in this case, even if the color signal data of each frame is repeatedly repeatedly increased and decreased, it is possible to prevent the illumination data from fluctuating greatly following this. In addition, it is possible to improve the sense of reality more effectively by changing the reference control data for each segment reflecting the intentions of the content creator in units of frames following the content of the video data to be displayed. Become.
- the illumination by the illumination device 5 installed around the video display device 3 is kept substantially constant based on the reference data in the same segment section. can do.
- substantially constant means that the variation in illumination light within the same segment is within a range that does not impair the presence of the viewer.
- FIGS. 13 and 14 are diagrams illustrating another example for explaining threshold setting of illumination data according to reference illumination data in the automatic setting mode.
- the change range is designated as the color difference ⁇ E with respect to the reference illumination data.
- FIG. 14 shows the level division of the color difference ⁇ E and the general degree of vision.
- it should be within a range that can be treated as a color difference that is indistinguishable with the system colors in FIG. 14, that is, within a level range where the color difference ⁇ E is less than 13.
- a range that can be handled, that is, a level range in which the color difference ⁇ E 6.5 or less is preferable.
- the allowable change range may directly specify threshold values for lightness, saturation, and hue, or may use the color difference ⁇ E.
- the color correction information when used, an allowable range is already defined as a color range. Therefore, when the illumination data is to be acquired using the color correction information, the color correction information is used as it is to indicate the upper and lower limits of the reference illumination data without using the threshold value set and held in the threshold value setting unit 26. Can be used.
- the effect control data generation unit 25 calculates each sound pressure level (dB) at a predetermined wind force application frequency (Hz) and vibration application frequency (Hz) stored in the acoustic information, and each of them is used as reference wind data (dB). ) And reference vibration data (dB).
- the reference wind power data and the reference vibration data may be received and acquired by the segment creator in advance.
- the reference wind power data and the reference vibration data are described as acoustic reference information in the automatic control device control information Da.
- the wind power application frequency 10 Hz is used when the frequency range is not specified, and 0 to 10 Hz is specified when the frequency range is specified.
- the vibration application frequency it is 50 Hz when the frequency range is not specified, and 0 to 125 Hz when the frequency range is specified.
- the reference wind power data is converted into wind power data indicating the wind speed (m / s).
- a conversion method to wind power data there is a case where “50 dB ⁇ 0.5 m / s” and “80 dB ⁇ 2 m / s” are converted with reference to a lookup table, but the conversion method is limited to this. It is not a thing.
- the converted value may be specified not only when the absolute value of the wind speed is specified, but also as “strong wind” and “weak wind”.
- Threshold value is set for the reference wind data and reference vibration data in the automatic setting mode in the same manner as the reference illumination data described above.
- the effect control data generation unit 25 calculates the sound pressure level of the wind force application frequency and the vibration application frequency of each frame, that is, the wind force data and the vibration data corresponding to each frame, and finally performs the final processing corresponding to each frame by threshold processing. Wind power data and vibration data are acquired, and control data for controlling the viewing environment device is generated.
- the threshold used for the threshold processing is set and held in the threshold setting unit 26.
- the wind power data corresponding to each frame used for controlling the wind power and the vibration data corresponding to each frame used for controlling the vibration are set according to the reference wind data and the reference vibration data, respectively.
- an upper limit and a lower limit of the change from each reference data that is, an allowable range is defined.
- the upper limit of the sound pressure level (reference wind data or reference vibration data) +15
- the lower limit of the sound pressure level (reference wind data or reference vibration data) ⁇ 10
- finally obtained wind data and vibration data Refers to things that can change.
- the effect control data generation unit 25 calculates a block average value of video data of predetermined pixels stored in a region specified by the region information in a representative frame in a segment displayed by the video display device 3, and Is the reference illumination data. Next, the color temperature (correlated color temperature) is estimated from the reference illumination data. As described above, the color temperature estimation process is to estimate the illumination state of the scene where the video is shot based on the feature amount of the video data.
- the processing method is not limited. For example, non-patent literature “scene illumination estimation process”, Shoji Tominaga, Satoru Sakurai, B.H. A. Wandell, IEICE Technical Report, PRMU 99-184, 1999. Can be applied.
- the color gamut occupied by the sensor output for each color temperature in the sensor space is obtained in advance, and the color temperature is estimated by examining the correlation between the color gamut and the acquired image pixel distribution.
- the color gamut occupied by the sensor output is obtained in advance.
- (3) The normalized (R, B) coordinate values are plotted on the RB plane.
- the color gamut having the highest correlation with the (R, B) coordinate value of the target image is estimated as the color temperature of the target image.
- the color gamut is obtained every 500K, for example.
- a color gamut that can be occupied by the sensor output for each color temperature is defined in the color space in order to classify the scene illumination.
- RGB values of sensor outputs for various object surfaces are obtained under the spectral distribution of each color temperature.
- a two-dimensional illumination color gamut obtained by projecting these RGB convex hulls onto the RB plane is used. This illumination color gamut can be formed by the color gamut for every 500K occupied by the sensor output as described above.
- the sensor correlation method scaling processing of image data is necessary to adjust the overall luminance difference between images.
- the luminance of the i-th pixel among the target pixels is I i and the maximum value is I max .
- the sensor output is normalized with respect to RGB and maximum values as follows.
- the color signal and the luminance signal of the predetermined screen area included in the video data to be displayed can be used as they are as in the conventional example described above. Needless to say, it is good.
- the calculated color temperature is converted to temperature data indicating temperature (° C.).
- temperature data indicating temperature (° C.).
- the conversion method to temperature data referring to the temperature conversion table shown in FIG. 15, “1667 ⁇ color temperature ⁇ 2251 ⁇ Hot”, “2251 ⁇ color temperature ⁇ 4171 ⁇ Warm”, “4171 ⁇ color temperature ⁇
- conversion is performed such as “8061 ⁇ Moderate” and “8061 ⁇ color temperature ⁇ 25000 ⁇ Cool”, but the conversion method is not limited to this.
- the data generation method for driving and controlling the data receiving device 2 in the automatic control mode is not limited to this.
- data for driving and controlling the data receiving device 2 is created. Also good.
- metadata is added to video data and audio data and transmitted as part of broadcast data.
- the broadcast data does not include metadata itself.
- the data receiving device can realize an optimal viewing environment when reproducing the video data or audio data.
- the data receiving apparatus receives identification data for identifying metadata in advance.
- FIG. 16 is a block diagram showing a configuration of the external server device according to the present embodiment.
- the external server device 9 bears a part of the function of the data transmission device 1.
- the external server device 9 includes a request reception unit 91, a data storage unit 92, and a metadata transmission unit 93.
- the request receiving unit 91 receives a transmission request for metadata related to specific video data or audio data (content) from the data receiving device (viewing environment control device) 2 'side described later.
- the data storage unit 92 stores metadata for each content.
- the metadata transmission unit 93 transmits the metadata for which the transmission request has been received to the requesting data receiving device.
- the metadata stored in the data storage unit 92 describes a start time code of a segment (for example, a scene or a shot) of an arbitrary control unit intended by the content creator or the like.
- the metadata is transmitted from the metadata transmitting unit 93 to the requesting data receiving apparatus 2 ′ together with the TC (time code) indicating the start time of the segment.
- an ID is added to a segment (for example, a scene or a shot) of an arbitrary control unit intended by the content creator, and the metadata of the content that has received the transmission request is sent from the metadata transmission unit 93 together with the segment ID. You may make it transmit to the data receiver 2 'of a request origin. In this case, the metadata can be transmitted to the receiving device 2 'in units of segments.
- the external server device 9 can transmit metadata in units of segments when receiving a request from the outside.
- the system configuration of the viewing environment control system 100 ′ including the data receiving device 2 ′ that receives the metadata transmitted from the external server device 9 and controls the viewing environment device will be described.
- the viewing environment control system 100 ′ includes a data receiving device 2 ′, a video display device 3, an audio reproduction device 4, an illumination device 5, a wind power device 6, an air conditioner 7, and a vibration device 8. Has been.
- the data receiving device 2 ′ has substantially the same configuration as the data receiving device 2 described above, but unlike the data receiving device 2, a CPU (Central Processing Unit) 27, a request transmitting unit 28, and a metadata receiving unit 29 Is further provided.
- a CPU Central Processing Unit
- the request transmission unit 28 transmits a transmission request for metadata corresponding to video data (content) to be displayed to the external server device 9 via the communication network based on an instruction from the effect control data generation unit 25.
- An instruction from the effect control data generation unit 25 to the request transmission unit 28 is performed via the CPU 27.
- the metadata receiving unit 29 receives metadata transmitted from the external server device 9 via the communication network.
- the received metadata is transmitted to the effect control data generation unit 25 via the CPU 27.
- the effect control data generation unit 25 in the data reception device 2 ′ is the same as the effect control data generation unit 25 in the data reception device 2 except that it makes a metadata transmission request based on the received identification data.
- the viewing environment control system 100 ′ obtains metadata corresponding to the content from the external server device 9 even when the broadcast data does not include metadata, and based on this metadata, This configuration controls the viewing environment device. Therefore, similarly to the above-described embodiment, the viewing environment device switching control is performed at an arbitrary timing according to the intention of the video producer while suppressing an increase in the data amount. This makes it possible to realize device control in an optimal viewing environment.
- the lighting device 5, the wind power device 6, the air conditioner 7, and the vibration device 8 have been described as examples of the peripheral devices arranged in the virtual viewing environment space, but are limited to these viewing environment devices. It is not a thing.
- the present invention can be applied to peripheral devices that affect the viewing environment, such as a scent generator.
- presentation effect information such as how strong the fragrance is generated may be defined by metadata.
- content that is video data or audio data is not limited to content related to a television program transmitted by television broadcasting, but is related to a work stored in a media medium such as Blu-ray Disc or DVD. It may be content. That is, the input video data is not limited to that obtained by receiving a television broadcast, and the present invention can also be applied when video data reproduced by an external reproduction device is input.
- the present invention is not limited to the above-described embodiment, and various modifications can be made within the scope indicated in the claims.
- the data receiving devices 2 and 2 ′ may be provided in the video display device 3, and the external viewing environment device can be controlled based on various information included in the input video data. Needless to say. That is, embodiments obtained by combining technical means appropriately modified within the scope of the claims are also included in the technical scope of the present invention.
- the viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data including one or more frames, and the video Receiving means for receiving reference information indicating a calculation area when calculating the feature quantity of data or audio data; and calculating means for calculating a feature quantity of video data or audio data in the calculation area indicated by the reference information. It is a configuration.
- the reference information indicates a screen area when the feature amount of the video data is calculated.
- the reference information indicates a frequency region when the feature amount of the audio data is calculated.
- the reference information may be set in units of frames constituting video data or audio data, or in units of segments that are sets of the frames.
- the viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data composed of one or more frames.
- calculation designation information preferably specifies calculation of an average color within a predetermined screen area as a feature amount of the video data.
- the calculation designation information designates calculation of a high luminance area color within a predetermined screen area as a feature amount of the video data.
- the calculation designation information preferably designates calculation of a color corresponding to a correlated color temperature in a predetermined screen area as a feature amount of the video data.
- the calculation designation information designates calculation of a color corresponding to a memory color in a predetermined screen area as a feature amount of the video data.
- the calculation designation information may be set in units of frames constituting video data or audio data, or in units of segments that are sets of the frames.
- the viewing environment control device is a viewing environment control device that controls a viewing environment device based on a feature amount of video data or audio data composed of one or more frames.
- the correction information indicates an allowable range related to at least one of brightness and color, which is calculated as a feature amount of the video data.
- the correction information indicates an allowable range related to at least one of a frequency and a sound pressure level, which is calculated as a feature amount of the audio data.
- the correction information may be set in units of frames constituting video data or audio data, or in units of segments that are sets of the frames.
- the viewing environment control device is a viewing environment control device that controls viewing environment equipment based on control reference data received from outside, and that is used for video data or audio data received from outside. It comprises a calculating means for calculating a feature quantity and a correcting means for correcting the control reference data using the feature quantity of the video data or audio data.
- control reference data is set in a segment unit which is a set of frames constituting the video data or audio data.
- a segment is defined as video data or audio data composed of one or more frames divided into arbitrary lengths in the time direction.
- the viewing environment device can be controlled for each segment unit.
- the viewing environment device can be controlled in accordance with the scene setting.
- the control reference data is preferably color reference information regarding at least one of brightness and color.
- control reference data is preferably acoustic reference information related to at least one of frequency and sound pressure level.
- the data transmission device transmits data to the viewing environment control device that controls the viewing environment device based on the feature amount of the video data or audio data including one or more frames.
- the data transmission device performs transmission, and includes transmission means for transmitting reference information indicating a calculation area when calculating the feature amount of the video data or audio data.
- the reference information indicates a screen area when calculating the feature amount of the video data.
- the reference information indicates a frequency region when the feature amount of the audio data is calculated.
- the data transmission device transmits data to the viewing environment control device that controls the viewing environment device based on the feature amount of video data or audio data including one or more frames.
- a data transmission apparatus that performs transmission, and includes a transmission unit that transmits calculation designation information indicating a calculation method when calculating the feature amount of the video data or audio data.
- the calculation designation information is to designate calculation of an average color in a predetermined screen area as the feature amount of the video data.
- the calculation designation information designates calculation of a high luminance area color within a predetermined screen area as a feature amount of the video data.
- the calculation designation information preferably designates calculation of a color corresponding to a correlated color temperature in a predetermined screen area as a feature amount of the video data.
- the calculation designation information designates calculation of a color corresponding to a memory color in a predetermined screen area as a feature amount of the video data.
- the data transmission device transmits data to the viewing environment control device that controls the viewing environment device based on the feature amount of the video data or audio data including one or more frames.
- the data transmission device performs transmission, and includes a transmission unit that transmits correction information indicating an allowable range when calculating the feature amount of the video data or audio data.
- the correction information indicates an allowable range related to at least one of brightness and color, which is calculated as a feature amount of the video data.
- the correction information indicates an allowable range related to at least one of a frequency and a sound pressure level, which is calculated as a feature amount of the audio data.
- the data transmission device is a data transmission device that transmits control reference data for controlling a viewing environment device to the viewing environment control device, and the control reference data includes video
- control reference data is preferably color reference information related to at least one of brightness and color.
- control reference data is preferably acoustic reference information related to at least one of frequency and sound pressure level.
- a viewing environment control system including the viewing environment control apparatus according to the present invention is also included in the category of the present invention.
- the viewing environment control system includes a viewing environment control device according to the present invention, a display device that displays the video data, and a viewing environment device that is installed in the same viewing environment as the video display device. It is the structure equipped with.
- the viewing environment control device may be realized by a computer.
- a viewing environment control program for realizing the viewing environment control device in the computer by operating the computer as each of the above means, and a computer-readable recording medium recording the viewing environment control program are also included in the scope of the present invention. enter.
- each unit included in the data transmitting device 1, the data receiving device 2, 2 ' may be configured by hardware logic. Alternatively, it may be realized by software using a CPU (Central Processing Unit) as follows.
- CPU Central Processing Unit
- the data transmission device 1 and the data reception devices 2 and 2 ′ include a CPU that executes instructions of a control program (viewing environment control program) that realizes each function, a ROM (Read Only Memory) that stores the control program, and the above-described control.
- a control program viewing environment control program
- ROM Read Only Memory
- a RAM Random Access Memory
- storage device recording medium
- the object of the present invention can be achieved with a predetermined recording medium.
- the program codes (execution format program, intermediate code program, source program) of the control program of the data transmission device 1, the data reception device 2, and 2 ′, which are software for realizing the functions described above, can be read by a computer. It only has to be recorded.
- This recording medium is supplied to the data transmission device 1 and the data reception devices 2 and 2 '. Thereby, the data transmission device 1 and the data reception devices 2 and 2 ′ (or CPU or MPU) as a computer may read and execute the program code recorded on the supplied recording medium.
- the recording medium for supplying the program code to the data transmitting device 1, the data receiving device 2, 2 ' is not limited to a specific structure or type. That is, the recording medium includes, for example, a tape system such as a magnetic tape or a cassette tape, a magnetic disk such as a floppy (registered trademark) disk / hard disk, or an optical disk such as a CD-ROM / MO / MD / DVD / CD-R. System, a card system such as an IC card (including a memory card) / optical card, or a semiconductor memory system such as a mask ROM / EPROM / EEPROM / flash ROM.
- a tape system such as a magnetic tape or a cassette tape
- a magnetic disk such as a floppy (registered trademark) disk / hard disk
- an optical disk such as a CD-ROM / MO / MD / DVD / CD-R.
- a card system such as an IC card (including a memory card) / optical card
- the object of the present invention can be achieved.
- the program code is supplied to the data transmission device 1 and the data reception devices 2 and 2 'via the communication network.
- the communication network is not limited to a specific type or form as long as it can supply program codes to the data transmission device 1 and the data reception devices 2 and 2 ′.
- the Internet, intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication network, etc. may be used.
- the transmission medium constituting the communication network may be any medium that can transmit the program code, and is not limited to a specific configuration or type.
- wired communication such as IEEE 1394, USB (Universal Serial Bus), power line carrier, cable TV line, telephone line, ADSL (Asymmetric Digital Subscriber Line) line, infrared such as IrDA or remote control, Bluetooth (registered trademark), 802. 11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, etc.
- the present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
- a data transmitting apparatus for transmitting video data composed of one or more frames
- the video data is divided into segments composed of an arbitrary number of one or more frames, and the start time and / or section of each segment is indicated.
- Segment delimiter information control information indicating the control method of the viewing environment device when displaying each frame of the video data
- a data transmitting apparatus wherein reference information is added to the video data in units of segments and transmitted.
- the control information includes information indicating an automatic control mode for controlling a viewing environment device based on control reference information when displaying all frames in the segment.
- Data transmission device
- control reference information includes a feature amount of video data to be displayed when all the frames in the segment are displayed.
- control reference information includes a feature amount of acoustic data to be reproduced when displaying all the frames in the segment.
- Video data to be displayed on a display device Segment delimiter information indicating the start time and / or section of a segment obtained by dividing the video data by an arbitrary number of one or more frames, and the video data in association with each segment
- Control information indicating a control method of the viewing environment device when displaying each frame constituting the video data, and a control indicating a reference of the control method of the viewing environment device when displaying each frame of the video data
- a viewing environment control apparatus comprising: receiving means for receiving reference information; and control means for controlling driving of viewing environment equipment installed around the display device based on the control information.
- the viewing environment control device according to any one of the seventh to eleventh configurations, a video / audio reproduction device for reproducing the video data and / or audio data, and a periphery of the video / audio reproduction device.
- a viewing environment control system comprising a peripheral device.
- a viewing environment control method for controlling driving of a viewing environment device according to a feature amount of video data or audio data to be displayed, by performing a correction process on the feature amount of the video data or audio data A viewing environment control method, comprising: generating control information of the viewing environment device.
- a data transmission method for transmitting video data composed of one or more frames the video data is divided into segments composed of an arbitrary number of one or more frames, and the start time and / or section of each segment is indicated.
- Segment delimiter information, control information indicating the control method of the viewing environment device when displaying each frame of the video data, and control indicating the reference of the control method of the viewing environment device when displaying each frame of the video data A data transmission method comprising: adding reference information to the video data in units of segments and transmitting the reference information.
- a viewing environment control method comprising: controlling driving of a viewing environment machine installed around a display device.
- Receiving means for receiving video data composed of a plurality of segments, and control means (delay generating section, effect control data generating section) for controlling viewing environment equipment, and outputting each frame of the video data to a display device
- Control means for determining the control content for the viewing environment device at the time of determination based on the frame feature amount that is the feature amount of the frame and the segment feature amount that is the feature amount of the segment to which the frame belongs.
- the receiving means includes descriptive data (metadata) including each segment feature quantity of the plurality of segments and reference information that the control means refers to extract a segment feature quantity from each of the plurality of segments.
- the viewing environment control apparatus is configured to further receive (data).
- the frame feature amount includes, but is not limited to, frame color information, luminance information, and acoustic information corresponding to the frame, and can vary from frame to frame.
- the segment feature amount includes, but is not limited to, color information, luminance information, and acoustic information, and is predetermined for each segment.
- the viewing environment control device receives video data composed of a plurality of segments, and each segment feature amount of the plurality of segments and the control means extract the segment feature amount from each of the plurality of segments. At least one of reference information to be referred to is received.
- the segment feature is associated with each segment and is information used to control the viewing environment device.
- the reference information is information that is referenced when the viewing environment control device extracts the segment feature amount in the corresponding segment.
- the viewing environment control device controls the viewing environment device based on the segment feature value extracted from the frame with reference to the received segment feature value or reference information and the frame feature value corresponding to the frame output to the display device .
- the viewing environment device in accordance with the contents of the segment, compared to controlling the viewing environment device using only the feature amount extracted from the frame output to the display device. Therefore, the viewing environment device can be appropriately controlled for each segment of the video, and the viewing environment device can be controlled in accordance with the atmosphere or scene setting of the shooting scene intended by the video producer. As a result, it is possible to provide the user with a viewing environment that provides a high sense of realism.
- the control means performs a correction process on the frame feature value of each frame belonging to the corresponding segment using each segment feature value, and a corrected feature value (post-correction) that is a frame feature value after the correction process.
- the viewing environment control apparatus determines the control content based on the data.
- the viewing environment device when the viewing environment device is controlled based on the frame feature corresponding to the frame output to the display device, a predetermined restriction is applied to the frame feature by the correction processing using the segment feature. Can do. Accordingly, even when the frame feature amount in the output frame is repeatedly increased and decreased, it is possible to prevent the driving intensity of the viewing environment device from fluctuating greatly following this. This makes it possible to control the viewing environment without disturbing the atmosphere when displaying images in the same segment.
- the segment feature amount indicates a permissible range of the frame feature amount of each frame belonging to the corresponding segment, and the control unit corrects the frame feature amount of each frame belonging to the segment so as to be within the permissible range. It is preferable to perform processing.
- the viewing environment control apparatus performs a correction process on the frame feature value in each frame, and generates a corrected feature value that falls within the allowable range indicated by the segment feature value. Then, the viewing environment device is controlled using the corrected feature amount. Since all the corrected feature values used for controlling the viewing environment devices are within the allowable range set for each corresponding segment, the viewing environment becomes large when images in the same segment are displayed. It can be prevented from changing.
- the segment feature amount is a representative value of the frame feature amount of each frame belonging to the corresponding segment (image feature amount in the representative frame), and the control means uses the representative value and predetermined threshold data. An allowable range of the frame feature value of each frame belonging to the corresponding segment is set, and the control means performs the correction process so that the frame feature value of each frame belonging to the segment falls within the allowable range. Viewing environment control device.
- the viewing environment control device has a range allowed for the variation of each frame feature amount based on the segment feature amount that is a representative value of the frame feature amount and the predetermined threshold data. Set the allowable range.
- the viewing environment control apparatus performs a correction process on the frame feature amount in each frame, and generates a corrected feature amount that falls within the set allowable range. Then, the viewing environment device is controlled using the corrected feature amount. Since all of the post-correction feature values (final feature values) used for controlling the viewing environment device are within the allowable range set for each corresponding segment, the images in the same segment are displayed.
- the allowable range can be set, and the drive strength of the viewing environment device can be prevented from fluctuating. This makes it possible to control the viewing environment without disturbing the atmosphere when displaying images in the same segment.
- the allowable range can be designated as a desired range by changing the threshold data, the allowable range can be changed on the user side.
- the descriptive data further includes designation data (control information) for designating a segment to be corrected by the control means, and the control means performs correction processing in the segment designated by the designation data.
- designation data control information
- the viewing environment control device detects the designated data and executes the correction process for the frame feature amount only within the segment designated by the designated data. For this reason, the viewing environment device is controlled without performing the correction process for a segment that does not require the correction process for each frame or that the content provider does not desire the correction process for each frame.
- the viewing environment control device wherein the reference information is region information that defines a region in the frame from which the control means extracts the segment feature value.
- an area in the frame can be defined, and a segment feature amount can be extracted from video information recorded in the area. Therefore, the amount of data to be handled is smaller than when video information of the entire frame is used, and the control process can be executed more quickly.
- the viewing environment control device wherein the frame feature amount is color information related to at least one of brightness and color of the frame.
- the viewing environment control apparatus can control the viewing environment device using at least one of the brightness and color information in the frame. Accordingly, it is possible to appropriately control the brightness, hue, and temperature of the viewing environment, and it is possible to provide the user with a viewing environment that is more suitable for viewing video.
- the viewing environment control apparatus wherein the receiving means further receives audio data, and the frame feature amount is acoustic information of the audio data corresponding to an output frame.
- the viewing environment control apparatus can control the viewing environment device using the acoustic information. Accordingly, the viewing environment control apparatus can appropriately control the airflow and vibration of the viewing environment, and can provide the user with a viewing environment that is more suitable for viewing images.
- a data transmission device that transmits video data composed of a plurality of segments to a viewing environment control device that controls a viewing environment device, wherein each of the plurality of segments is a feature amount of each frame belonging to the segment.
- Descriptive data including at least one of a segment feature amount representing a frame feature amount and reference information used to extract a segment feature amount from each of the plurality of segments, or for identifying the description data
- a data transmission device configured to transmit identification data to the viewing environment control device.
- the data transmission device transmits the video data composed of a plurality of segments to the viewing environment control device, and the description data including at least one of the segment feature amount and the reference information, or the description data Identification data for identification is transmitted.
- the segment information represents the frame feature amount of each frame belonging to the segment for each of the plurality of segments.
- the reference information is information used to extract a segment feature amount from each of a plurality of segments.
- the viewing environment control device can control the viewing environment device using the segment information defined for each segment.
- the viewing environment device can be appropriately controlled for each segment of the video, and the viewing environment device can be controlled in accordance with the atmosphere or scene setting of the shooting scene intended by the video producer.
- the user is possible to provide the user with a viewing environment that provides a high sense of realism.
- (26th configuration) Data including an adding means (data multiplexing unit) for adding the description data or the identification data to the video data in units of segments, and transmitting data generated by the adding means to the viewing environment control device Transmitter device.
- an adding means data multiplexing unit
- the data transmission device can add the description data to the video data and transmit the data generated by the addition to the viewing environment control device.
- the control of the viewing environment device there are many controls mainly in scene units in which the situation of the scene is constant, but there are not a few cases in which control is performed in shot units in which the situation of shooting is constant. In some cases, fine control on a frame basis is desirable.
- By dividing the segment into units that often require the viewing environment device to be controlled consistently depending on the content of the video data it is possible to efficiently describe description data or identification data while suppressing an increase in the amount of data to be added. .
- the viewing environment control device when data is transmitted to the viewing environment control device by the data transmitting device, the viewing environment control device extracts data that falls within the allowable range indicated by the segment feature value from the frame feature value of each frame. Can be generated.
- the viewing environment control device can control the viewing environment device using the generated data. All the data used for controlling the viewing environment device is within the allowable range set for each corresponding segment, so when the video in the same segment is displayed, the viewing environment changes greatly. Can be prevented.
- the data transmission device wherein the segment feature value is a representative value of the frame feature value of each frame belonging to the corresponding segment.
- the viewing environment control device when data is transmitted from the data transmission device to the viewing environment control device, uses each frame feature value and a predetermined threshold value data for each frame. It is possible to set an allowable range of the frame feature amount.
- the viewing environment control device can generate data that falls within the set allowable range from the frame feature amount of each frame.
- the viewing environment control device can control the viewing environment device using the generated data. All the data used for controlling the viewing environment device is within the allowable range set for each corresponding segment, so when the video in the same segment is displayed, the viewing environment changes greatly. Can be prevented.
- the viewing environment control apparatus can set the allowable range, and can prevent the driving intensity of the viewing environment device from fluctuating drastically. . This makes it possible to control the viewing environment without disturbing the atmosphere when displaying images in the same segment.
- the allowable range can be designated as a desired range by changing the threshold data, the allowable range can be changed on the user side.
- the data transmission device wherein the description data further includes designation data for designating a segment to be processed for the frame feature value using the segment feature value in the viewing environment control device.
- the viewing environment control apparatus when data is transmitted to the viewing environment control apparatus by the data transmission apparatus, the viewing environment control apparatus can execute the correction process for the frame feature amount only within the segment designated by the designated data. it can. Therefore, it is possible to control the viewing environment device without performing the correction process for a segment that does not require the correction process for each frame or that the content provider does not want the correction process for each frame.
- the data transmission device wherein the reference information is region information that defines a region in the frame from which the viewing environment control device extracts the segment feature.
- the viewing environment control device when data is transmitted to the viewing environment control device by the data transmission device, the viewing environment control device defines an area in the frame and from the video information recorded in the area. It becomes possible to extract the segment feature amount. Therefore, the amount of data to be handled is smaller than when video information of the entire frame is used, and the control process can be executed more quickly.
- the data transmission device wherein the frame feature amount is color information related to at least one of brightness and color of the frame.
- the viewing environment control device when data is transmitted to the viewing environment control device by the data transmission device, the viewing environment control device can control the viewing environment device using at least one of the lightness and color information of the frame. It becomes possible. Accordingly, it is possible to appropriately control the brightness, hue, and temperature of the viewing environment, and it is possible to provide the user with a viewing environment that is more suitable for viewing video.
- the data transmission device wherein the frame feature amount is sound information related to sound.
- the viewing environment control device when data is transmitted to the viewing environment control device by the data transmission device, the viewing environment control device can control the viewing environment device using the acoustic information. Accordingly, the viewing environment control apparatus can appropriately control the airflow and vibration of the viewing environment, and can provide the user with a viewing environment that is more suitable for viewing images.
- a viewing environment control system comprising: a viewing environment control device according to the present invention; a display device for displaying the video data; and a viewing environment device installed in the same viewing environment as the video display device. .
- the viewing environment control device (34th configuration)
- a display device that displays the video data
- an audio playback device that plays back the audio data
- the video display device and the audio playback device are installed in the same viewing environment.
- a viewing environment control system comprising a viewing environment device.
- a viewing environment control device receives a video data composed of a plurality of segments, and a viewing environment control device controls a viewing environment device, and each frame of the video data is displayed on a display device. Including a control step of determining control contents for the viewing environment device at the time of output based on a frame feature amount that is a feature amount of the frame and a segment feature amount that is a feature amount of a segment to which the frame belongs.
- description data including at least one of the segment feature amounts of the plurality of segments and reference information that the viewing environment control device refers to extract the segment feature amounts from each of the plurality of segments.
- the viewing environment control method which is a configuration for further receiving a message.
- Descriptive data including at least one of a segment feature amount representing a frame feature amount and reference information used to extract a segment feature amount from each of the plurality of segments, or for identifying the description data
- a data transmission method comprising: an addition step of adding identification data to the video data for each segment; and a transmission step of transmitting data generated by the addition step.
- a data transmission method that is configured to receive and transmit in units of segments.
- the present invention can be suitably used for a viewing environment control system that controls the viewing environment of a user.
- Data transmission device 2 2 'Data reception device (viewing environment control device) 3 Video display device (display device) 4 Audio playback device 5, 5a, 5b, 5c Lighting device (viewing environment equipment) 6, 6a, 6b Wind power device (viewing environment equipment) 7 Air conditioner (viewing environment equipment) 8 Vibration device (viewing environment equipment) 9 External server device 11 Data multiplexing unit 12 Transmission unit (transmission means) 13 Data Encoding Unit 21 Receiving Unit (Receiving Unit) 22 data separator 23 delay generator 24 delay generator 25 effect control data generator (calculation means, correction means) 26 threshold setting unit 27 CPU 28 Request Transmission Unit 29 Metadata Reception Unit 91 Request Reception Unit 92 Data Storage Unit 93 Metadata Transmission Unit 100, 100 ′ Viewing Environment Control System
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Studio Devices (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
Description
本発明の一実施形態について、図1~図15に基づいて説明すれば以下の通りである。なお、本実施形態では、視聴環境空間に配置された周辺機器として照明装置、空調機、送風機、振動装置を例に説明するが、視聴環境を制御する装置であればこれらの装置に限られるものではない。例えば香り発生装置などにも本発明を適用可能である。
図2は、本発明に係るデータ送信装置の一実施形態を示す概略構成を示すブロック図である。データ送信装置は、映像データ、音声データおよびメタデータを、後述する視聴環境制御システムに伝送する装置である。図2に示すように、データ送信装置1は、データ多重化部11、送信部(送信手段)12およびデータ符号化部13を含んで構成されている。
次に、メタデータについて、図3~5、7~10を参照して以下に説明する。
図4は、セグメント区切り情報Dsの構成を示している。セグメント区切り情報は、セグメントを特定するための情報である。本実施の形態では、図4に示すように、セグメント区切り情報Dsは、セグメント番号、開始時刻(開始点)、区間情報により構成されている。セグメント開始時刻は、映像データおよび音声データそれぞれの再生時間情報を示すために付加されたタイムコードにより指定されている。例えば図4に示すように、映像データの時間(h):分(m):秒(s):フレーム(f)を示す情報により構成されている。セグメント区間は、当該セグメントの開始タイムコードを示す時間から次のセグメントの開始タイムコードまでの期間を示す時間(h):分(m):秒(s):フレーム(f)を示す情報により構成されている。
制御情報Drは、各セグメントにおいて手動制御用機器制御情報Dmおよび自動制御用機器制御情報Daのいずれのデータを用いて視聴環境機器の制御を行うかを規定するものである。すなわち、各セグメントに含まれる各フレームが映像表示装置に出力される際の、各視聴環境機器に対する制御方法を示すものである。図3に示すように、本実施の形態では、自動制御モードおよび手動制御モードの2種類の制御方法が規定されており、1つのセグメントに対していずれかのモードが制御情報Drとして制御方法指定領域に記述・指定される。
手動制御用機器制御情報Dmは、各セグメントに含まれる映像データの撮影時における環境状況再現するために、各視聴環境機器に対してコンテンツ制作者が指定する制御データである。具体的には、映像データの撮影に用いられた撮影照明を表わす照明データ、風力を表す風力データ、温度を表わす温度データ、振動を表わす振動データなどが記述されている。
自動制御用機器制御情報Daは、自動制御モードにおいて、後述するデータ受信装置が各セグメントに含まれる映像データおよび音声データを利用して各視聴環境機器に対する制御データを生成する際に加える制限を示す補正情報、データ受信装置が制限を加える際に基準となる特徴量の抽出に用いる参照情報、データ受信装置が制限を加える際に基準となる特徴量を算出するための演算指定方法を示す演算指定情報、またはコンテンツ制作者により指定される制御データである制御基準情報(制御基準データ)である。
本実施形態に係る視聴環境制御システムについて、図1を参照して以下に説明する。
データ受信装置2は、受信部(受信手段)21、データ分離部22、ディレイ発生部23、24、駆動制御データ生成手段としてのエフェクト制御データ生成部(算出手段、補正手段)25および閾値設定部26を含んで構成されている。
照明装置5としては、例えば独立して発光制御することが可能なR(赤)、G(緑)、B(青)各色のLED光源が一定の間隔を空けて配列されたものを用いることができる。照明装置5は、これら3原色のLED光源により所望の色および輝度からなる照明光を出射する。ただし、照明装置5は、映像表示装置3の周囲環境の照明色および明るさを制御することができるような構成であればよく、所定色を発光するLED光源の組み合わせに限定されるものではない。例えば、白色LEDと色フィルタとによって構成してもよく、白色電球または蛍光管とカラーフィルタとの組み合わせ、およびカラーランプ等を適用することもできる。さらに、R(赤)、G(緑)、B(青)各色による表現に限ることなく、例えば照明色温度(単位:K)を利用して表現してもよい。
次に、メタデータを用いて、各視聴環境機器を制御する方法について、図11~15を参照して以下に説明する。
Lmax=max(Ri 2+Gi 2+Bi 2)1/2
そして、照明色域が投影されたRB平面に対して、正規化された(R,B)座標値をプロットする。この照明色域を参照色域とし、プロットした対象画像の座標値と比較する。そして対象画像の座標値に最も相関の高い参照色域を選択して、その選択した参照色域により色温度を決定する。
本発明の他の実施形態について、図16および17に基づいて説明すれば以下の通りである。なお、説明の便宜上、前述の実施の形態で用いたものと同じ機能を有する部材には同じ参照符号を付して、その説明を省略する。
なお、上述した実施形態を以下のように表現することも可能である。
1以上のフレームから構成される映像データを送信するデータ送信装置において、前記映像データを1以上の任意の数のフレームから構成されるセグメントに分割し、各セグメントの開始時刻及び/又は区間を示すセグメント区切り情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法を示す制御情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法の基準を示す制御基準情報とを、前記セグメント単位で前記映像データに付加して送信することを特徴とするデータ送信装置。
前記制御情報は、前記セグメント内の全てのフレームを表示する際、制御基準情報をもとに視聴環境機器を制御する自動制御モードを指示する情報を含むことを特徴とする第1の構成に記載のデータ送信装置。
前記制御基準情報は、前記セグメント内の全てのフレームを表示する際、表示すべき映像データの特徴量を含むことを特徴とする第1または第2の構成に記載のデータ送信装置。
前記映像データの特徴量は、画面領域に関する情報を含むことを特徴とする第3の構成に記載のデータ送信装置。
前記映像データの特徴量は、映像データの明るさまたは色に関する情報を含むことを特徴とする第3の構成に記載のデータ送信装置。
前記制御基準情報は、前記セグメント内の全てのフレームを表示する際、再生すべき音響データの特徴量を含むことを特徴とする第1または第2の構成に記載のデータ送信装置。
表示装置に表示すべき映像データと、該映像データを1以上の任意の数のフレームで分割したセグメントの開始時刻及び/又は区間を示すセグメント区切り情報と、前記セグメント単位で前記映像データに対応付けられた、前記映像データを構成する各フレームを表示する際の視聴環境機器の制御方法を示す制御情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法の基準を示す制御基準情報とを受信する受信手段と、前記制御情報に基づいて、前記表示装置の周辺に設置された視聴環境機器の駆動を制御する制御手段を備えたことを特徴とする視聴環境制御装置。
前記制御手段は、映像データの特徴量を用いて、前記視聴環境機器の駆動を制御することを特徴とする第7の構成に記載の視聴環境制御装置。
前記映像データの特徴量は、画面領域に関する情報を含むことを特徴とする第8の構成に記載の視聴環境制御装置。
前記映像データの特徴量は、映像データの明るさまたは色に関する情報を含むことを特徴とする第8の構成に記載の視聴環境制御装置。
前記制御手段は、音響データの特徴量を用いて、前記視聴環境機器の駆動を制御することを特徴とする第7の構成に記載の視聴環境制御装置。
第7から第11の構成のいずれかに記載の視聴環境制御装置と、前記映像データ及び/又は音声データを再生するための映像/音声再生装置と、該映像/音声再生装置の周辺に設置された周辺機器とを備えたことを特徴とする視聴環境制御システム。
表示すべき映像データまたは音響データの特徴量に応じて、視聴環境機器の駆動を制御する視聴環境制御方法であって、前記映像データまたは音響データの特徴量に対して補正処理を施すことにより、前記視聴環境機器の制御情報を生成することを特徴とする視聴環境制御方法。
1以上のフレームから構成される映像データを送信するデータ送信方法において、前記映像データを1以上の任意の数のフレームから構成されるセグメントに分割し、各セグメントの開始時刻及び/又は区間を示すセグメント区切り情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法を示す制御情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法の基準を示す制御基準情報とを、前記セグメント単位で前記映像データに付加して送信することを特徴とするデータ送信方法。
外部からの要求を受けて、映像データを構成する各フレームを表示する際の視聴環境機器の制御方法を示す制御情報を送信するデータ送信方法であって、前記映像データを1以上の任意の数のフレームで分割したセグメントの開始時刻及び/又は区間を示すセグメント区切り情報と、前記制御情報と、制御基準情報とを前記セグメント単位で送信することを特徴とするデータ送信方法。
表示装置に表示すべき映像データと、該映像データを1以上の任意の数のフレームで分割したセグメントの開始時刻及び/又は区間を示すセグメント区切り情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法を示す制御情報と、前記映像データの各フレームを表示する際の視聴環境機器の制御方法の基準を示す制御基準情報とを受信し、前記制御情報に基づいて、前記表示装置の周辺に設置された視聴環境機の駆動を制御することを特徴とする視聴環境制御方法。
複数のセグメントから構成される映像データを受信する受信手段と、視聴環境機器を制御する制御手段(ディレイ発生部、エフェクト制御データ生成部)であって、上記映像データの各フレームを表示装置に出力する際の該視聴環境機器に対する制御内容を、該フレームの特徴量であるフレーム特徴量と、該フレームが属するセグメントの特徴量であるセグメント特徴量とに基づいて決定する制御手段とを備えており、上記受信手段は、上記複数のセグメントの各のセグメント特徴量および上記制御手段が上記複数のセグメントの各からセグメント特徴量を抽出するために参照する参照情報の少なくとも何れかを含む記述データ(メタデータ)を更に受信する構成である、視聴環境制御装置。
上記制御手段は、各セグメント特徴量を利用して、対応する上記セグメントに属する各フレームのフレーム特徴量に対して補正処理を行い、補正処理後のフレーム特徴量である補正後特徴量(補正後のデータ)に基づいて上記制御内容を決定する、視聴環境制御装置。
(第19の構成)
上記セグメント特徴量は、対応するセグメントに属する各フレームのフレーム特徴量の許容範囲を示し、上記制御手段は、上記セグメントに属する各フレームのフレーム特徴量を、上記許容範囲内に収まるように上記補正処理を行う、ことが好ましい。
(第20の構成)
上記セグメント特徴量は、対応するセグメントに属する各フレームのフレーム特徴量の代表値(代表フレームにおける画像特徴量)であり、上記制御手段は、上記代表値と、予め定められた閾値データとを用いて対応するセグメントに属する各フレームのフレーム特徴量の許容範囲を設定し、上記制御手段は、上記セグメントに属する各フレームのフレーム特徴量を、上記許容範囲内に収まるように上記補正処理を行う、視聴環境制御装置。
上記記述データは、上記制御手段による上記補正処理の対象となるセグメントを指定する指定データ(制御情報)をさらに含み、上記制御手段は、上記指定データにより指定されたセグメントにおいて、補正処理を行う、視聴環境制御装置。
上記参照情報は、上記制御手段が上記セグメント特徴量を抽出する上記フレーム内の領域を規定する領域情報である、視聴環境制御装置。
上記フレーム特徴量は、上記フレームの明度および色彩の少なくとも何れかに関する色情報である、視聴環境制御装置。
上記受信手段は、さらに音声データを受信し、上記フレーム特徴量は、出力するフレームに対応する上記音声データの音響情報である、視聴環境制御装置。
複数のセグメントから構成される映像データを、視聴環境機器を制御する視聴環境制御装置に送信するデータ送信装置であって、上記複数のセグメントの各について、該セグメントに属する各フレームの特徴量であるフレーム特徴量を代表する、セグメント特徴量、および上記複数のセグメントの各からセグメント特徴量を抽出するために用いられる参照情報の少なくとも何れかを含む記述データ、または、該記述データを識別するための識別データを上記視聴環境制御装置に送信する構成である、データ送信装置。
上記記述データまたは上記識別データを上記セグメント単位で上記映像データに付加する付加手段(データ多重化部)を備えており、上記付加手段によって生成されるデータを上記視聴環境制御装置に送信する、データ送信装置。
視聴環境機器の制御においては、主に場面の状況が一定となるシーン単位での制御が多いが、撮影の状況が一定となるショット単位で制御することも少なからずある。また、時にはフレーム単位でのきめ細やかな制御が望ましい場合も存在する。映像データの内容によって視聴環境機器を一定に制御すべきことが多い単位でセグメントを区切ることにより、付加するデータ量の増加を抑制して、効率的に記述データまたは識別データを記述することができる。
上記セグメント特徴量は、対応するセグメントに属する各フレームのフレーム特徴量の許容範囲を示す、データ送信装置。
(第28の構成)
上記セグメント特徴量は、対応するセグメントに属する各フレームのフレーム特徴量の代表値である、データ送信装置。
上記記述データは、上記視聴環境制御装置における上記セグメント特徴量を用いた上記フレーム特徴量に対する処理の対象となるセグメントを指定する指定データをさらに含む、データ送信装置。
上記参照情報は、上記視聴環境制御装置が上記セグメント特徴量を抽出する上記フレーム内の領域を規定する領域情報である、データ送信装置。
上記フレーム特徴量は、上記フレームの明度および色彩の少なくとも何れかに関する色情報である、データ送信装置。
上記フレーム特徴量は、音響に関する音響情報である、データ送信装置。
本発明に係る視聴環境制御装置と、上記映像データを表示する表示装置と、上記映像表示装置と同一の視聴環境に設置されている視聴環境機器とを備えている構成である、視聴環境制御システム。
本発明に係る視聴環境制御装置と、上記映像データを表示する表示装置と、上記音声データを再生する音声再生装置と、上記映像表示装置および上記音声再生装置と同一の視聴環境に設置されている視聴環境機器とを備えている構成である、視聴環境制御システム。
視聴環境制御装置が、複数のセグメントから構成される映像データを受信する受信工程と、視聴環境制御装置が、視聴環境機器を制御する制御工程であって、上記映像データの各フレームを表示装置に出力する際の該視聴環境装置に対する制御内容を、該フレームの特徴量であるフレーム特徴量と、該フレームが属するセグメントの特徴量であるセグメント特徴量とに基づいて決定する制御工程とを包含し、上記受信工程では、上記複数のセグメントの各のセグメント特徴量および上記視聴環境制御装置が上記複数のセグメントの各からセグメント特徴量を抽出するために参照する参照情報の少なくとも何れかを含む記述データを更に受信する、構成である、視聴環境制御方法。
複数のセグメントから構成される映像データを、視聴環境機器を制御する視聴環境制御装置に送信するデータ送信方法であって、上記視聴環境制御装置は、上記複数のセグメントの各について、該セグメントに属する各フレームの特徴量であるフレーム特徴量を代表する、セグメント特徴量、および上記複数のセグメントの各からセグメント特徴量を抽出するために用いられる参照情報の少なくとも何れかを含む記述データ、または、該記述データを識別するための識別データを上記視聴環境制御装置に送信する構成である、データ送信方法。
複数のセグメントから構成される映像データを、視聴環境機器を制御する視聴環境制御装置に送信するデータ送信方法であって、上記複数のセグメントの各について、該セグメントに属する各フレームの特徴量であるフレーム特徴量を代表する、セグメント特徴量、および上記複数のセグメントの各からセグメント特徴量を抽出するために用いられる参照情報の少なくとも何れかを含む記述データ、または、該記述データを識別するための識別データを、上記映像データに該セグメント毎に付加する付加工程と、上記付加工程により生成されたデータを送信する送信工程とを含む構成である、データ送信方法。
複数のセグメントから構成される映像データを表示する際の、視聴環境機器を制御する視聴環境制御装置に送信するデータ送信方法であって、上記複数のセグメントの各について、該セグメントに属する各フレームの特徴量であるフレーム特徴量を代表する、セグメント特徴量、および上記複数のセグメントの各からセグメント特徴量を抽出するために用いられる参照情報の少なくとも何れかを含む記述データを、外部からの要求を受けて上記セグメント単位で送信する構成である、データ送信方法。
2,2’ データ受信装置(視聴環境制御装置)
3 映像表示装置(表示装置)
4 音声再生装置
5,5a,5b,5c 照明装置(視聴環境機器)
6,6a,6b 風力装置(視聴環境機器)
7 空調装置(視聴環境機器)
8 振動装置(視聴環境機器)
9 外部サーバ装置
11 データ多重化部
12 送信部(送信手段)
13 データ符号化部
21 受信部(受信手段)
22 データ分離部
23 ディレイ発生部
24 ディレイ発生部
25 エフェクト制御データ生成部(算出手段、補正手段)
26 閾値設定部
27 CPU
28 要求送信部
29 メタデータ受信部
91 要求受信部
92 データ格納部
93 メタデータ送信部
100,100’ 視聴環境制御システム
Claims (36)
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置であって、
前記映像データまたは音声データの特徴量を算出する際の算出領域を示す参照情報を受信する受信手段と、
前記参照情報により示される算出領域における映像データまたは音声データの特徴量を算出する算出手段とを備えたことを特徴とする視聴環境制御装置。 - 前記参照情報は、前記映像データの特徴量を算出する際の画面領域を示すものであることを特徴とする請求項1に記載の視聴環境制御装置。
- 前記参照情報は、前記音声データの特徴量を算出する際の周波数領域を示すものであることを特徴とする請求項1に記載の視聴環境制御装置。
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置であって、
前記映像データまたは音声データの特徴量を算出する際の算出方法を示す演算指定情報を受信する受信手段と、
前記演算指定情報により示される算出方法で映像データまたは音声データの特徴量を算出する算出手段とを備えたことを特徴とする視聴環境制御装置。 - 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の平均色の算出を指定するものであることを特徴とする請求項4に記載の視聴環境制御装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の高輝度領域色の算出を指定するものであることを特徴とする請求項4に記載の視聴環境制御装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の相関色温度に対応する色の算出を指定するものであることを特徴とする請求項4に記載の視聴環境制御装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の記憶色に対応する色の算出を指定するものであることを特徴とする請求項4に記載の視聴環境制御装置。
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置であって、
前記映像データまたは音声データの特徴量を算出する際の許容範囲を示す補正情報を受信する受信手段と、
前記補正情報により示される許容範囲内で映像データまたは音声データの特徴量を算出する算出手段とを備えたことを特徴とする視聴環境制御装置。 - 前記補正情報は、前記映像データの特徴量として算出される、明度および色彩の少なくとも何れかに関する許容範囲を示すものであることを特徴とする請求項9に記載の視聴環境制御装置。
- 前記補正情報は、前記音声データの特徴量として算出される、周波数および音圧レベルの少なくとも何れかに関する許容範囲を示すものであることを特徴とする請求項9に記載の視聴環境制御装置。
- 外部から受信した制御基準データに基づいて視聴環境機器を制御する視聴環境制御装置であって、
外部から受信した映像データまたは音声データの特徴量を算出する算出手段と、
前記映像データまたは音声データの特徴量を用いて、前記制御基準データを補正する補正手段とを備えたことを特徴とする視聴環境制御装置。 - 前記制御基準データは、前記映像データまたは音声データを構成するフレームの集合であるセグメント単位に設定されていることを特徴とする請求項12に記載の視聴環境制御装置。
- 前記制御基準データは、明度および色彩の少なくとも何れかに関する色基準情報であることを特徴とする請求項12または13に記載の視聴環境制御装置。
- 前記制御基準データは、周波数および音圧レベルの少なくとも何れかに関する音響基準情報であることを特徴とする請求項12または13に記載の視聴環境制御装置。
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置に対してデータの送信を行うデータ送信装置であって、
前記映像データまたは音声データの特徴量を算出する際の算出領域を示す参照情報を送信する送信手段を備えたことを特徴とするデータ送信装置。 - 前記参照情報は、前記映像データの特徴量を算出する際の画面領域を示すものであることを特徴とする請求項16に記載のデータ送信装置。
- 前記参照情報は、前記音声データの特徴量を算出する際の周波数領域を示すものであることを特徴とする請求項16に記載のデータ送信装置。
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置に対してデータの送信を行うデータ送信装置であって、
前記映像データまたは音声データの特徴量を算出する際の算出方法を示す演算指定情報を送信する送信手段を備えたことを特徴とするデータ送信装置。 - 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の平均色の算出を指定するものであることを特徴とする請求項19に記載のデータ送信装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の高輝度領域色の算出を指定するものであることを特徴とする請求項19に記載のデータ送信装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の相関色温度に対応する色の算出を指定するものであることを特徴とする請求項19に記載のデータ送信装置。
- 前記演算指定情報は、前記映像データの特徴量として所定の画面領域内の記憶色に対応する色の算出を指定するものであることを特徴とする請求項19に記載のデータ送信装置。
- 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御装置に対してデータの送信を行うデータ送信装置であって、
前記映像データまたは音声データの特徴量を算出する際の許容範囲を示す補正情報を送信する送信手段を備えたことを特徴とするデータ送信装置。 - 前記補正情報は、前記映像データの特徴量として算出される、明度および色彩の少なくとも何れかに関する許容範囲を示すものであることを特徴とする請求項24に記載のデータ送信装置。
- 前記補正情報は、前記音声データの特徴量として算出される、周波数および音圧レベルの少なくとも何れかに関する許容範囲を示すものであることを特徴とする請求項24に記載のデータ送信装置。
- 視聴環境機器を制御するための制御基準データを視聴環境制御装置に対して送信するデータ送信装置であって、
前記制御基準データは、映像データまたは音声データを構成するフレームの集合であるセグメント単位に設定されていることを特徴とするデータ送信装置。 - 前記制御基準データは、明度および色彩の少なくとも何れかに関する色基準情報であることを特徴とする請求項27に記載のデータ送信装置。
- 前記制御基準データは、周波数および音圧レベルの少なくとも何れかに関する音響基準情報であることを特徴とする請求項28に記載のデータ送信装置。
- 請求項1から15までの何れか1項に記載の視聴環境制御装置と、
上記映像データを表示する表示装置と、
上記表示装置と同一の視聴環境に設置されている視聴環境機器とを備えていることを特徴とする視聴環境制御システム。 - 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御方法であって、
前記映像データまたは音声データの特徴量を算出する際の算出領域を示す参照情報を受信する受信工程と、
前記参照情報により示される算出領域における映像データまたは音声データの特徴量を算出する算出工程とを含むことを特徴とする視聴環境制御方法。 - 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御方法であって、
前記映像データまたは音声データの特徴量を算出する際の算出方法を示す演算指定情報を受信する受信工程と、
前記演算指定情報により示される算出方法で映像データまたは音声データの特徴量を算出する算出工程とを含むことを特徴とする視聴環境制御方法。 - 1以上のフレームから構成される映像データまたは音声データの特徴量に基づいて視聴環境機器を制御する視聴環境制御方法であって、
前記映像データまたは音声データの特徴量を算出する際の許容範囲を示す補正情報を受信する受信工程と、
前記補正情報により示される許容範囲内で映像データまたは音声データの特徴量を算出する算出工程とを含むことを特徴とする視聴環境制御方法。 - 外部から受信した制御基準データに基づいて視聴環境機器を制御する視聴環境制御方法であって、
外部から受信した映像データまたは音声データの特徴量を算出する算出工程と、
前記映像データまたは音声データの特徴量を用いて、前記制御基準データを補正する補正工程とを含むことを特徴とする視聴環境制御方法。 - 請求項1から15までの何れか1項に記載の視聴環境制御装置を動作させる視聴環境制御プログラムであって、コンピュータを上記各手段として機能させるための視聴環境制御プログラム。
- 請求項35に記載の制御プログラムを記録しているコンピュータ読み取り可能な記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010548414A JP5442643B2 (ja) | 2009-01-27 | 2010-01-26 | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム |
CN2010800048532A CN102282849A (zh) | 2009-01-27 | 2010-01-26 | 数据发送装置、数据发送方法、视听环境控制装置、视听环境控制方法以及视听环境控制系统 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009015775 | 2009-01-27 | ||
JP2009-015775 | 2009-01-27 | ||
JP2009098396 | 2009-04-14 | ||
JP2009-098396 | 2009-04-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010087155A1 true WO2010087155A1 (ja) | 2010-08-05 |
Family
ID=42395420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/000438 WO2010087155A1 (ja) | 2009-01-27 | 2010-01-26 | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP5442643B2 (ja) |
CN (1) | CN102282849A (ja) |
WO (1) | WO2010087155A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013236241A (ja) * | 2012-05-09 | 2013-11-21 | Keio Gijuku | 香り発生デバイスコントローラー及びそれを含むシステム |
JP2014204316A (ja) * | 2013-04-05 | 2014-10-27 | 日本放送協会 | 音響信号再生装置、音響信号作成装置 |
JPWO2015083299A1 (ja) * | 2013-12-02 | 2017-03-16 | パナソニックIpマネジメント株式会社 | 中継装置、連動システム、配信装置、中継装置の処理方法およびプログラム |
EP3163891A4 (en) * | 2014-06-24 | 2017-10-25 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN111243615A (zh) * | 2020-01-08 | 2020-06-05 | 环鸿电子(昆山)有限公司 | 麦克风阵列信号处理方法及手持式装置 |
US20220116684A1 (en) * | 2020-10-12 | 2022-04-14 | Arris Enterprises Llc | Set-top box ambiance and notification controller |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104613595A (zh) * | 2014-12-30 | 2015-05-13 | 广东美的制冷设备有限公司 | 空调器的控制方法、系统、服务器和用户客户端 |
CN106792168A (zh) * | 2016-12-09 | 2017-05-31 | 北京小米移动软件有限公司 | 智能设备的控制方法和装置 |
CN107178870A (zh) * | 2017-05-04 | 2017-09-19 | 珠海格力电器股份有限公司 | 多媒体数据播放设备、空调控制方法及装置 |
CN113360682B (zh) * | 2021-05-21 | 2023-03-21 | 青岛海尔空调器有限总公司 | 信息处理方法和装置 |
CN114608167A (zh) * | 2022-02-28 | 2022-06-10 | 青岛海尔空调器有限总公司 | 室内环境的智能调节方法与智能调节系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001343900A (ja) * | 2000-05-31 | 2001-12-14 | Matsushita Electric Ind Co Ltd | 照明システムおよび照明制御データ作成方法 |
JP2005531909A (ja) * | 2002-07-04 | 2005-10-20 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 周辺光および照明ユニットを制御する方法およびシステム |
WO2006112170A1 (ja) * | 2005-03-30 | 2006-10-26 | Sharp Kabushiki Kaisha | 液晶表示装置 |
WO2007119277A1 (ja) * | 2006-03-20 | 2007-10-25 | Sharp Kabushiki Kaisha | 視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
WO2007123008A1 (ja) * | 2006-04-21 | 2007-11-01 | Sharp Kabushiki Kaisha | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
WO2007145064A1 (ja) * | 2006-06-13 | 2007-12-21 | Sharp Kabushiki Kaisha | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
JP2008252667A (ja) * | 2007-03-30 | 2008-10-16 | Matsushita Electric Ind Co Ltd | 動画イベント検出装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008252267A (ja) * | 2007-03-29 | 2008-10-16 | Japan Radio Co Ltd | 高周波電力増幅器 |
-
2010
- 2010-01-26 CN CN2010800048532A patent/CN102282849A/zh active Pending
- 2010-01-26 WO PCT/JP2010/000438 patent/WO2010087155A1/ja active Application Filing
- 2010-01-26 JP JP2010548414A patent/JP5442643B2/ja not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001343900A (ja) * | 2000-05-31 | 2001-12-14 | Matsushita Electric Ind Co Ltd | 照明システムおよび照明制御データ作成方法 |
JP2005531909A (ja) * | 2002-07-04 | 2005-10-20 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 周辺光および照明ユニットを制御する方法およびシステム |
WO2006112170A1 (ja) * | 2005-03-30 | 2006-10-26 | Sharp Kabushiki Kaisha | 液晶表示装置 |
WO2007119277A1 (ja) * | 2006-03-20 | 2007-10-25 | Sharp Kabushiki Kaisha | 視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
WO2007123008A1 (ja) * | 2006-04-21 | 2007-11-01 | Sharp Kabushiki Kaisha | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
WO2007145064A1 (ja) * | 2006-06-13 | 2007-12-21 | Sharp Kabushiki Kaisha | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 |
JP2008252667A (ja) * | 2007-03-30 | 2008-10-16 | Matsushita Electric Ind Co Ltd | 動画イベント検出装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013236241A (ja) * | 2012-05-09 | 2013-11-21 | Keio Gijuku | 香り発生デバイスコントローラー及びそれを含むシステム |
JP2014204316A (ja) * | 2013-04-05 | 2014-10-27 | 日本放送協会 | 音響信号再生装置、音響信号作成装置 |
JPWO2015083299A1 (ja) * | 2013-12-02 | 2017-03-16 | パナソニックIpマネジメント株式会社 | 中継装置、連動システム、配信装置、中継装置の処理方法およびプログラム |
EP3163891A4 (en) * | 2014-06-24 | 2017-10-25 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN111243615A (zh) * | 2020-01-08 | 2020-06-05 | 环鸿电子(昆山)有限公司 | 麦克风阵列信号处理方法及手持式装置 |
CN111243615B (zh) * | 2020-01-08 | 2023-02-10 | 环鸿电子(昆山)有限公司 | 麦克风阵列信号处理方法及手持式装置 |
US20220116684A1 (en) * | 2020-10-12 | 2022-04-14 | Arris Enterprises Llc | Set-top box ambiance and notification controller |
US11627377B2 (en) * | 2020-10-12 | 2023-04-11 | Arris Enterprises Llc | Set-top box ambiance and notification controller |
Also Published As
Publication number | Publication date |
---|---|
JP5442643B2 (ja) | 2014-03-12 |
CN102282849A (zh) | 2011-12-14 |
JPWO2010087155A1 (ja) | 2012-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5442643B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御方法および視聴環境制御システム | |
US11856200B2 (en) | Data output apparatus, data output method, and data generation method | |
US10402681B2 (en) | Image processing apparatus and image processing method | |
US11140354B2 (en) | Method for generating control information based on characteristic data included in metadata | |
JP5092015B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 | |
JP6566271B2 (ja) | 伝送方法及び再生装置 | |
CN110213459B (zh) | 显示方法和显示装置 | |
CN110460745B (zh) | 显示装置 | |
US20110188832A1 (en) | Method and device for realising sensory effects | |
WO2010007987A1 (ja) | データ送信装置、データ受信装置、データ送信方法、データ受信方法および視聴環境制御方法 | |
JP2011259354A (ja) | 視聴環境制御システム、送信装置、受信装置 | |
JP5074864B2 (ja) | データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法 | |
JP6751893B2 (ja) | 再生方法、再生装置、表示方法及び表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080004853.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10735627 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2010548414 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10735627 Country of ref document: EP Kind code of ref document: A1 |