JP4399087B2 - Lighting system, video display device, and lighting control method - Google Patents

Lighting system, video display device, and lighting control method Download PDF

Info

Publication number
JP4399087B2
JP4399087B2 JP2000163828A JP2000163828A JP4399087B2 JP 4399087 B2 JP4399087 B2 JP 4399087B2 JP 2000163828 A JP2000163828 A JP 2000163828A JP 2000163828 A JP2000163828 A JP 2000163828A JP 4399087 B2 JP4399087 B2 JP 4399087B2
Authority
JP
Japan
Prior art keywords
lighting
image data
control data
image
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2000163828A
Other languages
Japanese (ja)
Other versions
JP2001343900A (en
JP2001343900A5 (en
Inventor
健次郎 橋本
正 矢野
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to JP2000163828A priority Critical patent/JP4399087B2/en
Publication of JP2001343900A publication Critical patent/JP2001343900A/en
Publication of JP2001343900A5 publication Critical patent/JP2001343900A5/ja
Application granted granted Critical
Publication of JP4399087B2 publication Critical patent/JP4399087B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHTING NOT OTHERWISE PROVIDED FOR
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of the light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources

Description

[0001]
BACKGROUND OF THE INVENTION
  The present invention relates to an image display device for displaying image data.Its image display deviceLighting systems and lighting systems used withYouRegarding the law.
[0002]
[Prior art]
In the multimedia era, progress in technological development in the image and sound fields is remarkable. In particular, with the increase in the size of the display, it has become possible to enjoy realistic images. However, increasing the size of the display has limitations on cost and installation space. Therefore, there is a need for a technique that can enhance the sense of reality when viewing images without using a very large display.
[0003]
As one of such techniques, there is a technique for controlling illumination in conjunction with an image. Since the resources and costs required to manufacture lighting devices are much less than those of large displays, the technology that controls lighting in conjunction with images is effective from the perspective of reducing costs, saving energy, and preventing global warming. It is.
[0004]
As a technique for controlling illumination in conjunction with an image, for example, “light color variable illumination device” disclosed in JP-A-3-184203 is known.
[0005]
This prior art considers the remaining part of the image displayed on the screen of the image display device by removing the pixels of the skin color part such as the human face as the background part, and only the RGB signal and the luminance signal of each pixel of the background part are considered. To obtain average chromaticity and average luminance. There is disclosed a method of controlling illumination so that the chromaticity and luminance of the wall surface on the back of the image display device are the same as the average chromaticity and average luminance of the entire screen or the background portion excluding human skin color.
[0006]
[Problems to be solved by the invention]
Since the prior art disclosed in Japanese Patent Laid-Open No. 3-184203 is based on the premise that a single light source is used, a high sense of realism cannot be obtained. In order to obtain a high sense of presence, it is necessary to use a plurality of light sources. However, a technology for obtaining a high sense of presence using a plurality of light sources has not been realized. This is because it was not known how to control a plurality of light sources to obtain a high sense of realism. The prior art disclosed in Japanese Patent Laid-Open No. 3-184203 does not mention the use of a plurality of light sources.
[0007]
  The present invention has been made in consideration of the above-described problems, and an object of the present invention is to provide a lighting system capable of obtaining a high sense of reality., Image display deviceAnd lighting systemYouIs to provide the law.
[0008]
[Means for Solving the Problems]
  An illumination system of the present invention is an illumination system used with an image display device that displays image data, and is controlled based on the calculation unit that creates illumination control data based on the image data, and the illumination control data. A plurality of lighting fixtures, wherein the computing unit cuts out a predetermined region from the image frame of the image data, and the lighting fixture corresponding to the cut-out predetermined region from the image information of the cut-out predetermined region Create the lighting control data to controlAnd when the hue of the predetermined region changes due to the change of the image frame in the image data, the illumination is changed such that the hue is changed after the saturation or brightness of the lighting fixture to be controlled is lowered. Create control dataThis achieves the above object.
  The illumination control method of the present invention is an illumination control method for controlling a lighting fixture in conjunction with image data displayed on an image display device, the step of cutting out a predetermined area from an image frame of the image data, When the hue of a predetermined area cut out from the image frame changes due to a change in the image frame of the image data, the hue is changed after the saturation or lightness of the lighting fixture corresponding to the cut out predetermined area is lowered. The step of creating the lighting control data, and the step of controlling the plurality of lighting fixtures by the lighting control data, thereby achieving the above object.
[0021]
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 shows a configuration of a lighting system 1 according to the present invention. The illumination system 1 is used with the image display device 100.
[0022]
The illumination system 1 receives the video signal 401 and converts it into image data 411, and an arithmetic unit that creates the illumination control data 420 based on the image data 411 and outputs the image data 412 and the illumination control data 420. 12, the illumination control data input unit 5 that receives the illumination control data 420 and converts it into a control voltage 430 that is input to the plurality of lighting fixtures 201 to 206, and the plurality of lighting fixtures 201 to 206.
[0023]
The video signal input unit 18 receives the video signal 401 provided from the video playback device 110 provided outside the illumination system 1, and converts the video signal 401 into image data 411. The image data 411 is output to the calculation unit 12.
[0024]
The video signal input unit 18 can be, for example, a video capture board. The video playback device 110 can be, for example, a DVD-ROM playback device or an S-VHS video playback device. The video signal 401 can be, for example, an S-VHS signal. The video reproduction device 110 may be a television broadcast receiver or a satellite broadcast receiver. The image data 411 and the image data 412 are represented, for example, in the Avi file format, but the format of the image data is not limited to this.
[0025]
FIG. 2 shows the configuration of the calculation unit 12. The calculation unit 12 includes a CPU 10, a main memory 16, and an auxiliary storage device 15. The CPU 10 controls and monitors the entire arithmetic unit 12 and executes an illumination control data creation program 17 stored in the auxiliary storage device 15. The main memory 16 temporarily stores data such as image data 411, image data 412, and illumination control data 420, and data necessary for executing the illumination control data creation program 17. The main memory 16 is accessed by the CPU 10.
[0026]
The auxiliary storage device 15 stores an illumination control data creation program 17. The auxiliary storage device 15 may be used to temporarily store data such as the image data 411, the image data 412, and the illumination control data 420. In the auxiliary storage device 15, any recording medium can be used as a recording medium for storing the illumination control data creation program 17. For example, a recording medium such as a hard disk, MO, floppy disk, MD, DVD, IC card, or optical card can be used preferably.
[0027]
The calculation unit 12 receives the image data 411 and creates illumination control data 420 based on the image data 411. The created illumination control data 420 is output in synchronization with the image data 412. For example, the arithmetic unit 12 temporarily stores the received image data 411 in the auxiliary storage device 15, creates the illumination control data 420 based on the image data stored in the auxiliary storage device 15, and the illumination control data 420 and the image The data 412 may be output at the same timing.
[0028]
Alternatively, the image data 411 is delayed by the time necessary for creating the illumination control data 420 and output as the image data 412, so that the illumination control data 420 and the image data 412 are output at the same time while receiving the image data 411. You may make it do. In this manner, the process of outputting the illumination control data 420 and the image data 412 at the same time while receiving the image data 411 is referred to as a near real time process. Note that outputting at the same timing means outputting the image data 412 and the corresponding illumination control data 420 at the same time.
[0029]
FIG. 3 shows the structure of the lighting fixture 201. The lighting fixture 201 includes fluorescent tubes 3R, 3G and 3B and lighting circuits 7R, 7G and 7B. The fluorescent tubes 3R, 3G and 3B are assigned to R (red), G (green) and B (blue) color elements, respectively. Each of the lighting circuits 7R, 7G, and 7B is, for example, an inverter lighting circuit. By inputting the control voltage 430 to the lighting circuits 7R, 7G and 7B, the fluorescent tubes 3R, 3G and 3B emit light with a luminance value corresponding to the control voltage 430. The luminance values of the fluorescent tubes 3R, 3G and 3B can be controlled independently. For example, if each of the fluorescent tubes 3R, 3G, and 3B is controlled with 256 gradations by a control voltage of 0 to 5 V, for example, the lighting fixture 201 emits light of about 16.7 million (= 256 × 256 × 256). It is possible. Note that the number of fluorescent tubes included in the lighting fixture 201 is not limited to three. For example, a plurality of fluorescent tubes corresponding to color elements having a low luminance value may be used to supplement the luminance value.
[0030]
Each of the lighting fixtures 202 to 206 used in the lighting system 1 may have the same structure as the lighting fixture 201.
[0031]
FIG. 4 shows an example in which lighting fixtures included in the lighting system 1 of the present invention are laid out in a room.
[0032]
The lighting fixture 201 is disposed near the ceiling above the image display device 100. The lighting fixture 202 is disposed near the floor behind the image display device 100. The luminaire 204 and the luminaire 203 are disposed near the right ceiling and the floor of the image display device 100, respectively. The luminaire 206 and the luminaire 205 are disposed near the ceiling and the floor on the left side of the image display device 100, respectively. Here, the right side and the left side are defined based on the viewer 150 who appreciates the image data displayed on the image display device 100.
[0033]
The lighting fixtures 203 'to 206' and the lighting fixtures 2055 and 2601 will be described later. The lighting fixtures 203 ′ to 206 ′ and the lighting fixtures 2055 and 2601 may have the same structure as the lighting fixture 201.
[0034]
Below, how the some lighting fixtures 201-206 are controlled is demonstrated.
[0035]
FIG. 5 shows the structure of image data 300 for one frame. The image data 300 for one frame includes a plurality of peripheral areas 301 to 306. Image data is represented as a time series of frames. The number of frames included in one-second image data is, for example, 30 frames.
[0036]
The basic method of controlling the plurality of lighting fixtures 201 to 206 in conjunction with the image data is perceived as the plurality of peripheral regions shown in FIG. 5 extending outward from the center 309 of the image data 300. In this way, a plurality of lighting fixtures are controlled. Specifically, the lighting fixture 201 is controlled so that the representative color of the peripheral region 301 of the image data 300 is substantially the same as the color emitted by the lighting fixture 201. Similarly, the plurality of colors are set so that the representative colors of the peripheral regions 302, 303, 304, 305, and 306 are substantially the same as the colors emitted by the lighting fixtures 202, 203, 204, 205, and 206, respectively. The lighting fixtures 202 to 206 are controlled.
[0037]
An area 351 (an area shown by hatching in FIG. 5) obtained by extending the peripheral area 301 shown in FIG. 5 outward from the center 309 of the image display device 100 is referred to as an extended area of the peripheral area 301. The luminaire associated with the peripheral region 301 is disposed in the extended region 351 of the peripheral region 301. Or the lighting fixture matched with the peripheral region 301 may be arrange | positioned so that it may be arrange | positioned in areas other than the extension area | region 351, and the extension area | region 351 may be illuminated. In this way, the luminaire associated with each peripheral area illuminates each extension area. By controlling a plurality of lighting fixtures in this way, the viewer 150 can perceive that the image is extended outward and can obtain a high sense of realism.
[0038]
FIG. 6 shows a data flow when the illumination control data 420 is created from the image data 300 for one frame. The illumination control data 420 is created based on the image information 455 of the peripheral areas 301 to 306 of the image data 300 for one frame. The process of creating the illumination control data based on the image information 455 of the peripheral areas 301 to 306 is performed by executing the illumination control data creation program 17 in the calculation unit 12. The illumination control data 420 includes dimming signal 421 to dimming signal 426. The dimming signal 421 to the dimming signal 426 are dimming signals for controlling the lighting fixtures 201 to 206, respectively. The dimming signal has three input levels V for controlling the fluorescent tubes 3R, 3G and 3B assigned to the R, G and B color elements included in the luminaire.RM', VGM'And VBM'including. Input level VRM', VGM'And VBMEach of ′ is represented by 0 to 5 V, for example.
[0039]
A dimming signal 421 for controlling the lighting fixture 201 is created based on the image information of the peripheral region 301 of the image data 300. Similarly, dimming signals 422 to 426 for controlling the lighting fixtures 202 to 206 are created based on the image information of the peripheral areas 302 to 306 of the image data 300.
[0040]
FIG. 7 shows a processing procedure of the illumination control data creation program 17. The illumination control data creation program 17 is executed for each frame of image data, and one illumination control data 420 is created for each frame. Hereinafter, the processing procedure of the illumination control data creation program 17 will be described for each step.
[0041]
Step S1: One frame worth of image data is read into the main memory 16 (FIG. 2). The image data may be read directly from the video signal input unit 18 into the main memory 16, or the image data once stored in the auxiliary storage device may be read into the main memory 16. The read image data includes a plurality of peripheral areas. The arrangement of the plurality of peripheral areas is predetermined. For the description of the following processing procedure, one of the peripheral areas included in the image data is referred to as a peripheral area D.
[0042]
Step S2: A representative color representing the peripheral area D is defined. The peripheral region D includes a plurality of pixels, and each pixel includes three subpixels assigned to three color elements R (red), G (green), and B (blue). The image data includes information (image information) for causing these sub-pixels to emit light at a desired luminance level. The luminance level of each of the three sub-pixels assigned to the three color elements R (red), G (green), and B (blue) included in a pixel is represented by VR, VGAnd VBAnd Luminance level VR, VGAnd VBIs represented by a value from 0 to 255, for example, but is not limited thereto. Luminance level VRIs a value obtained by averaging (simple average) over all pixels included in the peripheral region D.RMBrightness level VGIs a value obtained by averaging V over all pixels included in the peripheral region D.GMBrightness level VBIs a value obtained by averaging V over all pixels included in the peripheral region D.BMThen, the representative color of the peripheral region D is (VRM, VGM, VBM).
[0043]
Step S3: Input / output correction of the image display device is performed. That is, the representative color (VRM, VGM, VBM) Is the brightness value (LRM, LGM, LBM). This input / output correction of the image display device will be described later with reference to FIG.
[0044]
Step S4: Luminance value after input / output correction of the image display device (LRM, LGM, LBM) Is the color amount (XC, YC, ZC). This conversion is performed by the relational expression shown in the following expression (1).
[0045]
[Expression 1]
[0046]
However, MDISIs a 3 × 3 matrix depending on the characteristics of the image display device. Color amount (XC, YC, ZC) Is a color amount that does not depend on the characteristics of the image display device.
[0047]
Step S5: Color amount (XC, YC, ZC) Is converted into color perception amounts (H, L, C). Each of H, L, and C represents hue, lightness, and saturation. This conversion is performed by the relational expressions shown in the following expressions (2) to (6).
[0048]
a = 500 [(XC/ Xn)(1/3)+ (YC/ Yn)(1/3)] (2)
b = 200 [(YC/ Yn)(1/3)+ (ZC/ Zn)(1/3)] (3)
H = atan (b / a) (4)
L = 116 (YC/ Yn)(1/3)-16 (5)
C = (a2+ B2)(1/2)  (6)
Xn, YnAnd ZnIs a constant.
[0049]
Step S6: Color correction is performed, and corrected color perception amounts (H ′, L ′, C ′) are obtained. Color correction is performed by the relational expressions shown in the following expressions (7) to (9).
[0050]
H '= H + PH  (7)
L ′ = L × PL  (8)
C ′ = C × PC  (9)
Where PC, PHAnd PLAre parameters for color correction. In this way, by performing color correction on the color perception amounts (H, L, C), color correction can be performed independently for each of hue, brightness, and saturation. For example, in formulas (7) to (9), PH= 0, PL= 1 and PCBy performing color correction such that = 1.1, it is possible to perform color correction that emphasizes only the saturation, that is, the vividness of the color. Color correction may be arbitrarily performed according to the taste of the viewer.
[0051]
Step S7: The corrected color perception amount (H ′, L ′, C ′) is a color amount (XC', YC', ZC′). This conversion is performed by the relational expressions shown in the following expressions (10) to (14).
[0052]
a = C ′ × cosH (10)
b = C ′ × sinH (11)
YC'= [(L + 16) / 116]Three× Yn  (12)
XC′ = [A / 500 + (YC'/ Yn)(1/3)]Three× Xn  (13)
ZC′ = [− B / 200 + (YC'/ Yn)(1/3)]Three× Zn  (14)
Step S8: Color amount (XC', YC', ZC′) Is the luminance value (L) of the luminaire associated with the peripheral region DRM', LGM', LBM′). This conversion is performed by the relational expression shown in the following expression (15).
[0053]
[Expression 2]
[0054]
However, MLAMPIs a 3 × 3 matrix that depends on the characteristics of the luminaire associated with the peripheral region D. LRM', LGM'And LBM'Is a luminance value emitted by the fluorescent tubes 3R, 3G, and 3B assigned to the R, G, and B color elements included in the lighting fixture. The unit of the luminance value is, for example, [cd / cm2].
[0055]
Step S9: Input / output correction of the lighting fixture is performed. That is, the luminance value (LRM', LGM', LBM′) Is a dimming signal (VRM', VGM', VBM′). VRM', VGM'And VBM'Represents the luminance value L for each of the fluorescent tubes 3R, 3G, and 3B included in the lighting fixture.RM', LGM'And LBMThis is the input level necessary for emitting light with '. The input / output correction of the lighting fixture will be described later with reference to FIG.
[0056]
Step S10: It is determined whether or not the processing in steps S2 to S9 has been completed for all peripheral areas included in the image data for one frame. If this determination is Yes, the process of creating illumination control data for one frame of image data ends. The illumination control data is defined as a set of dimming signals for all peripheral areas included in the image data for one frame. If said determination is No, a process will return to step S2 and a process will be performed about another peripheral region.
[0057]
Note that, in step S6 of the processing procedure shown in FIG.H= 0, PL= 1, PC= 1, the color amount (XC, YC, ZC) And the color amount (XC', YC', ZC') Is the same. In this case, steps S5 to S7 can be omitted.
[0058]
FIG. 8 shows input / output characteristics of the image display device 100. In FIG. 8, the horizontal axis VRRepresents the luminance level of the color element R (red) input to the image display device 100, and the vertical axis LRRepresents the output level of the image display device 100, that is, the luminance value of the color element R (red). The luminance level is represented by a value of 0 to 255, for example, and the luminance value is, for example, cd / cm.2It is represented by VRAnd LRIs expressed by the following equation (16).
[0059]
LR= FRDIS(VR(16)
Where function fRDISIs a function indicating the input / output characteristics of the color element R (red) of the image display device 100. Such a function indicating input / output characteristics is defined by a polynomial, for example, and incorporated in the illumination control data creation program 17.
[0060]
Although the input / output characteristics are shown only for the color element R (red) in FIG. 8, the input / output characteristics for the color element G (green) and the color element B (blue) are similarly expressed by the function f.GDISAnd function fBDISAs a representative color (V in step S3 of FIG. 7).RM, VGM, VBM) Is the luminance value (LRM, LGM, LBM) Is performed by the following equations (17) to (19).
[0061]
LRM= FRDIS(VRM(17)
LGM= FGDIS(VGM(18)
LBM= FBDIS(VBM(19)
The dotted line in FIG.RMTo LRMIs shown.
[0062]
FIG. 9 shows input / output characteristics of the lighting fixture 201. Horizontal axis VR'Represents the input level of the color element R (red) input to the illumination control data input unit 5 as a dimming signal for the luminaire 201, andR'Represents the output level of the luminaire 201, that is, the luminance value of the color element R (red). The input level is represented by a value of 0 to 5 V, for example, and the luminance value is, for example, cd / cm.2It is represented by VR'And LRThe relationship with ′ is expressed by the following equation (20).
[0063]
LR'= FRLAMP(VR′) (20)
Where function fRLAMPIs a function indicating the input / output characteristics for the color element R (red) of the luminaire 201. Such a function indicating input / output characteristics is defined by a polynomial, for example, and incorporated in the illumination control data creation program 17.
[0064]
Although FIG. 9 shows the input / output characteristics only for the color element R (red) of the lighting fixture 201, the input / output characteristics for the color element G (green) and the color element B (blue) are similarly expressed by the function f.GLAMPAnd function fBLAMPIs defined as the luminance value (L in step S9 in FIG. 7).RM', LGM', LBM′) The dimming signal (VRM', VGM', VBMThe process of converting to ') is performed by the following equations (21) to (23).
[0065]
VRM'= FRLAMP -1(LRM’) (21)
VGM'= FGLAMP -1(LGM′) (22)
VBM'= FBLAMP -1(LBM′) (23)
A dotted line in FIG.RM’To VRMIt shows how 'is obtained.
[0066]
The input / output characteristics of the lighting fixtures 202 to 206 are also defined as functions, like the input / output characteristics of the lighting fixture 201.
[0067]
The illumination control data 420 is generated by the processing procedure described above. The lighting control data 420 may be further smoothed and filtered.
[0068]
FIG. 10 shows a data flow when the illumination control data 420 is subjected to smoothing processing and filtering processing. The smoothing process and the filtering process are performed in order to eliminate a feeling of strangeness when changing the illumination light. When the viewer views the video in an environment where the illumination light changes in conjunction with the video, the illumination light affects the viewer's peripheral visual field, so the viewer is extremely concerned if the illumination light changes rapidly or flickers. I feel a sense of discomfort. In order to prevent such a sense of incongruity, a smoothing process is performed. The smoothing process is performed, for example, by obtaining a moving average between the illumination control data created from the previous frame and the illumination control data created from the current frame. The number of frames used for the moving average is, for example, 4 frames. As the number of frames used for the moving average is increased, lighting control data with a gentle change is created. That is, the amount of change per unit time of the luminance value of the illumination light is suppressed.
[0069]
The filtering process is a process of cutting a dimming signal below a predetermined level included in the illumination control data. Flickering can be prevented by cutting the dimming signal at the boundary between lighting and non-lighting. The predetermined level is, for example, the level of the dimming signal having a value of 25% of the dimming signal for lighting the lighting fixture with the maximum luminance value.
[0070]
As described above, the illumination control data 420 is subjected to the smoothing process 703 and the filtering process 704 to become the smoothed and filtered illumination control data 420 ′. The smoothed and filtered illumination control data 420 ′ is input to the illumination control data input unit 5.
[0071]
In step S2 of the processing procedure shown in FIG. 7, the representative color is defined by obtaining a simple average of the luminance levels of R, G, and B of the pixels included in the peripheral region. The representative color may be defined by obtaining a maximum histogram of R, G, and B luminance levels. For example, when the peripheral region includes a red region and a green region, the representative color defined by obtaining a simple average of the luminance levels of R, G, and B is a color that is neither red nor green (for example, white ) May occur. In this case, there is a sense of incongruity when illumination light having substantially the same color as the representative color is irradiated onto the extended region of the peripheral region, which is not preferable. When the representative color is defined by obtaining the maximum histogram, the color of the region having the larger area among the red region and the green region included in the peripheral region becomes the representative color. When the representative color is obtained in this manner, it is preferable that the illumination light having substantially the same color as the representative color is irradiated on the extension region of the peripheral region without feeling uncomfortable.
[0072]
Further, the number of lighting fixtures included in the lighting system 1 is not limited to six. For example, as shown in FIG. 4, the lighting system 1 may control ten lighting fixtures 203 ′, 204 ′, 205 ′, and 206 ′ in addition to the lighting fixtures 201 to 206. . The luminaires 203 ′, 204 ′, 205 ′, and 206 ′ are arranged outside the luminaires 203, 204, 205, and 206, respectively, in the view of the viewer 150. In addition, the peripheral area and the lighting fixture do not have to correspond one-to-one. For example, any of the lighting fixtures 205 ′ and 205 shown in FIG. 4 may be associated with the peripheral region 305 shown in FIG. 5. Further, the lighting system 1 may not include the lighting fixture 205, and the lighting fixture 205 'may be associated with the peripheral region 305 shown in FIG.
[0073]
Further, the control method of the lighting device may be changed according to the viewing angle of the viewer 150. The viewing angle of the viewer 150 is different between the lighting fixture 205 ′ and the lighting fixture 205. That is, the luminaire 205 is closer to the center of the visual field of the viewer 150 than the luminaire 205 ′. Even if the light emission colors of the lighting fixture 205 and the lighting fixture 205 ′ are the same, the viewing angle of the viewer 150 is different, so that the impression that the lighting fixture 205 gives to the viewer 150 and the lighting fixture 205 ′ are the viewer 150. It is different from the impression given to. In general, the lighting fixture 205 ′ may be controlled so that the luminance value thereof is lower than that of the lighting fixture 205 because the peripheral portion of the visual field is easily felt. Such control is executed, for example, by appropriately selecting parameters for color correction in step S6 of the processing procedure shown in FIG. 7 according to the arrangement of the lighting fixtures.
[0074]
The relationship between the luminaire 206 ′ and the luminaire 206, the relationship between the luminaire 203 ′ and the luminaire 203, and the relationship between the luminaire 204 ′ and the luminaire 204 are also described above. It is the same as the relationship.
[0075]
The method for defining the peripheral region in the peripheral part of the image data is not limited to the definition method shown in FIG.
[0076]
11A to 11G show other examples of the structure of the image data 300 for one frame. The peripheral area included in the image data is defined in advance as shown in FIGS. 11A to 11G according to the number and arrangement of the lighting fixtures of the lighting system 1.
[0077]
The peripheral portion of the image data 300 for one frame shown in FIG. 11A includes peripheral regions 1102, 1103, 1104, 1105, and 1106. Such a peripheral region defining method is, for example, that the lighting fixtures included in the lighting system 1 are lighting fixtures 201, 203, 203 ′, 204, 204 ′, 205, 205 ′, 206 and 206 ′ shown in FIG. It can be used in some cases. The luminaire 201 is associated with the peripheral area 1103. The luminaire 203 and / or 203 ′ is associated with the peripheral area 1106. The luminaire 205 and / or 205 ′ is associated with the peripheral area 1105. The luminaire 206 and / or 206 ′ is associated with the peripheral area 1102. The luminaire 204 and / or 204 ′ are associated with the peripheral area 1104. The lighting fixtures 205 and 203 may be omitted, or the lighting fixtures 205 'and 203' may be omitted. Also, the lighting fixtures 206 and 204 may be omitted, or the lighting fixtures 206 'and 204' may be omitted.
[0078]
Image data 300 for one frame shown in FIG. 11B includes a peripheral area 1108. Such a peripheral region defining method can be used, for example, when the lighting fixtures included in the lighting system 1 are the lighting fixtures 201, 204, and 206 shown in FIG. The lighting fixtures 201, 204, and 206 are associated with the peripheral region 1108. The lighting fixtures 204 and 206 may be omitted.
[0079]
Image data 300 for one frame shown in FIG. 11C includes peripheral areas 1109 and 1110. Such a peripheral region defining method can be used, for example, when the lighting fixtures included in the lighting system 1 are the lighting fixtures 201, 206, 204, 205, 202, and 203 shown in FIG. The lighting fixtures 201, 206, and 204 are associated with the peripheral region 1109. The lighting fixtures 205, 202, and 203 are associated with the peripheral region 1110. The lighting fixtures 206 and 204 may be omitted. Also, the lighting fixtures 205 and 203 may be omitted.
[0080]
Image data 300 for one frame shown in FIG. 11D includes peripheral regions 1111 and 1112. Such a method of defining the peripheral region can be used, for example, when the lighting fixtures included in the lighting system 1 are the lighting fixtures 203, 204, 205, and 206 shown in FIG. The lighting fixtures 205 and 206 are associated with the peripheral area 1111. The lighting fixtures 204 and 203 are associated with the peripheral region 1112. The lighting fixtures 206 and 204 may be omitted, or the lighting fixtures 205 and 203 may be omitted.
[0081]
Image data 300 for one frame shown in FIG. 11E includes peripheral areas 1113, 1114, 1115, and 1116. For example, the lighting device included in the lighting system 1 is a lighting device 203, 203 ′, 204, 204 ′, 205, 205 ′, 206 and 206 ′ shown in FIG. Can be used in some cases. The lighting fixture 203 and / or 203 ′ is associated with the peripheral area 1116. The luminaire 205 and / or 205 ′ is associated with the peripheral area 1114. The luminaire 206 and / or 206 ′ is associated with the peripheral area 1113. The luminaire 204 and / or 204 ′ are associated with the peripheral area 1115. The lighting fixtures 205 and 203 may be omitted, or the lighting fixtures 205 'and 203' may be omitted. Also, the lighting fixtures 206 and 204 may be omitted, or the lighting fixtures 206 'and 204' may be omitted.
[0082]
Image data 300 for one frame shown in FIG. 11F includes peripheral areas 1117, 1118, and 1119. Such a peripheral region defining method can be used, for example, when the lighting fixtures included in the lighting system 1 are the lighting fixtures 201, 206, 204, 205, and 203 shown in FIG. The lighting fixtures 205 and 206 are associated with the peripheral region 1117. The lighting fixtures 204 and 203 are associated with the peripheral area 1119. The lighting fixture 201 is associated with the peripheral area 1118. The lighting fixtures 203 and 205 may be omitted, or the lighting fixtures 206 and 204 may be omitted.
[0083]
The image data 300 for one frame shown in FIG. 11G includes peripheral areas 1120, 1121, 1122, 1123, and 1124. Such a peripheral region defining method is, for example, that the lighting fixtures included in the lighting system 1 are the lighting fixtures 202, 203, 203 ′, 204, 204 ′, 205, 205 ′, 206 and 206 ′ shown in FIG. Can be used. The lighting fixture 202 is associated with the peripheral region 1123. The luminaire 203 and / or 203 ′ is associated with the peripheral area 1124. The luminaire 205 and / or 205 ′ is associated with the peripheral area 1122. The luminaire 206 and / or 206 ′ is associated with the peripheral area 1120. The luminaire 204 and / or 204 ′ or both are associated with the peripheral area 1121. The lighting fixtures 205 and 203 may be omitted, or the lighting fixtures 205 'and 203' may be omitted. Also, the lighting fixtures 206 and 204 may be omitted, or the lighting fixtures 206 'and 204' may be omitted.
[0084]
In the above, with reference to FIG. 11A-FIG. 11G, the structure of the illumination system 1 and the variation of the method of defining a peripheral region in the peripheral part of image data were demonstrated. Variations of the configuration of the illumination system 1 and the method of defining the peripheral region in the peripheral part of the image data are not limited to FIGS. 11A to 11G. The lighting system 1 may have a lighting fixture that is not associated with any surrounding area. If at least one of the plurality of lighting fixtures included in the lighting system 1 is controlled so as to be linked to one corresponding peripheral region among the plurality of peripheral regions, a high sense of realism can be obtained, and the effect of the present invention can be achieved. can get.
[0085]
As the number of peripheral regions in the peripheral part and the number of lighting fixtures included in the lighting system 1 increase, the cost increases and the amount of calculation for creating lighting control data also increases. The number of peripheral areas is appropriately determined in consideration of the effects of cost reduction and realism improvement. For example, the number of the peripheral regions shown in FIG. 11D and the number of the peripheral regions shown in FIG. 11C are both two, but the peripheral region is positioned in the vertical direction when the peripheral region is positioned in the horizontal direction. The effect of improving the sense of reality is greater than that. Therefore, the definition method of the peripheral region shown in FIG. 11D is preferable to the definition method of the peripheral region shown in FIG. 11C. Considering the effects of cost reduction and realism improvement, the number of peripheral regions is generally preferably about 2 to 4 (for example, FIG. 11D and FIG. 11E), but the present invention is not limited to this.
[0086]
In general, more lighting fixtures can control lighting in response to detailed changes in the image, but on the other hand, the lighting system becomes larger and the creation of lighting control data becomes complicated and the time required for creation increases. Both the introduction cost and the calculation cost increase. Therefore, it is necessary to increase the sense of presence by using a limited number of lighting fixtures. A method for enhancing the realistic effect using a limited number of lighting fixtures will be described below.
[0087]
FIG. 12 shows the field of view of the viewer 150. The center line (reference line) of the visual field of the viewer 150 is assumed to be at the center O of the image data displayed on the image display device. The viewer 150 has a central field of view 502 corresponding to a region of less than 30 ° left and right with respect to the reference line as a base point, and a peripheral field of view 503 of 30 ° to 100 ° right and left (shown in FIG. 12). Hatched area). The sensitivity of the human eye varies depending on the location within the field of view. The vicinity of the center of the visual field is sensitive to differences in brightness and color, and the peripheral part of the visual field is sensitive to movement. The present inventors evaluated the effect of improving the presence in the following three cases with respect to the arrangement of the lighting fixtures.
[0088]
(A) When a lighting fixture is arranged only in the central visual field.
[0089]
(B) When a lighting fixture is arranged only in the peripheral visual field.
[0090]
(C) When a lighting fixture is arranged in the central visual field and the peripheral visual field.
[0091]
The above (a) is realized by using the lighting fixtures in the background portion of the image display device 100 (the lighting fixtures 201 and 202 shown in FIG. 4). The above (b) was realized by using the lighting fixtures 203, 204, 205 and 206 shown in FIG. The above (c) is realized by using the lighting fixtures 201, 202, 203, 204, 205 and 206 shown in FIG.
[0092]
The effect of improving the sense of presence is (c)> (b)> (a) in the above three cases, and the effect of improving the sense of presence is sufficiently high even in the case of (b) in which the lighting apparatus is arranged only in the peripheral visual field. was gotten. This is because, as already described, the human visual field moves more in the peripheral visual field, i.e., it is more sensitive to changes and can fully sense the change in light emission of the luminaire. The positional relationship between the viewer's field of view and the lighting fixture also varies depending on the size of the viewing room and the position of the viewer, but in general the central field of view corresponds to the front of the room and the peripheral field of view is the side of the viewing room. It corresponds to.
[0093]
In the case of (b) above, the luminance value irradiated from the illumination device having the peripheral visual field is higher than the luminance value irradiated from the lighting device having the central visual field. On the other hand, in the case of the above (a), the luminance value irradiated from the lighting device having the central visual field is higher than the luminance value irradiated from the lighting device having the peripheral visual field. In this way, in order to improve the sense of reality by using limited lighting fixtures, the luminance value irradiated from the lighting fixture in the peripheral visual field is the luminance value irradiated from the lighting fixture in the central visual field. It is a basic condition to make it higher. Therefore, for example, it is preferable to use the peripheral region defining method shown in FIG. 11E.
[0094]
The above description has been given for the above three cases (a), (b), and (c), but the present invention is not limited to the arrangement of these lighting fixtures. As long as the luminance value emitted from the luminaire in the peripheral field of view (or illuminates in the peripheral field of view) is higher than the luminance value emitted from the luminaire in the central field of view (or illuminates in the central field of view) , Can improve the sense of reality by lighting.
[0095]
The size of each peripheral area may be arbitrarily determined. In general, the smaller the size of the peripheral region, the better the average color of the peripheral region perceived by humans and the representative color obtained by calculating the simple average of RGB as described above, that is, the color Reproducibility is improved. However, if the size of the surrounding area is small, if the surrounding area happens to contain an image of a relatively small object (for example, a scene where a bird flew in a landscape image), the color of the relatively small object May become the representative color of the peripheral region, that is, noise resistance is degraded. The size of the peripheral region is appropriately determined from the viewpoint of such color reproducibility and noise resistance.
[0096]
In the above description, the lighting apparatus uses fluorescent tubes assigned to R (red), G (green), and B (blue) color elements, but the color elements are not limited to R, G, and B. . For example, C (cyan), Y (yellow), and M (magenta) may be used as color elements. Moreover, you may light-emit a lighting fixture other than a fluorescent tube. For example, light may be emitted by a light emitting diode (LED). Three light emitting diodes of R (red), G (green) and B (blue) constitute one illumination element, and this one illumination element is composed of one pixel included in the peripheral area of the image data And may be controlled in association with each other. A single lighting fixture may include a plurality of such lighting elements. In this case, the peripheral region can be controlled by associating and controlling the pixels included in the peripheral region and the lighting elements included in the luminaire associated with the peripheral region without defining the representative color of the peripheral region. The color perceived as a whole and the color perceived as a whole lighting fixture can be made substantially the same.
[0097]
The lighting fixture may be direct illumination or indirect illumination. For example, the lighting fixture 202 shown in FIG. 4 is arranged behind the image display device 100 and is configured to illuminate the back wall surface by indirect lighting, but the lighting fixture 202 is arranged on the front surface of the image display device 100. It may be. Moreover, you may make it the structure which uses the spot lighting fixture as a lighting fixture and illuminates the wall surface of an appreciation room. When making it the structure which a lighting fixture illuminates a wall surface, it is preferable that the color of a wall surface is white. Further, the size of the viewing room in which the illumination system 1 is arranged is not limited to the example shown in FIG. For example, the lighting system 1 may be used in a large space such as a public space in a movie theater or a nursing home.
[0098]
Further, the lighting control data may be distributed via the Internet or the like. The illumination control data may be distributed by a carrier wave. Thus, when the illumination control data is distributed, the illumination system 1 (FIG. 1) may not have the video signal input unit 18 and the calculation unit 12. In other words, the lighting system 1 only needs to include the lighting control data input unit 5 and a plurality of lighting fixtures. Further, the illumination control data may be distributed together with the video signal without being distributed alone.
[0099]
When the illumination system 1 (FIG. 1) has the calculation unit 12 and creates illumination control data by near real-time processing, the computation time required to create illumination control data from one frame of image data is shortened. It is necessary to. For example, a video signal based on the NTSC signal includes 30 frames per second. In this case, the image data also includes 30 frames per second, and it is necessary to create illumination control data for each of these frames. Calculations required to create illumination control data from one frame of image data by using a method using a look-up table, a method for performing pixel sampling, and a method for performing frame interpolation, either alone or in combination with others Time can be shortened. These methods are described below.
[0100]
FIG. 13 shows a data flow when creating illumination control data using a lookup table. First, the representative color 802 of the peripheral area is obtained from the image data 411. This processing procedure is the same as steps S1 to S2 of the processing procedure shown in FIG. Next, a lookup table created in advance is searched using the representative color 802 as a key (processing 803), and illumination control data 420 is created. Thereby, steps S3 to S9 of the processing procedure shown in FIG. 7 are replaced by the look-up table processing 803, and the time required for creating the illumination control data is greatly shortened.
[0101]
Next, a method for performing pixel sampling will be described.
[0102]
FIG. 14 shows the pixels that are sampled on the peripheral area. A hatched rectangle indicates a pixel to be sampled, and a non-hatched rectangle indicates a pixel that is not sampled. If the representative color is defined by calculating the average of the color elements R, G and B only for the sampled pixels, the calculation time required to define the representative color is reduced. That is, the calculation time required for step S2 of the processing procedure shown in FIG. 7 is shortened.
[0103]
FIG. 15 shows the relationship between the number of pixel samplings and the time required for creating illumination control data. The time required for creating the illumination control data is represented by 100 as the time required for creating the illumination control data when the pixel sampling number is 0, that is, when pixel sampling is not used. A pixel sampling number of 1 represents sampling every other pixel in the peripheral area. As shown in FIG. 15, by performing pixel sampling, the time required for creating the illumination control data can be shortened to about 1/3 as compared with the case where pixel sampling is not used. Even if the number of pixel sampling is increased to 5 or more, the effect of shortening the time required for creating the illumination control data is almost lost. Accordingly, the sampling number of pixels is preferably about 1 to 5, but is not limited thereto. Note that when the number of pixel samplings is about several, the illumination control data created using pixel sampling is not significantly different from the case where pixel sampling is not used.
[0104]
FIG. 16 shows a data flow when creating illumination control data using frame interpolation. n is an integer of 3 or more, and frames 1 to n are temporally adjacent frames. Also, the peripheral area D1~ DnAre peripheral regions included in frames 1 to n, respectively, and these peripheral regions are peripheral regions at the same position in the frame. D included in frame 1 and frame n1And Dn, The color perception amount 1601 is obtained by steps S2 to S6 of the processing procedure shown in FIG. D included in frame 2 to frame (n-1)2~ Dn-1, The color perception amount 1650 is obtained by interpolating the color perception amount 1601 of both the frame 1 and the frame n. According to this method, since the time for processing the image data of all frames can be shortened, the time for creating the illumination control data can be shortened.
[0105]
With respect to the frame 1 and the frame n, the color perception amount 1601 is obtained from the dimming signal 1611 by steps S7 to S9 of the processing procedure shown in FIG. Created.
[0106]
Similarly, with respect to the frames 2 to (n−1), the dimming signal 1611 is obtained from steps S7 to S9 of the processing procedure shown in FIG. Lighting control data 1621 is created by obtaining a dimming signal for.
[0107]
The color perception amount interpolation method used in the present invention will be described below. In the image data of a series of scenes, the color perception amount can be simply interpolated. However, when the image data of frame 1 and frame n represent completely different scenes, simple interpolation of the color perception amount causes a problem. For example, the peripheral area D of the frame 11Color perceived amount (H1', L1', C1') Of hue H1'Represents cyan and the peripheral region D of frame nnColor perceived amount (Hn', Ln', Cn') Of hue HnWhen ′ represents yellow, the color perception amount (H1', L1', C1′) And perceived color (Hn', Ln', Cn') Is simply interpolated for each hue, lightness, and saturation, a hue representing green appears. When interpolating the color perception amount, it is not preferable that a hue completely different from the representative color of the original image data is perceived. This is because if a completely different hue is perceived, the viewer feels uncomfortable. According to the color perception amount interpolation method used in the present invention, it is possible to prevent an intermediate hue from being perceived during interpolation.
[0108]
FIG. 17A shows an example of changes in hue, brightness, and saturation in the color perception amount interpolation method used in the present invention. A curve 1701 represents the relationship between the frame number and the hue. Since the frame number corresponds to time, the curve 1701 represents the temporal change in hue. Similarly, a curve 1702 represents a temporal change in brightness, and a curve 1703 represents a temporal change in saturation. In the example shown in FIG. 17A, the hue, brightness, and saturation are set to H.1’To Hn‘L’1'To LnUntil ‘C’1'To CnWhen interpolating to ′, the lightness and saturation are respectively L1'And C1After changing to a predetermined value smaller than ', the hue is H1’To HnIs interpolated to change to ', and the hue is HnAfter changing to ', the brightness and saturation are respectively Ln'And CnInterpolation is performed so as to change to '. The predetermined value is a value of brightness or saturation that makes it difficult to perceive a change in hue. Input level V for each color element of the dimming signal included in the illumination control data for a certain frameRM', VGM'And VBM'Is obtained from the hue H', the lightness L ', and the saturation C' by the conversion shown in the above equations (7) to (23). That is, the input level V for each color element of the dimming signal included in the illumination control dataRM', VGM'And VBM'Corresponds to hue H', lightness L ', and saturation C'. Hue H ′, lightness L ′, and saturation C ′ can be regarded as a hue component, a lightness component, and a saturation component of light emitted from the lighting fixture controlled by the dimming signal, respectively.
[0109]
When the hue changes between the frame 1 and the frame n, if the hue is changed after the lightness and the saturation are lowered, the uncomfortable feeling due to the change in the hue of the light emitted from the luminaire is reduced. This is because when the saturation and brightness are low, the difference in hue becomes difficult to perceive.
[0110]
Note that, as an interpolation method for reducing a sense of incongruity due to a change in hue, the hue may be changed after either one of saturation or lightness is reduced. Even when one of saturation and lightness is low, a difference in hue is hardly perceived.
[0111]
FIG. 17B shows another example of changes in hue, brightness, and saturation in the color perception amount interpolation method used in the present invention. Curves 1711, 1712, and 1713 represent changes in hue, lightness, and saturation over time, respectively. In the example shown in FIG. 17B, the hue, brightness, and saturation are set to H.1’To Hn‘L’1'To LnUntil ‘C’1'To C1nWhen interpolating up to ', the saturation is C1After changing to a value smaller than ', the hue is H1’To HnIs interpolated to change to ', and the hue is HnAfter changing to ', the saturation is CnInterpolation is performed so as to change to '. Lightness is L1'To Ln'Is simply interpolated.
[0112]
FIG. 18 is a diagram showing the change in the color perception amount from frame 1 to frame n on the color correlation diagram. The amount of color perception of frame 1 and frame n is point P.1And point PnIt is shown in A curve 1801 indicates a change path of the color perception amount when the hue is changed after the saturation is decreased, and a curve 1802 indicates a change of the color perception amount when the hue is changed without decreasing the saturation. Indicates the route. When the hue is changed after the saturation is lowered, the color perception amount is the point P.1After approaching white (ie, achromatic color), point PnChange to. For this reason, it is difficult to perceive changes in hue. When the hue is changed without reducing the saturation, the color perception amount is the point P.1Point P after approaching bluenChange to. For this reason, a change in hue is easily perceived.
[0113]
The interpolation method described with reference to FIGS. 17A and 17B to change the hue after reducing either the saturation or the lightness is preferable when the hue changes between frame 1 and frame n. Interpolation method used. Whether such an interpolation method is used instead of simple interpolation is determined by, for example, the hue H obtained from the frame 1 and the frame n.1’And HnIt is determined by searching whether there is an uncomfortable hue between '. If the color category is defined within the range of hue angles, the hue H1’And HnIt can be determined by calculation whether or not there is another color category between 'and. Hue H1’And HnWhen it is determined that there is a category of another color between “,” that is, when the hue changes beyond a predetermined color category, the color described with reference to FIGS. 17A and 17B is used. An interpolation method is used in which the hue is changed after either the brightness or the brightness is lowered.
[0114]
For example, when n = 6, the color perception amount of frame 1 (H1', L1', C1′) And the color perception amount of frame 6 (H6', L6', C6′) And the color perception amount of frame 2 (H2', L2', C2′) To the color perception amount of frame 5 (HFive', LFive', CFiveAn example of the processing procedure for obtaining ') is shown in the following (a) to (d).
[0115]
(A) H2‘= H1’And L2'And C2'Is set to a sufficiently small value. The sufficiently small value is a value of brightness and saturation that does not perceive a sense of incongruity due to a change in hue, and is a predetermined value.
[0116]
(B) HThree‘H’1’And H6It is an intermediate value between 'and LThree'And CThree'Is L2'And C2Equal to '.
[0117]
(C) HFour‘= H6’And LFour'And CFour'Is LThree'And CThreeEqual to '.
[0118]
(D) HFive‘= H6’And LFive'And CFive'Is LFour'And L6′ Intermediate value and CFour'And C6Set to a value intermediate between '.
[0119]
The hue H obtained from the frame 1 and the frame n1’And HnIf there is no uncomfortable hue between 'and', the color perception amount may be interpolated by simply interpolating the hue, lightness, and saturation. When the value of n is small, for example, when n = 3, the time difference between frame 1 and frame n is small, and the difference in image information included in the image data of frame 1 and frame n is also considered small. It is not always necessary to use an interpolation method that changes the hue after reducing either the saturation or the lightness as described above. In such a case, the dimming signal for the intermediate frame may be obtained by simply interpolating the dimming signal for frame 1 and the dimming signal for frame n.
[0120]
All the illumination control data creation methods described above are illumination control data creation methods in which illumination is controlled based on image information in a peripheral region of image data. Such control is performed in order to enhance a sense of reality by making the viewer perceive that the image is extended in an oblique direction in the horizontal and vertical directions within the viewer's field of view. However, the illumination system 1 of the present invention is not limited to being controlled in this way. For example, it may be controlled so as to enhance the sense of reality by providing a sense of surprise to the viewer, or may be controlled so as to provide a feeling of peace. Below, the preparation method of the illumination control data which the illumination system 1 of this invention provides a viewer with a feeling of surprise and peace is demonstrated.
[0121]
The sense of surprise can be provided by, for example, lighting the lighting fixture at the same time that the exploded scene is displayed on the image display device. The viewer is amazed at the illusion that an explosion has occurred around him. The explosion scene is extracted using image information of an area including the central portion of the image data.
[0122]
FIG. 19 shows a central area 1900 defined as an area including the central portion of the image data 300 for one frame. In a scene that is intended to surprise the viewer, such as an exploded scene, the image information in the central area 1900 changes rapidly. Therefore, by calculating the amount of change in the image information in the central area 1900, it can be determined whether or not the target frame is an explosion scene. For example, an average value (RGB average value) of output levels of all R, G, and B included in the central region 1900 is used as the image information of the central region 1900.
[0123]
FIG. 20 shows the RGB average value and the difference between the RGB average values in each frame. In the example shown in FIG. 20, the central area 1900 is made to coincide with the entire area of the image data. In FIG. 20, the horizontal axis indicates the frame number. Each frame of frame numbers 1 to 51 is a temporally continuous frame in this order, and includes an explosion scene. A polygonal line 2001 indicates the RGB average value of each frame, a polygonal line 2002 indicates the difference between the RGB average values of each frame, and the RGB average value is a simple average value of the luminance levels of the image expressed in 256 levels from 0 to 255. is there. The difference between the RGB average values is a value obtained by subtracting the RGB average value of the frame preceding by one frame from the RGB average value of each frame. Generally, in an explosion scene, the image suddenly becomes bright, that is, the RGB average value suddenly increases. Therefore, a frame in which both the RGB average value and the difference between the RGB average values are larger than a certain threshold value can be determined as an explosion scene. In the example shown in FIG. 20, a frame having an RGB average value of 100 or more and a difference of RGB average values of 50 or more coincides with the explosion scene.
[0124]
In addition, although the example which used the RGB level showed the determination of the explosion scene, you may perform the determination process of an explosion scene after converting into another color system.
[0125]
In the example shown in FIG. 20, the central area 1900 is made to coincide with the entire area of the image. In general images, there are many scenes where an explosion occurs at a position far from the viewer. In addition, an image having a large amount of change in image information in the peripheral portion of the image is often an image for increasing a sense of speed, and does not necessarily coincide with an explosion scene. Therefore, the determination accuracy is improved by determining whether or not the scene is an explosion scene based on the amount of change in the image information of only the central portion not including the peripheral portion of the image. Also, a scene that surprises the viewer, such as an explosion scene, may be determined based on not only the amount of change in image information but also the amount of change in audio information. In scenes that surprise viewers, such as explosion scenes, the volume of sound changes rapidly as the image changes rapidly. Accordingly, a method for determining an explosion scene based on the amount of change in audio information can be described in the same manner as the method for determining an explosion scene with reference to FIG. For example, as in the graph shown in FIG. 20, a graph can be obtained in which the horizontal axis is the frame number and the vertical axis is “difference between audio volume and audio volume”. In addition, the determination accuracy is improved by determining a scene that surprises the viewer, such as an explosion scene, based on both the change amount of the image information and the change amount of the audio information.
[0126]
Explosion scene lighting control data is created in a frame that is determined to be a scene that surprises the viewer, such as an explosion scene. The explosion scene illumination control data is generated, for example, by executing a processing procedure similar to steps S2 to S9 of the processing procedure shown in FIG. The lighting fixture controlled by the explosion scene lighting control data may be any of the lighting fixtures 201-206 and 203'-206 'shown in FIG. A plurality of these lighting fixtures may be controlled by the explosion scene lighting control data. Further, a lighting fixture 2055 (FIG. 4) dedicated to the explosion scene may be arranged near the viewer, for example, and the lighting fixture 2055 may be controlled by the explosion scene lighting control data. A plurality of lighting fixtures 2055 may be provided. Since the lighting fixture 2055 is a lighting fixture dedicated to an explosion scene, the color of the emitted light may be a single color (for example, red) or two colors (for example, red and yellow).
[0127]
A method for providing a viewer with a feeling of peace with the illumination system 1 of the present invention will be described below. It is known that relaxation can be achieved, that is, a feeling of peace can be obtained by changing the light output of illumination light at a frequency in the range of 0.05 Hz to 0.6 Hz. This technique can also be applied when lighting is controlled in conjunction with image data by the lighting system 1 of the present invention.
[0128]
Illumination control data providing such a feeling of comfort is obtained by, for example, converting the dimming signal created by the processing procedure shown in FIG. 7 into a color signal and a luminance signal, and modulating the luminance signal at a predetermined frequency. Then, it can be created by converting the coordinates into a dimming signal again. Such coordinate conversion of the dimming signal can be performed by a known relational expression.
[0129]
FIG. 21 shows an example in which the luminance value emitted by the lighting fixture is changed at a constant frequency. The frequency for changing the luminance value is preferably in the range of 0.05 Hz to 0.6 Hz. Among these, the range of 0.1 Hz to 0.4 Hz is particularly preferable for obtaining a feeling of peace.
[0130]
FIG. 22 shows that the luminance value is changed at a constant frequency so that the temporal change rate of the luminance value when the luminance value of the luminaire increases is larger than the temporal change rate of the luminance value when the luminance value decreases. An example is shown. By changing the luminance value in this way, it becomes easy to synchronize the breathing rhythm with the change of the illumination light, and the feeling of peace is improved. This is because a change in luminance value is reminiscent of an image that breathes in and out.
[0131]
It is not essential to periodically change the luminance values of all the luminaires included in the lighting system 1 in this way. For example, only the luminance value of the lighting fixture 201 shown in FIG. 4 may be changed periodically. Alternatively, only the luminance value of the lighting fixture 202, only the luminance value of the lighting fixtures 201 and 202, only the luminance value of the lighting fixtures 205 and 203, or only the luminance value of the lighting fixtures 201 and 202 may be changed periodically. Good. Or you may make it change the luminance value of the lighting fixtures 201-206 and 203'-206 'periodically. Or you may make it arrange | position the lighting fixture 2601 (FIG. 4) which is not linked with image data, and change the luminance value of the lighting fixture 2601 periodically. There may be a plurality of lighting fixtures 2601.
[0132]
Such a method of providing a feeling of peace can be suitably used, for example, when viewing a landscape image or the like.
[0133]
【The invention's effect】
According to the present invention, the viewer can perceive a high sense of realism by changing the illumination light in conjunction with the image.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a configuration of a lighting system 1 according to the present invention.
FIG. 2 is a block diagram showing a configuration of a calculation unit 12
FIG. 3 is a diagram showing the structure of a lighting fixture 201
FIG. 4 is a diagram showing an example in which lighting fixtures of the lighting system 1 of the present invention are laid out in a room.
FIG. 5 is a diagram showing the structure of image data 300 for one frame.
FIG. 6 is a diagram showing a data flow when illumination control data 420 is created from image data 300 for one frame.
FIG. 7 is a diagram showing a processing procedure of an illumination control data creation program 17
8 is a diagram showing input / output characteristics of the image display device 100. FIG.
FIG. 9 is a diagram showing input / output characteristics of the lighting fixture 201
FIG. 10 is a diagram showing a data flow when the lighting control data 420 is subjected to smoothing processing and filtering processing;
FIG. 11A is a diagram showing another example of the structure of image data 300 for one frame.
FIG. 11B is a diagram showing another example of the structure of image data 300 for one frame.
FIG. 11C is a diagram showing another example of the structure of image data 300 for one frame.
FIG. 11D is a diagram showing another example of the structure of the image data 300 for one frame.
FIG. 11E is a diagram showing another example of the structure of the image data 300 for one frame.
FIG. 11F is a diagram showing another example of the structure of image data 300 for one frame.
FIG. 11G is a diagram showing another example of the structure of the image data 300 for one frame.
FIG. 12 is a view showing the visual field of the viewer 150
FIG. 13 is a diagram showing a data flow when creating illumination control data using a lookup table.
FIG. 14 is a diagram showing pixels to be sampled on the peripheral region
FIG. 15 is a diagram showing the relationship between the number of pixel samplings and the time required to create illumination control data
FIG. 16 is a diagram showing a data flow when creating illumination control data using frame interpolation;
FIG. 17A is a diagram showing an example of changes in hue, brightness, and saturation in the color perception amount interpolation method used in the present invention;
FIG. 17B is a diagram showing another example of changes in hue, brightness, and saturation in the color perception amount interpolation method used in the present invention;
FIG. 18 is a diagram showing a change in color perception amount from frame 1 to frame n on a color correlation diagram.
FIG. 19 is a diagram showing a peripheral area 1900 defined as a peripheral area including a central portion of image data 300 for one frame.
FIG. 20 is a diagram showing an RGB average value and a difference between RGB average values in each frame;
FIG. 21 is a diagram showing an example in which the luminance value emitted by the lighting fixture is changed at a constant frequency.
FIG. 22 changes the luminance value at a constant frequency so that the temporal change rate of the luminance value when the luminance value of the luminaire increases is larger than the temporal change rate of the luminance value when the luminance value decreases. Illustration showing an example
[Explanation of symbols]
1 Lighting system
5 Lighting control data input section
12 Calculation unit
18 Video signal input section
100 Image display device
201-206 Lighting equipment
412 Image data
420 Lighting control data

Claims (8)

  1. An illumination system for use with an image display device that displays image data,
    A calculation unit for creating illumination control data based on the image data;
    A plurality of lighting fixtures controlled based on the lighting control data;
    With
    The computing unit is
    Cut out a predetermined area from the image frame of the image data, create the illumination control data for controlling the lighting fixture corresponding to the cut out predetermined area from the image information of the cut out predetermined area , and
    In the image data, when the hue of the predetermined area changes due to the change of the image frame, the lighting control data is generated so that the hue is changed after the saturation or brightness of the lighting fixture to be controlled is lowered. to, the lighting system.
  2.     The illumination system according to claim 1, wherein the predetermined area is a peripheral area of the image frame.
  3.     The illumination system according to claim 2, wherein the predetermined area is a peripheral area on the left and right of the image frame.
  4. The computing unit is
    When the hue of the predetermined area changes beyond the color category due to the change of the image frame in the image data, the saturation or brightness of the luminaire to be controlled is determined from the information of the categorized color category. to create the illumination control data, such as changing the color, and thus reduce, illumination system as claimed in claim 1.
  5. The said lighting fixture is comprised from the fluorescent lamp or LED allocated to R (red), G (green), and B (blue), The illumination system as described in any one of Claim 1 to 4 .
  6. The lighting system according to any one of claims 1 to 5 , wherein the lighting fixture that illuminates the back direction of the video display device is a lighting fixture that illuminates the back direction by indirect illumination.
  7. An image display device comprising the illumination system according to claim 6 .
  8. A lighting control method for controlling a lighting fixture in conjunction with image data displayed on an image display device,
    Cutting out a predetermined area from an image frame of the image data;
    When the hue of a predetermined area cut out from the image frame changes due to a change in the image frame of the image data, the hue is reduced after reducing the saturation or lightness of the luminaire corresponding to the cut out predetermined area. Creating the lighting control data to be changed ;
    Controlling the plurality of lighting fixtures according to the lighting control data;
    Including a lighting control method.
JP2000163828A 2000-05-31 2000-05-31 Lighting system, video display device, and lighting control method Active JP4399087B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000163828A JP4399087B2 (en) 2000-05-31 2000-05-31 Lighting system, video display device, and lighting control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000163828A JP4399087B2 (en) 2000-05-31 2000-05-31 Lighting system, video display device, and lighting control method

Publications (3)

Publication Number Publication Date
JP2001343900A JP2001343900A (en) 2001-12-14
JP2001343900A5 JP2001343900A5 (en) 2006-04-13
JP4399087B2 true JP4399087B2 (en) 2010-01-13

Family

ID=18667530

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000163828A Active JP4399087B2 (en) 2000-05-31 2000-05-31 Lighting system, video display device, and lighting control method

Country Status (1)

Country Link
JP (1) JP4399087B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104206020A (en) * 2012-03-01 2014-12-10 皇家飞利浦有限公司 Methods and apparatus for interpolating low frame rate transmissions in lighting systems
KR101544069B1 (en) * 2012-08-07 2015-08-12 엘지디스플레이 주식회사 A light emitting diode display and method for driving the same

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0211898D0 (en) * 2002-05-23 2002-07-03 Koninkl Philips Electronics Nv Controlling ambient light
BR0305349A (en) * 2002-07-04 2004-10-05 Koninkl Philips Electronics Nv Method and system for controlling ambient light and lighting unit
DE10252698B3 (en) * 2002-11-13 2004-08-12 Loewe Opta Gmbh Electronic entertainment device with reception and display of electronic program guide data listed according to different program themes
JP2007519995A (en) * 2004-01-05 2007-07-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ambient light derived from video content by mapping transformation via unrendered color space
WO2005069638A1 (en) * 2004-01-05 2005-07-28 Koninklijke Philips Electronics, N.V. Flicker-free adaptive thresholding for ambient light derived from video content mapped through unrendered color space
JP4556022B2 (en) * 2004-04-09 2010-10-06 クロイ電機株式会社 Illumination system for preventing desaturation and illumination method for preventing desaturation
JP4730813B2 (en) * 2005-03-29 2011-07-20 Kddi株式会社 Moving image data classification device
JP2007025111A (en) * 2005-07-14 2007-02-01 Sharp Corp System, method and program for presenting video image, and recording medium thereof
JPWO2007052395A1 (en) * 2005-10-31 2009-04-30 シャープ株式会社 Viewing environment control device, viewing environment control system, viewing environment control method, data transmission device, and data transmission method
CN101331802B (en) * 2005-12-15 2016-10-12 皇家飞利浦电子股份有限公司 For creating the system and method for artificial atmosphere
CN101438579B (en) * 2006-03-31 2012-05-30 皇家飞利浦电子股份有限公司 Adaptive rendering of video content based on additional frames of content
CN101427577A (en) * 2006-04-19 2009-05-06 夏普株式会社 Data transmitting device, data transmitting method, audiovisual environment control device, audiovisual environment control system and audiovisual environment control method
EP2040472A4 (en) * 2006-06-13 2009-11-11 Sharp Kk Data transmitting device, data transmitting method, audio-visual environment control device, audio-visual environment control system, and audio-visual environment control method
CN101480106B (en) * 2006-06-26 2012-07-04 皇家飞利浦电子股份有限公司 Device for generating light
US8111004B2 (en) * 2006-06-27 2012-02-07 Koninklijke Philips Electronics N.V. Color navigation system
JP2010511986A (en) * 2006-12-08 2010-04-15 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ambient lighting
JP4948548B2 (en) * 2006-12-28 2012-06-06 シャープ株式会社 Transmission device, viewing environment control device, and viewing environment control system
US20110316426A1 (en) * 2006-12-28 2011-12-29 Sharp Kabushiki Kaisha Audio-visual environment control device, audio-visual environment control system and audio-visual environment control method
JP2008193605A (en) * 2007-02-07 2008-08-21 Pioneer Electronic Corp Irradiation state control method
WO2008110973A1 (en) * 2007-03-13 2008-09-18 Philips Intellectual Property & Standards Gmbh Method of controlling the lighting of a room in accordance with an image projected onto a projection surface
JP4922853B2 (en) * 2007-07-12 2012-04-25 シャープ株式会社 Viewing environment control device, viewing environment control system, and viewing environment control method
AT507705T (en) * 2007-08-17 2011-05-15 Koninkl Philips Electronics Nv Device and method for dynamic color changing
JP2009060541A (en) * 2007-09-03 2009-03-19 Sharp Corp Data transmission device and method, and viewing environment control device and method
JP5074864B2 (en) * 2007-09-03 2012-11-14 シャープ株式会社 Data transmission device, data transmission method, viewing environment control device, viewing environment control system, and viewing environment control method
JP2009122367A (en) * 2007-11-14 2009-06-04 Sharp Corp Image display device and image display method
BRPI0916465A2 (en) * 2008-07-15 2018-02-06 Sharp Kk data transmission apparatus, data transmission method, audiovisual environment control apparatus, audiovisual environment control system and audiovisual environment control method
KR20110042067A (en) * 2008-07-15 2011-04-22 샤프 가부시키가이샤 Data transmission device, data reception device, method for transmitting data, method for receiving data, and method for controlling audio-visual environment
JP2010041249A (en) * 2008-08-01 2010-02-18 Sharp Corp Dimming method, video processor, display device and dimming calculator
WO2010087153A1 (en) * 2009-01-27 2010-08-05 シャープ株式会社 Data structure for perception effect information, device for outputting perception effect information, method of outputting perception effect information, perception effect information generating device, control device, system, program and recording medium
WO2010087155A1 (en) * 2009-01-27 2010-08-05 シャープ株式会社 Data transmission device, data transmission mthod, audio-visual environment control devcice, audio-visual environment control method, and audio-visual environment control system
JP5426943B2 (en) * 2009-06-29 2014-02-26 パナソニック株式会社 Dimming control device for LED and lighting apparatus using the same
JP5622372B2 (en) * 2009-09-10 2014-11-12 任天堂株式会社 Image display system and lighting device
JP2011086437A (en) * 2009-10-14 2011-04-28 Nintendo Co Ltd Image display system, lighting system, information processing device, and control program
US9220158B2 (en) 2009-12-17 2015-12-22 Koninklijke Philips N.V. Ambience cinema lighting system
JP2013008453A (en) * 2011-06-22 2013-01-10 Panasonic Corp Lighting system
CN103249214B (en) * 2012-02-13 2017-07-04 飞利浦灯具控股公司 The remote control of light source
US9641725B2 (en) * 2012-11-27 2017-05-02 Philips Lighting Holding B.V. Use of ambience light for copy protection of video content displayed on a screen
CN105210454B (en) * 2013-05-16 2018-06-08 飞利浦灯具控股公司 The calibration based on camera of ambient lighting system
JP2015101189A (en) * 2013-11-25 2015-06-04 パイオニア株式会社 Onboard display device, head up display, control method, program, and memory medium
JP5965434B2 (en) * 2014-06-17 2016-08-03 任天堂株式会社 Image display system, lighting system, information processing apparatus, and control program
CN105245793B (en) * 2015-09-21 2018-08-14 广东小明网络技术有限公司 A kind of generation method and device of lamp control file

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104206020A (en) * 2012-03-01 2014-12-10 皇家飞利浦有限公司 Methods and apparatus for interpolating low frame rate transmissions in lighting systems
CN104206020B (en) * 2012-03-01 2016-10-26 皇家飞利浦有限公司 Method and apparatus for the transmission of interpolation low frame rate rate in the illumination system
KR101544069B1 (en) * 2012-08-07 2015-08-12 엘지디스플레이 주식회사 A light emitting diode display and method for driving the same
US9390677B2 (en) 2012-08-07 2016-07-12 Lg Display Co., Ltd. Light emitting diode display device with image data dependent compensation and method for driving the same

Also Published As

Publication number Publication date
JP2001343900A (en) 2001-12-14

Similar Documents

Publication Publication Date Title
JP6561098B2 (en) Scalable system to control color management including various levels of metadata
US10327021B2 (en) Display management server
JP5792369B2 (en) method and apparatus for image data conversion
US8767006B2 (en) Method for producing a color image and imaging device employing same
US9635377B2 (en) High dynamic range image processing device and method
EP2539884B1 (en) Display management methods and apparatus
US8130235B2 (en) Apparatus and method of automatically adjusting a display experiencing varying lighting conditions
RU2471214C2 (en) Apparatus for controlling liquid crystal display, liquid crystal display, method of controlling liquid crystal display, program and data medium
RU2454023C2 (en) Adaptive play of television content based on additional content frames
TWI338271B (en)
EP1532607B1 (en) Method and apparatus for processing video pictures improving dynamic false contour effect compensation
KR100591386B1 (en) Illuminator, projection display device, and method for driving the same
KR101044709B1 (en) Method for extracting and processing video content encoded in a rendered color space to be emulated by an ambient light source
JP3878030B2 (en) Image display device and image display method
DE60126554T2 (en) Image display system, image processing method and information storage medium
EP1407445B1 (en) System and method of data conversion for wide gamut displays
JP5301161B2 (en) Field sequential display of color images
US7091941B2 (en) Color OLED display with improved power efficiency
KR100943274B1 (en) Color signal correction device and a method, and image processing system using the same and a method thereof
KR100843090B1 (en) Apparatus and method for improving a flicker for images
US8441498B2 (en) Device and method for processing color image data
ES2700874T3 (en) Apparatus and methods for color display devices
RU2609760C2 (en) Improved image encoding apparatus and methods
CN1232103C (en) Signal processing unit and liquid crystal display device
US7894000B2 (en) Dominant color extraction using perceptual rules to produce ambient light derived from video content

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060228

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060228

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090202

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090226

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090417

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20091002

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20091026

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121030

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131030

Year of fee payment: 4