WO2005002240A1 - 表示特性補正データの算出方法、表示特性補正データの算出プログラム、表示特性補正データの算出装置 - Google Patents
表示特性補正データの算出方法、表示特性補正データの算出プログラム、表示特性補正データの算出装置 Download PDFInfo
- Publication number
- WO2005002240A1 WO2005002240A1 PCT/JP2004/008919 JP2004008919W WO2005002240A1 WO 2005002240 A1 WO2005002240 A1 WO 2005002240A1 JP 2004008919 W JP2004008919 W JP 2004008919W WO 2005002240 A1 WO2005002240 A1 WO 2005002240A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- correction data
- image
- area
- test pattern
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/04—Diagnosis, testing or measuring for television systems or their details for receivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- Display characteristic correction data calculation method display characteristic correction data calculation program, display characteristic correction data calculation device
- the present invention relates to a display characteristic correction data calculation method for calculating correction data for display characteristics of an image display device based on photographing data of a test pattern, a display characteristic correction data calculation program, and a display characteristic correction data calculation device. About.
- the image display device may be configured with only one video device, but a device configured by combining a plurality of video devices in order to achieve a larger screen is also known.
- a projection system in which a plurality of projectors is used to project a partial image from each projector to form one large image on one screen. Has been.
- image distortion, image position shift, color unevenness, brightness unevenness, white balance shift, non-optimality of gamma characteristics, etc. may occur in each video device.
- a geometric deviation, a color deviation, a luminance deviation, a difference in gamma characteristics, and the like may occur.
- Non-uniformity in display characteristics such as geometric characteristics, color characteristics, luminance characteristics, white balance characteristics, and gamma characteristics can be visually perceived and impair the overall image quality. there were.
- a test pattern is displayed on an image display device, the image is photographed by a photographing device such as a digital camera, and correction data for correcting display characteristics based on the photographed image is obtained.
- a photographing device such as a digital camera
- Japanese Patent Application Laid-Open No. Hei 9-326981 discloses a technique in which a camera is installed in front of a screen, a test pattern is displayed on the screen, and the displayed test pattern is photographed. And calculates correction data based on the captured image, and based on the correction data A technology is described in which an input image is corrected and output and displayed, for example, to perform geometric correction and the like.
- International Publication No. WO 99/3187 7 describes a screen condition monitoring camera installed in a multi-projection device in which one screen is formed on a screen using a plurality of projectors each projecting a partial image.
- a technique for performing geometric deformation and local color correction of the partial image based on image information input from the camera is described.
- the imaging device may not be arranged at a distance such that the entire display area of the image display device can be accommodated in the screen.
- the entire image display area can be photographed, at least a part of the photographed image data may not have correct display characteristics.
- the image display device is of a projection type and a frame is provided in the screen, a subtle shadow or the like is generated in the vicinity of the frame, and accurate display is caused by the influence. For example, it is not possible to capture characteristics.
- the present invention has been made in view of the above circumstances, and the display characteristics of the entire image are obtained even when the imaging data obtained by imaging the test pattern does not normally include all the images related to the test pattern data. It is an object of the present invention to provide a display characteristic correction data calculation method, a display characteristic correction data calculation program, and a display characteristic correction data calculation device capable of calculating correction data and displaying a high-quality image. The purpose is. Disclosure of the invention
- the calculation method of the display characteristic correction data according to the first invention is a method of displaying a test pattern on an image display device based on the test pattern data, capturing the test pattern to obtain shooting data, and obtaining the shooting data based on the obtained shooting data.
- Correction data for correcting the display characteristics of the image display device A method of calculating display characteristic correction data, wherein, when the acquired photographing data does not normally include all of the images related to the test pattern data, the area that does not normally include the image.
- the complement target area is set so as to include the complement target area, and the complement target area is complemented based on the area other than the complement target area. This is a method of calculating the correction data,
- the method for calculating display characteristic correction data according to the second invention is the method for calculating display characteristic correction data according to the first invention, wherein the test pattern data is generated prior to displaying the test pattern on the image display device. Is the way.
- the method for calculating display characteristic correction data according to a third aspect of the present invention is the method for calculating display characteristic correction data according to the first or second aspect, wherein the display characteristic is a geometric characteristic, a color characteristic, a luminance characteristic, a white balance characteristic, A method that includes at least one of the following: a gamma characteristic.
- the calculation of the correction data of the complement target area is performed in a manner other than the above-mentioned area to be complemented.
- the above-described test is performed based on the acquired photographing data. This is performed by calculating correction data for all the images related to the pattern data.
- the method for calculating display characteristic correction data according to a fifth aspect of the present invention is the method for calculating display characteristic correction data according to the first to third aspects, wherein the calculation of the correction data for the complement target area is performed on the completion target area. Calculating the correction data of the area other than the complement target area based on the imaging data of the other area, and complementing the correction data of the complement target area based on the calculated correction data of the area other than the complement target area. This is done by
- the calculation of the correction data of the complement target area is performed based on the imaging data. Correction data of the entire region of the image related to the photographing data, and compensating the correction data of the complement target region based on the correction data of the region other than the complement target region in the calculated correction data. What is done.
- a method for calculating display characteristic correction data according to a seventh aspect of the present invention is the method for calculating display characteristic correction data according to the first aspect to the sixth aspect, wherein the image relating to the photographing data is set prior to setting the complementing target area. Displayed, and the above-mentioned area to be complemented in response to a manual operation on the displayed image. Set the area.
- the method for calculating display characteristic correction data according to an eighth aspect of the present invention is the method for calculating display characteristic correction data according to any of the first to sixth aspects, wherein the complementing target area analyzes the photographing data. Based on the result of recognizing an area that does not normally include an image related to pattern data, the area is automatically set to include the area.
- a method for calculating display characteristic correction data according to a ninth aspect of the present invention is the method for calculating display characteristic correction data according to the eighth aspect, wherein the analysis of the imaging data is performed between a plurality of imaging data corresponding to a plurality of test pattern data. Is performed by comparing.
- the method for calculating display characteristic correction data according to a tenth aspect of the present invention is the method for calculating display characteristic correction data according to the sixth aspect, wherein the complement target area includes correction data for an entire area of the image related to the photographing data.
- the analysis is automatically set so as to include the test pattern data based on the recognition result of the area that does not normally include the image related to the test pattern data.
- the method for calculating display characteristic correction data according to the eleventh aspect of the present invention is the method for calculating display characteristic correction data according to the tenth aspect, wherein the analysis of the photographing data includes a plurality of test pattern data corresponding to a plurality of test pattern data. This is performed by comparing the correction data of all the regions of the image related to the photographing data.
- the method for calculating display characteristic correction data according to the twelfth aspect of the present invention is the method for calculating display characteristic correction data according to any of the first to sixth aspects, wherein the obstacle detection device uses an obstacle detection device prior to setting the complement target area. An object is detected, and the complement target area is determined based on a result of recognizing an image area corresponding to the detected obstacle as an area that does not normally include an image related to the test pattern data. It is automatically set to include.
- a method for calculating display characteristic correction data according to a thirteenth aspect of the present invention is the method for calculating display characteristic correction data according to the first aspect of the present invention, wherein the complementation of the data of the complementing target area is performed for an area other than the capture target area. This is performed by copying data.
- a method for calculating display characteristic correction data according to a fourteenth aspect of the present invention is the method for calculating display characteristic correction data according to the first aspect of the present invention, wherein the complementation of the data of the complement target area is performed by using data of an area other than the complement target area. Is calculated based on a predetermined correlation.
- a method for calculating display characteristic correction data according to a fifteenth aspect of the present invention is the method for calculating display characteristic correction data according to the fifteenth aspect, wherein the predetermined correlation is within the complement target area.
- a method for calculating display characteristic correction data according to a sixteenth aspect of the present invention is the method for calculating display characteristic correction data according to the first aspect, wherein the image display device comprises: a projector for projecting an image; and an image projected by the projector. And a projection device.
- a method for calculating display characteristic correction data according to a seventeenth aspect of the present invention is the method for calculating display characteristic correction data according to the first aspect, wherein the image display device comprises: a plurality of projectors each projecting a partial image; A screen for displaying an image projected by the projector, and arranging the partial images projected on the screen by the projector while overlapping the partial images adjacent to each other at an edge thereof with a superimposed region.
- the image display device comprises: a plurality of projectors each projecting a partial image; A screen for displaying an image projected by the projector, and arranging the partial images projected on the screen by the projector while overlapping the partial images adjacent to each other at an edge thereof with a superimposed region.
- a program for calculating display characteristic correction data according to an eighteenth aspect of the present invention is a program for causing a computer to display a test pattern on an image display device based on the test pattern data, photograph the test pattern, and acquire photographing data.
- An apparatus for calculating display characteristic correction data comprises: a photographing device for photographing a test pattern displayed on an image display device based on test pattern data to acquire photographing data; A calculation device for calculating correction data for correcting display characteristics of the display device, wherein the calculation device normally includes all of the images related to the test pattern data. If not, the complement target area is set to include the area that is not normally included, and the complement target area is complemented based on an area other than the complement target area. The correction data, including the correction data, for all of the images related to the test pattern data is calculated.
- FIG. 1 is a diagram showing an outline of a configuration for detecting a display characteristic of a multi-projection system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of a display characteristic correction system of the multi-projection system in the embodiment.
- FIG. 3 is a flowchart showing an outline of the operation of the display characteristic correction system of the multi-projection system in the embodiment.
- FIG. 4 is a diagram showing an example in which an obstacle is located in a shooting area in the embodiment.
- FIG. 5 is a diagram showing photographing data when a high-luminance test pattern is photographed in the presence of an obstacle in the embodiment.
- FIG. 6 is a view showing photographing data when a test pattern with low luminance is photographed in the presence of an obstacle in the embodiment.
- FIG. 7 is a diagram showing a state in which a complement target area in photographing data is set while providing a margin in the embodiment.
- FIG. 8 is a diagram illustrating an example of a state of a complementing target area in photographing data in the embodiment.
- FIG. 9 is a diagram showing a state in which the imaging data of the complement target area is complemented by using the photograph data of an area other than the complement target area in the embodiment.
- FIG. 10 is a diagram showing an example of a state of a complement target area in display characteristic correction data in the embodiment.
- FIG. 11 is a diagram showing how the display characteristic correction data of the complement target area is complemented by using the display characteristic correction data of an area other than the complement target area in the embodiment.
- FIG. 12 is a diagram for explaining a complementing method by weighting in the embodiment.
- FIG. 13 is a diagram showing a state where an obstacle is included in photographing data of a test pattern for correcting a geometric characteristic in the embodiment.
- FIG. 14 is a diagram showing how markers in a test pattern are complemented in the above embodiment.
- FIG. 15 is a diagram showing how markers are complemented in a test pattern when geometric distortion occurs in the above embodiment.
- FIG. 16 shows a photographing data of a test pattern for correcting color characteristics in the above embodiment.
- FIG. 17 is a diagram showing a state in which an obstacle is included in photographing data of a test pattern for correcting color characteristics in the embodiment.
- FIG. 18 is a diagram showing a state in which photographing data of a test pattern is divided into small blocks in the embodiment.
- FIG. 19 is a diagram showing a state in which the imaging data divided into small blocks includes an obstacle in the embodiment.
- FIG. 20 is a diagram illustrating a state in which the shooting range of the shooting device is smaller than the display area of the image display device in the embodiment.
- FIG. 21 is a diagram showing a state in which a structural frame exists in the display screen of the image display device in the embodiment.
- FIG. 1 to 21 show an embodiment of the present invention
- FIG. 1 is a diagram showing an outline of a configuration for correcting display characteristics of a multi-projection system.
- a multi-projection system which is an image display device, includes a plurality of projectors 1 including a plurality of LCD projectors or a plurality of DLP projectors, and still image data and moving image data supplied from a recording medium or a communication line. And an image processing device 2 for generating and outputting partial images projected by each of these projectors 1 based on the above, and a screen 5 on which the partial images from the projector 1 are projected.
- the partial image projected on the screen 5 by the projector 1 is arranged so as to overlap an adjacent partial image with a superimposed area on the periphery thereof so as to constitute one image as a whole. Things.
- such a multi-projection system may not have uniform display characteristics in each of the projectors 1 or in a relative relationship between the projectors 1.
- the display characteristic correction data calculation device in the display characteristic correction system is composed of an imaging device.
- the digital camera 4 is for photographing an image displayed in the display area of the screen 5 and generating electronic photographed image data.
- the digital camera 4 is used as the photographing device, but the present invention is not limited to this.
- the photographing was performed using a silver halide camera, and the photographing device was created directly from the developed film or from the film. It is also possible to generate captured image data by scanning a print with a scanner or the like.
- the imaging device may be a system that directly scans the display area of the screen 5 using, for example, a line sensor.
- an obstacle 6 such as a tree is present between the digital camera 4 and the screen 5.
- the obstacle 6 exists between the digital camera 4 which is a photographing device and the screen 5 which forms the display area of the image display device, and the test pattern displayed on the screen 5 by the digital camera 4 This means that at least a part of the test pattern information is missing from the captured image when the image is captured.
- the obstacle sensor 7 is used as one of the means for detecting such an obstacle 6.
- the obstacle sensor 7 may have various configurations, and some examples include an ultrasonic sensor, a laser beam, and the like.
- FIG. 1 shows an example in which the obstacle sensor 7 is arranged
- the obstacle 6 may be detected from, for example, photographing data without using the obstacle sensor 7 as described later. Is also possible.
- the computer 3 controls the entire system and performs various operations and processes as described below.
- the computer 3 generates a test pattern and outputs the test pattern to the screen 5 by the projector 1 via the image processing device 2.
- a display is performed, the displayed test pattern is imaged by the digital camera 4, and the acquired test pattern is acquired based on the photographing data acquired from the digital camera 4 and the obstacle position information from the obstacle sensor 7.
- the complementing target area is set so as to include the area that does not normally include the image.
- the complement based on the area of By performing the capture of the target area, correction data for all of the images related to the test pattern data is calculated.
- the computer 3 is, like a general personal computer (PC), a main body 3a containing a CPU, a memory, a hard disk, and the like, and a main body 3a connected to the main body 3a.
- a keyboard 3d connected to the main body 3a and used for inputting character data and the like.
- the digital camera 4, the obstacle sensor 7, and the computer 3 can be subsequently removed from the multi-projection system even if the computer 3 is removed from the multi-projection system.
- the system can display the corrected high-quality image.
- FIG. 2 is a block diagram showing a functional configuration of a display characteristic correction system of the multi-projection system.
- this display characteristic correction system has a control unit 11 for controlling the entire system and a test for generating test pattern data for obtaining display characteristic correction data (image correction data).
- a pattern generation unit 12 an image display unit 13 that performs display based on the test pattern data generated by the test pattern generation unit 12, and a test pattern displayed on the image display unit 13.
- a test pattern photographing section 14 for outputting photographing data, and a complementing target area manual setting section 15 for manually setting an area as a complementing target area when the test pattern in the above photographing data is not normally displayed.
- a complement target area automatic detection for automatically detecting, as a complement target area, an area where the test pattern in the photographing data is not normally displayed.
- Imaging that complements the imaging data of the complementing target area set by the section 16 and the complementing target area manual setting section 15 or the complementing target area automatic detection section 16 based on the imaging data of the area other than the complementing target area.
- an image correction unit 20 for correcting the image data to be displayed on the image display unit 13 based on the correction data generated by the control unit 9.
- the complementary target area manual setting unit 15 includes a test pattern photographed image display unit 22 for displaying photographed data of the test pattern output from the test pattern photographing unit 14 on the monitor 3b, and a monitor 3b.
- the complement target area is set according to the manual operation.
- the complementary target area specifying unit 23 and the complementary target area automatic detection unit 16 are configured to have a failure based on the test pattern photographing data output from the test pattern photographing unit 14.
- An obstacle detection unit 25 that detects an object, and a complement target area determination unit 26 that determines a complement target area based on the detection result of the obstacle detection unit 25 are configured.
- test pattern photographing section 14 corresponds to the digital camera 4
- image display section 13 corresponds to the projector 1 and the screen 5
- image correction section 20 corresponds to the image processing device 2.
- control unit 11 control unit 11
- test pattern generation unit 12 complement target area manual setting unit 15
- complement target area automatic detection unit 16 photographing data complement unit 17
- image correction data image correction data complementing unit 19
- control unit 11 test pattern generation unit 12
- complement target area manual setting unit 15 complement target area automatic detection unit 16
- photographing data complement unit 17 image correction data
- image correction data complementing unit 19 correspond to the computer 3.
- the test pattern photographed image display unit 22 corresponds to the monitor 3b
- the complement target area designation unit 23 corresponds to the mouse 3c (or keyboard 3d). I have.
- the obstacle detection unit 25 corresponds to this.
- FIG. 3 is a flowchart showing the outline of the operation of the display characteristic correction system of the multi-projection system as described above.
- test pattern generation section 12 based on the control of the control section 11, the test pattern generation section 12 generates test pattern data for correcting display characteristics (step S1).
- This test pattern The data is designed to clarify the display characteristics of the image display device, such as geometric characteristics, color characteristics, luminance characteristics, white balance characteristics, gamma characteristics, etc.
- test pattern data may be configured so that correction data for a plurality of display characteristics can be calculated with only one test pattern. However, different test pattern data may be generated for each display characteristic to be corrected. No problem.
- the test pattern data generated by the test pattern generator 12 is sent to the image display 13 and displayed as a test pattern (step S2). Specifically, the test pattern data is converted into partial image data corresponding to each projector 1 by the image processing device 2, and each projector 1 projects these partial image data, so that the test pattern is displayed on the screen 5. Is displayed.
- the test pattern displayed on the screen 5 is photographed by the test pattern photographing unit 14, that is, the digital camera 4, and is output as photographed data (step S3).
- the photographing data is stored in an internal buffer memory or a hard disk via an interface provided in the main body 3 of the computer 3.
- the display characteristic correction data calculation program executed in the computer 3 determines whether the complementing target area is to be set manually or automatically (step S4). This determination is based on, for example, whether the check mark on the operation screen displayed by the display characteristic correction data calculation program is attached to a check box for automatically setting a complement target area or a check box for manual setting. The judgment is made according to: Here, when it is determined that the manual setting is selected, the acquired photographing data is displayed on the test pattern photographed image display section 2 2 of the target area manual setting section 15, that is, the monitor 3 b, and The photographed test pattern can be confirmed on the monitor 3b (step S5).
- the CPU sets the complementing target area according to the operation (step S6).
- the shape used to specify the area to be complemented may be any shape such as a rectangle, a triangle, a circle, a free polygon, or a free curve, but a two-dimensional area is specified. It is necessary to use a shape that can be used. In addition, since there may be a plurality of obstacles 6, a plurality of areas can be set.
- step S4 If it is determined in step S4 that the automatic setting is selected, the obstacle detection unit 25 of the complement target area automatic detection unit 16 performs the obstacle detection, and as a result, The image area corresponding to the obstacle is recognized by the complementing area determination unit 26 as an area that does not normally include the image related to the test pattern data, and the complementing area is automatically set to include the area. Yes (step S7).
- the detection of the obstacle by the obstacle detection unit 25 may be performed based on the detection result by the obstacle sensor 7 as shown in FIG. 1 or by analyzing the photographing data. It may be performed, or both may be used.
- the complement target area determination unit 26 does not always designate only the area where an obstacle exists or its vicinity as the complement target, and is not suitable for use as image data for correcting display characteristics due to other factors.
- the area and the setting area for easily performing the arithmetic processing are also determined as the complement target areas.
- the projector 1 projects the test pattern on the screen 5
- all the images related to the test pattern data are displayed on the screen 5.
- the photographing data obtained by photographing the display area of the screen 5 does not include all the images related to the test pattern.
- display characteristic correction data is obtained based on such photographing data
- display characteristic correction data for all of the images related to the test pattern cannot be obtained.
- the display characteristics including the projection angle of view are adjusted later, the projection will be performed at the appropriate angle of view. Display characteristic correction data for all of them is required. Therefore
- a setting area for easily performing the arithmetic processing when the contour of the obstacle has a complicated shape, a case where the complementing target area is set to, for example, a rectangle so as to include the obstacle is given. You.
- the complement target area determination unit 26 determines that the above-described area is also a complement target area as necessary.
- the display characteristic correction data is calculated by performing the complementing process also on the complementing target region set in the above step S6 or step S7.
- the imaging data of the complementing target area is obtained by complementation, the imaging data of the entire area is calculated, and then the display characteristic correction data of the entire area is calculated.
- the other is to calculate the display characteristic correction data of the area other than the complement target area based on the photographing data of the area other than the complement target area, and to calculate the display characteristic correction data of the complement target area from the display characteristic correction data of the area other than the complement target area.
- step S8 it is determined whether the complement target is a captured image (that is, captured data) or not (step S8).
- this determination for example, whether the check mark of the operation screen displayed by the display characteristic correction data calculation program is attached to a check box for setting the complemented object to a captured image or a check box for setting the display characteristic correction data It is determined by
- the photographing data of the complementing target area is obtained by complementing the photographing data of the area other than the complementing target area (step S9).
- step S10 display characteristic correction data for all regions is calculated based on the calculated photographing data for all regions.
- step S8 it is determined whether or not the ability to calculate display characteristic correction data is based on all of the captured data including the complement target area (step S11). .
- step S12 when not all of the photographing data is used, display characteristic correction data is calculated based on a portion of the photographing data other than the complement target area (step S12).
- step SI1 when all the photographing data is used in step SI1, the display characteristic correction data is calculated based on the photographing data (step S13).
- the calculation of the display characteristic correction data can be performed not only in the region excluding the complement target region but also in all the photographing data without considering the complement target region.
- step S14 the display characteristic correction data is obtained (step S14), and this processing ends.
- step S4 If the setting to automatically detect the complementing target area is made in step S4, the actual detection processing is performed based on the photographing data in step S7.
- the present invention is not limited to this. And without setting the area to be complemented (without taking into account the obstacle 6) After calculating the display characteristic correction data for all the photographing data in step S13, the calculated display characteristic correction data It is also possible to automatically detect the complement target area based on the.
- the display characteristic correction data is created using such photographing data, correction data is generated for the low-brightness portion so as to greatly increase the brightness as compared with other portions. Therefore, the correction data may be analyzed, a data portion exceeding an appropriate threshold value may be regarded as a portion where an obstacle exists, and a complement target area may be set to include the portion.
- the complement target area is not limited to being set based on the imaging data. In this way, the operator can arbitrarily set the complementing target area, and calculate the final display characteristic correction data from the determination of the obstacle without the necessity of setting the complementing target area by the operator. It is also possible to automatically perform a series of steps up to this point.
- a data portion used for electronic image correction is transmitted to the image processing device 2, and the display target is displayed on the image processing device 2. Is corrected.
- a portion that requires manual adjustment is displayed as an adjustment item or the like on the monitor 3 b of the computer 3, and is displayed by an operator or a user. Adjustments will be made. Requires manual adjustment Examples of the portion include adjustment of the projection angle of view when the projection optical system of the projector 1 does not have the electric zoom function, or adjustment of the projection direction of each projector. Of course, these may be configured so that they can be adjusted electrically.However, considering the cost and weight of the entire system, it is more efficient to manually adjust some adjustments. Because it is possible.
- the test pattern data generated by the test pattern generation unit 12 it is possible to directly output the test pattern data generated by the test pattern generation unit 12 to the image display unit 13, but it is also possible to output the test pattern data via the image correction unit 20.
- the first test pattern data is output to the image display unit 13 without any correction by the image correction unit 20, and the display characteristic correction data calculated by this test pattern is used as an image.
- the second test pattern data (for example, it may be the same as the first test pattern data) is corrected by the image correction unit 20 and then displayed on the image display unit 13. If the display characteristic correction data is calculated again from the photographing data, the accuracy can be further improved. As described above, the adjustment may be performed recursively to increase the accuracy of correcting the display characteristics.
- FIG. 4 to FIG. 7 are diagrams for explaining an example of means for automatically detecting the complement target area.
- FIG. 4 shows that the obstacle 6 is located in the photographing area 31.
- FIG. 4 to detect the complement target area, for example, the following is performed.
- test pattern data corresponding to a high-brightness image is generated by the test pattern generation unit 12, displayed on the image display unit 13, and an image as shown in FIG. 5 is captured.
- FIG. 5 is a diagram showing photographing data when a test pattern having a high luminance is photographed in the presence of an obstacle.
- the photographing data 32 includes an image portion 6 a relating to the obstacle 6.
- test pattern data corresponding to the low-luminance image is generated by the test pattern generation unit 12 and displayed by the image display unit 13 to capture an image as shown in FIG.
- FIG. 6 is a diagram showing photographing data when a test pattern having a low luminance is photographed in the presence of an obstacle.
- the photographing data 33 includes an image portion 6 a related to the obstacle 6.
- the portion related to the test pattern should have a large change in luminance, and conversely, the portion having a small change in luminance should be the image portion related to the obstacle 6. 6a can be estimated. Therefore, as shown in FIG. 7, the complement target area 35 in the photographing data 34 is set while providing a small amount of a margin so as to include the image portion 6a related to the obstacle 6. ing .
- FIG. 7 is a diagram showing a manner in which a complement target area in the photographing data is set while providing a margin. Note that this margin may be such that the image portion 6a relating to the obstacle 6 may affect other image portions, or the obstacle 6 may be the subject of the digital camera 4 capturing the screen 5. This is set to remove blurring due to being out of the depth of field ⁇ , and it can be set to any width.
- the complementing target area 35 at this time can be set along the contour of the image portion 6a of the obstacle 6, but the image of the obstacle 6 is taken into consideration in view of ease of processing and the like. Any shape (for example, a rectangular shape) including the portion 6a may be adopted.
- test patterns high and low luminance
- these test patterns are compared (for example, by taking a difference between both photographed data) to obtain an image portion 6 a of the obstacle 6.
- a high-luminance test pattern is displayed, and a portion having a predetermined luminance or lower in the photographed data is estimated as the obstacle 6.
- the present invention is not limited to the case where the brightness of the test pattern to be displayed is different, but the obstacle 6 can be detected by changing the color of the test pattern to be displayed or displaying the test pattern of a specific texture. It is.
- the obstacle 6 since the obstacle 6 is located closer to the digital camera 4 than the screen 5, the obstacle 6 may be automatically detected by distance detection using AF technology. It is also possible to automatically detect obstacle 6 using image recognition technology.
- image recognition technology In particular, when the shape of an obstacle in a captured image is roughly known in advance, the shape of the obstacle can be detected by a generally known shape recognition technique for an image such as pattern matching. It is advisable to select one that is suitable for the target and use it to detect the area occupied by obstacles.
- FIG. 8 is a diagram showing an example of a state of a complementing target area in photographing data
- FIG. 9 is a diagram showing a state of complementing photographing data of the complementing target area using photographing data of an area other than the complementing target area. This complementing processing of the photographing data is performed in step S9 shown in FIG. W there.
- FIG. 9 shows an enlarged view of the complement target area 6b in FIG.
- taking the photographing data at each point of interest in the complementation target area 6b into consideration the distance between the point of interest and the surrounding neighborhood is taken into account based on the photographing data of the vicinity near the outside of the capture target area 6b. Perform supplementary processing while performing weighting.
- the contribution from the vicinity of the center above the center point 38 (shown by a downward arrow 39d) and the center point 38 Is equal to the contribution from the vicinity below (indicated by an upward arrow 39u), and similarly, the contribution from the vicinity of the center point 38 to the left (to the right arrow 39r). ) And the contribution from the vicinity of the center point 38 to the right (shown by a leftward arrow 391) are equivalent.
- the weight is increased, and if it is far from the neighborhood, the weight is reduced. A specific example of this weighting will be described later with reference to FIG.
- the complementing process is performed on the complementing target region 6c in the same manner as the complementing target region 6b.
- the correlation between the imaging data is estimated and complemented based on the distance between the point of interest and the vicinity, but in the case of test pattern data in which a specific pattern is repeated, the It is good to estimate the correlation in consideration of and complement it.
- the complementing process of the photographing data may be performed in units of pixels constituting the image, but the complementing process performed here is a process for calculating the display characteristic correction data. It is not necessary to perform processing with detailed precision such as units. Therefore, for example, a plurality of pixels may be combined into one block, and the complementing process may be performed in units of this block. In this case, there is an advantage that the processing time can be significantly reduced.
- FIG. 10 is a diagram showing an example of a state of the complement target area in the display characteristic correction data.
- FIG. 11 is a view showing the display characteristic correction data of the complement target area using the display characteristic correction data of the area other than the complement target area. It is a figure showing a situation that complements. This complementation processing of the display characteristic correction data is performed in step S14 shown in FIG.
- the display characteristic correction data is calculated based on the photographing data of the area other than the complement target area
- the display characteristic correction data for the area other than the complement target area is obtained. Suppose that this is as shown in FIG. 10, for example.
- a predetermined number of pixels constituting an image are collectively treated as a block, and display characteristic correction data is calculated using this block as a unit.
- display characteristic correction data 41 At the left end of the display characteristic correction data 41, there is a vertically long complementation target area 6e, and at the middle right of the display characteristic correction data 41, a substantially L-shaped complementation target is provided. Region 6 d exists.
- the actual display characteristic correction data is a set of display characteristic correction data corresponding to each position on the image (each block position, or each pixel position when one block is defined as one pixel).
- FIGS. 10 and 11 show a visual illustration corresponding to the same position of the image data.
- the complementation processing is performed on the display characteristic correction data.
- the processing for complementing the 6d display characteristic correction data will be described with reference to FIG.
- the display characteristic correction data of the block adjacent to the target block is copied, for example.
- the display characteristic correction data of the block 42 completely included in the complement target area 6 d is converted to the block closest to the block 42 in the area other than the complement target area 6 d. 4 Complements by copying from the display characteristic correction data of 3.
- the plot closest to the block 45 in the area other than the complement target area 6d is also included. Compensation is performed by copying from the display characteristic correction data in step 46.
- the block adjacent to the right side of the block 44 is the professional property of the area other than the complement target area 6 d. Since the blocks are closest to each other, the display characteristic correction data of the block may be used. However, in order to simplify the processing, the display characteristic correction data of the block 46 is described above. When copying to the block 45, a process of copying to the block 44 collectively is also possible.
- How to perform the processing at this time depends on the type of display characteristics to be corrected, the generated test pattern, or the size and shape of the complement target area. (That is, processing that requires a short processing time and obtains accurate results) may be selected.
- the display characteristic correction data of the area other than the complementing target area 6 d that is close to the complementing target area 6 d is copied to supplement the display characteristic correction data of the complementing target area 6 d.
- the present invention is not limited to this, and the weighting may be used to complement the display characteristic correction data of multiple blocks according to the correlation between the block to be complemented and the block to be complemented, such as the distance and pattern. Of course, it does not matter.
- the above-described process of complementing the display characteristic correction data by copying is a process suitable for use when, for example, the display characteristics are color characteristics.
- FIG. 12 is a diagram for explaining a complementing method by weighting.
- the complementing method by weighting as shown in FIG. 12 may be applied to the complementing process of the photographing data, or may be applied to the complementing process of the display characteristic correction data.
- the data of the block of interest (or pixel of interest) is calculated by weighting the data of the blocks (or pixels) in the vicinity of the block with the distance as a correlation. Things.
- the data 51 the area surrounded by the thick line in the figure is the complement target area 6f.
- a case of complementing, for example, a block 52 at the upper left corner in the complement target area 6f will be described.
- a block in an area adjacent to the block 52 and other than the complement target area 6f is a block 53 on the upper side of the block 52, a block 54 on the left side, and a block 55 on the upper left side, It is. Therefore, the data of the block 52 is complemented based on the data of the blocks 53, 54, and 55. At this time, the data of the blocks 53 and 54 are referred to as 2 in consideration of the distance. If we consider it as a percentage, we will consider the data in blocks 55 as a ratio of one. That is, assuming that the data of the blocks 53, 54, 55 are X, ⁇ , Z, respectively, the data of the block 52 is calculated as (2X + 2Y + Z) Z5.
- the data of block 52 is calculated by taking a weighted average, but if the increase or decrease of the data is not linear, it is needless to say that a calculation method suitable for that is used. .
- calculation method described here can be applied to any of the case where the data is photographing data and the case where the data is display characteristic correction data.
- FIG. 13 is a diagram showing how the obstacles are included in the image data of the test pattern for correcting the geometric characteristics
- Fig. 14 is a diagram showing how markers in the test pattern are complemented
- Fig. 15 is FIG. 9 is a diagram illustrating a state of complementation of a marker in a test pattern when a geometric distortion occurs.
- test pattern generation unit 12 When correcting the geometric characteristics, the test pattern generation unit 12 generates test pattern data in which, for example, cross-shaped markers are two-dimensionally arranged at appropriate intervals.
- test pattern photographing unit 14 When such a test pattern is displayed on the image display unit 13, if the obstacle 6 is present, part of the marker will not be photographed by the test pattern photographing unit 14.
- the shaded portion is the complement target area 6g including the portion where the obstacle 6 exists.
- the marker 62 originally supposed to be included in the capture target area 6 g is lost.
- the photographing data complementing unit 17 converts the coordinate information of the missing marker 62 (more specifically, the coordinate information of the point where the cross shape of the crosses crosses) into a region other than the complementing target area 6 g of the photographing data 61. Is calculated on the basis of the part. First, the coordinate detection of the marker 62 included in the imaging data 61 other than the complement target area 6 g is performed as follows.
- a marker detection area 63 which is considered to include the marker 62, is set for each marker 62, and within the marker detection area 63 Performs automatic detection of markers 62 respectively.
- the reason why the marker detection area 63 is set is that the processing time can be reduced as compared with the case where all areas are set as detection targets. Thereby, the coordinates of the marker 62 other than the complement target area 6 g are detected.
- each marker can be designated using a row index and a column index.
- the missing one of such indices is the index of the missing marker 62 in the complementation target area 6g. For example, in the example shown in Fig. 13, two markers 62 are missing, but if the index is represented as (row, column), one of them will become an index (4, 4). And the other is an index (5, 4).
- the missing marker 6 2 (or one of the missing markers if there are more than one) is specified by the row index and the column index as described above, and the target missing marker 6 2 is specified.
- a marker 62 having the same row index as 2 is selected as a first marker group, and a marker 62 having the same column index as the missing marker 62 of interest is selected as a second marker group.
- a straight line 63 h in the horizontal direction is estimated, and similarly, the straight line 63 h is included in the second marker group.
- a straight line 63V in the vertical direction is estimated.
- the coordinates 62 a of the point where the two straight lines 63 h, 63 V thus estimated intersect are the estimated values of the coordinates of the missing marker 62 of interest.
- the curves 63 h which are arranged in the horizontal direction, are estimated by, for example, spline interpolation. Then, a straight line 6 3 V in the vertical direction estimated based on each coordinate of the marker 62 included in the second marker group, and a coordinate 6 2 b of a point where this curve 63 h intersect with The coordinates of the missing marker 62 of interest can be estimated.
- spline interpolation has been described as an example of the algorithm for estimating a curve, but the present invention is not limited to this, and various algorithms such as Ladranger capture may be employed.
- the algorithm used for such complementation includes the characteristics of the display surface of the image display unit 13 (for example, a concave surface or a convex surface) and the imaging characteristics of the test pattern imaging unit 14 (for example, the imaging optical system). It is desirable to use the optimum one according to the distortion etc.).
- the position of the marker 62 in the entire image is determined.
- the geometric correction data is calculated so that these markers 62 become the original display positions.
- a technique as described in the above-mentioned Japanese Patent Application Laid-open No. Hei 9-326981 may be used.
- FIGS. Fig. 16 shows the image data of the test pattern for correcting the color characteristics
- Fig. 17 shows the image data of the test pattern for correcting the color characteristics including an obstacle
- FIG. 18 is a diagram showing a state in which photographing data of a test pattern is divided into small blocks
- FIG. 19 is a diagram showing a state in which photographing data divided into small blocks includes an obstacle.
- color information measurement areas 72 to 76 set in imaging data 71 as shown in Fig. 16 are used, which are different from those when performing geometric correction and luminance correction. . If any one of these color information measurement areas 72 to 76 is obstructed by an obstacle, at least partially, that is, as shown in FIG. 17, the color information measurement area 7 If 2 to 76 are lost even in the part, the color correction will be affected.
- the photographing data 71 may be affected by the flare of the optical system of the photographing device.
- the obstacle portion may affect the color information measurement areas 72 to 76, and it may not be possible to accurately obtain color correction data.
- the photographing data is complemented as in step S9 in FIG. 3
- the photographing data is as shown in Fig. 17, that is, the color information measurement area 72 located in the center of the color information measurement areas 72 to 76 is blocked by the obstacle 6,
- the complementing target area 77 is set while providing a margin so as to include the obstacle image 6 h.
- the setting of the complement target area 77 is performed manually in step S6, for example, the pointer on the screen is moved using the mouse 3c or the like as shown in the figure, and, for example, four points 7 7 Set the area by clicking on a, 77b, 77c, or 77d.
- a triangular or polygonal area can be set by increasing or decreasing the number of points.
- the width of the margin at this time includes, for example, a region that may be affected by the above-described flare or the like.
- the complement target area is set manually, but it is also possible to perform the automatic setting as described above.
- the photographing data is complemented, for example, by the method described with reference to FIG.
- the color information measurement area is circular.
- the color information measurement area is not limited to this.
- the color information measurement area may be rectangular, and other shapes may be employed.
- the size of one color information measurement area can be changed.
- the number of color information measurement areas is five in the example shown in FIG. 16 above, but it can be reduced or reduced, that is, one area can cover the entire area. Up to the number can be set arbitrarily.
- FIG. 18 shows an example in which the number of color information measurement areas indicated by the cells is increased (for example, maximized) to cover the entire area 78.
- the image 6 h obstacles exists as shown in FIG. 1 9, and set 1 constant the area to be complemented 7 7 may be performed in the same manner complementary processing.
- the luminance characteristic when correcting the luminance characteristic as a display characteristic, it can be used as a test pattern for correcting the geometric characteristic and a test pattern for correcting the color characteristic as described above, and is composed of only white.
- a test pattern (or a test pattern incorporating grid points for determining the coordinate position) may be used.
- the target to be complemented is manually or automatically as described above.
- the area may be set, and the set complementing target area may be complemented by copying or complementing in consideration of the correlation, so that the luminance may be uniform.
- the luminance correction amount is handled by the eleven correction coefficient of the ⁇ value, and when the distance as described above is used as the correlation, when complementing the block 52 in FIG.
- the calculation formula (2 ⁇ + 2 ⁇ + ⁇ ) / 5 described above can be used.
- FIG. 20 is a diagram showing a state where the shooting range of the shooting device is smaller than the display area of the image display device.
- the imaging device 4 it is not possible to capture all the images displayed in the display area 5a of the image display device because an obstacle, which is a tangible object, is located between the imaging device 4 and the display area 5a of the image display device. Not just if it exists. As shown in FIG. 20, for example, when the installation position of the imaging device 4 is limited (that is, when the imaging device 4 cannot be further moved away from the image display device in a room or the like), the display of the image display device is In some cases, the entire area 5a cannot be captured within the shooting range 4a.
- the image display device is of the projection type
- the luminous flux to be projected is wider than the screen, and the test pattern data cannot be displayed entirely on the screen. In some cases.
- FIG. 21 is a diagram showing a state in which a structural frame exists in the display screen of the image display device.
- the test pattern is not displayed in the portion of the frame 81 arranged on the screen 5, and the test pattern displayed near the frame 81 is also 8 It may be affected by the shadow of 1.
- Examples of the case corresponding to FIG. 21 include a case where a frame structure is formed using a frame member to maintain the flatness of the screen 5 and a case where a large screen is formed by arranging a plurality of monitors and the like. And so on. Even in the cases shown in FIGS. 20 and 21, the part of the image related to the test pattern data that was not photographed by the photographing device (or the part with a margin) was complemented.
- the display characteristic correction data for the entire range of the image related to the test pattern data by performing the supplementary processing as described above.
- the case where the acquired photographing data does not normally include all of the images related to the test pattern data is not limited to these.
- the reflected portion when reflected by indoor illumination light, sunlight, natural light in the sky, reflected light of waves, etc. is included.
- data is radially copied from a center of an image where normal photographing data is obtained to a periphery of a screen where photographing data is not normal. And so on.
- the case where the display area of the image display device has a curved shape is, for example, a case where the screen is formed in a concave shape, a case where the display region of the image display device using a CRT is a convex shape, for example, Specific examples thereof include an arch-shaped screen arranged along a cylindrical wall surface, and a hemispherical dome-shaped screen used for, for example, a planetary room.
- the present invention is not limited to this.
- the setting is performed automatically first, and then, manually. It may be changed to. According to this, it is possible to perform more accurate area setting using human vision while reducing the burden on the operator.
- the setting of the obstacle 6 only needs to be performed once unless the setting is changed. However, if necessary, for each type of display characteristic to be corrected (that is, when the luminance characteristic is to be corrected, Can be set for each of the following cases: when is to be corrected, when color characteristics are to be corrected, and so on.
- the projector 1 and the screen 5, which are projection-type image display devices, are used as the image display unit.
- the present invention is not limited to this.
- a plasma display or a liquid crystal display is used. The same can be applied to the image display unit.
- the above-described system for correcting the display characteristic 1 is not applied only to the projection system, but may be an image display device such as a CRT monitor or a liquid crystal monitor, or the like.
- the present invention can be applied to a multi-type image display device configured by combining a plurality of these devices.
- the display characteristic correction data for the entire image is calculated even if the image related to the test pattern data is not normally included. It becomes possible.
- the display area of the image display device protrudes from the imaging range of the imaging device, or a structure such as a frame exists in the display area of the image display device. Even if it does, the display characteristic correction data for the entire image can be calculated without removing these factors, and a high-quality image can be displayed.
- the processing time can be reduced because the processing data is reduced.
- the display characteristic correction data is calculated. Since the processing up to data calculation can be performed as is using conventional software and equipment, accurate display characteristic correction data can be obtained by simply adding only the complementary part as a processing module of the add-on type. It can be calculated. Therefore, already developed software and equipment can be used effectively, and development costs can be reduced.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04746388A EP1638345A1 (en) | 2003-06-25 | 2004-06-18 | Method for calculating display characteristic correction data, program for calculating display characteristic correction data, and device for calculating display characteristic correction data |
US10/560,730 US20070091334A1 (en) | 2003-06-25 | 2004-06-18 | Method of calculating correction data for correcting display characteristic, program for calculating correction data for correcting display characteristic and apparatus for calculating correction data for correcting display characteristic |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-181793 | 2003-06-25 | ||
JP2003181793A JP2005020314A (ja) | 2003-06-25 | 2003-06-25 | 表示特性補正データの算出方法、表示特性補正データの算出プログラム、表示特性補正データの算出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005002240A1 true WO2005002240A1 (ja) | 2005-01-06 |
Family
ID=33549536
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/008919 WO2005002240A1 (ja) | 2003-06-25 | 2004-06-18 | 表示特性補正データの算出方法、表示特性補正データの算出プログラム、表示特性補正データの算出装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20070091334A1 (ja) |
EP (1) | EP1638345A1 (ja) |
JP (1) | JP2005020314A (ja) |
WO (1) | WO2005002240A1 (ja) |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005189542A (ja) * | 2003-12-25 | 2005-07-14 | National Institute Of Information & Communication Technology | 表示システム、表示プログラム、表示方法 |
JP4454373B2 (ja) * | 2004-04-08 | 2010-04-21 | オリンパス株式会社 | キャリブレーション用カメラ装置 |
EP1628492A1 (en) * | 2004-08-17 | 2006-02-22 | Dialog Semiconductor GmbH | A camera test system |
EP1648181A1 (en) | 2004-10-12 | 2006-04-19 | Dialog Semiconductor GmbH | A multiple frame grabber |
JP2007256506A (ja) * | 2006-03-22 | 2007-10-04 | Victor Co Of Japan Ltd | 画像投影装置 |
JP4716026B2 (ja) * | 2006-05-24 | 2011-07-06 | セイコーエプソン株式会社 | 投写装置、画像表示システム、プログラム、情報記憶媒体および投写方法 |
US20070286514A1 (en) * | 2006-06-08 | 2007-12-13 | Michael Scott Brown | Minimizing image blur in an image projected onto a display surface by a projector |
JP4340923B2 (ja) | 2007-02-23 | 2009-10-07 | セイコーエプソン株式会社 | プロジェクタ、プログラムおよび情報記憶媒体 |
CN101765877B (zh) * | 2007-06-13 | 2013-05-01 | 日本电气株式会社 | 图像显示设备、图像显示方法及图像显示程序 |
JP2009021771A (ja) * | 2007-07-11 | 2009-01-29 | Hitachi Ltd | 映像調整方法 |
US7986356B2 (en) * | 2007-07-25 | 2011-07-26 | Hewlett-Packard Development Company, L.P. | System and method for determining a gamma curve of a display device |
US20090027504A1 (en) * | 2007-07-25 | 2009-01-29 | Suk Hwan Lim | System and method for calibrating a camera |
US20090153749A1 (en) * | 2007-12-14 | 2009-06-18 | Stephen Randall Mixon | Portable projector background color correction scheme |
US20090167782A1 (en) * | 2008-01-02 | 2009-07-02 | Panavision International, L.P. | Correction of color differences in multi-screen displays |
JP2009244379A (ja) * | 2008-03-28 | 2009-10-22 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
US8125494B2 (en) * | 2008-04-03 | 2012-02-28 | American Panel Corporation | Method for mapping optical properties for a display device |
JP5321011B2 (ja) * | 2008-11-25 | 2013-10-23 | ソニー株式会社 | 画像信号処理装置、画像信号処理方法および画像投射装置 |
JP2011155412A (ja) * | 2010-01-26 | 2011-08-11 | Panasonic Electric Works Co Ltd | 投影システムおよび投影システムにおける歪み修正方法 |
JP5397632B2 (ja) * | 2010-06-23 | 2014-01-22 | セイコーエプソン株式会社 | 投写装置、画像表示システム、プログラム、情報記憶媒体および投写方法 |
JP5453352B2 (ja) * | 2011-06-30 | 2014-03-26 | 株式会社東芝 | 映像表示装置、映像表示方法およびプログラム |
CN102821285A (zh) * | 2012-07-06 | 2012-12-12 | 中影数字巨幕(北京)有限公司 | 一种数字电影放映方法、优化装置和放映系统 |
US9648086B2 (en) | 2013-06-28 | 2017-05-09 | Sonic Ip, Inc. | System, method, and computer program product for providing test logic to user devices |
US20150035993A1 (en) * | 2013-08-05 | 2015-02-05 | Sonic Ip, Inc. | Systems, methods, and media for calibrating a display device |
JP2015080190A (ja) * | 2013-09-11 | 2015-04-23 | 株式会社リコー | 抽出方法、プログラム、抽出装置および画像投影装置 |
JP5687747B2 (ja) * | 2013-10-10 | 2015-03-18 | オリンパスイメージング株式会社 | 携帯機器 |
JP6307843B2 (ja) * | 2013-11-12 | 2018-04-11 | 株式会社リコー | 補間方法、プログラムおよび補間装置 |
US9332290B2 (en) | 2013-12-31 | 2016-05-03 | Sonic Ip, Inc. | Methods, systems, and media for certifying a playback device |
JP2015139006A (ja) * | 2014-01-20 | 2015-07-30 | 株式会社リコー | 情報処理装置、プログラム |
JP2015171039A (ja) * | 2014-03-07 | 2015-09-28 | キヤノン株式会社 | 色処理装置およびその方法 |
JP2016014712A (ja) * | 2014-07-01 | 2016-01-28 | キヤノン株式会社 | シェーディング補正値算出装置およびシェーディング補正値算出方法 |
US9380297B1 (en) * | 2014-12-04 | 2016-06-28 | Spirent Communications, Inc. | Video streaming and video telephony uplink performance analysis system |
KR102545813B1 (ko) * | 2016-12-30 | 2023-06-21 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 방법 |
EP3668077B1 (en) * | 2017-08-09 | 2023-08-02 | FUJIFILM Corporation | Image processing system, server device, image processing method, and image processing program |
JP2019047311A (ja) | 2017-09-01 | 2019-03-22 | セイコーエプソン株式会社 | 画像投写システム及びその制御方法 |
KR102506919B1 (ko) * | 2018-03-14 | 2023-03-07 | 주식회사 엘엑스세미콘 | 테스트 기능을 갖는 디스플레이 구동 장치 및 이를 포함하는 디스플레이 장치 |
US11405695B2 (en) | 2019-04-08 | 2022-08-02 | Spirent Communications, Inc. | Training an encrypted video stream network scoring system with non-reference video scores |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767125A (ja) * | 1993-08-24 | 1995-03-10 | Nec Corp | 投写型ディスプレイの幾何学歪調整装置 |
JPH10215467A (ja) * | 1997-01-30 | 1998-08-11 | Hitachi Ltd | 位置測定方法及び位置測定システム |
JP2001008240A (ja) * | 1999-06-24 | 2001-01-12 | Minolta Co Ltd | Crtのルミナンス特性測定装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4835602A (en) * | 1987-08-27 | 1989-05-30 | North American Philips Corporation | Color projection television system with apparatus for correcting misconvergence and misgeometry which calculates coefficients of equations representing deflection correction waveforms |
US5796425A (en) * | 1995-05-16 | 1998-08-18 | Mitsubishi Denki Kabushiki Kaisha | Elimination of the effect of difference in vertical scanning frequency between a display and a camera imaging the display |
JP2004180142A (ja) * | 2002-11-28 | 2004-06-24 | Canon Inc | 画像処理装置、階調変換特性設定方法及びプログラム |
-
2003
- 2003-06-25 JP JP2003181793A patent/JP2005020314A/ja active Pending
-
2004
- 2004-06-18 US US10/560,730 patent/US20070091334A1/en not_active Abandoned
- 2004-06-18 EP EP04746388A patent/EP1638345A1/en not_active Withdrawn
- 2004-06-18 WO PCT/JP2004/008919 patent/WO2005002240A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767125A (ja) * | 1993-08-24 | 1995-03-10 | Nec Corp | 投写型ディスプレイの幾何学歪調整装置 |
JPH10215467A (ja) * | 1997-01-30 | 1998-08-11 | Hitachi Ltd | 位置測定方法及び位置測定システム |
JP2001008240A (ja) * | 1999-06-24 | 2001-01-12 | Minolta Co Ltd | Crtのルミナンス特性測定装置 |
Also Published As
Publication number | Publication date |
---|---|
JP2005020314A (ja) | 2005-01-20 |
US20070091334A1 (en) | 2007-04-26 |
EP1638345A1 (en) | 2006-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005002240A1 (ja) | 表示特性補正データの算出方法、表示特性補正データの算出プログラム、表示特性補正データの算出装置 | |
JP6882835B2 (ja) | 画像を表示するためのシステム及び方法 | |
JP3925521B2 (ja) | スクリーンの一部の辺を用いたキーストーン補正 | |
TWI242373B (en) | Image processing system, projector and image processing method | |
JP5266954B2 (ja) | 投写型表示装置および表示方法 | |
TWI253006B (en) | Image processing system, projector, information storage medium, and image processing method | |
JP5266953B2 (ja) | 投写型表示装置および表示方法 | |
KR100571175B1 (ko) | 화상 처리 시스템, 프로젝터, 정보 기억 매체 및 화상처리 방법 | |
US7226173B2 (en) | Projector with a plurality of cameras | |
JP4165540B2 (ja) | 投写画像の位置調整方法 | |
JP7289653B2 (ja) | 制御装置、内視鏡撮像装置、制御方法、プログラムおよび内視鏡システム | |
JP3996610B2 (ja) | プロジェクタ装置とその画像歪補正方法 | |
KR20160034847A (ko) | 단초점 카메라를 이용하여 디스플레이 시스템을 캘리브레이팅하기 위한 시스템 및 방법 | |
JP2004260785A (ja) | 画像歪み補正機能を備えたプロジェクタ装置 | |
WO2006025191A1 (ja) | マルチプロジェクションシステムにおける幾何補正方法 | |
JP2017156581A (ja) | 投影装置及びその制御方法 | |
JP2006246502A (ja) | 画像歪み補正機能を備えたプロジェクタ装置 | |
JP2005099150A (ja) | 画像表示装置の画像補正データ算出方法 | |
CN112189337A (zh) | 图像处理装置和图像处理方法 | |
JP4661576B2 (ja) | 投写画像の位置調整方法 | |
JP2011199717A (ja) | 投写型表示装置および画像表示方法 | |
JP2002365718A (ja) | マルチディスプレイ装置およびその調整方法 | |
JP4661651B2 (ja) | 投写画像の位置調整方法 | |
JP2021061510A (ja) | 投影制御装置及び方法、プログラム、記憶媒体 | |
WO2023189456A1 (ja) | 情報処理装置、情報処理方法、および記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007091334 Country of ref document: US Ref document number: 10560730 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004746388 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004746388 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10560730 Country of ref document: US |