WO2019045010A1 - 情報処理装置、情報処理システムおよび情報処理方法 - Google Patents
情報処理装置、情報処理システムおよび情報処理方法 Download PDFInfo
- Publication number
- WO2019045010A1 WO2019045010A1 PCT/JP2018/032246 JP2018032246W WO2019045010A1 WO 2019045010 A1 WO2019045010 A1 WO 2019045010A1 JP 2018032246 W JP2018032246 W JP 2018032246W WO 2019045010 A1 WO2019045010 A1 WO 2019045010A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- luminance
- image
- projector
- information processing
- unit
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 85
- 238000003672 processing method Methods 0.000 title claims description 3
- 238000012937 correction Methods 0.000 claims abstract description 91
- 238000005259 measurement Methods 0.000 claims abstract description 73
- 238000000034 method Methods 0.000 claims description 45
- 230000008569 process Effects 0.000 claims description 37
- 238000012545 processing Methods 0.000 description 35
- 238000006243 chemical reaction Methods 0.000 description 23
- 238000012360 testing method Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 9
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 230000003595 spectral effect Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/002—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/006—Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/68—Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits
Definitions
- the present invention relates to an information processing apparatus, an information processing system, and an information processing method.
- Uniformity correction that corrects the brightness on the screen to be substantially uniform over the entire screen are provided for the medical and printing industries and the like. Uniformity correction is realized by multiplying each RGB (Red Green Blue) signal of each pixel constituting an input image by predetermined uniformity correction data.
- Patent Document 1 An image processing apparatus has been proposed that prevents generation of a grid-like luminance distribution by shifting the positions of the R signal, G signal, and B signal pixels from one another to several tens of pixels when multiplying by uniformity correction data.
- Patent Document 1 adjusts the relative brightness in the screen, and can not realize display according to the absolute value of the brightness.
- An object of the present invention is to provide an information processing apparatus and the like that can realize display according to the absolute value of luminance.
- a first acquisition unit for acquiring luminance correction information for correcting the luminance measured from a predetermined measurement position on a display unit displaying an image based on an input signal, in accordance with the luminance information included in the input signal.
- a second acquisition unit that acquires image data to be displayed on the display unit; and a luminance correction unit that corrects the image data acquired by the second acquisition unit based on luminance correction information acquired by the first acquisition unit.
- an output unit that outputs the image data corrected by the luminance correction unit to the display unit.
- FIG. 18 is an explanatory view showing a configuration of an information processing system at a deformation acquisition stage of the third embodiment.
- FIG. 18 is an explanatory drawing showing the configuration of the information processing system in the luminance distribution acquisition stage of the third embodiment. It is an explanatory view explaining arrangement of a projector and a screen. It is the figure which looked at the projector and the screen from the upper side. It is the figure which looked at the projector and the screen from the right side. It is an explanatory view explaining a projection state of a projector.
- FIG. 16 is a flow chart showing a flow of processing of a program at a preparation stage of Embodiment 3.
- FIG. It is a flowchart which shows the flow of a process of the subroutine of deformation
- FIG. 16 is a flow chart showing a flow of processing of a program at a preparation stage of Embodiment 3.
- FIG. It is a flowchart which shows the flow of a process of the subroutine of deformation
- FIG. 18 is an explanatory diagram for explaining a projection state of the projector of the fourth embodiment.
- FIG. 18 is an explanatory diagram for explaining an arrangement of a projector and a screen according to a fifth embodiment.
- FIG. 21 is an explanatory view showing a configuration of an information processing system at a use stage of the sixth embodiment.
- FIG. 21 is a functional block diagram showing the operation of the information processing system of the seventh embodiment.
- FIG. 21 is an explanatory view showing a configuration of an information processing system according to an eighth embodiment.
- FIG. 33A shows a state in which the use range is superimposed.
- FIG. 33 is an explanatory drawing for explaining a second modification of the ninth embodiment.
- FIG. 33 is an explanatory drawing for explaining a second modification of the ninth embodiment.
- FIG. 1 is an explanatory view for explaining the outline of the information processing system 10.
- the information processing system 10 of the present embodiment is used, for example, for evaluating a test target camera 15 such as an on-vehicle camera.
- the information processing system 10 includes an information processing device 20 (see FIG. 3) and a display device 30 connected to the information processing device 20.
- the display device 30 includes a projector 31 and a screen 33 for rear projection.
- the test target camera 15 is disposed on the opposite side of the projector 31 via the screen 33.
- actual luminance image data including luminance information corresponding to actual luminance is input to the information processing device 20.
- actual luminance means a physical quantity or spectral radiance having a target spectral sensitivity curve such as a target luminance or tristimulus value and having a value uniquely determined from the spectral radiance.
- Displaying with actual brightness means displaying the image by reproducing the above-mentioned absolute physical quantity.
- the actual luminance image data is, for example, a photographed image photographed by a high resolution two-dimensional color luminance meter 36.
- the actual luminance image data may be an actual photographed image photographed by a digital camera that is capable of performing luminance calibration and photographing of actual luminance.
- the actual luminance image data may be a simulation image created by simulation based on physical theory.
- the actual luminance image data may be a spectral image captured by a hyperspectral camera or the like.
- tristimulus value data X, Y, Z of the CIE color system defined in CIE (Commission Internationale de l'Eclairage (E is with an acute accent): International Lighting Commission) of each pixel is used.
- Each pixel has a unique spectral sensitivity curve such as CIELAB (CIE L * a * b * ) value, CIERGB (CIE Red Green Blue) value, or CIELMS (CIE Long Medium Short) value, and is unique from the spectral radiance. It is expressed by the physical quantity whose value is determined. Physical quantities are not limited to three dimensions. It may be a physical quantity of one dimension, two dimensions or four or more dimensions.
- the actual luminance image data may be image data in which the spectral radiance of each pixel is recorded.
- the actual luminance image data includes image data or moving image data of a general format such as JPEG (Joint Photographic Experts Group) or PNG (Portable Network Graphics), and RGB gradation values and luminance recorded in these data. It may be data in which the associated reference information and the like become a set.
- JPEG Joint Photographic Experts Group
- PNG Portable Network Graphics
- the actual luminance image data includes image data or moving image data of a general format such as JPEG or PNG, RGB gradation values recorded in these data, luminance, gamma values of photographing equipment, and color gamut information It may be data in which
- the actual luminance image data is luminance-corrected based on luminance correction information described later.
- the image data after the luminance correction is input to the projector 31.
- the projector 31 projects an image on the screen 33 based on the input image data.
- the projector 31 performs so-called rear projection in which the left and right of the input image are inverted and projected.
- the rear projection image projected on the screen 33 is observed from a position substantially facing the projector 31 across the screen 33
- the rear projection image looks high in the central portion and low in the peripheral portion. For example, when the position to be observed moves to the right, the high-brightness part also appears to move to the right.
- the brightness correction information is information for correcting the brightness distribution and the absolute value of the brightness which change depending on the position to be observed as described above.
- the stage up to creating luminance correction information is referred to as the preparation stage.
- the position at which the brightness is measured when creating the brightness correction information is referred to as a measurement position.
- the information processing system 10 of the present embodiment is used.
- the actual luminance image data corrected using the luminance correction information corresponding to the measurement position is input to the projector 31 and projected on the screen 33. From the measurement position, it is possible to view an image of actual luminance that is faithful to the actual luminance image data.
- the test target camera 15 By arranging the test target camera 15 at the measurement position, it is possible to capture an image of actual luminance by the test target camera 15.
- test target camera 15 By testing the test target camera 15 using the system described above, for example, lens flares and ghosts caused by headlights of oncoming vehicles, etc., or changes in brightness before and after the tunnel, etc. We can evaluate the influence that it has on the image to be taken. Since it is easy to evaluate a plurality of types of test target cameras 15 under the same condition, it is possible to obtain useful information for, for example, model selection of an on-vehicle camera.
- FIG. 2 is an explanatory view for explaining an outline of the luminance distribution measurement.
- the projector 31 projects a gray, white, or black image onto the screen 33. In the following description, it is described as gray including white and black.
- the luminance at each position of the projected image is measured using a luminance meter 36 located at the measurement position.
- the brightness of gray to be projected from the projector 31 to the screen 33 is changed, and the luminance is measured again. As described above, the luminance at each position of the image when gray images of various lightness are projected from the projector 31 is measured.
- a high resolution two-dimensional color luminance meter 36 is used as the luminance meter 36.
- a two-dimensional luminance meter may be used as the luminance meter 36.
- the screen 33 may be mechanically scanned to measure the luminance at each position.
- FIG. 2B shows an example of the measurement result of luminance.
- the central part of the projection range is high in luminance, and is lower at the end.
- the distribution state of the luminance changes in accordance with the individual difference of the projector 31 and the positional relationship between the projector 31 and the screen 33 in addition to the measurement position.
- the distribution of luminance also changes due to deterioration with time of the lamp that is the light source of the projector 31.
- FIG. 3 is an explanatory view illustrating the configuration of the information processing system 10 in the preparation stage.
- the information processing system 10 in the preparation stage includes an information processing device 20, a display device 30, and a luminance meter 36.
- the information processing device 20 includes a central processing unit (CPU) 21, a main storage 22, an auxiliary storage 23, a communication unit 24, an output I / F (Interface) 25, an input I / F 26, and a bus.
- the information processing apparatus 20 of the present embodiment is an information processing apparatus such as a general-purpose personal computer or a tablet.
- the CPU 21 is an arithmetic and control unit that executes a program according to the present embodiment.
- One or more CPUs or multi-core CPUs are used as the CPU 21.
- the CPU 21 may be a field programmable gate array (FPGA), a complex programmable logic device (CPLD), an application specific integrated circuit (ASIC), instead of one or more CPUs or multi-core CPUs or together with one or more CPUs or multi-core CPUs.
- a GPU Graphics Processing Unit
- the CPU 21 is connected to hardware units constituting the information processing apparatus 20 via a bus.
- the main storage device 22 is a storage device such as a static random access memory (SRAM), a dynamic random access memory (DRAM), or a flash memory.
- SRAM static random access memory
- DRAM dynamic random access memory
- flash memory a storage device such as a static random access memory (SRAM), a dynamic random access memory (DRAM), or a flash memory.
- the main storage device 22 temporarily stores necessary information and programs being executed by the information processing device 20 in the middle of processing performed by the information processing device 20.
- the auxiliary storage device 23 is a storage device such as an SRAM, a flash memory, a hard disk or a magnetic tape.
- the auxiliary storage device 23 stores a program to be executed by the CPU 21, a brightness measurement DB (Database) 51, a brightness correction DB 52, and various information necessary for executing the program.
- the luminance measurement DB 51 and the luminance correction DB 52 may be stored in another storage device connected to the information processing device 20 via a network or the like. Details of each DB will be described later.
- the communication unit 24 is an interface that communicates with the network.
- the output I / F 25 is an interface that outputs image data to be displayed on the display device 30.
- the input I / F 26 is an interface for acquiring the luminance measurement result from the luminance meter 36.
- the input I / F 26 may be an interface that reads data measured in advance using the luminance meter 36 from a portable recording medium such as an SD (Secure Digital) memory card.
- the display device 30 includes a screen 33 and a projector 31.
- the screen 33 is for rear projection.
- the screen 33 is an example of the display unit of the present embodiment.
- the display device 30 may include a projector 31 capable of front projection and a screen 33 for front projection.
- the display device 30 may include any display panel such as a liquid crystal display panel or an organic EL (Electro Luminescence) display panel.
- FIG. 4 is a graph showing the relationship between the input gradation value to the projector 31 and the luminance.
- the horizontal axis in FIG. 4 indicates the gradation value of the image in which the entire surface is gray input to the projector 31 via the output I / F 25.
- the input to the projector 31 is 8 bits, and 256 gradations from 0 to 255 can be input. When the input tone value is 0, it means black, and when the input tone value is 255, it means white.
- the input to the projector 31 may be a bit number larger than 8 bits.
- the vertical axis in FIG. 4 indicates the ratio of the luminance measured by the luminance meter 36 to the maximum luminance, that is, the luminance actual value normalized by the maximum luminance in the display area.
- the solid line shows the measurement result of the central portion of the screen 33, and the broken line shows an example of the measurement result of the end portion of the screen 33.
- the larger the input tone value the larger the measured luminance value. Even if the input tone value is the same, the luminance measurement value at the end is lower than that at the center.
- FIG. 5 is an explanatory diagram for explaining the record layout of the luminance measurement DB 51.
- the luminance measurement DB 51 is a DB that records the position on the screen 33 and the measured value of the luminance by the luminance meter 36 in association with each other.
- the luminance measurement DB 51 has a position field and a luminance measurement value field.
- the luminance actual value field has an arbitrary number of input tone value fields, such as 10 fields of input tone values, 20 fields of input tone values, and 255 fields of input tone values.
- the position on the screen 33 is recorded by the X coordinate and the Y coordinate.
- the X coordinate and the Y coordinate are expressed by the position of the measurement pixel of the two-dimensional color luminance meter 36.
- FIG. 6 is an explanatory diagram for explaining the record layout of the luminance correction DB 52.
- the luminance correction DB 52 is a DB that records the position on the screen 33 and the input gradation value input to the projector 31 from the output I / F 25 in order to obtain a predetermined display luminance value.
- the information recorded in the luminance correction DB 52 is an example of the luminance correction information according to the present embodiment.
- the luminance correction DB 52 has a position field and an input tone value field.
- the input gradation value field has an arbitrary number of display luminance value fields such as a display luminance value of 100 fields, a display luminance value of 200 fields, a display luminance value of 5000 fields and a display luminance value of 10000 fields.
- the position on the screen 33 is recorded by the X coordinate and the Y coordinate.
- the input gradation value to be input to the projector 31 from the output I / F 25 when the display luminance value measured by the luminance meter 36 disposed at the measurement position is 100 candelas / square meter is recorded. .
- the input gradation value input to the projector 31 from the output I / F 25 when the display luminance value measured by the luminance meter 36 disposed at the measurement position is 200 candelas / square meter is recorded It is done.
- input gradation values to be input to the projector 31 from the output I / F 25 when the display luminance value measured by the luminance meter 36 disposed at the measurement position is 5000 candelas / square meter are recorded. .
- FIG. 5 in the case of an input tone value of 10 at position (1, 1), the actual measured value of luminance is 100 candelas / square meter. Therefore, as shown in FIG. 6, the input gradation value required to obtain the display luminance value of 100 candelas / square meter at position (1, 1) is 10.
- "-" indicates that the luminance can not be obtained. For example, at position (1, 1), even when the input tone value is increased, a display luminance of 10000 candelas / square meter can not be obtained.
- the input gradation value obtained by interpolation processing of an arbitrary method such as linear interpolation is Recorded in the display luminance value field.
- FIG. 7 is a flowchart showing the flow of processing of the program in the preparation stage.
- the program shown in FIG. 7 is executed after the installation and focusing of the screen 33 and the projector 31 are completed, and the luminance meter 36 is arranged at the measurement position.
- the CPU 21 determines an input tone value (step S501). For the input tone value, an arbitrary value such as every 10 tones can be set.
- the CPU 21 displays the luminance distribution evaluation image on the display device 30 (step S502). Specifically, the CPU 21 outputs the image data of the luminance distribution evaluation image whose entire surface corresponds to the gradation value determined in step S501 to the projector 31 through the output I / F 25.
- the projector 31 projects an image on the screen 33 based on the input image data. Thereby, the luminance distribution evaluation image is displayed on the screen 33.
- the luminance distribution evaluation image may be, for example, an image in which portions having different gradations are arranged in a checkerboard shape.
- the CPU 21 obtains a measured value of the luminance distribution via the luminance meter 36 and the input I / F 26 (step S503).
- the CPU 21 records the measurement value in the field corresponding to the input tone value determined in step S501 of the record corresponding to each coordinate position of the luminance measurement DB 51 (step S504).
- the CPU 21 determines whether or not measurement of a predetermined input tone value is completed (step S505). If it is determined that the process has not ended (NO in step S505), the CPU 21 returns to step S501. If it is determined that the process has ended (YES in step S505), the CPU 21 starts a subroutine for calculating a correction value (step S506).
- the subroutine for calculating the correction value is a subroutine for creating the brightness correction DB 52 based on the brightness measurement DB 51. The flow of processing of the correction value calculation subroutine will be described later.
- the CPU 21 interpolates the luminance correction DB 52 in accordance with the resolution of the input data input to the projector 31 (step S507). Specifically, the CPU 21 adds a record to the luminance correction DB 52 such that the number of display pixels of the projector 31 matches the number of records of the luminance correction DB 52. The CPU 21 records the input tone value for each field of the added record based on an arbitrary interpolation method. Further, the CPU 21 corrects the data of the position field in accordance with the pixel position of the projector 31. Thereafter, the CPU 21 ends the process.
- FIG. 8 is a flowchart showing a flow of processing of a subroutine of correction value calculation.
- the CPU 21 initializes the luminance correction DB 52 (step S511). Specifically, the CPU 21 deletes the record of the existing luminance correction DB 52, and creates the same number of records as the luminance measurement DB 51. The CPU 21 records the same data as the position field of the luminance measurement DB 51 in the position field of each record.
- the CPU 21 acquires, from the luminance measurement DB 51, one record, that is, a measurement result indicating the relationship between the input gradation value and the luminance value at one position (step S512).
- the CPU 21 calculates an input tone value corresponding to the luminance value of each display luminance value field of the luminance correction DB 52 (step S513).
- the CPU 21 calculates, for example, an input tone value for a predetermined display luminance value by performing linear interpolation on the data acquired in step S512.
- CPU 21 calculates a function indicating the relationship between the input tone value and the display luminance value by, for example, the least squares method, and based on the calculated function, the input floor for the predetermined display luminance value. You may calculate a key value.
- the CPU 21 records the input tone value for each display luminance value calculated in step S513 in the record of the luminance correction DB 52 corresponding to the position acquired in step S512 (step S514).
- the CPU 21 determines whether the processing of all the records of the luminance measurement DB 51 has been completed (step S515). If it is determined that the process has not ended (NO in step S515), the CPU 21 returns to step S512. If it is determined that the process has ended (YES in step S515), the CPU 21 ends the process.
- FIG. 9 is a flowchart showing the flow of processing of the program at the use stage.
- the CPU 21 acquires original image data from the auxiliary storage device 23 or another server connected via a network (step S 521).
- the CPU 21 may acquire the original image data via an interface such as HDMI (High Definition Multimedia Interface).
- Original image data may be generated by simulation software.
- the CPU 21 may store original image data acquired from the outside in the auxiliary storage device 23 and acquire it again.
- the original image data is actual luminance image data including information of actual luminance.
- the CPU 21 implements the function of the second acquisition unit of the present embodiment.
- the CPU 21 acquires the luminance value of one pixel in the image acquired in step S521 (step S522).
- the CPU 21 extracts a record corresponding to the position of the pixel acquired in step S522 from the luminance correction DB 52.
- the CPU 21 acquires the input gradation value of the field corresponding to the luminance value acquired in step S522 (step S523).
- the CPU 21 realizes the function of the first acquisition unit of the present embodiment.
- the CPU 21 calculates an input gradation value by interpolation.
- the CPU 21 records the input tone value acquired in step S523 in association with the position of the pixel acquired in step S522 (step S524).
- CPU21 implements the function of the brightness correction part of this Embodiment.
- the CPU 21 determines whether the processing of all the pixels of the original image data has been completed (step S525). If it is determined that the process has not ended (NO in step S525), the CPU 21 returns to step S522.
- step S525 If it is determined that the process has ended (YES in step S525), the CPU 21 outputs image data to the projector 31 through the output I / F 25 based on the input tone value of each pixel recorded in step S524 (step S526) .
- step S526 the CPU 21 realizes the function of the output unit of the present embodiment.
- the projector 31 projects an image on the screen 33 based on the input image data. Thereafter, the CPU 21 ends the process.
- an image of actual luminance is displayed on the screen 33 when viewed from the measurement position.
- an information processing apparatus 20 or the like capable of realizing display according to the absolute value of luminance.
- test target camera 15 For example, by arranging the test target camera 15 at the measurement position and photographing the screen 33, it is possible to evaluate the test target camera 15 using the actual luminance image.
- an image captured by the test target camera 15 with lens flare and ghost caused by headlights of oncoming vehicles or the like, or change in brightness before and after a tunnel. can be evaluated for their impact on
- the actual luminance image may be a moving image.
- verification of an automatic driving operation based on an image captured by the on-vehicle camera can be performed.
- driving simulation and the like using an actual luminance image can also be provided.
- the present embodiment relates to an information processing apparatus 20 that creates luminance correction information for a plurality of measurement positions and displays an image corrected based on the measurement position closest to the position where the test target camera 15 and the like are installed.
- the description of the parts common to the first embodiment will be omitted.
- the process of the preparation stage described using FIG. 7 is performed for a plurality of measurement positions.
- the luminance correction DB 52 corresponding to each measurement position is stored in the auxiliary storage device 23.
- FIG. 10 is a flowchart showing the process flow of the program at the use stage of the second embodiment.
- the CPU 21 acquires the position of the test target camera 15 or the like from, for example, a position acquisition unit such as a position sensor (step S531).
- the CPU 21 calculates the distance between the position acquired in step S531 and each of the plurality of measurement positions for which the luminance correction information has been created in advance (step S532).
- the CPU 21 selects a measurement position to be used for luminance correction (step S533).
- the subsequent processing is performed using the luminance correction DB 52 corresponding to the selected measurement position.
- step S533 the measurement position closest to the position acquired in step S531 can be selected.
- step S533 a plurality of measurement positions close to the position acquired in step S531 may be selected, and measurement values at the position acquired in step S531 may be estimated by data interpolation.
- the CPU 21 acquires original image data from the auxiliary storage device 23 or another server connected via a network (step S 521).
- the subsequent processing is the same as the processing of the program according to the first embodiment described with reference to FIG.
- the information processing system 10 that performs luminance correction by selecting the closest measurement position from among a plurality of measurement positions. For example, even when the position of the test target camera 15 is changed, the information processing system 10 capable of displaying an actual luminance image can be realized.
- step S502 of the program described with reference to FIG. 7 the luminance display image is displayed individually for each of the three primary colors R (Red), G (Green), and B (Blue), and the luminance measurement DB 51 and the luminance correction for each of the three primary colors are displayed. You may create DB52. It is possible to realize the information processing system 10 that prevents the occurrence of color unevenness caused by chromatic aberration and the like.
- the present embodiment relates to an information processing system 10 that superimposes images to be projected on the screen 33 from a plurality of projectors 31.
- the description of the parts common to the first embodiment will be omitted.
- the preparation stage has two stages of a deformation acquisition stage and a luminance distribution acquisition stage.
- FIG. 11 is an explanatory view showing the configuration of the information processing system 10 at the deformation acquisition stage of the third embodiment.
- the information processing system 10 at the deformation acquisition stage includes an information processing device 20, a display device 30, and a luminance meter 36.
- the information processing apparatus 20 includes a CPU 21, a main storage 22, an auxiliary storage 23, a communication unit 24, an output I / F 25, an input I / F 26, a monitor 27, and a bus.
- the monitor 27 is, for example, a liquid crystal display device or the like provided in the information processing device 20.
- An information processing apparatus 20 according to the present embodiment is an information processing apparatus 20 such as a general-purpose personal computer or a tablet.
- the display device 30 includes a screen 33 and a plurality of projectors 31 such as a first projector 311 and a second projector 312.
- projectors 31 such as a first projector 311 and a second projector 312.
- the camera 37 is connected to the input I / F 26.
- the camera 37 is disposed at a position facing the projector 31 across the screen 33.
- the camera 37 may be disposed on the same side as the first projector 311 or the like at a position not blocking the light projection path of the projector 31.
- the camera 37 is a high resolution digital camera.
- FIG. 12 is an explanatory view showing the configuration of the information processing system 10 in the luminance distribution acquisition stage of the third embodiment.
- the camera 37 is changed to the luminance meter 36.
- FIG. 13 and 14 are explanatory diagrams for explaining the arrangement of the projector 31 and the screen 33.
- FIG. 13 is a view of the projector 31 and the screen 33 viewed from the rear side of the projector 31.
- FIG. 14A is a view of the projector 31 and the screen 33 viewed from the upper side.
- FIG. 14B is a view of the projector 31 and the screen 33 viewed from the right side.
- the projection state from each projector 31 to the screen 33 is schematically shown.
- a total of six projectors 31 are used, three in the lateral direction and two in the vertical direction.
- the projectors 31 at both ends in the left-right direction are arranged substantially in a fan shape so as to face the optical axis of the projector 31 in the middle.
- a plurality of projectors 31 may be accommodated in one case and provided in a form that looks like one integrated projector in appearance.
- all or part of the plurality of projectors 31 may share optical components, such as, for example, projection lenses, relay optics or spatial light modulators. All or some of the plurality of projectors 31 may share an optical path. All or part of the plurality of projectors 31 may share a power supply circuit, a control circuit, and the like.
- the projector 31 is adjusted to project an image on substantially the same area on the screen 33 using the lens shift function, and focusing is performed.
- the arrangement of the projectors 31 shown in FIG. 13 and FIG. 14 is an example, and any number of projectors 31 can be arranged and used at arbitrary positions.
- FIG. 15 is an explanatory view for explaining a projection state of the projector 31. As shown in FIG. In FIG. 15, description will be made using two projectors 31 of the first projector 311 and the second projector 312.
- the CPU 21 operates the projectors 31 one by one, and acquires the projection range of each projector 31 via the camera 37.
- the CPU 21 superimposes the projection range of each projector 31 and displays it on the monitor 27 as shown in FIG. 15B.
- the user can input the use range by an operation such as dragging with a mouse, for example.
- the CPU 21 may automatically determine the use range by calculating a rectangle having a predetermined aspect ratio included in the projection range of each projector 31.
- the coordinates on the use range are used to indicate the position on the screen 33.
- the use range may be defined as a projection range by an arbitrary number of projectors 31 such as three or more.
- FIG. 16 is an explanatory view for explaining a projection state of the projector 31. As shown in FIG. Also in FIG. 16, description will be made using the two projectors 31 of the first projector 311 and the second projector 312.
- the CPU 21 outputs, to each of the projectors 31, image data obtained by deforming the original image so as to project a predetermined image in the use range.
- the projector 31 projects the input image on the screen 33 as shown in FIG. 16A.
- FIG. 16B by superimposing the respective images on the screen 33, a high-brightness image is displayed in the use range of the screen 33.
- FIG. 17 is an explanatory diagram for explaining an example of the measurement result of luminance in the third embodiment.
- FIG. 17 shows measurement results measured using the luminance meter 36 when images of uniform gray are simultaneously projected from all the projectors 31 to the use range.
- a luminance meter 36 is arranged to measure the luminance in the use range. As shown in FIG. 17, high brightness portions corresponding to the number of projectors 31 are formed.
- an image of actual luminance can be displayed on the screen 33 by inputting the image data with the luminance distribution corrected to the respective projectors 31.
- FIG. 18 is a flow chart showing the flow of processing of a program in the preparation stage of the third embodiment.
- the CPU 21 starts a subroutine for deformation acquisition (step S551).
- the subroutine of deformation acquisition acquires the use range based on the projection range by each projector 31 as described using FIG. 15, and inputs the image input to the projector 31 as described using FIG. 16A.
- This subroutine is for recording shape correction information to be deformed.
- the flow of processing of the deformation acquisition subroutine will be described later.
- the CPU 21 starts a subroutine of luminance distribution acquisition (step S552).
- the subroutine for acquiring the luminance distribution is a subroutine for measuring the luminance distribution described using FIG. 17 and creating the luminance correction DB 52. A flow of processing of a subroutine of luminance distribution acquisition will be described later.
- the CPU 21 then ends the process.
- FIG. 19 is a flowchart showing a flow of processing of a deformation acquisition subroutine.
- the CPU 21 selects one projector 31 (step S561).
- the CPU 21 displays an image for acquiring deformation on the display device 30 (step S562).
- the CPU 21 projects, from the projector 31 via the output I / F 25, an image for deformation acquisition whose luminance value on the entire surface is the maximum value. Thereby, a white image is displayed on the screen 33.
- the image for acquiring deformation may be any image such as a so-called checkerboard-like image in which white squares and black squares are alternately arranged.
- a white image whose entire surface is white is used as an image for deformation acquisition will be described as an example.
- the CPU 21 acquires the projection range of the white image through the camera 37 and records the projection range in the auxiliary storage device 23 (step S563).
- the CPU 21 determines whether the processing of all the projectors 31 has been completed (step S564). If it is determined that the process has not ended (NO in step S564), the CPU 21 returns to step S561.
- the CPU 21 determines the use range described using FIG. 15B (step S565).
- CPU 21 determines the range of use, for example, by receiving an input from the user.
- the CPU 21 may automatically determine the use range by calculating a rectangle having a predetermined aspect ratio included in the projection range of each projector 31.
- the CPU 21 acquires the projection range recorded in step S563 for one projector 31 (step S566).
- the CPU 21 corrects the image projected on the screen 33 by deforming the original image as described using FIG. 16A based on the acquired projection range and the use range determined in step S565.
- the correction information is calculated and stored in the auxiliary storage device 23 (step S567).
- the shape correction information can be expressed, for example, by a matrix that is deformed by coordinate conversion of the image.
- the image deformation is conventionally used, and thus the description thereof is omitted.
- the CPU 21 determines whether the processing of all the projectors 31 has been completed (step S568). If it is determined that the process has not ended (NO in step S568), the CPU 21 returns to step S566. If it is determined that the process has ended (YES in step S568), the CPU 21 ends the process.
- FIG. 20 is a flowchart showing a flow of processing of a subroutine of luminance distribution acquisition.
- the subroutine for acquiring the luminance distribution is a subroutine for measuring the luminance distribution described using FIG. 17 and creating the luminance correction DB 52.
- the CPU 21 determines an input tone value (step S571). For the input tone value, an arbitrary value such as every 10 tones can be set.
- the CPU 21 creates a luminance distribution evaluation image based on the shape correction information stored in the auxiliary storage device 23 (step S572). Specifically, the CPU 21 creates image data for projecting an image of the input tone value determined in step S571 in the use range described using FIG. 15B, and stores the image data in the auxiliary storage device 23.
- the CPU 21 determines whether the processing of all the projectors 31 has been completed (step S573). If it is determined that the process has not ended (NO in step S573), the CPU 21 returns to step S572.
- the CPU 21 displays a luminance distribution evaluation image (step S574). Specifically, the CPU 21 outputs the image data of the luminance distribution evaluation image created in step S572 to the respective projectors 31 via the output I / F 25.
- the projector 31 projects an image on the screen 33 based on the input image data. The images projected from the respective projectors 31 are projected so as to be superimposed on the use range described using FIG. Thus, the luminance distribution evaluation image is displayed on the screen 33.
- the CPU 21 obtains a measured value of the luminance distribution via the luminance meter 36 and the input I / F 26 (step S575).
- the CPU 21 records the measurement value in the field corresponding to the input gradation value determined in step S571 of the record corresponding to each coordinate position of the luminance measurement DB 51 (step S576).
- the relationship between the input tone value and the luminance on the screen 33 is the same at any position on the screen 33. Therefore, by displaying one luminance distribution evaluation image on the screen 33 and measuring the luminance, the relationship between the input gradation value of each projector 31 and the luminance on the screen 33 is acquired, and the luminance measurement DB 51 is created. It is good. By using the relationship data between the input gradation value of each projector 31 and the luminance on the screen 33, actual luminance display can be performed with high accuracy.
- the CPU 21 determines whether or not measurement of a predetermined input tone value is completed (step S577). If it is determined that the process has not ended (NO in step S577), the CPU 21 returns to step S571. If it is determined that the process has ended (YES in step S577), the CPU 21 starts a subroutine for calculating a correction value (step S578).
- the subroutine for calculating the correction value is the same subroutine as the subroutine described with reference to FIG. Thereafter, the CPU 21 ends the process.
- FIG. 21 is a flow chart showing the flow of processing of a program at the use stage according to the third embodiment.
- the CPU 21 obtains original image data from the auxiliary storage device 23 or another server connected via a network (step S581).
- the original image data is actual luminance image data including information of actual luminance.
- the CPU 21 acquires the luminance value of one pixel in the image acquired in step S581 (step S582).
- the CPU 21 calculates the position in the use range described using FIG. 15B for the pixel for which the luminance has been acquired (step S583).
- the CPU 21 refers to the luminance correction DB 52 to acquire an input tone value corresponding to the luminance calculated in step S583 (step S584).
- the CPU 21 performs interpolation based on the brightness correction DB 52, and calculates an input tone value corresponding to the position calculated in step S583 and the display brightness value.
- the CPU 21 records the input tone value acquired in step S584 in association with the position calculated in step S583 (step S585). The CPU 21 determines whether the processing of all the pixels of the original image data has been completed (step S586). If it is determined that the process has not ended (NO in step S586), the CPU 21 returns to step S582.
- step S586 the CPU 21 acquires, from the auxiliary storage device 23, shape correction information for deforming an image input to one projector 31 (step S591). By step S591, the CPU 21 realizes the function of the third acquisition unit of the present embodiment.
- the CPU 21 deforms the image data formed by the input tone value of each pixel recorded in step S585 based on the shape correction information (step S592). By step S592, the CPU 21 realizes the function of the shape correction unit of the present embodiment.
- the CPU 21 outputs the image data to the projector 31 at step S592 via the output I / F 25 (step S593).
- the projector 31 projects an image on the screen 33 based on the input image data.
- the CPU 21 determines whether the processing of all the projectors 31 has been completed (step S594). If it is determined that the process has not ended (NO in step S594), the CPU 21 returns to step S591. If it is determined that the process has ended (YES in step S594), the CPU 21 ends the process.
- the information processing apparatus 20 which projects high luminance portions which can not be projected by one projector 31 with actual luminance.
- the luminance correction DB 52 in the case of using one or several projectors 31 may be created.
- a relatively small number of projectors 31 can be used for relatively dark images, and all projectors 31 can be used for images including high-brightness portions.
- all the projectors 31 may be used for an area including a high brightness part, and one or more projectors 31 may be used for the other parts. Since the superimposed projection is not performed for the low luminance portion, it is possible to provide the information processing system 10 that displays with high resolution.
- the present embodiment relates to an information processing system 10 that uses an auxiliary projector 32 that projects an image to a part of a use range.
- the description of the parts common to the third embodiment will be omitted.
- FIG. 22 is an explanatory diagram for explaining the arrangement of the projector 31 and the screen 33 according to the fourth embodiment.
- FIG. 22A is a view of the projector 31, the auxiliary projector 32, and the screen 33 viewed from the upper side.
- FIG. 22B is a view of the projector 31, the auxiliary projector 32, and the screen 33 as viewed from the rear side of the projector 31.
- two auxiliary projectors 32 are arranged in a substantially fan shape on the left and right of the six projectors 31 arranged in the same manner as in the third embodiment.
- FIG. 23 is an explanatory diagram for explaining a projection state of the projector 31 of the fourth embodiment.
- the projection range of the projector 31 of the present embodiment will be described using FIGS. 22 and 23.
- the six projectors 31 from the first projector 311 to the sixth projector 316 can project an image to an area including a use range.
- the first auxiliary projector 321 and the second auxiliary projector 322 disposed on the right project an image on the substantially right half of the use range.
- the right half of the projectable area of the first auxiliary projector 321 and the second auxiliary projector 322 is not used, as indicated by the broken lines in FIGS. 22A and 23.
- the third auxiliary projector 323 and the fourth auxiliary projector 324 disposed on the left project an image on the substantially left half of the use range.
- the left half of the projectable area of the third auxiliary projector 323 and the fourth auxiliary projector 324 is not used, as indicated by broken lines in FIGS. 22A and 23.
- the information processing system 10 capable of displaying a portion of high luminance near the edge of the use range with actual luminance.
- the information processing system 10 having a wide area in which a high brightness image can be displayed with real brightness.
- the number of auxiliary projectors 32 may be three or less or five or more.
- the auxiliary projector 32 can be arranged at any place.
- the size of the projectable area of the auxiliary projector 32 may be different from the size of the projectable area of the projector 31.
- the present embodiment relates to an information processing system 10 having a plurality of screens 33.
- the description of the parts common to the third embodiment will be omitted.
- FIG. 24 is an explanatory diagram for explaining the arrangement of the projector 31 and the screen 33 according to the fifth embodiment.
- the display device 30 according to the present embodiment is continuously arranged on a first screen 331, a second screen 332 continuously arranged on one side of the first screen 331, and one side opposite to the first screen 331.
- a third screen 333 is included.
- the first screen 331 to the third screen 333 will be referred to as the screen 33 unless it is necessary to distinguish them.
- each projector 31 is arranged on the back of each screen 33.
- the optical axis of each projector 31 is arranged to be directed to the measurement position.
- a so-called panoramic real brightness image is projected from a total of 18 projectors 31 so as to be continuous with the three screens 33.
- the information processing system 10 capable of evaluating the wide-angle test target camera 15. Since the rear axes of the respective projectors 31 are directed to the measurement position, it is possible to provide the information processing system 10 capable of displaying an image with high luminance at real luminance.
- the screen 33 may be four or more.
- the screen 33 may be connected vertically.
- the screen 33 may be a curved surface. It is possible to provide the information processing system 10 in which the influence of the screen 33 seam is small.
- the present embodiment relates to an information processing system 10 in which a user visually observes an actual luminance image.
- the description of the parts common to the third embodiment will be omitted.
- FIG. 25 is an explanatory view showing the configuration of the information processing system 10 at the use stage of the sixth embodiment.
- the seat 18 of the car is disposed such that the user's eyes are located near the measurement position.
- the windshield 17, the handle 19, the instrument panel and the like are arranged in accordance with the position of the seat 18.
- An actual luminance image is displayed on the screen 33.
- the user can evaluate the visibility of the instrument panel when, for example, the headlights of an oncoming vehicle, the morning sun, the setting sun or the like is hit.
- the user can also evaluate the visibility of a so-called HUD (Head-Up Display) that projects various types of information on the windshield 17.
- HUD Head-Up Display
- the information processing system 10 which performs actual luminance display for the drive simulator 16 which can experience a phenomenon such as dazzling by headlights of oncoming vehicles.
- FIG. 26 is an explanatory drawing showing the operation of the information processing device 20 of the seventh embodiment.
- the information processing apparatus 20 operates as follows based on control by the CPU 21.
- the information processing system 10 includes a display device 30 and an information processing device 20.
- the display device 30 has a display unit 33 that displays an image.
- the information processing apparatus 20 includes a first acquisition unit 61, a second acquisition unit 62, a luminance correction unit 63, and an output unit 64.
- the first acquisition unit 61 acquires luminance correction information that corrects the luminance obtained by measuring the display unit displaying an image based on an input signal from a predetermined measurement position, in accordance with the luminance information included in the input signal.
- the second acquisition unit 62 acquires an image to be displayed on the display unit 33.
- the luminance correction unit 63 corrects the image acquired by the second acquisition unit 62 based on the correction information acquired by the first acquisition unit 61.
- the output unit 64 outputs the image corrected by the luminance correction unit to the display unit.
- FIG. 27 is an explanatory view of the configuration of the information processing system 10 according to the eighth embodiment. The description of the parts common to the first embodiment will be omitted.
- An information processing system 10 includes a computer 90, a display device 30, and a luminance meter 36.
- the computer 90 includes a CPU 21, a main storage 22, an auxiliary storage 23, a communication unit 24, an output I / F 25, an input I / F 26, a reading unit 28, and a bus.
- the computer 90 is an information device such as a general-purpose personal computer or a tablet.
- the program 97 is recorded on a portable recording medium 96.
- the CPU 21 reads the program 97 via the reading unit 28 and stores the program 97 in the auxiliary storage device 23.
- the CPU 21 may also read the program 97 stored in the semiconductor memory 98 such as a flash memory mounted in the computer 90.
- the CPU 21 may download the program 97 from another server (not shown) or the like connected via the communication unit 24 and a network (not shown) and store the program 97 in the auxiliary storage device 23.
- the program 97 is installed as a control program of the computer 90, loaded to the main storage 22, and executed.
- the computer 90 functions as the information processing apparatus 20 described above.
- the present embodiment relates to a form in which coordinates of an image projected from the projector 31, coordinates in a use range described using FIG. 15, and original image data are sequentially converted using a conversion DB.
- the description of the parts common to the third embodiment will be omitted.
- FIG. 28 is an explanatory view for explaining conversion between coordinates of an image projected from the projector 31 and coordinates of a range of use.
- FIG. 28A shows coordinates of an image input to the first projector 311, that is, projector coordinates. With the top left corner of the image as the origin (0, 0), define the x-axis to the right and the y-axis to the bottom. For example, when using the first projector 311 with a square ratio pixel at 1080p resolution, x is an integer from 0 to 1919 and y is an integer from 0 to 1079.
- FIG. 28B shows coordinates of a use range, that is, use range coordinates.
- a use range that is, use range coordinates.
- the origin (0, 0)
- x is an integer from 0 to 2047
- y is an integer from 0 to 1079 is there.
- FIG. 29 is an explanatory view for explaining conversion between use range coordinates and coordinates of original image data.
- FIG. 29A shows use range coordinates. Similar to FIG. 28B, with the upper left corner of the use range as the origin (0, 0), the x-axis is defined rightward and the y-axis is defined downward.
- FIG. 29B shows coordinates of original image data, that is, original image coordinates.
- the x-axis is defined rightward and the y-axis is defined downward.
- the original image data is a square ratio pixel at 1080p resolution
- x is an integer from 0 to 1919
- y is an integer from 0 to 1079.
- FIG. 30 is an explanatory diagram for explaining the record layout of the first conversion DB.
- the first conversion DB is a DB that records the projector coordinates of the image projected from the projector 31, the use range coordinates, and the distribution of the luminance to each projector 31 in association with each other.
- the first transformation DB has a projector number field, a projector coordinate field, a use range coordinate field, and a distribution field.
- the numbers given to the projector 31 by serial numbers are recorded.
- the coordinates of the image projected from the projector 31 described using FIG. 28A are recorded in the projector coordinate field.
- the range-of-use coordinates described using FIG. 28B are recorded in the range-of-use coordinates field.
- the vicinity of the origin of the projector coordinates is not included in the use range. For such coordinates, "-" is recorded in the use range coordinate field.
- the point where the projector coordinates are “100, 100” in the first projector 311 indicates that the projection is performed to the point where the use range coordinates are “200.45, 300. 32”.
- the distribution field the distribution of luminance to the projector 31 is recorded.
- “0.25” recorded in the distribution field is 25% of the total luminance with respect to the first projector 311. It means being distributed.
- "-" is recorded in the distribution field.
- the values of the distribution fields are defined such that the sum is 1 for each position within the use range.
- the characteristic of each projector 31 is made effective by increasing the value of the distribution field applied to the projector 31 with high luminance. Available.
- the value of the distribution field may be determined to be proportional to the maximum luminance to which each projector 31 can contribute for each position within the use range. By determining in this manner, it is possible to realize the information processing system 10 that can reduce the number of times of measurement of the luminance distribution and perform actual luminance display with a small amount of calculation.
- the distribution of luminance is recorded in the distribution field will be described as an example.
- FIG. 31 is an explanatory diagram for explaining the record layout of the second conversion DB.
- the second conversion DB is a DB that records usage range coordinates and original image coordinates in association with each other.
- the second transformation DB has a use range coordinate field and an original image coordinate field.
- the use range coordinates described in FIG. 29A are recorded in the use range coordinate field.
- the original coordinates described using FIG. 29B are recorded in the original image coordinates field.
- FIG. 31 it is shown that the point whose original image coordinate is "340.24, 234.58" is projected to the point whose use range coordinate is "100, 100".
- the aspect ratio of the use range is different from the aspect ratio of the original image, the original image is not projected at the end of the use range.
- "-" is recorded in the original image coordinate field corresponding to the non-projected use range coordinate.
- FIG. 32 is a flowchart showing a flow of processing of a program according to the ninth embodiment.
- the CPU 21 acquires original image data from the auxiliary storage device 23 or another server connected via a network (step S601).
- the CPU 21 sets the projector coordinates to the initial value "0, 0" (step S602).
- the CPU 21 uses the projector coordinates as a key to search the first conversion DB to extract a record, and acquires use range coordinates from the use range coordinate field of the extracted record (step S603).
- the CPU 21 determines whether the projector coordinates are within the use range coordinates (step S604). If it is out of use range coordinates, “-” is recorded in the use range coordinates acquired in step S603.
- the CPU 21 calculates original image coordinates corresponding to the use range coordinates (step S605). Specifically, the CPU 21 searches the second conversion DB using a plurality of coordinates near the use range coordinates acquired in step S603 as a key, extracts a record, interpolates the original image coordinates of the extracted record, and Calculate image coordinates. Interpolation can be performed by any method such as the nearest neighbor method, bilinear method, bicubic method, and the like.
- the CPU 21 determines whether the calculated original image coordinates are within the range of the original image (step S606). For example, when “ ⁇ ” is recorded in the original coordinate field of the record extracted by searching the second conversion DB and the interpolation can not be performed normally, the CPU 21 determines that it is out of the range of the original image.
- the CPU 21 acquires the luminance of the pixel based on the original image data acquired in step S601 (step S607).
- the luminance of the pixel uses, for example, the luminance of the original image data closest to the coordinates calculated in step S605. Pixels in the vicinity of the coordinates calculated in step S605 may be extracted from the original image data, and the luminance may be calculated by interpolation using an arbitrary interpolation method.
- the CPU 21 integrates the distribution recorded in the distribution field of the record extracted from the first conversion DB in step S603 to the luminance calculated in step S607 to calculate the luminance allocated to the projector 31 being processed (step S608).
- the CPU 21 determines that the pixel is black, that is, the luminance of the pixel is It is determined that it is 0 (step S609).
- the CPU 21 acquires an input tone value corresponding to the luminance of the pixel (step S610). At this time, the CPU 21 performs interpolation based on the luminance correction DB 52 described with reference to FIG. 6, and calculates the input gradation value corresponding to the position calculated in step S603 and the luminance value acquired in step S608 or step S609. Do.
- the brightness correction DB 52 is created for each of the projectors 31 based on the projection brightness when only the projector 31 is used.
- the CPU 21 records the input tone value acquired in step S610 in association with the projector coordinates (step S611). The CPU 21 determines whether or not processing of all projector coordinates has been completed (step S612). If it is determined that the process has not ended (NO in step S612), the CPU 21 selects projector coordinates to be processed next (step S613). The CPU 21 returns to step S603.
- step S612 determines whether the processing of all the projectors 31 has ended. If it is determined that the processing of all the projectors 31 has not been completed (NO in step S614), the CPU 21 selects the projector 31 to be processed next (step S615). The CPU 21 returns to step S602.
- step S614 If it is determined that the processing of all the projectors 31 has been completed (YES in step S614), the CPU 21 outputs an image to all the projectors 31 (step S616). Images are projected from the respective projectors 31 onto the screen 33. As a result, actual luminance display is realized in which an image of luminance faithful to the original image data is projected on the screen 33. The CPU 21 ends the process.
- FIG. 33 is an explanatory drawing for explaining a first modification of the ninth embodiment.
- FIG. 33A shows a state in which the first projector 311 to the fourth projector 314 are projected on the screen 33.
- the edges of the projection ranges of the four projectors 31 slightly overlap, and the projection ranges of the four projectors 31 overlap at the central portion.
- FIG. 33B shows a state in which the use range is superimposed on FIG.
- the first conversion DB described with reference to FIG. 30 can be created for each projector 31.
- By providing the distribution field in the first conversion DB it is possible to appropriately allocate the luminance to each of the projectors 31 even when the number of projectors 31 to be superimposed and projected differs depending on the place.
- FIG. 34 is an explanatory drawing for explaining a second modification of the ninth embodiment.
- this modification not a rectangular coordinate system but a coordinate system deformed in a barrel shape is used as the range of use.
- the original image data can be deformed and displayed in a barrel shape as shown in FIGS. 34A and 34B. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Geometry (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Projection Apparatus (AREA)
Abstract
Description
図1は、情報処理システム10の概要を説明する説明図である。本実施の形態の情報処理システム10は、たとえば車載カメラ等のテスト対象カメラ15の評価に使用される。
本実施の形態は、複数の測定位置に対して輝度補正情報を作成し、テスト対象カメラ15等を設置した位置に最も近い測定位置に基づいて補正した画像を表示する情報処理装置20に関する。実施の形態1と共通する部分については、説明を省略する。
本実施の形態は、複数のプロジェクタ31からスクリーン33に投影する画像を重畳する情報処理システム10に関する。実施の形態1と共通する部分については、説明を省略する。
本実施の形態は、使用範囲の一部に画像を投影する補助プロジェクタ32を使用する情報処理システム10に関する。実施の形態3と共通する部分については、説明を省略する。
本実施の形態は複数のスクリーン33を有する情報処理システム10に関する。実施の形態3と共通する部分については、説明を省略する。
本実施の形態は、ユーザが実輝度画像を目視により観察する情報処理システム10に関する。実施の形態3と共通する部分については、説明を省略する。
図26は、実施の形態7の情報処理装置20の動作を示す説明図である。情報処理装置20は、CPU21による制御に基づいて以下のように動作する。
本実施の形態は、汎用のコンピュータ90とプログラム97とを組み合わせて動作させることにより、本実施の形態の情報処理システム10を実現する形態に関する。図27は、実施の形態8の情報処理システム10の構成を示す説明図である。なお、実施の形態1と共通する部分の説明は省略する。
本実施の形態は、プロジェクタ31から投影する画像の座標と、図15を使用して説明した使用範囲における座標と、元画像データとを、変換DBを用いて順次変換する形態に関する。実施の形態3と共通する部分については、説明を省略する。
図33は、実施の形態9の第1変形例を説明する説明図である。図33Aは、第1プロジェクタ311から第4プロジェクタ314までをスクリーン33に投影した状態を示す。4台のプロジェクタ31の投影範囲の縁が少しずつ重なり、中央部では4台のプロジェクタ31の投影範囲が重なっている。
図34は、実施の形態9の第2変形例を説明する説明図である。本変形例においては、使用範囲に直交座標系ではなく、樽型に変形した座標系を使用している。このような樽型の座標系に基づいて図31を使用して説明した第2変換DBを作成することにより、図34Aおよび図34Bに示すように元画像データを樽型に変形させて表示できる。
今回開示された実施の形態はすべての点で例示であって、制限的なものではないと考えられるべきである。本発明の範囲は、上記した意味ではなく、請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。
15 テスト対象カメラ
16 ドライブシミュレータ
17 フロントガラス
18 シート
19 ハンドル
20 情報処理装置
21 CPU
22 主記憶装置
23 補助記憶装置
24 通信部
25 出力I/F
26 入力I/F
27 モニタ
28 読取部
30 表示装置
31 プロジェクタ
311 第1プロジェクタ
312 第2プロジェクタ
313 第3プロジェクタ
314 第4プロジェクタ
315 第5プロジェクタ
316 第6プロジェクタ
321 第1補助プロジェクタ
322 第2補助プロジェクタ
323 第3補助プロジェクタ
324 第4補助プロジェクタ
33 スクリーン(表示部)
331 第1スクリーン
332 第2スクリーン
333 第3スクリーン
36 輝度計(二次元色彩輝度計)
37 カメラ
51 輝度実測DB
52 輝度補正DB
61 第1取得部
62 第2取得部
63 輝度補正部
64 出力部
96 可搬型記録媒体
97 プログラム
98 半導体メモリ
Claims (8)
- 入力信号に基づく画像を表示した表示部を所定の測定位置から測定した輝度を、前記入力信号に含まれる輝度情報に合わせて補正する輝度補正情報を取得する第1取得部と、
前記表示部に表示する画像データを取得する第2取得部と、
前記第2取得部が取得した画像データを、前記第1取得部が取得した輝度補正情報に基づいて補正する輝度補正部と、
前記輝度補正部が補正した画像データを前記表示部に出力する出力部と
を備える情報処理装置。 - 画像の形状を補正する形状補正情報を取得する第3取得部と、
前記輝度補正部が補正した画像データを、前記第3取得部が取得した形状補正情報に基づいて補正する形状補正部とを備え、
前記出力部は、前記形状補正部が補正した画像データを出力する
請求項1に記載の情報処理装置。 - 前記画像データは、実輝度に関連づけられた三刺激値データを含み、
前記輝度補正部は、前記画像データに対応する実輝度の画像を前記表示部に表示するように前記画像データを補正する
請求項1または請求項2に記載の情報処理装置。 - 表示装置と、情報処理装置とを備える情報処理システムにおいて、
前記表示装置は、
画像を表示する表示部を有し、
前記情報処理装置は、
入力信号に基づく画像を表示した前記表示部を所定の測定位置から測定した輝度を、前記入力信号に含まれる輝度情報に合わせて補正する輝度補正情報を取得する第1取得部と、
前記表示部に表示する画像を取得する第2取得部と、
前記第2取得部が取得した画像を、前記第1取得部が取得した補正情報に基づいて補正する輝度補正部と、
前記輝度補正部が補正した画像を前記表示部に出力する出力部とを有する
情報処理システム。 - 前記情報処理装置は、
前記測定位置を取得する位置取得部を有する
請求項4に記載の情報処理システム。 - 前記表示部は、リアプロジェクション用のスクリーンであり、
前記表示装置は、
前記スクリーンに画像を投影可能な複数のプロジェクタを含み、
前記プロジェクタは、相互に投影範囲が重複するように配置され、
前記情報処理装置は、
複数の前記プロジェクタが、前記第2取得部が取得した画像を、前記スクリーンに重畳する状態で投影するように補正する形状補正部を備え、
前記輝度補正部は、前記形状補正部が補正した画像を補正し、
前記出力部は複数の前記プロジェクタのそれぞれに前記輝度補正部が補正した画像を出力する
請求項4または請求項5に記載の情報処理システム。 - 前記プロジェクタの一部は、投影範囲の一部が他の前記プロジェクタの投影範囲と重複するように配置される
請求項6に記載の情報処理システム。 - 入力信号に基づく画像を表示した表示部を所定の測定位置から測定した輝度を、前記入力信号に含まれる輝度情報に合わせて補正する輝度補正情報を取得し、
前記表示部に表示する画像を取得し、
取得した画像を、取得した輝度補正情報に基づいて補正し、
補正した画像を前記表示部に出力する
処理をコンピュータに実行させる情報処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18849975.0A EP3723363A4 (fr) | 2017-08-30 | 2018-08-30 | Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations |
US16/643,421 US20210407046A1 (en) | 2017-08-30 | 2018-08-30 | Information processing device, information processing system, and information processing method |
JP2019539636A JPWO2019045010A1 (ja) | 2017-08-30 | 2018-08-30 | 情報処理装置、情報処理システムおよび情報処理方法 |
IL272975A IL272975A (en) | 2017-08-30 | 2020-02-27 | Information processing device, information processing system and information processing method |
JP2022027115A JP7260937B2 (ja) | 2017-08-30 | 2022-02-24 | カメラテストシステムおよびカメラテスト方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017166119 | 2017-08-30 | ||
JP2017-166119 | 2017-08-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019045010A1 true WO2019045010A1 (ja) | 2019-03-07 |
Family
ID=65525777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/032246 WO2019045010A1 (ja) | 2017-08-30 | 2018-08-30 | 情報処理装置、情報処理システムおよび情報処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20210407046A1 (ja) |
EP (1) | EP3723363A4 (ja) |
JP (2) | JPWO2019045010A1 (ja) |
IL (1) | IL272975A (ja) |
WO (1) | WO2019045010A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113986417A (zh) * | 2021-10-11 | 2022-01-28 | 深圳康佳电子科技有限公司 | 一种应用程序投屏控制方法、装置、终端设备及存储介质 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019212601A1 (de) * | 2019-08-22 | 2021-02-25 | Volkswagen Aktiengesellschaft | Generieren einer Anzeige eines Augmented-Reality-Head-up-Displays für ein Kraftfahrzeug |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0756549A (ja) * | 1993-08-12 | 1995-03-03 | Hitachi Ltd | 画像機器及び色補正システム |
JP2004036225A (ja) * | 2002-07-03 | 2004-02-05 | Taiyo Kiso Kk | 結束リング |
JP2008051849A (ja) * | 2006-08-22 | 2008-03-06 | Seiko Epson Corp | プロジェクションシステム、画像情報処理装置、プログラム、及び記録媒体 |
JP2008191257A (ja) * | 2007-02-01 | 2008-08-21 | Canon Inc | 画像表示装置及びその制御方法、プログラム並びにコンピュータ読み取り可能な記憶媒体 |
JP2012142669A (ja) * | 2010-12-28 | 2012-07-26 | Seiko Epson Corp | 投写制御装置、投写システム、テストチャート、投写領域判定方法 |
JP2015177484A (ja) * | 2014-03-18 | 2015-10-05 | 株式会社Jvcケンウッド | モニタ及び映像信号表示方法 |
US20160046751A1 (en) | 2013-03-20 | 2016-02-18 | Basf Se | Polyurethane-based polymer composition |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001209358A (ja) * | 2000-01-26 | 2001-08-03 | Seiko Epson Corp | 表示画像のムラ補正 |
JP2003046751A (ja) * | 2001-07-27 | 2003-02-14 | Olympus Optical Co Ltd | マルチプロジェクションシステム |
US7129456B2 (en) | 2002-02-19 | 2006-10-31 | Olympus Corporation | Method and apparatus for calculating image correction data and projection system |
JP2004158941A (ja) * | 2002-11-05 | 2004-06-03 | Seiko Epson Corp | 色ムラ補正装置、プロジェクタ、色ムラ補正方法、プログラムおよび記録媒体 |
JP3620537B2 (ja) * | 2003-05-02 | 2005-02-16 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法 |
US6817721B1 (en) * | 2003-07-02 | 2004-11-16 | Hewlett-Packard Development Company, L.P. | System and method for correcting projector non-uniformity |
JP2005189542A (ja) | 2003-12-25 | 2005-07-14 | National Institute Of Information & Communication Technology | 表示システム、表示プログラム、表示方法 |
-
2018
- 2018-08-30 EP EP18849975.0A patent/EP3723363A4/fr not_active Withdrawn
- 2018-08-30 US US16/643,421 patent/US20210407046A1/en not_active Abandoned
- 2018-08-30 WO PCT/JP2018/032246 patent/WO2019045010A1/ja active Application Filing
- 2018-08-30 JP JP2019539636A patent/JPWO2019045010A1/ja active Pending
-
2020
- 2020-02-27 IL IL272975A patent/IL272975A/en unknown
-
2022
- 2022-02-24 JP JP2022027115A patent/JP7260937B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0756549A (ja) * | 1993-08-12 | 1995-03-03 | Hitachi Ltd | 画像機器及び色補正システム |
JP2004036225A (ja) * | 2002-07-03 | 2004-02-05 | Taiyo Kiso Kk | 結束リング |
JP2008051849A (ja) * | 2006-08-22 | 2008-03-06 | Seiko Epson Corp | プロジェクションシステム、画像情報処理装置、プログラム、及び記録媒体 |
JP2008191257A (ja) * | 2007-02-01 | 2008-08-21 | Canon Inc | 画像表示装置及びその制御方法、プログラム並びにコンピュータ読み取り可能な記憶媒体 |
JP2012142669A (ja) * | 2010-12-28 | 2012-07-26 | Seiko Epson Corp | 投写制御装置、投写システム、テストチャート、投写領域判定方法 |
US20160046751A1 (en) | 2013-03-20 | 2016-02-18 | Basf Se | Polyurethane-based polymer composition |
JP2015177484A (ja) * | 2014-03-18 | 2015-10-05 | 株式会社Jvcケンウッド | モニタ及び映像信号表示方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113986417A (zh) * | 2021-10-11 | 2022-01-28 | 深圳康佳电子科技有限公司 | 一种应用程序投屏控制方法、装置、终端设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
IL272975A (en) | 2020-04-30 |
JP7260937B2 (ja) | 2023-04-19 |
EP3723363A4 (fr) | 2021-11-03 |
US20210407046A1 (en) | 2021-12-30 |
JPWO2019045010A1 (ja) | 2020-10-01 |
EP3723363A1 (fr) | 2020-10-14 |
JP2022066278A (ja) | 2022-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100596206C (zh) | 投影仪颜色校正方法 | |
US9661257B2 (en) | Projection system, image processing device, and projection method | |
JP4165540B2 (ja) | 投写画像の位置調整方法 | |
TWI511122B (zh) | 校正相機之影像失真的校準方法與系統 | |
JP3497805B2 (ja) | 画像投影表示装置 | |
JP5603014B2 (ja) | 超解像表示の補正 | |
JP7260937B2 (ja) | カメラテストシステムおよびカメラテスト方法 | |
JP4114683B2 (ja) | 投写画像の位置調整方法 | |
CN105103541B (zh) | 图案位置检测方法、图案位置检测系统以及应用了这些的画质调整技术 | |
US20050002586A1 (en) | Method and system for providing formatted data to image processing means in accordance with a standard format | |
US20190313070A1 (en) | Automatic calibration projection system and method | |
US20120032973A1 (en) | Image processing apparatus, image processing method, and storage medium | |
CN103200409B (zh) | 多投影仪显示系统的颜色校正方法 | |
JPWO2003071794A1 (ja) | 画像補正データ算出方法、画像補正データ算出装置、及びプロジェクションシステム | |
JP5411786B2 (ja) | 撮影装置および画像統合プログラム | |
JP6028527B2 (ja) | 表示処理装置、表示処理方法、及びプログラム | |
CN108737806A (zh) | 一种投影仪色彩校正方法及装置、计算机存储介质 | |
JP2005189542A (ja) | 表示システム、表示プログラム、表示方法 | |
JP2009157219A (ja) | ディスプレイの評価方法およびそれに用いる評価装置 | |
KR20110095556A (ko) | 영상투사장치 및 그 영상보정방법 | |
JP4396107B2 (ja) | 映像表示システム | |
JP2009217174A (ja) | 映像表示システムの調整装置、調整方法、幾何情報取得方法、及びプログラム | |
WO2018230364A1 (ja) | 画像処理装置、画像処理方法、プログラム、およびプロジェクタ装置 | |
JP4696669B2 (ja) | 画像調整方法及び画像調整装置 | |
WO2019230438A1 (ja) | 情報処理装置、情報処理方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18849975 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019539636 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 272975 Country of ref document: IL |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018849975 Country of ref document: EP Effective date: 20200330 |