WO2011061802A1 - Information operation device for display units - Google Patents

Information operation device for display units Download PDF

Info

Publication number
WO2011061802A1
WO2011061802A1 PCT/JP2009/006234 JP2009006234W WO2011061802A1 WO 2011061802 A1 WO2011061802 A1 WO 2011061802A1 JP 2009006234 W JP2009006234 W JP 2009006234W WO 2011061802 A1 WO2011061802 A1 WO 2011061802A1
Authority
WO
WIPO (PCT)
Prior art keywords
display unit
display
unit
information
image
Prior art date
Application number
PCT/JP2009/006234
Other languages
French (fr)
Japanese (ja)
Inventor
三木洋平
宮原浩二
藤本仁志
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2009/006234 priority Critical patent/WO2011061802A1/en
Publication of WO2011061802A1 publication Critical patent/WO2011061802A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display

Definitions

  • the present invention relates to information on a display unit that detects the position of each display unit in a display device composed of a plurality of display units, measures the luminance of the display unit based on the detected position, and superimposes display unit information.
  • the present invention relates to an arithmetic device.
  • the brightness of each display unit needs to be uniform in order to display a high-quality image.
  • the luminance may not be uniform depending on the observation angle due to the characteristics of the LED elements used in the display unit.
  • a measurement technique for detecting the luminance for each display unit is required.
  • Patent Document 1 when measuring the luminance of each display unit region using the imaging unit, a position measurement pattern for determining the position of each display unit is displayed, and After that, it was necessary to display a luminance measurement pattern. In addition, the information on only the problematic part is output separately from the photographed image from the luminance calculation result, and it is difficult for the operator to grasp the intuitive situation.
  • the present invention has been made to solve such a problem, and it is not necessary to prepare a special display pattern for measuring the position, and the information calculation of the display unit can easily measure the position of the display unit.
  • An object is to provide an apparatus.
  • the information processing device of the display unit obtains a projective transformation matrix representing the relationship between the photographed image coordinate system and the display area coordinate system from the photographed image, and based on the projective transformation matrix, each display unit of the display device.
  • the position is calculated. Thereby, it is not necessary to prepare a special display pattern for measuring the position, and the position of the display unit can be easily measured.
  • FIG. 1 shows the relationship between a display device 1 and an imaging device 2 used in an information processing device of a display unit according to this embodiment.
  • FIG. 2 is a configuration diagram of the information processing apparatus of the display unit according to the present embodiment.
  • the display unit information computation device according to the present embodiment is realized by a computation device 100 to which a photographing device (photographing means) 2 for photographing the display device 1 is connected.
  • the arithmetic device 100 includes a video acquisition unit 101, a four corner detection unit 102, a projective transformation matrix calculation unit 103, a display unit position calculation unit 104, and a luminance calculation unit 105.
  • the arithmetic device 100 is realized by a computer, and the video acquisition unit 101 to the luminance calculation unit 105 are configured by software corresponding to each function and hardware such as a CPU and a memory that execute these software.
  • at least one of the functional units of the video acquisition unit 101 to the luminance calculation unit 105 may be configured with dedicated hardware.
  • the video acquisition unit 101 acquires an image obtained by photographing the display device 1 with the photographing device 2.
  • the four corner detection unit 102 detects the positions of the four corners of the display device image shown in the image held in the video acquisition unit 101.
  • the projective transformation matrix calculation unit 103 calculates a projective transformation matrix that represents the correspondence between the captured image coordinate system and the display area coordinate system necessary for specifying the display unit position on the captured image.
  • the display unit position calculation unit 104 obtains the display unit position on the image using the projection transformation matrix obtained by the projection transformation matrix calculation unit 103.
  • the four corner detection unit 102 to the display unit position calculation unit 104 constitute display unit position calculation means.
  • the luminance calculation unit 105 obtains the luminance of each display unit from the captured image based on the display unit position information on the captured image specified by the display unit position calculation unit 104.
  • step ST1 the entire display device 1 is turned on (step ST1), and photographing is performed in this state (step ST2). 4 is stored in the image acquisition unit 101.
  • step ST3 the four corner detection unit 102 detects the four corners of the display screen on the photographed image using the photographed image as shown in FIG.
  • the corner detection method uses a generally known corner detection method. At this time, if there is something other than the display screen that adversely affects corner detection in the captured image, the area for corner detection may be limited by manually setting a mask area.
  • step ST4 when the display device 1 is a device composed of display units of m rows and n columns, the upper left is (0, 0) and the lower right is shown in the display screen coordinates in FIG.
  • a projection transformation matrix used when converting the two-dimensional coordinates defined as (m, n) from the display area coordinates to the photographed image coordinates representing the pixel position of the photographed image is obtained (indicated by A in FIG. 6). Specifically, assuming that the captured image coordinate position is (C x , C y ) and the display screen coordinates are (S x , S y ), the projective transformation matrix is expressed as in Equation (1), From the nature, ⁇ can be eliminated.
  • each coefficient of the projective transformation matrix can be obtained using Equation (2).
  • step ST5 the display unit position calculation unit 104 uses the projection transformation matrix obtained in step ST4 to correspond to the vertex coordinates of each display unit in the display screen coordinates as shown in B in FIG. Find the coordinates on the captured image. That is, the boundary position of the display unit is obtained using the projective transformation matrix.
  • step ST6 the luminance calculation unit 105 calculates the luminance of each display unit using the individual display unit positions in the captured image coordinate system obtained in step ST5 as a clue. At this time, if there is something that affects the luminance measurement in the shooting environment, create a difference image after acquiring a shot image with the entire display screen turned off and use this image. To remove the effect.
  • the imaging unit that captures the display image of the display device that includes a plurality of display units, and the captured image coordinate system from the captured image of the imaging unit
  • a projection transformation matrix representing the relationship of the display area coordinate system is obtained, and based on the projection transformation matrix
  • the display unit position calculating means for calculating the respective positions of the plurality of display units in the display device, and the display unit calculating means Since it has a luminance calculation unit for calculating the luminance of each display unit based on the position of each display unit and the photographed image, the display pattern of the display device and its photographing intended to measure the luminance of the display unit Using only the image, it is possible to specify the position of the display unit on the captured image and measure the brightness of each unit. And providing a pattern, there is no need to obtain a new captured image.
  • the display unit position calculating means obtains a projective transformation matrix from the correspondence between the captured image and the four corner positions in the display image, and displays each display based on the projective transformation matrix. Since the boundary position of the unit is obtained, the position of the display unit can be easily obtained.
  • Embodiment 2 FIG.
  • the position of the display unit on the photographed image and the luminance measurement of each unit are performed using only the display pattern of the display device and the photographed image for the purpose of measuring the brightness of the display unit.
  • the information on the boundary between the display units obtained in the first embodiment and the brightness of each display unit is superimposed on the captured image.
  • FIG. 7 is a configuration diagram of the information processing apparatus of the display unit according to the second embodiment.
  • the configurations of the image acquisition unit 101 to the luminance calculation unit 105 in the display device 1, the imaging device 2, and the arithmetic device 100a are the same as those in the first embodiment shown in FIG. To do.
  • luminance difference information between adjacent display units is superimposed on the captured image from the boundary of the display unit obtained by the display unit position calculating unit 104 and the result obtained by the luminance calculating unit 105 in the arithmetic device 100a.
  • a superimposed image generation unit 106 is provided.
  • step ST7 based on the information obtained in steps ST5 and ST6, the superimposed image generation unit 106 creates a superimposed image. Specifically, as shown in FIG. 9, from the position information on the captured image of the display unit obtained in step ST5, a point is superimposed on the vertex position of the display unit, and a line is superimposed on the boundary position of the display unit. . In addition, when an address of the display unit is defined, numbers and characters may be superimposed so that the address can be easily understood by the viewer. Then, as shown in FIG.
  • the boundary is emphasized with a line having a different density in accordance with the magnitude of the luminance difference in the portion where the luminance difference between adjacent display units is large from the luminance information of the display unit obtained in step ST6.
  • the enhancement method may be enhancement based on the thickness of a line (a thick line when the luminance difference is large and a thin line when the luminance difference is small). In this way, by emphasizing the portion where the luminance difference between adjacent display units is large, the operator can easily understand where the problematic unit to be noticed is.
  • step ST8 the superimposed image 3 created in step ST7 is displayed.
  • a display method it may be displayed on a display by a personal computer (not shown) or printed on paper.
  • the luminance was measured, and a superimposed image was created for the result.
  • the display unit has different brightness / darkness tendencies depending on the RGB display elements, it is also possible to shoot with the RGB lighted individually, and to superimpose the individual RGB light / dark conditions from the respective images.
  • the superimposing image generation unit that superimposes the boundary position of each display unit and the luminance difference information of each display unit on the captured image is provided. Makes it easier to recognize the position of the display unit.
  • Embodiment 3 the information to be superimposed on the captured image relates to the luminance of the display unit.
  • the display unit position information on the photographed image obtained in the first embodiment is used, and the display unit information is used as information to be superimposed on the photographed image.
  • FIG. 12 is a configuration diagram of the information processing apparatus of the display unit according to the third embodiment.
  • the calculation apparatus 100b includes a display unit information superimposed image generation unit 107 in place of the luminance calculation unit 105 of the first embodiment.
  • the display unit information superimposed image generation unit 107 acquires display unit information from the display unit information storage device 4 that stores the display unit information, and generates an image in which the display unit information is superimposed.
  • Step ST1 to ST5 are the same as those in the first and second embodiments.
  • the display unit information is superimposed on the captured image in step ST9.
  • the display unit information includes, for example, a display unit manufacturing number and an operating time, and is information unique to each display unit.
  • step ST10 the created superimposed image 3a is displayed.
  • the display method may be to display on the display of a personal computer or to print on paper, as in step ST8 of the second embodiment.
  • FIG. 14 is an explanatory diagram when the serial number is displayed for each display unit.
  • the operator can easily visually recognize the display unit information such as the manufacturing number of each display unit in the display device. That is, in the case of an assembly-type large display device (a device that is assembled when used and disassembled after use) as the display device, the display unit to be used changes each time, so by displaying the manufacturing number for each display unit, The operator can easily confirm the display unit in which the display device is assembled. Further, in the case of such an assembly-type large display device, there are display units that are frequently used and display units that are not so, and there is a difference in operating time for each display unit.
  • the operator can easily check the operation time of each display unit. Furthermore, even for stationary large display devices, the display unit may need to be partially replaced over a long period of time. Even in such a case, maintenance of the display device can be performed by knowing the operating time of each display unit. Can be useful.
  • the operator may select whether or not to superimpose the display unit information as necessary. In this way, the operator can visually recognize the display unit information when necessary, and by selecting “no display”, the captured image itself can be easily confirmed.
  • the imaging unit that captures the display image of the display device including a plurality of display units, and the captured image coordinate system from the captured image of the imaging unit
  • a projection transformation matrix representing the relationship of the display area coordinate system is obtained, and based on the projection transformation matrix
  • the display unit position calculating means for calculating the respective positions of the plurality of display units in the display device, and the display unit calculating means Since the display unit information superimposed image generating unit for superimposing display unit information, which is information unique to each display unit, on the position of each display unit in the captured image based on the position of each display unit is displayed on the captured image Since the unit information is displayed corresponding to the position, the operator can easily grasp the entire display device.
  • the information processing device for a display unit relates to a configuration for detecting the position of each display unit in a display device composed of a plurality of display units, and is a display unit composed of a plurality of LEDs. Is suitable for detecting the position of the display unit of the LED large-screen display device configured by combining the two.

Abstract

A projective transform matrix calculation unit (103) finds a projective transform matrix indicating the relationship between a captured image coordinate system and a display region coordinate system on the basis of a captured image acquired by a video acquisition unit (101). A display unit position calculation unit (104) calculates the positions of respective display units in a display device (1) on the basis of the projective transform matrix. A luminance calculation unit (105) calculates the luminances of the respective display units on the basis of the positions of the respective display units and the captured image.

Description

表示ユニットの情報演算装置Information unit for display unit
 本発明は、複数の表示ユニットからなる表示装置において、各表示ユニットの位置を検出し、この検出位置に基づいて表示ユニットの輝度を計測したり、表示ユニット情報を重畳したりする表示ユニットの情報演算装置に関するものである。 The present invention relates to information on a display unit that detects the position of each display unit in a display device composed of a plurality of display units, measures the luminance of the display unit based on the detected position, and superimposes display unit information. The present invention relates to an arithmetic device.
 複数のLEDから構成される表示ユニットを組み合わせて構成されるLED大画面表示装置において、高品質な画像を表示させるためには、個々の表示ユニットの輝度が均一である必要がある。しかし、表示ユニットに使用されているLED素子の特性により、観察角度によって輝度が均一にならない場合がある。このような問題に対して、作業者が目視のみで輝度について問題のある表示ユニットを調査すると、長時間にわたる高負荷な作業となり、しかも均一な評価を行うことは困難である。そのため、作業者の負荷を減少させるには、表示ユニットごとの輝度を検出する計測技術が必要となる。 In an LED large-screen display device configured by combining display units composed of a plurality of LEDs, the brightness of each display unit needs to be uniform in order to display a high-quality image. However, the luminance may not be uniform depending on the observation angle due to the characteristics of the LED elements used in the display unit. For such a problem, when an operator examines a display unit having a problem with luminance only by visual observation, it becomes a heavy work for a long time and it is difficult to perform a uniform evaluation. Therefore, in order to reduce the load on the operator, a measurement technique for detecting the luminance for each display unit is required.
 従来、このような画像表示装置の輝度計測方式として、撮像手段を用いて複数の方向から位置計測パターンを表示させたディスプレイ画面を撮影し、位置計測パターンを手がかりに撮影画像間における同一対応画素を求め、撮影画像間の差分値から異常画素を検出する技術があった(例えば、特許文献1参照)。 Conventionally, as a luminance measurement method of such an image display device, a display screen on which a position measurement pattern is displayed from a plurality of directions using an imaging unit is photographed, and the same corresponding pixel between photographed images is obtained using the position measurement pattern as a clue. There is a technique for obtaining an abnormal pixel from a difference value between photographed images (see, for example, Patent Document 1).
特開2007-71723号公報JP 2007-71723 A
 しかしながら、上記特許文献1に記載された技術では、撮像手段を用いて各表示ユニット領域の輝度を計測する際には、個々の表示ユニットの位置を決定するための位置計測パターンを表示させ、かつ、その後、輝度計測用のパターンを表示する必要があった。また、その輝度算出結果から問題のある箇所のみの情報が撮影画像とは別に出力され、作業者による直観的な状況の把握が困難であった。 However, in the technique described in Patent Document 1, when measuring the luminance of each display unit region using the imaging unit, a position measurement pattern for determining the position of each display unit is displayed, and After that, it was necessary to display a luminance measurement pattern. In addition, the information on only the problematic part is output separately from the photographed image from the luminance calculation result, and it is difficult for the operator to grasp the intuitive situation.
 この発明は、かかる問題を解決するためになされたもので、位置を計測するための特別な表示パターンを準備する必要がなく、容易に表示ユニットの位置を測定することのできる表示ユニットの情報演算装置を提供することを目的とする。 The present invention has been made to solve such a problem, and it is not necessary to prepare a special display pattern for measuring the position, and the information calculation of the display unit can easily measure the position of the display unit. An object is to provide an apparatus.
 この発明に係る表示ユニットの情報演算装置は、撮影画像から、撮影画像座標系と表示領域座標系の関係を表す射影変換行列を求め、この射影変換行列に基づいて、表示装置における各表示ユニットの位置を算出するようにしたものである。これにより、位置を計測するための特別な表示パターンを準備する必要がなく、容易に表示ユニットの位置を測定することができる。 The information processing device of the display unit according to the present invention obtains a projective transformation matrix representing the relationship between the photographed image coordinate system and the display area coordinate system from the photographed image, and based on the projective transformation matrix, each display unit of the display device. The position is calculated. Thereby, it is not necessary to prepare a special display pattern for measuring the position, and the position of the display unit can be easily measured.
この発明に係る表示ユニットの情報演算装置における表示装置と撮影装置とを示す説明図である。It is explanatory drawing which shows the display apparatus and imaging device in the information arithmetic unit of the display unit which concerns on this invention. この発明の実施の形態1の表示ユニットの情報演算装置の構成図である。It is a block diagram of the information arithmetic unit of the display unit of Embodiment 1 of this invention. この発明の実施の形態1の表示ユニットの情報演算装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the information arithmetic unit of the display unit of Embodiment 1 of this invention. この発明の実施の形態1の表示ユニットの情報演算装置における撮影画像の説明図である。It is explanatory drawing of the picked-up image in the information arithmetic unit of the display unit of Embodiment 1 of this invention. この発明の実施の形態1の表示ユニットの情報演算装置における撮影画像の四隅を示す説明図である。It is explanatory drawing which shows the four corners of the picked-up image in the information processing apparatus of the display unit of Embodiment 1 of this invention. この発明の実施の形態1の表示ユニットの情報演算装置における撮影画像座標と表示画面座標との関係を示す説明図である。It is explanatory drawing which shows the relationship between the picked-up image coordinate and a display screen coordinate in the information processing apparatus of the display unit of Embodiment 1 of this invention. この発明の実施の形態2の表示ユニットの情報演算装置の構成図である。It is a block diagram of the information arithmetic unit of the display unit of Embodiment 2 of this invention. この発明の実施の形態2の表示ユニットの情報演算装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the information arithmetic unit of the display unit of Embodiment 2 of this invention. この発明の実施の形態2の表示ユニットの情報演算装置における表示ユニットの頂点位置と境界位置の重畳状態を示す説明図である。It is explanatory drawing which shows the superimposition state of the vertex position and boundary position of a display unit in the information arithmetic unit of the display unit of Embodiment 2 of this invention. この発明の実施の形態2の表示ユニットの情報演算装置における輝度差の表示例を示す説明図である。It is explanatory drawing which shows the example of a display of the brightness | luminance difference in the information processing apparatus of the display unit of Embodiment 2 of this invention. この発明の実施の形態2の表示ユニットの情報演算装置における輝度差の他の表示例を示す説明図である。It is explanatory drawing which shows the other example of a brightness | luminance difference in the information processing apparatus of the display unit of Embodiment 2 of this invention. この発明の実施の形態3の表示ユニットの情報演算装置の構成図である。It is a block diagram of the information arithmetic unit of the display unit of Embodiment 3 of this invention. この発明の実施の形態3の表示ユニットの情報演算装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the information arithmetic unit of the display unit of Embodiment 3 of this invention. この発明の実施の形態3の表示ユニットの情報演算装置における表示ユニット情報の重畳例を示す説明図である。It is explanatory drawing which shows the example of a superimposition of the display unit information in the information processing apparatus of the display unit of Embodiment 3 of this invention.
 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、本実施の形態による表示ユニットの情報演算装置に用いる表示装置1と撮影装置2との関係を示したものである。
 図2は、本実施の形態による表示ユニットの情報演算装置の構成図である。
 本実施の形態による表示ユニットの情報演算装置は、図1に示すように、表示装置1を撮影する撮影装置(撮影手段)2が接続された演算装置100によって実現される。演算装置100は、映像取得部101と、四隅検出部102と、射影変換行列算出部103と、表示ユニット位置算出部104と、輝度算出部105とを備えている。尚、演算装置100はコンピュータによって実現され、映像取得部101~輝度算出部105は、それぞれの機能に対応したソフトウェアと、これらのソフトウェアを実行するCPUやメモリ等のハードウェアから構成されている。あるいは、映像取得部101~輝度算出部105の少なくともいずれかの機能部を専用のハードウェアで構成してもよい。
Hereinafter, in order to explain the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
FIG. 1 shows the relationship between a display device 1 and an imaging device 2 used in an information processing device of a display unit according to this embodiment.
FIG. 2 is a configuration diagram of the information processing apparatus of the display unit according to the present embodiment.
As shown in FIG. 1, the display unit information computation device according to the present embodiment is realized by a computation device 100 to which a photographing device (photographing means) 2 for photographing the display device 1 is connected. The arithmetic device 100 includes a video acquisition unit 101, a four corner detection unit 102, a projective transformation matrix calculation unit 103, a display unit position calculation unit 104, and a luminance calculation unit 105. The arithmetic device 100 is realized by a computer, and the video acquisition unit 101 to the luminance calculation unit 105 are configured by software corresponding to each function and hardware such as a CPU and a memory that execute these software. Alternatively, at least one of the functional units of the video acquisition unit 101 to the luminance calculation unit 105 may be configured with dedicated hardware.
 映像取得部101は、表示装置1を撮影装置2で撮影した画像を取得するものである。四隅検出部102は、映像取得部101に保持された画像に写っている表示装置画像の四隅の位置を検出するものである。射影変換行列算出部103は、撮影画像上の表示ユニット位置を特定するために必要な撮影画像座標系と表示領域座標系の対応関係を表す射影変換行列を算出するものである。表示ユニット位置算出部104は、射影変換行列算出部103で求めた射影変換行列を用いて画像上の表示ユニット位置を求めるものである。尚、これら四隅検出部102~表示ユニット位置算出部104で表示ユニット位置算出手段を構成している。輝度算出部105は、表示ユニット位置算出部104で特定した撮影画像上の表示ユニット位置情報を元に撮影画像から表示ユニット単位の輝度を求めるものである。 The video acquisition unit 101 acquires an image obtained by photographing the display device 1 with the photographing device 2. The four corner detection unit 102 detects the positions of the four corners of the display device image shown in the image held in the video acquisition unit 101. The projective transformation matrix calculation unit 103 calculates a projective transformation matrix that represents the correspondence between the captured image coordinate system and the display area coordinate system necessary for specifying the display unit position on the captured image. The display unit position calculation unit 104 obtains the display unit position on the image using the projection transformation matrix obtained by the projection transformation matrix calculation unit 103. The four corner detection unit 102 to the display unit position calculation unit 104 constitute display unit position calculation means. The luminance calculation unit 105 obtains the luminance of each display unit from the captured image based on the display unit position information on the captured image specified by the display unit position calculation unit 104.
 次に、実施の形態1の表示ユニットの情報演算装置の動作を図3のフローチャートに基づいて説明する。
 図1に示すように、撮影装置2を表示装置1に対して任意の角度で設置した上で、表示装置1全面を点灯させ(ステップST1)、この状態で撮影を行い(ステップST2)、図4に示すような撮影画像を画像取得部101に格納する。
 次に、ステップST3において、図4に示すような撮影画像を用いて図5に示すように撮影画像上の表示画面の四隅を四隅検出部102で検出する。四隅の検出方法においては、一般的に知られているコーナ検出手法といった手法を用いる。このとき、表示画面以外でコーナ検出に悪影響を与えるものが撮影画像内にある場合は、手動によってマスク領域を設定するなどして、コーナ検出を行う領域を限定してもよい。
Next, the operation of the information processing apparatus of the display unit according to the first embodiment will be described based on the flowchart of FIG.
As shown in FIG. 1, after the photographing device 2 is installed at an arbitrary angle with respect to the display device 1, the entire display device 1 is turned on (step ST1), and photographing is performed in this state (step ST2). 4 is stored in the image acquisition unit 101.
Next, in step ST3, the four corner detection unit 102 detects the four corners of the display screen on the photographed image using the photographed image as shown in FIG. The corner detection method uses a generally known corner detection method. At this time, if there is something other than the display screen that adversely affects corner detection in the captured image, the area for corner detection may be limited by manually setting a mask area.
 次に、ステップST4において、表示装置1が、m行n列の表示ユニットで構成された装置であるとき、図6中の表示画面座標に示すように、左上を(0,0)、右下を(m,n)と定義した2次元座標を表示領域座標から撮影画像の画素位置を表す撮影画像座標へ変換する際に使用する射影変換行列を求める(図6中のAで示す)。具体的には、撮影画像座標位置を(C,C)、表示画面座標を(S,S)とすると、射影変換行列は式(1)のように表され、斉次座標の性質からλは消去できる。

Figure JPOXMLDOC01-appb-I000001
 ステップST3で求めた撮影画像上の表示装置1の四隅位置(4点の座標をそれぞれ(C1x,C1y)(C2x,C2y)(C3x,C3y)(C4x,C4y)とする)と、それに対応する表示領域座標(4点の座標をそれぞれ(S1x,S1y)(S2x,S2y)(S3x,S3y)(S4x,S4y)が4組分かっているため、式(2)を用いて射影変換行列の各係数が求まる。
Next, in step ST4, when the display device 1 is a device composed of display units of m rows and n columns, the upper left is (0, 0) and the lower right is shown in the display screen coordinates in FIG. A projection transformation matrix used when converting the two-dimensional coordinates defined as (m, n) from the display area coordinates to the photographed image coordinates representing the pixel position of the photographed image is obtained (indicated by A in FIG. 6). Specifically, assuming that the captured image coordinate position is (C x , C y ) and the display screen coordinates are (S x , S y ), the projective transformation matrix is expressed as in Equation (1), From the nature, λ can be eliminated.

Figure JPOXMLDOC01-appb-I000001
Four corner positions of the display device 1 on the captured image obtained in step ST3 (the coordinates of the four points are ( C1x , C1y ) ( C2x , C2y ) ( C3x , C3y ) ( C4x , C4y ), respectively) And four display area coordinates (S 1x , S 1y ), (S 2x , S 2y ), (S 3x , S 3y ), (S 4x , S 4y ) corresponding to the display area coordinates corresponding thereto. Therefore, each coefficient of the projective transformation matrix can be obtained using Equation (2).

Figure JPOXMLDOC01-appb-I000002
 次に、ステップST5において、表示ユニット位置算出部104では、ステップST4で求めた射影変換行列を使用して、図6中のBに示すように、表示画面座標の各表示ユニットの頂点座標に対応する撮影画像上の座標を求める。即ち、射影変換行列を用いて表示ユニットの境界位置を求める。

Figure JPOXMLDOC01-appb-I000002
Next, in step ST5, the display unit position calculation unit 104 uses the projection transformation matrix obtained in step ST4 to correspond to the vertex coordinates of each display unit in the display screen coordinates as shown in B in FIG. Find the coordinates on the captured image. That is, the boundary position of the display unit is obtained using the projective transformation matrix.
 次に、ステップST6において、ステップST5で求めた撮影画像座標系における個々の表示ユニット位置を手がかりに、輝度算出部105で、表示ユニット単位の輝度を算出する。また、このとき、撮影環境で輝度計測に影響を与えるものがある場合には、表示画面の全面を消灯させた状態の撮影画像を別に取得したうえで、差分画像を作成し、この画像を使用して影響を取り除くとよい。 Next, in step ST6, the luminance calculation unit 105 calculates the luminance of each display unit using the individual display unit positions in the captured image coordinate system obtained in step ST5 as a clue. At this time, if there is something that affects the luminance measurement in the shooting environment, create a difference image after acquiring a shot image with the entire display screen turned off and use this image. To remove the effect.
 以上のように、実施の形態1の表示ユニットの情報演算装置によれば、複数の表示ユニットからなる表示装置の表示画像を撮影する撮影手段と、撮影手段の撮影画像から、撮影画像座標系と表示領域座標系の関係を表す射影変換行列を求め、射影変換行列に基づいて、表示装置における複数の表示ユニットのそれぞれの位置を算出する表示ユニット位置算出手段と、表示ユニット算出手段で算出された各表示ユニットの位置と撮影画像とに基づいて、各表示ユニットの輝度を算出する輝度算出部とを備えたので、表示ユニットの輝度を計測することを目的とした表示装置の表示パターンとその撮影画像だけで、撮影画像上の表示ユニットの位置特定と各ユニットの輝度計測が可能となり、撮影画像上の表示ユニット位置を特定するための表示パターンを用意することや、新たな撮影画像を取得する必要がない。 As described above, according to the information processing apparatus of the display unit of the first embodiment, the imaging unit that captures the display image of the display device that includes a plurality of display units, and the captured image coordinate system from the captured image of the imaging unit A projection transformation matrix representing the relationship of the display area coordinate system is obtained, and based on the projection transformation matrix, the display unit position calculating means for calculating the respective positions of the plurality of display units in the display device, and the display unit calculating means Since it has a luminance calculation unit for calculating the luminance of each display unit based on the position of each display unit and the photographed image, the display pattern of the display device and its photographing intended to measure the luminance of the display unit Using only the image, it is possible to specify the position of the display unit on the captured image and measure the brightness of each unit. And providing a pattern, there is no need to obtain a new captured image.
 また、実施の形態1の表示ユニットの情報演算装置によれば、表示ユニット位置算出手段は、撮影画像と表示画像における四隅位置の対応関係より射影変換行列を求め、射影変換行列に基づいて各表示ユニットの境界位置を求めるようにしたので、表示ユニットの位置を容易に求めることができる。 Further, according to the information processing apparatus for the display unit of the first embodiment, the display unit position calculating means obtains a projective transformation matrix from the correspondence between the captured image and the four corner positions in the display image, and displays each display based on the projective transformation matrix. Since the boundary position of the unit is obtained, the position of the display unit can be easily obtained.
実施の形態2.
 上記実施の形態1では、表示ユニットの輝度を計測することを目的とした表示装置の表示パターンとその撮影画像だけで、撮影画像上の表示ユニットの位置特定と各ユニットの輝度計測を行った。実施の形態2では、実施の形態1で求めた表示ユニットの境界と各表示ユニット輝度の情報を撮影画像に重畳させるようにしたものである。
Embodiment 2. FIG.
In the first embodiment, the position of the display unit on the photographed image and the luminance measurement of each unit are performed using only the display pattern of the display device and the photographed image for the purpose of measuring the brightness of the display unit. In the second embodiment, the information on the boundary between the display units obtained in the first embodiment and the brightness of each display unit is superimposed on the captured image.
 図7は、実施の形態2の表示ユニットの情報演算装置の構成図である。
 図において、表示装置1、撮影装置2及び演算装置100a内の映像取得部101~輝度算出部105の構成は、図2に示した実施の形態1と同様であるため、ここでの説明は省略する。実施の形態2では、演算装置100a内に、表示ユニット位置算出部104で求めた表示ユニットの境界と輝度算出部105で求めた結果から隣接表示ユニット間の輝度差情報を撮影画像上に重畳させる重畳画像生成部106を備えている。
FIG. 7 is a configuration diagram of the information processing apparatus of the display unit according to the second embodiment.
In the figure, the configurations of the image acquisition unit 101 to the luminance calculation unit 105 in the display device 1, the imaging device 2, and the arithmetic device 100a are the same as those in the first embodiment shown in FIG. To do. In the second embodiment, luminance difference information between adjacent display units is superimposed on the captured image from the boundary of the display unit obtained by the display unit position calculating unit 104 and the result obtained by the luminance calculating unit 105 in the arithmetic device 100a. A superimposed image generation unit 106 is provided.
 次に、実施の形態2の表示ユニットの情報演算装置の動作を図8のフローチャートに基づいて説明する。
 先ず、ステップST1~ステップST6までの動作については実施の形態1と同様である。次に、ステップST7において、ステップST5およびステップST6で求めた情報を元に、重畳画像生成部106で重畳画像を作成する。具体的には、図9に示すようにステップST5で求めた表示ユニットの撮影画像上の位置情報より、表示ユニットの頂点位置に対して点を重畳し、表示ユニットの境界位置に線を重畳する。また、表示ユニットのアドレスが定義されている場合には、そのアドレスが閲覧者にとって分かりやすいように数字や文字を重畳してもよい。そして、図10に示すようにステップST6で求めた表示ユニットの輝度情報から隣接する表示ユニットの輝度差が大きい部分に関して境界を、その輝度差の大きさに応じて濃度の異なる線で強調する。また、強調方法については、図11のように、線の太さによる強調(輝度差が大きい場合は太く、輝度差が小さい場合は細い線とする)であってもよい。このように、隣接表示ユニット間での輝度差が大きい部分について強調させることで、作業者は注目すべき問題のあるユニットがどこにあるのかを容易に理解できるようになる。
Next, the operation of the information processing apparatus of the display unit according to the second embodiment will be described based on the flowchart of FIG.
First, the operations from step ST1 to step ST6 are the same as in the first embodiment. Next, in step ST7, based on the information obtained in steps ST5 and ST6, the superimposed image generation unit 106 creates a superimposed image. Specifically, as shown in FIG. 9, from the position information on the captured image of the display unit obtained in step ST5, a point is superimposed on the vertex position of the display unit, and a line is superimposed on the boundary position of the display unit. . In addition, when an address of the display unit is defined, numbers and characters may be superimposed so that the address can be easily understood by the viewer. Then, as shown in FIG. 10, the boundary is emphasized with a line having a different density in accordance with the magnitude of the luminance difference in the portion where the luminance difference between adjacent display units is large from the luminance information of the display unit obtained in step ST6. Further, as shown in FIG. 11, the enhancement method may be enhancement based on the thickness of a line (a thick line when the luminance difference is large and a thin line when the luminance difference is small). In this way, by emphasizing the portion where the luminance difference between adjacent display units is large, the operator can easily understand where the problematic unit to be noticed is.
 ステップST8において、ステップST7で作成した重畳画像3を表示させる。表示方法としては、図示しないパーソナルコンピュータでディスプレイ上に表示させることや、紙に印刷するといったことであってもよい。
 上記動作例では輝度についての計測を行い、その結果について重畳画像を作成した。しかし、RGBの各表示素子によって表示ユニットの明暗の傾向が異なる場合には、RGBを個別に点灯させた状態で撮影を行い、それぞれの画像からRGB個別の明暗状況を重畳させてもよい。
In step ST8, the superimposed image 3 created in step ST7 is displayed. As a display method, it may be displayed on a display by a personal computer (not shown) or printed on paper.
In the above operation example, the luminance was measured, and a superimposed image was created for the result. However, if the display unit has different brightness / darkness tendencies depending on the RGB display elements, it is also possible to shoot with the RGB lighted individually, and to superimpose the individual RGB light / dark conditions from the respective images.
 以上のように実施の形態2の表示ユニットの情報演算装置によれば、各表示ユニットの境界位置と各表示ユニットの輝度差情報を撮影画像に重畳させる重畳画像生成部を備えたので、作業者は表示ユニットの位置を認識しやすくなる。 As described above, according to the information processing apparatus for the display unit of the second embodiment, the superimposing image generation unit that superimposes the boundary position of each display unit and the luminance difference information of each display unit on the captured image is provided. Makes it easier to recognize the position of the display unit.
実施の形態3.
 上記実施の形態2では、撮影画像に重畳させる情報は表示ユニットの輝度に関するものであった。これに対し、実施の形態3では、実施の形態1で求めた撮影画像上の表示ユニット位置情報を用いると共に、撮影画像に重畳させる情報として、表示ユニット情報を用いるようにしたものである。
Embodiment 3 FIG.
In the second embodiment, the information to be superimposed on the captured image relates to the luminance of the display unit. In contrast, in the third embodiment, the display unit position information on the photographed image obtained in the first embodiment is used, and the display unit information is used as information to be superimposed on the photographed image.
 図12は、実施の形態3の表示ユニットの情報演算装置の構成図である。
 実施の形態3の表示ユニットの情報演算装置では、演算装置100bにおいて、実施の形態1の輝度算出部105に代えて表示ユニット情報重畳画像生成部107を備えている。この表示ユニット情報重畳画像生成部107は、表示ユニット情報を格納する表示ユニット情報格納装置4から表示ユニット情報を取得し、この表示ユニット情報を重畳した画像を生成するものである。
FIG. 12 is a configuration diagram of the information processing apparatus of the display unit according to the third embodiment.
In the display unit information calculation apparatus of the third embodiment, the calculation apparatus 100b includes a display unit information superimposed image generation unit 107 in place of the luminance calculation unit 105 of the first embodiment. The display unit information superimposed image generation unit 107 acquires display unit information from the display unit information storage device 4 that stores the display unit information, and generates an image in which the display unit information is superimposed.
 次に、実施の形態3の表示ユニットの情報演算装置の動作を図13のフローチャートに基づいて説明する。
 ステップST1からステップST5までは、実施の形態1、2と同様の動作である。実施の形態3では、ステップST5で撮影画像上の表示ユニット位置を定義した後、ステップST9で表示ユニット情報を撮影画像に重畳させる。表示ユニット情報とは、例えば、表示ユニットの製造番号や稼働時間などがあり、各表示ユニット固有の情報である。次に、ステップST10において、作成した重畳画像3aを表示させる。表示方法としては、実施の形態2のステップST8と同様に、パーソナルコンピュータのディスプレイ上に表示させることや、紙に印刷するといったことであってもよい。
Next, the operation of the information processing apparatus of the display unit according to the third embodiment will be described based on the flowchart of FIG.
Steps ST1 to ST5 are the same as those in the first and second embodiments. In Embodiment 3, after the display unit position on the captured image is defined in step ST5, the display unit information is superimposed on the captured image in step ST9. The display unit information includes, for example, a display unit manufacturing number and an operating time, and is information unique to each display unit. Next, in step ST10, the created superimposed image 3a is displayed. The display method may be to display on the display of a personal computer or to print on paper, as in step ST8 of the second embodiment.
 図14は、表示ユニット毎に、その製造番号を表示させるようにした場合の説明図である。
 このように、表示ユニット毎に製造番号を表示させることにより、作業者は、表示装置における各表示ユニットの製造番号といった表示ユニット情報を容易に視認することができる。即ち、表示装置として組み立て式の大型表示装置(使用するときに組み立て、使用後に分解する装置)の場合、その都度、使用する表示ユニットが変わるため、表示ユニット毎の製造番号を表示することで、作業者は表示装置がどのような表示ユニットで組み立てられているかを容易に確認することができる。
 また、このような組み立て式の大型表示装置の場合、使用頻度の高い表示ユニットとそうでない表示ユニットが存在することになり、表示ユニット毎に稼働時間に差が発生する。そこで、表示ユニット情報として稼働時間を表示させることで、作業者は各表示ユニットの稼働時間も容易に確認することができる。更に、据置型の大型表示装置に関しても長期間には部分的に表示ユニットの交換が必要になることもあり、このような場合でも、表示ユニット毎の稼働時間を知ることで、表示装置のメンテナンスに役立てることができる。
FIG. 14 is an explanatory diagram when the serial number is displayed for each display unit.
Thus, by displaying the manufacturing number for each display unit, the operator can easily visually recognize the display unit information such as the manufacturing number of each display unit in the display device. That is, in the case of an assembly-type large display device (a device that is assembled when used and disassembled after use) as the display device, the display unit to be used changes each time, so by displaying the manufacturing number for each display unit, The operator can easily confirm the display unit in which the display device is assembled.
Further, in the case of such an assembly-type large display device, there are display units that are frequently used and display units that are not so, and there is a difference in operating time for each display unit. Therefore, by displaying the operation time as the display unit information, the operator can easily check the operation time of each display unit. Furthermore, even for stationary large display devices, the display unit may need to be partially replaced over a long period of time. Even in such a case, maintenance of the display device can be performed by knowing the operating time of each display unit. Can be useful.
 尚、上記実施の形態3において、重畳画像3aを無条件で出力するのではなく、作業者が必要に応じて表示ユニット情報を重畳させるか否かを選択するようにしてもよい。このようにすれば、作業者は必要なときに表示ユニット情報を視認することができ、また、表示なしを選択することで、撮影画像そのものも容易に確認することができる。 In the third embodiment, instead of outputting the superimposed image 3a unconditionally, the operator may select whether or not to superimpose the display unit information as necessary. In this way, the operator can visually recognize the display unit information when necessary, and by selecting “no display”, the captured image itself can be easily confirmed.
 以上のように、実施の形態3の表示ユニットの情報演算装置によれば、複数の表示ユニットからなる表示装置の表示画像を撮影する撮影手段と、撮影手段の撮影画像から、撮影画像座標系と表示領域座標系の関係を表す射影変換行列を求め、射影変換行列に基づいて、表示装置における複数の表示ユニットのそれぞれの位置を算出する表示ユニット位置算出手段と、表示ユニット算出手段で算出された各表示ユニットの位置に基づいて、各表示ユニットに固有の情報である表示ユニット情報を撮影画像における各表示ユニットの位置に重畳させる表示ユニット情報重畳画像生成部を備えたので、撮影画像上に表示ユニット情報が位置に対応して表示されるので、作業者にとって表示装置全体を把握しやすくなる。 As described above, according to the information processing apparatus of the display unit of the third embodiment, the imaging unit that captures the display image of the display device including a plurality of display units, and the captured image coordinate system from the captured image of the imaging unit A projection transformation matrix representing the relationship of the display area coordinate system is obtained, and based on the projection transformation matrix, the display unit position calculating means for calculating the respective positions of the plurality of display units in the display device, and the display unit calculating means Since the display unit information superimposed image generating unit for superimposing display unit information, which is information unique to each display unit, on the position of each display unit in the captured image based on the position of each display unit is displayed on the captured image Since the unit information is displayed corresponding to the position, the operator can easily grasp the entire display device.
 以上のように、この発明に係る表示ユニットの情報演算装置は、複数の表示ユニットからなる表示装置において、各表示ユニットの位置を検出する構成に関するものであり、複数のLEDから構成される表示ユニットを組み合わせて構成されるLED大画面表示装置の表示ユニットの位置検出を行うのに適している。 As described above, the information processing device for a display unit according to the present invention relates to a configuration for detecting the position of each display unit in a display device composed of a plurality of display units, and is a display unit composed of a plurality of LEDs. Is suitable for detecting the position of the display unit of the LED large-screen display device configured by combining the two.

Claims (4)

  1.  複数の表示ユニットからなる表示装置の表示画像を撮影する撮影手段と、
     前記撮影手段の撮影画像から、撮影画像座標系と表示領域座標系の関係を表す射影変換行列を求め、当該射影変換行列に基づいて、前記表示装置における前記複数の表示ユニットのそれぞれの位置を算出する表示ユニット位置算出手段と、
     前記表示ユニット算出手段で算出された各表示ユニットの位置と前記撮影画像とに基づいて、前記各表示ユニットの輝度を算出する輝度算出部とを備えた表示ユニットの情報演算装置。
    Photographing means for photographing a display image of a display device comprising a plurality of display units;
    A projection transformation matrix representing the relationship between the photographed image coordinate system and the display area coordinate system is obtained from the photographed image of the photographing means, and the respective positions of the plurality of display units in the display device are calculated based on the projection transformation matrix. Display unit position calculating means for
    An information processing apparatus for a display unit, comprising: a luminance calculation unit that calculates the luminance of each display unit based on the position of each display unit calculated by the display unit calculation means and the captured image.
  2.  表示ユニット位置算出手段は、撮影画像と表示画像における四隅位置の対応関係より射影変換行列を求め、当該射影変換行列に基づいて各表示ユニットの境界位置を求めることを特徴とする請求項1記載の表示ユニットの情報演算装置。 2. The display unit position calculating means obtains a projection transformation matrix from a correspondence relationship between a captured image and four corner positions in the display image, and obtains a boundary position of each display unit based on the projection transformation matrix. Information arithmetic unit of display unit.
  3.  各表示ユニットの境界位置と前記各表示ユニットの輝度差情報を撮影画像に重畳させる重畳画像生成部を備えたことを特徴とする請求項2記載の表示ユニットの情報演算装置。 3. The display unit information calculation device according to claim 2, further comprising a superimposed image generation unit that superimposes a boundary position of each display unit and luminance difference information of each display unit on a captured image.
  4.  複数の表示ユニットからなる表示装置の表示画像を撮影する撮影手段と、
     前記撮影手段の撮影画像から、撮影画像座標系と表示領域座標系の関係を表す射影変換行列を求め、当該射影変換行列に基づいて、前記表示装置における前記複数の表示ユニットのそれぞれの位置を算出する表示ユニット位置算出手段と、
     前記表示ユニット算出手段で算出された各表示ユニットの位置に基づいて、当該各表示ユニットに固有の情報である表示ユニット情報を前記撮影画像における各表示ユニットの位置に重畳させる表示ユニット情報重畳画像生成部を備えた表示ユニットの情報演算装置。
    Photographing means for photographing a display image of a display device comprising a plurality of display units;
    A projection transformation matrix representing the relationship between the photographed image coordinate system and the display area coordinate system is obtained from the photographed image of the photographing means, and the respective positions of the plurality of display units in the display device are calculated based on the projection transformation matrix. Display unit position calculating means for
    Display unit information superimposed image generation for superimposing display unit information, which is information unique to each display unit, on the position of each display unit in the captured image based on the position of each display unit calculated by the display unit calculating means Information processing device of a display unit comprising a unit.
PCT/JP2009/006234 2009-11-19 2009-11-19 Information operation device for display units WO2011061802A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/006234 WO2011061802A1 (en) 2009-11-19 2009-11-19 Information operation device for display units

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/006234 WO2011061802A1 (en) 2009-11-19 2009-11-19 Information operation device for display units

Publications (1)

Publication Number Publication Date
WO2011061802A1 true WO2011061802A1 (en) 2011-05-26

Family

ID=44059304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/006234 WO2011061802A1 (en) 2009-11-19 2009-11-19 Information operation device for display units

Country Status (1)

Country Link
WO (1) WO2011061802A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06332439A (en) * 1993-05-26 1994-12-02 Hitachi Ltd Display device for multi screen
JPH0764522A (en) * 1993-08-30 1995-03-10 Hitachi Ltd Automatic adjusting system for multi-display device
JPH07333760A (en) * 1994-06-15 1995-12-22 Hitachi Ltd Automatic adjusting system
JP2003524915A (en) * 1998-09-23 2003-08-19 ハネウェル・インコーポレーテッド Method and apparatus for calibrating a tiled display
JP2004219869A (en) * 2003-01-17 2004-08-05 Toshiba Lighting & Technology Corp Regulating device for multi-display, multi-display system, and multi-display video system
JP2005266042A (en) * 2004-03-17 2005-09-29 Seiko Epson Corp Geometric correction method of image and correction apparatus
JP2008232837A (en) * 2007-03-20 2008-10-02 Seiko Epson Corp Method and system of defective enhancement, and detection of defect, and program
WO2008149449A1 (en) * 2007-06-07 2008-12-11 Telesystems Co., Ltd. Multi-display device
JP2009267966A (en) * 2008-04-28 2009-11-12 Canon Inc Image correction processing device, method for controlling the same, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06332439A (en) * 1993-05-26 1994-12-02 Hitachi Ltd Display device for multi screen
JPH0764522A (en) * 1993-08-30 1995-03-10 Hitachi Ltd Automatic adjusting system for multi-display device
JPH07333760A (en) * 1994-06-15 1995-12-22 Hitachi Ltd Automatic adjusting system
JP2003524915A (en) * 1998-09-23 2003-08-19 ハネウェル・インコーポレーテッド Method and apparatus for calibrating a tiled display
JP2004219869A (en) * 2003-01-17 2004-08-05 Toshiba Lighting & Technology Corp Regulating device for multi-display, multi-display system, and multi-display video system
JP2005266042A (en) * 2004-03-17 2005-09-29 Seiko Epson Corp Geometric correction method of image and correction apparatus
JP2008232837A (en) * 2007-03-20 2008-10-02 Seiko Epson Corp Method and system of defective enhancement, and detection of defect, and program
WO2008149449A1 (en) * 2007-06-07 2008-12-11 Telesystems Co., Ltd. Multi-display device
JP2009267966A (en) * 2008-04-28 2009-11-12 Canon Inc Image correction processing device, method for controlling the same, and program

Similar Documents

Publication Publication Date Title
JP5082776B2 (en) Image processing device
CN100525473C (en) Display evaluation method and apparatus
JP2011196685A (en) Defect detection device, defect repairing device, display panel, display device, defect detection method and program
EP3171588B1 (en) Image processing method and image processing apparatus executing that image processing method
JP7255718B2 (en) Information processing device, recognition support method, and computer program
JP4581927B2 (en) Display device glare measuring method and device
TW201626351A (en) Luminance level inspection equipment and luminance level inspection method
TW201028681A (en) Defect inspection device and defect inspection method
JP2021168186A5 (en)
TWI759669B (en) Method and system for inspecting display image
JP2011095061A5 (en)
CN110620887B (en) Image generation device and image generation method
JP2016050982A (en) Luminance correction device and system including the same, and luminance correction method
JP2012047673A (en) Inspection device and inspection method
JP2005189542A5 (en)
JP2009157219A (en) Evaluating method of display, and evaluation apparatus used for it
JP2003167530A (en) Method and device for display picture inspection
WO2011061802A1 (en) Information operation device for display units
KR102064695B1 (en) Non-uniformity evaluation method and non-uniformity evaluation device
JP2011002401A (en) Correction coefficient calculating method in luminance measuring apparatus, and luminance measuring apparatus
JP2012215425A5 (en)
JP2021056963A (en) Information processing apparatus, information processing system, information processing method, and program
JP6777079B2 (en) Dimension measuring device, dimensional measuring system, and dimensional measuring method
TWI470207B (en) A method of building up gray-scale transform function, a panel testing method and an automated panel testing system
JP6591176B2 (en) Method for calculating glare evaluation value of protective film and device for calculating glare evaluation value of protective film

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09851424

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09851424

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP