JP2006127238A - Method for detecting center position of pixel of imaging device - Google Patents

Method for detecting center position of pixel of imaging device Download PDF

Info

Publication number
JP2006127238A
JP2006127238A JP2004316131A JP2004316131A JP2006127238A JP 2006127238 A JP2006127238 A JP 2006127238A JP 2004316131 A JP2004316131 A JP 2004316131A JP 2004316131 A JP2004316131 A JP 2004316131A JP 2006127238 A JP2006127238 A JP 2006127238A
Authority
JP
Japan
Prior art keywords
element chip
sides
center
image
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2004316131A
Other languages
Japanese (ja)
Inventor
Takeshi Noda
武司 野田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to JP2004316131A priority Critical patent/JP2006127238A/en
Publication of JP2006127238A publication Critical patent/JP2006127238A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Input (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a method for detecting the center position of pixels of an imaging device by which the center position of effective pixels of the imaging device can be accurately detected. <P>SOLUTION: The method comprises: a step (S1) for recognizing four element chip sides of an element chip as an image; a step (S2) for interpolating pixels of density images in the vicinities of four element chip sides; a step (S3) for calculating a plurality of edge positions for four sides of the element chip in each subpixel; a step (S4) for calculating an equation of lines of four element chip sides on image coordinates passing a plurality of edge positions, a step (S5) for obtaining each of the coordinates of an intersecting point of two orthogonal sides and the coordinates of an intersecting point of other two sides opposed to the former two sides; a step (S6) for recognizing the center of the coordinates of the two intersecting points as the center of the element chip, and a step (S7) for offsetting the center of the element chip by a value obtained by multiplying a difference between the position of the center of the element chip and the position of the center of an effective pixel part on the element chip by the magnification of a camera which is measured beforehand and recognizing the offset position as the center position of the pixels of the imaging device. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

本発明は、撮像素子の画素中心を精密に認識できる撮像素子の画素中心位置の検出方法に関する。特に、小型化されつつある撮像素子の画素中心と光学部材の光軸中心とを精密に合致させた撮像モジュールの組み立てを実現可能な撮像素子の画素中心位置の検出方法に関する。   The present invention relates to a detection method of a pixel center position of an image sensor that can accurately recognize the pixel center of the image sensor. In particular, the present invention relates to a method for detecting the pixel center position of an image sensor that can realize assembling of an image pickup module in which the pixel center of the image sensor being miniaturized and the optical axis center of an optical member are precisely matched.

従来、撮像素子の画素中心の検出方法としては、撮像素子のパッケージ上に基準点をマークして位置認識を行うか、パッケージの外形を画像位置認識するのが一般的である。しかしながら、小型かつ高精度のため、精密な位置決めが要求される近年の撮像モジュールにおいては、基準マークを設けるスペースもなく、また、外形基準の画像位置認識では、位置決め精度が外形と有効画素部位置との位置精度に依存してしまい、高精度な位置決めができないという問題がある。   Conventionally, as a method for detecting the center of a pixel of an image sensor, it is common to perform position recognition by marking a reference point on the package of the image sensor, or to recognize the image position of the outer shape of the package. However, in recent imaging modules that require precise positioning due to their small size and high accuracy, there is no space for providing a reference mark. In addition, in the image position recognition based on the outer shape, the positioning accuracy is determined by the outer shape and the effective pixel portion position. There is a problem that high-precision positioning cannot be performed.

この問題を解決するために、撮像素子の有効画素のエッジを検出し、その有効画素枠の中心に、取付けホルダーの中心値を一致させることで、撮像素子の光軸を合わせる方法が提案されている(例えば、特許文献1参照。)。この特許文献1に記載の撮像素子の光軸合わせの方法は、図12に示すように、撮像素子100のx方向112の有効画素部のエッジ検出を装置据付の図示しないカメラにより、y方向113に複数回走査する。これにより、有効画素部の線エッジ101を求める。   In order to solve this problem, a method has been proposed in which the edge of the effective pixel of the image sensor is detected, and the center value of the mounting holder is aligned with the center of the effective pixel frame so that the optical axis of the image sensor is aligned. (For example, refer to Patent Document 1). As shown in FIG. 12, the optical axis alignment method of the image sensor described in Patent Document 1 detects the edge of the effective pixel portion in the x direction 112 of the image sensor 100 by using a camera (not shown) installed in the apparatus. Scan multiple times. Thereby, the line edge 101 of the effective pixel portion is obtained.

次に、同様にして、y方向113の有効画素部のエッジ検出をx方向112に複数回走査することにより行い、有効画素部の線エッジ103を求める。この線エッジ101,103の検出においては、濃淡エッジの輝度差を拡大させ、エッジ検出の精度を上げるために、エッジ強調などの画像処理が施された上で行われる。
次いで、線エッジ101と線エッジ103との交点105を画像処理上の座標値として求め、同様な操作により、線エッジ102と線エッジ104とを求め、線エッジ間の交点B106と交点C107を同様に求める。そして、交点105と交点106との中点108を通る中心線114及び交点106と交点107との中点109を通る中心線115により、交点110、すなわち有効画素中心位置を求めるという技術である。
特開平5−316400号公報
Next, similarly, the edge detection of the effective pixel portion in the y direction 113 is performed by scanning the x direction 112 a plurality of times to obtain the line edge 103 of the effective pixel portion. The detection of the line edges 101 and 103 is performed after image processing such as edge enhancement is performed in order to increase the luminance difference between the light and shade edges and increase the accuracy of edge detection.
Next, the intersection point 105 between the line edge 101 and the line edge 103 is obtained as a coordinate value in image processing, the line edge 102 and the line edge 104 are obtained by the same operation, and the intersection point B106 and the intersection point C107 between the line edges are similarly obtained. Ask for. In this technique, the intersection 110, that is, the effective pixel center position is obtained from the center line 114 passing through the midpoint 108 between the intersection 105 and the intersection 106 and the center line 115 passing through the midpoint 109 between the intersection 106 and the intersection 107.
JP-A-5-316400

しかしながら、撮像素子の有効画素部のエッジは有効画素でない部分との明瞭な濃淡差がない場合が多く、そのような撮像素子においては、上記特許文献1に記載の撮像素子の光軸を合わせる方法を用いても、線エッジの検出が精度良く行うことができない。このため、有効画素中心位置が精度良く求まらないという問題を抱えている。   However, there are many cases where the edge of the effective pixel portion of the image sensor does not have a clear gray level difference from a portion that is not an effective pixel. In such an image sensor, the method of aligning the optical axis of the image sensor described in Patent Document 1 above Even if is used, the line edge cannot be detected with high accuracy. For this reason, there is a problem that the effective pixel center position cannot be obtained with high accuracy.

本発明は、上記の課題を解決するためになされたものであって、撮像素子の有効画素エッジが明瞭に見えないような撮像素子でも、精密に撮像素子の有効画素中心位置を検出できる撮像素子の画素中心位置の検出方法を提供することを目的とする。   The present invention has been made to solve the above-described problem, and is an image sensor that can accurately detect the effective pixel center position of the image sensor even if the image sensor does not clearly show the effective pixel edge of the image sensor. An object of the present invention is to provide a method for detecting the pixel center position.

上記目的を達成するために、本発明は、以下の手段を提供する。
本発明の撮像素子の画素中心位置の検出方法は、位置決めされた撮像素子に設けられた矩形状の素子チップの4つの素子チップ辺を画像認識する工程と、前記4つの素子チップ辺近傍の濃淡画像の画素間を補間する工程と、補間された画像データを元に、前記素子チップ辺上の複数のエッジ位置をサブピクセル単位で4辺分割り出す工程と、前記複数のエッジ位置を通る画像座標上における4つの素子チップ辺の直線の方程式を割り出す工程と、前記4辺の直線方程式より、直交する2辺の交点とそれに対向する他の2辺の交点との座標をそれぞれ求める工程と、前記2つの交点の座標の中心を前記素子チップの中心として認識する工程と、前記素子チップの中心と前記素子チップ上の有効画素部の中心との位置の差に予め測定されたカメラの倍率をかけた量だけ前記素子チップの中心をオフセットさせ、この位置を撮像素子の画素中心位置として認識する工程とを有することを特徴とする。
In order to achieve the above object, the present invention provides the following means.
The method of detecting the pixel center position of the image sensor according to the present invention includes a step of recognizing four element chip sides of a rectangular element chip provided in a positioned image sensor, and a density near the four element chip sides. A step of interpolating between pixels of an image, a step of dividing a plurality of edge positions on the element chip side into four sub-pixels based on the interpolated image data, and image coordinates passing through the plurality of edge positions A step of determining a linear equation of four element chip sides above, a step of obtaining coordinates of an intersection of two orthogonal sides and an intersection of the other two sides opposite to each other from the linear equation of the four sides, A step of recognizing the center of coordinates of two intersection points as the center of the element chip, and a camera measured in advance in a difference in position between the center of the element chip and the center of the effective pixel portion on the element chip Amount obtained by multiplying the magnification is offset a center of the device chip, and having a recognizing step of this position as a pixel center position of the image pickup device.

本発明に係る撮像素子の画素中心位置の検出方法によれば、まず、素子チップ辺を画像認識し、素子チップの直交する2辺の交点に対向する他の2辺の交点の座標を求め、その中点を素子チップの中心として割り出す。その後、素子チップの中心と有効画素部の中心との位置の差と予め測定しておいたカメラの倍率とを掛け合わせることにより求められた量だけ素子チップの中心をオフセットさせ、この位置を撮像素子の有効画素中心として認識する。したがって、撮像素子の有効画素中心を求める際、従来のように、撮像素子の有効画素部のエッジを認識せずに、求めることができるため、有効画素部のエッジが明瞭に見えない撮像素子でも、有効画素部の中心位置を精度良く割り出すことが可能となる。   According to the detection method of the pixel center position of the image sensor according to the present invention, first, the image of the element chip side is recognized, and the coordinates of the intersection point of the other two sides facing the intersection point of the two orthogonal sides of the element chip are obtained. The midpoint is determined as the center of the element chip. Then, the center of the element chip is offset by the amount obtained by multiplying the difference between the position of the center of the element chip and the center of the effective pixel portion and the magnification of the camera measured in advance, and this position is imaged. It is recognized as the effective pixel center of the element. Therefore, when the effective pixel center of the image sensor is obtained, it can be obtained without recognizing the edge of the effective pixel portion of the image sensor as in the prior art. Thus, the center position of the effective pixel portion can be determined with high accuracy.

本発明の撮像素子の画素中心位置の検出方法は、位置決めされた撮像素子に設けられた矩形状の素子チップの2つまたは3つの素子チップ辺を画像認識する工程と、前記2つまたは3つの素子チップ辺近傍の濃淡画像の画素間を補間する工程と、補間された画像データを元に、前記素子チップ辺上の複数のエッジ位置をサブピクセル単位で2辺分または3辺分割り出す工程と、前記複数のエッジ位置を通る画像座標上における2つまたは3つの素子チップ辺の直線の方程式を割り出す工程と、前記2つまたは3つの素子チップ辺の直線方程式より、直交する2辺の画像座標上における交点を一点求める工程と、前記交点を起点として、前記直交する2辺のうちの1辺の直線方程式上で、1辺の長さと予め測定されたカメラの倍率とから1点の前記素子チップの頂点の画像座標位置を割り出す工程と、前記直交する2辺のうち他辺の直線方程式上で、1辺の長さと予め測定されたカメラの倍率とから、前記頂点に対向する他の頂点の画像座標位置を割り出す工程と、前記2つの頂点の画像座標位置の中点座標を前記素子チップの中心として認識する工程と、前記素子チップの中心と前記素子チップ上の有効画素部の中心との位置の差に予め測定されたカメラの倍率をかけた量だけ前記素子チップの中心をオフセットさせ、この位置を撮像素子の画素中心位置として認識する工程とを有することを特徴とする。   The method for detecting the pixel center position of the image sensor of the present invention includes a step of recognizing two or three element chip sides of a rectangular element chip provided in a positioned image sensor, A step of interpolating between pixels of a grayscale image in the vicinity of the element chip side, and a step of dividing a plurality of edge positions on the element chip side into two sides or three sides in units of subpixels based on the interpolated image data , Determining a linear equation of two or three element chip sides on image coordinates passing through the plurality of edge positions, and image coordinates of two orthogonal sides from the linear equation of the two or three element chip sides From the step of obtaining one intersection point above and the linear equation of one of the two orthogonal points starting from the intersection point, the length of one side and a pre-measured camera magnification The step of determining the image coordinate position of the apex of the element chip, and the other of the two sides orthogonal to each other on the linear equation of the other side, facing the apex from the length of one side and the pre-measured magnification of the camera Determining the image coordinate positions of the vertices, recognizing the midpoint coordinates of the image coordinates of the two vertices as the center of the element chip, and the effective pixel portion of the element chip and the effective pixel portion on the element chip. And a step of offsetting the center of the element chip by an amount obtained by multiplying a difference in position with the center by a camera magnification measured in advance, and recognizing the position as the pixel center position of the image sensor.

本発明に係る撮像素子の画素中心位置の検出方法によれば、素子チップ辺の1辺または2辺が明瞭に見えない場合には、明瞭に見える素子チップの2つまたは3つの素子チップ辺を画像認識し、認識した2辺上で、1辺の長さと予め測定されたカメラの倍率とから、素子チップの対向する2つの頂点の画像座標位置を割り出す。この2つの頂点から中点座標を求め、素子チップの中心とし、その後、素子チップの中心と有効画素部の中心との位置の差と予め測定しておいたカメラの倍率とを掛け合わせることにより求められた量だけ素子チップの中心をオフセットさせ、この位置を撮像素子の有効画素中心として認識する。したがって、素子チップの1辺または2辺が明瞭に見えない場合でも、有効画素部の中心位置を精度良く割り出すことが可能となる。   According to the detection method of the pixel center position of the image pickup device according to the present invention, when one or two sides of the element chip side are not clearly visible, two or three element chip sides of the element chip that are clearly visible are determined. Image recognition is performed, and on the two recognized sides, the image coordinate positions of two vertices facing each other of the element chip are determined from the length of one side and the magnification of the camera measured in advance. By calculating the midpoint coordinates from these two vertices and using it as the center of the element chip, and then multiplying the difference in position between the center of the element chip and the center of the effective pixel portion and the magnification of the camera measured in advance. The center of the element chip is offset by the obtained amount, and this position is recognized as the effective pixel center of the image sensor. Therefore, even when one or two sides of the element chip cannot be clearly seen, it is possible to accurately determine the center position of the effective pixel portion.

本発明においては以下の効果を奏する。
本発明に係る撮像素子の画素中心位置の検出方法によれば、撮像素子の素子チップ辺を画像認識し、素子チップの中心として割り出す。その後、素子チップの中心をオフセットさせ、この位置を撮像素子の有効画素中心として認識する。したがって、撮像素子の有効画素中心を求める際、撮像素子の有効画素部のエッジが明瞭に見えない撮像素子でも、素子チップの素子チップ辺を画像認識することにより、精密に有効画素の中心位置を検出できるため、高精度な撮像モジュールの組み立てが実現できる。
The present invention has the following effects.
According to the pixel center position detecting method of the image sensor according to the present invention, the image of the element chip side of the image sensor is recognized and determined as the center of the element chip. Thereafter, the center of the element chip is offset, and this position is recognized as the effective pixel center of the image sensor. Therefore, when determining the effective pixel center of the image sensor, the center position of the effective pixel can be accurately determined by recognizing the image of the element chip side of the element chip even for an image sensor in which the edge of the effective pixel portion of the image sensor is not clearly visible. Since it can detect, the assembly of a highly accurate imaging module is realizable.

次に、本発明の第1実施形態について、図1から図7を参照して説明する。
本実施形態に係る撮像素子1の画素中心位置の検出方法について、図1に示すフローチャートを用いて説明する。この撮像素子1は、凹部3aが形成されたチップ枠3と、凹部3aに配された素子チップ4とを備えている。また、凹部3a及び素子チップ4の大きさは略同等である。
まず、位置決めされた撮像素子1及び素子チップ4を装置据付の位置認識用カメラ(図示略)にて画像認識する。これにより、図2に示すような、撮像素子1及び素子チップ4の画像状態が認識される。認識された撮像素子1の画像は、例えば、VGA画面(0,0)から(640,480)の濃淡画素データとして位置認識用のカメラ(図示略)の画像座標2上に取り込まれる(ステップS1)。
Next, a first embodiment of the present invention will be described with reference to FIGS.
A method for detecting the pixel center position of the image sensor 1 according to the present embodiment will be described with reference to the flowchart shown in FIG. The imaging element 1 includes a chip frame 3 in which a recess 3a is formed, and an element chip 4 disposed in the recess 3a. The sizes of the recess 3a and the element chip 4 are substantially equal.
First, the image sensor 1 and the element chip 4 that have been positioned are recognized by a position recognition camera (not shown) installed in the apparatus. Thereby, the image states of the image sensor 1 and the element chip 4 as shown in FIG. 2 are recognized. The recognized image of the image sensor 1 is captured on the image coordinates 2 of the position recognition camera (not shown) as, for example, grayscale pixel data of the VGA screen (0, 0) to (640, 480) (step S1). ).

次に、撮像素子1のチップ枠3の内側を形成する4つの素子チップ辺10a,10b,10c,10dの後述するエッジ部(エッジ位置)近傍の濃淡画像データを3つの分割線X1,X2,X3及びY1,Y2,Y3により16個の領域に分割する。そして、各分割線X1,X2,X3,Y1,Y2,Y3上において、スプライン補間や2次曲線近似により、画素間のデータをサブピクセル単位で補間を行う。ここで、素子チップ辺10a,10b,10c,10dは、これらに囲まれた部分が素子チップ4上の有効画素部4aを含有し、また、有効画素部4aではない走査部4bを含んだ素子チップ4の端辺を表している。さらに、走査部4bの外周にチップ枠3が配されている。   Next, the grayscale image data in the vicinity of edge portions (edge positions) described later of the four element chip sides 10a, 10b, 10c, and 10d forming the inside of the chip frame 3 of the image sensor 1 are divided into three dividing lines X1, X2, and X2. The area is divided into 16 areas by X3 and Y1, Y2, and Y3. Then, inter-pixel data is interpolated in sub-pixel units by spline interpolation or quadratic curve approximation on each of the dividing lines X1, X2, X3, Y1, Y2, and Y3. Here, each of the element chip sides 10a, 10b, 10c, and 10d includes an effective pixel portion 4a on the element chip 4 in a portion surrounded by the element chip sides 10a, 10b, 10c, and 10d, and an element including a scanning portion 4b that is not the effective pixel portion 4a. The end side of the chip 4 is shown. Further, the chip frame 3 is arranged on the outer periphery of the scanning unit 4b.

続いて、素子チップ辺10a,10b,10c,10dの中で1辺10aのX1部でのエッジ部12a近傍の濃淡データの補間処理の詳細を例に挙げて、図3,図4及び図5を用いて説明する。ここで、エッジ部とは、素子チップ辺10a,10b,10c,10dと分割線X1,X2,X3及びY1,Y2,Y3とのそれぞれの交点を示している。
まず、図3に示すようにエッジ部12a近傍のY方向の濃淡画像データが取り込まれる。そして、エッジ部12a近傍の濃淡データを横軸にY方向のアドレス、縦軸に濃淡データを取ると、図4に示すような散布図となる。この状態で、エッジ検出のスレッシュを40程度とするとエッジ部のY方向のアドレスは14である。次に、画素と画素との間の濃淡データを画素毎の濃淡データから1/10画素単位、すなわち、サブピクセル単位で2次曲線補間やスプライン補間することにより、図4に示す散布図は、図5に示すように、滑らかな曲線となる。
Subsequently, the details of the interpolation processing of the grayscale data in the vicinity of the edge portion 12a in the X1 portion of the one side 10a among the element chip sides 10a, 10b, 10c, and 10d will be described as an example. Will be described. Here, the edge portion indicates each intersection of the element chip sides 10a, 10b, 10c, 10d and the dividing lines X1, X2, X3 and Y1, Y2, Y3.
First, as shown in FIG. 3, grayscale image data in the Y direction near the edge portion 12a is captured. Then, when the grayscale data in the vicinity of the edge portion 12a is taken as the Y-direction address on the horizontal axis and the grayscale data on the vertical axis, a scatter diagram as shown in FIG. 4 is obtained. In this state, if the edge detection threshold is about 40, the Y-direction address of the edge portion is 14. Next, the scatter diagram shown in FIG. 4 is obtained by interpolating the grayscale data between the pixels from the grayscale data for each pixel by 1/10 pixel unit, that is, by quadratic curve interpolation or spline interpolation in subpixel units. As shown in FIG. 5, it becomes a smooth curve.

さらに、素子チップ辺10aのX2部でのエッジ部12b,素子チップ辺10aのX3部でのエッジ部12c,素子チップ辺10bのY1部でのエッジ部12d,素子チップ辺10bのY2部でのエッジ部12e,素子チップ辺10bのY3部でのエッジ部12f,素子チップ辺10cのX1部でのエッジ部12g,素子チップ辺10cのX2部でのエッジ部12h,素子チップ辺10cのX3部でのエッジ部12i,素子チップ辺10dのY1部でのエッジ部12j,素子チップ辺10dのY2部でのエッジ部12k及び素子チップ辺10dのY3部でのエッジ部12lに対しても、同様の近似や補間処理を行う(ステップS2)。
なお、各素子チップ辺10a,10b,10c,10dを3分割としたが、2分割でも、4分割以上でも良い。
Further, the edge portion 12b at the X2 portion of the element chip side 10a, the edge portion 12c at the X3 portion of the element chip side 10a, the edge portion 12d at the Y1 portion of the element chip side 10b, and the Y2 portion of the element chip side 10b. Edge portion 12e, edge portion 12f at Y3 portion of element chip side 10b, edge portion 12g at X1 portion of element chip side 10c, edge portion 12h at X2 portion of element chip side 10c, and X3 portion of element chip side 10c The same applies to the edge portion 12i at the edge portion 12j, the edge portion 12j at the Y1 portion of the element chip side 10d, the edge portion 12k at the Y2 portion of the element chip side 10d, and the edge portion 12l at the Y3 portion of the element chip side 10d. Approximation and interpolation processing are performed (step S2).
Each element chip side 10a, 10b, 10c, 10d is divided into three parts, but it may be divided into two parts or more than four parts.

次に、上述した近似や補間した濃淡データを元に、素子チップ辺10a,10b,10c,10dのエッジ部12a,12b,12c,12d,12e,12f,12g,12h,12i,12j,12k及び12lの定められた閾値を切るエッジ座標を求める。例えば、エッジ部12a近傍で、閾値を40とすると素子チップ辺10aのエッジ部12aのY座標は、13.2のようにサブピクセル単位で高精度に求まる。同様に、エッジ部12b,12c,12g,12h,12iのY座標を、エッジ部12d,12e,12f,12j,12k,12lのX座標をそれぞれサブピクセル単位で求める(ステップS3)。   Next, based on the above-described approximation or interpolated grayscale data, the edge portions 12a, 12b, 12c, 12d, 12e, 12f, 12g, 12h, 12i, 12j, 12k of the element chip sides 10a, 10b, 10c, 10d and An edge coordinate that cuts a predetermined threshold value of 12 l is obtained. For example, if the threshold value is 40 in the vicinity of the edge portion 12a, the Y coordinate of the edge portion 12a of the element chip side 10a can be obtained with high accuracy in units of subpixels as shown in 13.2. Similarly, the Y coordinates of the edge portions 12b, 12c, 12g, 12h, and 12i and the X coordinates of the edge portions 12d, 12e, 12f, 12j, 12k, and 12l are obtained in units of subpixels (step S3).

次に、サブピクセル単位でのエッジ座標を元に、素子チップ辺10a,10b,10c,10dの画像座標2上での直線の方程式を求める。この方程式は、図6に示すように、素子チップ辺10aのX1点での検出されたエッジ部12aのY座標の値が13.2、X2点での検出されたエッジ部12bのY座標の値が13.5、X3点での検出されたエッジ部12cのY座標の値が13.8、であるので、X1=100、X2=320、X3=540とすると、画像座標2上での素子チップ辺10aの直線の方程式は、y=0.00136x+13.064となる。これは、複数のエッジ部(12a,12b,12c)を通る直線の方程式を近似的に求めた式である。同様に、素子チップ辺10b,10c,10dの画像座標2上での直線の方程式を、検出されたエッジ部12d,12e,12f,12g,12h,12i,12j,12k及び12lの座標を元に算出する(ステップS4)。   Next, an equation of a straight line on the image coordinate 2 of the element chip sides 10a, 10b, 10c, and 10d is obtained based on the edge coordinates in subpixel units. As shown in FIG. 6, the equation shows that the Y coordinate value of the detected edge portion 12a at the X1 point of the element chip side 10a is 13.2 and the Y coordinate value of the detected edge portion 12b at the X2 point. Since the value of 13.5 and the value of the Y coordinate of the detected edge portion 12c at the point X3 is 13.8, if X1 = 100, X2 = 320, and X3 = 540, the value on the image coordinate 2 The linear equation of the element chip side 10a is y = 0.00136x + 13.064. This is an equation obtained by approximating a linear equation passing through a plurality of edge portions (12a, 12b, 12c). Similarly, linear equations on the image coordinates 2 of the element chip sides 10b, 10c, and 10d are obtained based on the detected coordinates of the edge portions 12d, 12e, 12f, 12g, 12h, 12i, 12j, 12k, and 12l. Calculate (step S4).

次に、素子チップ辺10aに直交する10dの交点13aを上記2つの直線方程式より算術的に算出する。同様に、素子チップ辺10bに直交する10cの交点13cを上記2つの直線方程式より算術的に算出する(ステップS5)。なお、本実施形態では、素子チップ辺10aと素子チップ辺10dとの交点13a及び素子チップ辺10bと素子チップ辺10cとの交点13cを求めたが、これに代えて、素子チップ辺10aと素子チップ辺10bとの交点13b及び素子チップ辺10cと素子チップ辺10dとの交点13dを求めても良い。   Next, an intersection 13a of 10d orthogonal to the element chip side 10a is arithmetically calculated from the above two linear equations. Similarly, an intersection 13c of 10c orthogonal to the element chip side 10b is arithmetically calculated from the above two linear equations (step S5). In this embodiment, the intersection 13a between the element chip side 10a and the element chip side 10d and the intersection 13c between the element chip side 10b and the element chip side 10c are obtained, but instead, the element chip side 10a and the element chip side 10c are obtained. The intersection 13b with the chip side 10b and the intersection 13d with the element chip side 10c and the element chip side 10d may be obtained.

続いて、交点13aと交点13cまたは交点13bと交点13dとの中点座標5(Cx,Cy)を算出する。交点13aの座標を(Kx1,Ky1)、交点13bの座標を(Kx2,Ky2)、交点13cの座標を(Kx3,Ky3)、交点13dの座標を(Kx4,Ky4)とすると、Cx=(Kx1+Kx3)/2、または、Cx=(Kx2+Kx4)/2、Cy=(Ky1+Ky3)/2、または、Cy=(Ky2+Ky4)/2となる(ステップS6)。   Subsequently, the midpoint coordinate 5 (Cx, Cy) between the intersection 13a and the intersection 13c or between the intersection 13b and the intersection 13d is calculated. If the coordinates of the intersection 13a are (Kx1, Ky1), the coordinates of the intersection 13b are (Kx2, Ky2), the coordinates of the intersection 13c are (Kx3, Ky3), and the coordinates of the intersection 13d are (Kx4, Ky4), Cx = (Kx1 + Kx3) ) / 2, or Cx = (Kx2 + Kx4) / 2, Cy = (Ky1 + Ky3) / 2, or Cy = (Ky2 + Ky4) / 2 (step S6).

中点座標5(Cx,Cy)を算出した後、図7に示すように、素子チップ4の中点座標(中心)5と、有効画素部4の図面上の有効画素部中心位置(有効画素中心)6とにより、X座標方向のズレ量7とY座標のズレ量8とを求める。求められたX座標方向のズレ量7とY座標のズレ量8と予め測定されたカメラの倍率とから、実際の撮像素子1の中心位置(画素中心位置)と有効画素部中心位置6とのオフセット量を演算して求める。ここで、カメラの画像のX座標方向の倍率をCbx[ピクセル/mm]、カメラの画像のY座標方向の倍率をCby[ピクセル/mm]、撮像素子1の中心位置と有効画素部中心位置6とのX座標方向のズレ量7をZx[ピクセル],Y座標方向のズレ量8をZy[ピクセル]とすると、
Zx[ピクセル]=(図面上のX方向のズレ量7)[mm]×Cbx[ピクセル/mm]
Zy[ピクセル]=(図面上のY方向のズレ量8)[mm]×Cby[ピクセル/mm]となる。
次に、検出された素子チップ4の中心座標5(Cx,Cy)に、撮像素子1の中心位置と有効画素部中心位置6とのズレ量(Zx,Zy)分オフセットさせた座標(Cx+Zx,Cy+Zy)を実際の有効画素の中心位置として求める(ステップS7)。
After calculating the midpoint coordinates 5 (Cx, Cy), as shown in FIG. 7, the midpoint coordinates (center) 5 of the element chip 4 and the effective pixel portion center position (effective pixel) on the drawing of the effective pixel portion 4 (Center) 6, a shift amount 7 in the X coordinate direction and a shift amount 8 in the Y coordinate are obtained. Based on the obtained shift amount 7 in the X coordinate direction, shift amount 8 in the Y coordinate, and the pre-measured magnification of the camera, the actual center position (pixel center position) of the image sensor 1 and the effective pixel portion center position 6 are determined. Calculate the offset amount. Here, the magnification of the camera image in the X coordinate direction is Cbx [pixel / mm], the magnification of the camera image in the Y coordinate direction is Cby [pixel / mm], the center position of the image sensor 1 and the effective pixel portion center position 6 When the amount of deviation 7 in the X coordinate direction is Zx [pixel] and the amount of deviation 8 in the Y coordinate direction is Zy [pixel],
Zx [pixel] = (deviation amount in X direction on drawing 7) [mm] × Cbx [pixel / mm]
Zy [pixel] = (Y-direction misalignment 8 in the drawing) [mm] × Cby [pixel / mm].
Next, coordinates (Cx + Zx, C) offset from the center coordinates 5 (Cx, Cy) of the detected element chip 4 by the amount of deviation (Zx, Zy) between the center position of the image sensor 1 and the effective pixel portion center position 6. Cy + Zy) is obtained as the actual center position of the effective pixel (step S7).

本実施形態に係る撮像素子の画素中心位置の検出方法によれば、素子チップ辺10a,10b,10c,10dを画像認識し、認識した素子チップ4の対向する交点13a及び交点13cまたは交点13b及び交点13dの画像座標を求め、その中点を素子チップ4の中心として割り出す。その後、素子チップ4の中心座標5と有効画素部中心位置6との図面上の位置の差に予め測定されたカメラの倍率を掛け合わせ、これにより、求められた量だけ中心座標をオフセットさせた位置を有効画素中心として認識する。したがって、有効画素像が不明瞭な場合でも、高精度に撮像素子1の実際の有効画素中心位置(Cx+Zx,Cy+Zy)を割り出すことが可能となる。   According to the detection method of the pixel center position of the imaging device according to the present embodiment, the device chip sides 10a, 10b, 10c, and 10d are image-recognized, and the intersecting point 13a and the intersecting point 13c or the intersecting point 13b of the recognized device chip 4 are recognized. The image coordinates of the intersection 13d are obtained, and the midpoint is determined as the center of the element chip 4. Thereafter, the difference between the position on the drawing between the center coordinate 5 of the element chip 4 and the effective pixel portion center position 6 is multiplied by the pre-measured camera magnification, thereby offsetting the center coordinate by the determined amount. The position is recognized as the effective pixel center. Therefore, even when the effective pixel image is unclear, the actual effective pixel center position (Cx + Zx, Cy + Zy) of the image sensor 1 can be determined with high accuracy.

次に、本発明に係る第2実施形態について、図8から図10を参照して説明する。なお、以下に説明する各実施形態において、上述した第1実施形態に係る撮像素子1及び撮像素子1の画素中心位置の検出方法と共通とする箇所の説明を省略することにする。
本実施形態に係る撮像素子の画素中心位置の検出方法について、図8に示すフローチャートを用いて説明する。
本実施形態に係る撮像素子の画素中心位置の検出方法において、第1実施形態と異なる点は、素子チップ4の直交する2つの素子チップ辺10a,10bを画像認識する工程(ステップS11),2つの素子チップ辺10a,10bの直線の方程式と、図面上の2つの素子チップ辺10a,10bの長さ情報と、予め測定されたカメラの倍率とから、撮像素子1の対向する2つの頂点の座標を求める工程(ステップS15)及び撮像素子1の対向する2つの頂点13a,13cの座標の中点座標を素子チップ4の中心座標(Cx,Cy)として認識する工程(ステップS16)である。
Next, a second embodiment according to the present invention will be described with reference to FIGS. In each of the embodiments described below, the description of the image sensor 1 according to the first embodiment described above and the description of the parts common to the pixel center position detection method of the image sensor 1 will be omitted.
A method for detecting the pixel center position of the image sensor according to the present embodiment will be described with reference to the flowchart shown in FIG.
In the detection method of the pixel center position of the image sensor according to the present embodiment, the difference from the first embodiment is that the step of recognizing two orthogonal element chip sides 10a and 10b of the element chip 4 (step S11), 2 From the linear equation of the two element chip sides 10a and 10b, the length information of the two element chip sides 10a and 10b on the drawing, and the pre-measured magnification of the camera, A step of obtaining coordinates (step S15) and a step of recognizing the midpoint coordinates of the two vertices 13a and 13c facing the image pickup device 1 as the center coordinates (Cx, Cy) of the element chip 4 (step S16).

ステップS11では、素子チップ4の素子チップ辺10a,10b,10c,10dの中の1辺(図9(a)に示す例では、チップ辺10a)と、この1辺に直交する他の一辺(図9(a)に示す例では、チップ辺10b)のみを画像認識し、画像認識した2辺において、各素子チップ辺10a,10b上3点(図9(a)に示す例では、交点13a,13b,13c)近傍の濃淡画像の画素間をスプライン補間により補間を行う(ステップS12)。次に、補間されたデータを元に、素子チップ辺10a,10bのエッジ部をサブピクセル単位で検出する。そして、サブピクセル単位で割り出したチップ辺10a,10bのエッジ部の情報を元に、2つの素子チップ辺10a,10bの画像座標2上の直線方程式を算出する(ステップS14)。   In step S11, one of the element chip sides 10a, 10b, 10c, 10d of the element chip 4 (chip side 10a in the example shown in FIG. 9A) and another side orthogonal to this one side ( In the example shown in FIG. 9A, only the chip side 10b) is image-recognized, and in the two sides that have been image-recognized, three points on each element chip side 10a, 10b (in the example shown in FIG. 9A, the intersection 13a , 13b, 13c) Interpolation is performed between pixels of the grayscale image in the vicinity by spline interpolation (step S12). Next, based on the interpolated data, the edge portions of the element chip sides 10a and 10b are detected in sub-pixel units. Then, a linear equation on the image coordinate 2 of the two element chip sides 10a and 10b is calculated based on the information on the edge portions of the chip sides 10a and 10b determined in units of subpixels (step S14).

ステップS15では、ステップS14で求めた素子チップ辺10a,10bの画像座標2上での直線の方程式を満たし、なおかつ、図9(a)に示すように、チップ辺10aとチップ辺10bとの交点13bを起点として、図面上での素子チップ辺10a,10bの長さと予め測定されたカメラ倍率とを掛け合わせた距離位置2点、すなわち、素子チップ4の対向する2つの頂点13a,13cの画像座標2上での座標を求める。   In step S15, the linear equation on the image coordinate 2 of the element chip sides 10a and 10b obtained in step S14 is satisfied, and as shown in FIG. 9A, the intersection of the chip side 10a and the chip side 10b. Starting from 13b, two distance positions obtained by multiplying the lengths of the element chip sides 10a and 10b on the drawing by a pre-measured camera magnification, that is, images of two vertices 13a and 13c facing the element chip 4 The coordinates on coordinate 2 are obtained.

そして、ステップS16では、ステップS15で求めた2点の頂点13a,13cの中点座標を素子チップ4の中点座標(中心)5として認識する。その後、第1実施形態と同様にして、実際の撮像素子1の中心位置(画素中心位置)と有効画素部中心位置6とのズレ量(Zx,Zy)分オフセットさせた座標(Cx+Zx,Cy+Zy)を実際の有効画素の中心位置として求める(ステップS17)。   In step S16, the midpoint coordinates of the two vertices 13a and 13c obtained in step S15 are recognized as the midpoint coordinates (center) 5 of the element chip 4. Thereafter, similarly to the first embodiment, coordinates (Cx + Zx, Cy + Zy) offset by an amount of deviation (Zx, Zy) between the actual center position (pixel center position) of the image sensor 1 and the effective pixel portion center position 6 are used. Is determined as the center position of the actual effective pixel (step S17).

本実施形態に係る撮像素子の画素中心位置の検出方法によれば、素子チップ辺の1辺または2辺が明瞭に見えない場合には、明瞭に見える素子チップ辺10a,10bの2辺を画像認識し、認識した2辺上で、1辺の長さと予め測定されたカメラの倍率とから、素子チップ4の対向する2つの頂点13a,13cの画像座標位置を割り出す。この2つの頂点13a,13cから中点座標5(Cx,Cy)を求め、その後、素子チップ4の中心5と有効画素部中心位置6との図面上の位置の差と予め測定しておいたカメラの倍率とを掛け合わせることにより求められた量だけ中心座標5をオフセットさせ、この位置を有効画素中心位置(Cx+Zx,Cy+Zy)として認識する。したがって、素子チップ辺の1辺または2辺が明瞭に見えない場合でも、有効画素部4の中心位置を精度良く割り出すことが可能となる。   According to the detection method of the pixel center position of the image sensor according to the present embodiment, when one or two sides of the element chip side cannot be clearly seen, two sides of the element chip sides 10a and 10b that are clearly visible are imaged. The image coordinate positions of the two vertices 13a and 13c facing each other of the element chip 4 are determined from the length of one side and the pre-measured magnification of the camera on the two recognized sides. The midpoint coordinate 5 (Cx, Cy) is obtained from the two vertices 13a and 13c, and then the difference between the position on the drawing between the center 5 of the element chip 4 and the effective pixel portion center position 6 is measured in advance. The center coordinate 5 is offset by the amount obtained by multiplying by the magnification of the camera, and this position is recognized as the effective pixel center position (Cx + Zx, Cy + Zy). Therefore, even when one or two sides of the element chip side cannot be clearly seen, it is possible to accurately determine the center position of the effective pixel portion 4.

なお、本実施形態においては、図9(a)に示すように、2辺10a,10bを用いたが、これに限るものではなく、2辺10a,10bの他に、図9(b)から図9(c)に示すように、2辺10b,10c、2辺10c,10d、2辺10d,10aの3通りが考えられるが、このうち、素子チップ辺の隠蔽がなく、画像品質の一番良い2辺を選択すれば良い。   In this embodiment, as shown in FIG. 9A, the two sides 10a and 10b are used. However, the present invention is not limited to this, and in addition to the two sides 10a and 10b, from FIG. 9B. As shown in FIG. 9C, there are three possible ways: two sides 10b and 10c, two sides 10c and 10d, and two sides 10d and 10a. The best two sides should be selected.

また、直交する2つの素子チップ辺10a,10bから素子チップ4の頂点13a,13cの座標を求めたが、図10に示すように、3つの素子チップ辺10a,10b,10cを画像認識し、素子チップ4の頂点13bを求めても良い。そして、ステップS14で求めた素子チップ辺10a,10b,10cの画像座標2上での直線の方程式を満たし、素子チップ4の頂点13dを起点とし、図面上での素子チップ辺10a,10b,10cの長さと予め測定されたカメラ倍率とを掛け合わせた距離位置の点をもう1点の撮像素子1の頂点13dとして認識する。続いて、ステップS16以降を行うことにより、直交する2つの素子チップ辺10a,10bから有効画素の中心を求めるよりも、3つの素子チップ辺10a,10b,10cから有効画素の中心を求める方が、高精度に求めることが可能となる。   Further, the coordinates of the vertices 13a and 13c of the element chip 4 were obtained from the two orthogonal element chip sides 10a and 10b. As shown in FIG. 10, the three element chip sides 10a, 10b and 10c are image-recognized. The apex 13b of the element chip 4 may be obtained. Then, the linear equation on the image coordinate 2 of the element chip sides 10a, 10b, and 10c obtained in step S14 is satisfied, and the vertex 13d of the element chip 4 is set as a starting point, and the element chip sides 10a, 10b, and 10c on the drawing are used. A point at a distance position obtained by multiplying the length of the camera by a pre-measured camera magnification is recognized as another vertex 13d of the image sensor 1. Subsequently, by performing step S16 and the subsequent steps, it is more preferable to obtain the center of the effective pixel from the three element chip sides 10a, 10b, and 10c than to obtain the center of the effective pixel from the two orthogonal element chip sides 10a and 10b. It is possible to obtain with high accuracy.

なお、本発明の技術範囲は上記実施形態に限定されるものではなく、本発明の趣旨を逸脱しない範囲において種々の変更を加えることが可能である。
例えば、上記各実施形態において、図11に示すように、撮像素子20が、素子チップ4の上部を覆うように、チップ枠3の端面3bに透明カバーガラス21を設けても良い。このように、透明カバーガラス21を設けることにより、凹部3a内に配された素子チップ4を保護することが可能になる。
The technical scope of the present invention is not limited to the above embodiment, and various modifications can be made without departing from the spirit of the present invention.
For example, in each of the above embodiments, as shown in FIG. 11, a transparent cover glass 21 may be provided on the end surface 3 b of the chip frame 3 so that the imaging element 20 covers the upper part of the element chip 4. Thus, by providing the transparent cover glass 21, the element chip 4 arranged in the recess 3a can be protected.

本発明の第1実施形態に係る撮像素子の画素中心位置の検出方法を示すフローチャートである。4 is a flowchart illustrating a method for detecting a pixel center position of the image sensor according to the first embodiment of the present invention. 本発明の第1実施形態に係る撮像素子を示す平面図である。It is a top view which shows the image pick-up element which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係る撮像素子の画素中心位置の検出方法により検出された画像座標と濃淡データとの関係を示す。The relationship between the image coordinate detected by the detection method of the pixel center position of the image pick-up element which concerns on 1st Embodiment of this invention, and grayscale data is shown. 本発明の第1実施形態に係る撮像素子の画素中心位置の検出方法により検出された画像座標と濃淡データとの関係を示すグラフである。It is a graph which shows the relationship between the image coordinate detected by the detection method of the pixel center position of the image sensor which concerns on 1st Embodiment of this invention, and light / dark data. 図3をサブピクセル単位で2次曲線補間やスプライン補間した画像座標と濃淡データとの関係を示すグラフである。4 is a graph showing the relationship between image coordinates obtained by performing quadratic curve interpolation or spline interpolation in FIG. 本発明の第1実施形態に係る撮像素子の画素中心位置の検出方法を示す平面図である。It is a top view which shows the detection method of the pixel center position of the image pick-up element which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係る撮像素子の画素中心位置の検出方法を示す平面図である。It is a top view which shows the detection method of the pixel center position of the image pick-up element which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係る撮像素子の画素中心位置の検出方法を示すフローチャートである。It is a flowchart which shows the detection method of the pixel center position of the image pick-up element which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る撮像素子の画素中心位置の検出方法を示す概略図である。It is the schematic which shows the detection method of the pixel center position of the image pick-up element which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る撮像素子の画素中心位置の検出方法の他の変形例を示す概略図である。It is the schematic which shows the other modification of the detection method of the pixel center position of the image pick-up element which concerns on 2nd Embodiment of this invention. 本発明の第1,第2実施形態に係る撮像素子の変形例を示す要部断面図である。It is principal part sectional drawing which shows the modification of the image pick-up element which concerns on 1st, 2nd embodiment of this invention. 従来の検出方法により検出される撮像素子の画素中心位置を示す平面図である。It is a top view which shows the pixel center position of the image pick-up element detected by the conventional detection method.

符号の説明Explanation of symbols

1 撮像素子
2 画像座標
4a 有効画素部
5 素子チップの中心
6 有効画素部中心位置(有効画素中心)
12a,12b,12c,12d,12e,12f,12g,12h,12i,12j,12k,12l エッジ部
13a,13c,13b,13d 交点







DESCRIPTION OF SYMBOLS 1 Image pick-up element 2 Image coordinate 4a Effective pixel part 5 Center of element chip 6 Effective pixel part center position (effective pixel center)
12a, 12b, 12c, 12d, 12e, 12f, 12g, 12h, 12i, 12j, 12k, 12l Edge portions 13a, 13c, 13b, 13d







Claims (2)

位置決めされた撮像素子に設けられた矩形状の素子チップの4つの素子チップ辺を画像認識する工程と、
前記4つの素子チップ辺近傍の濃淡画像の画素間を補間する工程と、
補間された画像データを元に、前記素子チップ辺上の複数のエッジ位置をサブピクセル単位で4辺分割り出す工程と、
前記複数のエッジ位置を通る画像座標上における4つの素子チップ辺の直線の方程式を割り出す工程と、
前記4辺の直線方程式より、直交する2辺の交点とそれに対向する他の2辺の交点との座標をそれぞれ求める工程と、
前記2つの交点の座標の中心を前記素子チップの中心として認識する工程と、
前記素子チップの中心と前記素子チップ上の有効画素部の中心との位置の差に予め測定されたカメラの倍率をかけた量だけ前記素子チップの中心をオフセットさせ、この位置を撮像素子の画素中心位置として認識する工程とを有することを特徴とする撮像素子の画素中心位置の検出方法。
A step of recognizing four element chip sides of a rectangular element chip provided in the positioned image sensor;
Interpolating between pixels of the gray image near the four element chip sides;
Based on the interpolated image data, a step of dividing a plurality of edge positions on the element chip side into four sides in units of subpixels;
Determining a linear equation of four element chip sides on image coordinates passing through the plurality of edge positions;
Obtaining the coordinates of the intersection of two orthogonal sides and the intersection of the other two sides opposite to each other from the linear equation of the four sides;
Recognizing the center of the coordinates of the two intersection points as the center of the element chip;
The center of the element chip is offset by an amount obtained by multiplying the difference in position between the center of the element chip and the center of the effective pixel portion on the element chip by a pre-measured camera magnification, and this position is set as the pixel of the image sensor. And a step of recognizing the center position as a center position.
位置決めされた撮像素子に設けられた矩形状の素子チップの2つまたは3つの素子チップ辺を画像認識する工程と、
前記2つまたは3つの素子チップ辺近傍の濃淡画像の画素間を補間する工程と、
補間された画像データを元に、前記素子チップ辺上の複数のエッジ位置をサブピクセル単位で2辺分または3辺分割り出す工程と、
前記複数のエッジ位置を通る画像座標上における2つまたは3つの素子チップ辺の直線の方程式を割り出す工程と、
前記2つまたは3つの素子チップ辺の直線方程式より、直交する2辺の画像座標上における交点を1点求める工程と、
前記交点を起点として、前記直交する2辺のうちの1辺の直線方程式上で、1辺の長さと予め測定されたカメラの倍率とから1点の前記素子チップの頂点の画像座標位置を割り出す工程と、
前記直交する2辺のうち他辺の直線方程式上で、1辺の長さと予め測定されたカメラの倍率とから、前記頂点に対向する他の頂点の画像座標位置を割り出す工程と、
前記2つの頂点の画像座標位置の中点座標を前記素子チップの中心として認識する工程と、
前記素子チップの中心と前記素子チップ上の有効画素部の中心との位置の差に予め測定されたカメラの倍率をかけた量だけ前記素子チップの中心をオフセットさせ、この位置を撮像素子の画素中心位置として認識する工程とを有することを特徴とする撮像素子の画素中心位置の検出方法。
A step of recognizing two or three element chip sides of a rectangular element chip provided in the positioned image sensor;
Interpolating between pixels of the gray image near the two or three element chip sides;
Based on the interpolated image data, a step of dividing a plurality of edge positions on the element chip side into two or three sides in subpixel units;
Determining a linear equation of two or three element chip sides on image coordinates passing through the plurality of edge positions;
Obtaining one intersection point on the image coordinates of two orthogonal sides from the linear equation of the two or three element chip sides;
Starting from the intersection, the image coordinate position of one vertex of the element chip is determined from the length of one side and the pre-measured magnification of the camera on the linear equation of one of the two orthogonal sides. Process,
Determining the image coordinate position of another vertex facing the vertex from the length of one side and the magnification of the camera measured in advance on the linear equation of the other side of the two orthogonal sides;
Recognizing the midpoint coordinates of the image coordinate positions of the two vertices as the center of the element chip;
The center of the element chip is offset by an amount obtained by multiplying the difference in position between the center of the element chip and the center of the effective pixel portion on the element chip by a pre-measured camera magnification, and this position is set as the pixel of the image sensor. And a step of recognizing the center position as a center position.
JP2004316131A 2004-10-29 2004-10-29 Method for detecting center position of pixel of imaging device Withdrawn JP2006127238A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004316131A JP2006127238A (en) 2004-10-29 2004-10-29 Method for detecting center position of pixel of imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004316131A JP2006127238A (en) 2004-10-29 2004-10-29 Method for detecting center position of pixel of imaging device

Publications (1)

Publication Number Publication Date
JP2006127238A true JP2006127238A (en) 2006-05-18

Family

ID=36721924

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004316131A Withdrawn JP2006127238A (en) 2004-10-29 2004-10-29 Method for detecting center position of pixel of imaging device

Country Status (1)

Country Link
JP (1) JP2006127238A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897773A (en) * 2022-03-31 2022-08-12 海门王巢家具制造有限公司 Distorted wood detection method and system based on image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897773A (en) * 2022-03-31 2022-08-12 海门王巢家具制造有限公司 Distorted wood detection method and system based on image processing
CN114897773B (en) * 2022-03-31 2024-01-05 上海途巽通讯科技有限公司 Method and system for detecting distorted wood based on image processing

Similar Documents

Publication Publication Date Title
EP1555507B1 (en) Three-dimensional visual sensor
KR101801217B1 (en) Two-dimensional code
JP3425366B2 (en) Image correction device
CN111263142B (en) Method, device, equipment and medium for testing optical anti-shake of camera module
CN109100363B (en) Method and system for distinguishing defects of attached foreign matters from dust
EP1343332A2 (en) Stereoscopic image characteristics examination system
JP2005072888A (en) Image projection method and image projection device
JP2004260785A (en) Projector with distortion correction function
US20130075585A1 (en) Solid imaging device
JP3996617B2 (en) Projector device with image distortion correction function
US11182918B2 (en) Distance measurement device based on phase difference
JP4906128B2 (en) Camera unit inspection apparatus and camera unit inspection method
JP4623657B2 (en) Captured image processing apparatus and captured image processing method for electronic component mounter
JP2014155063A (en) Chart for resolution measurement, resolution measurement method, positional adjustment method for camera module, and camera module manufacturing method
JP5493900B2 (en) Imaging device
JP5240517B2 (en) Car camera calibration system
JP2006127238A (en) Method for detecting center position of pixel of imaging device
CN111369480A (en) Method and device for processing periodic texture
JP4548228B2 (en) Image data creation method
EP3564747A1 (en) Imaging device and imaging method
JP2019113434A (en) Calibrator, calibration method, and calibration program
JPH0979946A (en) Inspection device for display device
WO2014084056A1 (en) Testing device, testing method, testing program, and recording medium
JP3555407B2 (en) Edge detection method and apparatus therefor
JPH10283478A (en) Method for extracting feature and and device for recognizing object using the same method

Legal Events

Date Code Title Description
A300 Application deemed to be withdrawn because no request for examination was validly filed

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20080108