WO2011158344A1 - Image processing method, program, image processing device, and imaging device - Google Patents

Image processing method, program, image processing device, and imaging device Download PDF

Info

Publication number
WO2011158344A1
WO2011158344A1 PCT/JP2010/060185 JP2010060185W WO2011158344A1 WO 2011158344 A1 WO2011158344 A1 WO 2011158344A1 JP 2010060185 W JP2010060185 W JP 2010060185W WO 2011158344 A1 WO2011158344 A1 WO 2011158344A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual projection
projection plane
coordinate system
image processing
image
Prior art date
Application number
PCT/JP2010/060185
Other languages
French (fr)
Japanese (ja)
Inventor
上田 滋之
央樹 坪井
Original Assignee
コニカミノルタオプト株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタオプト株式会社 filed Critical コニカミノルタオプト株式会社
Priority to JP2012520203A priority Critical patent/JPWO2011158344A1/en
Priority to PCT/JP2010/060185 priority patent/WO2011158344A1/en
Publication of WO2011158344A1 publication Critical patent/WO2011158344A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations

Definitions

  • the present invention relates to an image processing method, a program, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
  • Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
  • the image height change rate increases when the incident angle is greater than or equal to a predetermined value in image processing when displaying photographing data photographed using a wide-angle lens on the display. In the case of less than a predetermined value, correction is performed so that the change rate of the image height decreases.
  • zoom, pan, tilt, etc. are performed based on an instruction to change a shooting area by a user. Each time processing is added, new image processing is required.
  • the present invention provides an image processing method, a program, an image processing apparatus, and an imaging apparatus capable of performing specific distortion processing while reducing processing time with a relatively small circuit.
  • the purpose is to provide.
  • an image processing method for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an image sensor having a plurality of pixels via an optical system including a lens
  • An image processing method comprising:
  • the image data is calculated from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane, and the corresponding position on the imaging element surface is calculated. 5.
  • image data at each position is obtained from pixel data.
  • An image processing apparatus for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, A virtual projection plane in the world coordinate system, which converts the coordinates in the world coordinate system of each pixel of the virtual projection plane consisting of a curved surface or a plane connecting a plurality of planes to the camera coordinate system using the distortion correction coefficient of the optical system; An image processing unit that calculates image data of the virtual projection plane set by the setting unit based on the coordinates converted into the camera coordinate system and the plurality of pixel data; An image processing apparatus comprising:
  • the shape of the virtual projection surface is a cylindrical shape having a circular arc cross section or a polygonal mirror shape including a plurality of continuous panels.
  • the image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane.
  • the image processing apparatus according to any one of 6 to 8, wherein image data at each position is obtained from pixel data.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising: A virtual projection plane of a world coordinate system in which a position and a size are set, and the coordinates in the world coordinate system of each pixel of the virtual projection plane formed by a curved surface or a plane connecting a plurality of planes is used as a distortion correction coefficient of the optical system.
  • An image processing unit for calculating image data on the virtual projection plane based on the coordinates converted into the camera coordinate system and the plurality of pixel data. Program to function as.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system
  • the computer comprising: A setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane in the world coordinate system, and includes a curved surface or a plane connecting a plurality of planes; The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data of the virtual projection plane set by the setting unit, Program to function as.
  • An imaging device having a plurality of pixels;
  • a setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane of a world coordinate system, which is a curved surface or a plane connecting a plurality of planes;
  • the coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, and the coordinates converted into the camera coordinate system and the image sensor are received.
  • An image processing unit that calculates image data of a virtual projection plane set by the setting unit based on a plurality of pixel data obtained in the above manner;
  • An imaging device comprising:
  • the imaging apparatus according to 12, wherein the shape of the virtual projection surface is a cylindrical shape having a circular arc cross section or a polygonal mirror shape including a plurality of continuous panels.
  • An operation unit operated by a user A display unit; Have The setting unit sets the shape, size, and position of the virtual projection plane in the world coordinate system based on an operation to the operation unit, The image pickup apparatus according to item 12 or 13, wherein the display unit displays image data on the set virtual projection plane.
  • the image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane.
  • the image pickup apparatus according to any one of 12 to 15, wherein image data at each position is obtained from pixel data.
  • the virtual projection plane set in the world coordinate system is converted into the camera coordinate system using the distortion correction coefficient, and the virtual projection plane is based on the converted coordinates of the camera coordinate system and the pixel data of the image sensor.
  • the virtual projection plane is composed of a curved surface or a plane connecting a plurality of planes, so that cylindrical distortion correction, spherical distortion correction, and trihedral mirror distortion correction can be handled in a single process.
  • FIG. 7A is a diagram illustrating an example of the shape of a virtual projection plane
  • FIG. 7A is an example of a cylindrical shape
  • FIG. 7B is an example of a trihedral mirror shape.
  • FIG. 1 An example in which two virtual projection planes VP are set is shown. This is an example in which the position of the virtual projection plane VP is changed with the image center o of the camera coordinate system as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by roll is shown.
  • An example in which the virtual projection plane VP0 is pitch rotated is shown.
  • An example in which the virtual projection plane VP0 is rotated by yaw is shown.
  • This is an example in which the position of the virtual projection plane VP is changed with the center ov of the virtual projection plane VP0 as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by a virtual pitch is shown.
  • An example in which the virtual projection plane VP0 is rotated by a virtual yaw is shown.
  • FIG. 25 is an example in which image data calculated using a virtual projection plane corresponding to FIG. 24 is displayed on the display unit 120.
  • FIG. 25 is an example in which image data calculated using a virtual projection plane corresponding to FIG. 24 is displayed on the display unit 120.
  • FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment.
  • X, Y, and Z are world coordinate systems, and the origin O is the lens center.
  • Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O.
  • Point P is an object point of the object in the world coordinate system XYZ.
  • is an incident angle with respect to the optical axis (coincident with the Z axis).
  • X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA.
  • o is the center of the image and is the intersection of the optical axis Z and the image sensor surface.
  • the point p is a point on the image sensor surface in the camera coordinate system, and the object point P is converted into the camera coordinate system using a distortion correction coefficient based on a parameter based on lens characteristics (hereinafter referred to as “lens parameter”). is there.
  • the VP is a virtual projection plane, and consists of a curved surface or a surface connecting a plurality of planes.
  • the virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system.
  • the virtual projection plane VP can be changed in shape, size, and position in the world coordinate system based on an instruction from the user to the operation unit 130 (see FIG. 3).
  • position change is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
  • “Cylinder distortion correction (or cylinder correction)” is a distortion correction method performed using the following virtual projection plane VP.
  • the virtual projection plane VP shape is rectangular or square for the shape projected on the XY plane perpendicular to the optical axis Z direction (the X axis is the horizontal direction and the Y axis is the vertical direction), and is not constant in the optical axis Z direction, There are arbitrary changes.
  • “Spherical distortion correction (or spherical correction)” is a distortion correction method performed using the following virtual projection plane VP.
  • the virtual projection plane VP shape is a shape in which the inner surface of the sphere is cut out into a rectangle, and the shape projected on the XY plane perpendicular to the optical axis Z is a rectangle or square, and the X axis direction and the Y axis direction are also Z.
  • the axial direction is not constant and is arbitrarily changed.
  • the coordinate unit change amount in the X-axis direction and the Y-axis direction has a constant curvature.
  • Trihedral mirror correction (or trihedral mirror distortion correction) is a distortion correction method performed using the following virtual projection plane VP.
  • the virtual projection plane VP shape is arbitrary in the X-axis direction and Y-axis direction (particularly preferably rectangular or square), a part is constant (no change) in the optical axis Z direction, There are arbitrary changes.
  • an example of a cylindrical shape in which a part of the inner peripheral surface of the cylinder is cut off as the virtual projection surface is shown.
  • the shape of the virtual projection plane is not limited to this, and may be a shape of the inner peripheral surface of a hemisphere or a spherical crown cut out by a plane intersecting with the sphere, or a shape obtained by cutting the inner peripheral surface of the sphere into a rectangle. .
  • the shape which connected the some surface may be sufficient.
  • a three-sided mirror shape (described later) composed of three continuous panels is preferable.
  • the virtual projection plane VP has a predetermined shape and size, and the center ov of the virtual projection plane VP is located on the Z axis.
  • Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP.
  • a virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus.
  • the imaging apparatus includes an imaging unit 110, a control device 100, a display unit 120, and an operation unit 130.
  • the imaging unit 110 includes a short-focus lens, an imaging element, and the like.
  • examples of the lens include a wide-angle lens and a fisheye lens.
  • the control device 100 includes an image processing unit 101, a setting unit 102, and a storage unit 103.
  • the setting unit 102 sets the shape, position, and size of the virtual projection plane VP based on an input instruction to the operation unit 130.
  • the image processing unit 101 creates a conversion table of each coordinate on the virtual projection plane into the camera coordinate system based on the set shape, position, and size of the virtual projection plane VP, and uses the conversion table to capture the imaging unit 110.
  • the pixel data photographed in the above is processed to generate image data to be displayed on the display unit 120.
  • the storage unit 103 stores a distortion correction coefficient calculated based on the lens parameters of the lens. Also, the position and size of the virtual projection plane VP and the created conversion table are stored.
  • the display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the image data created by the image processing unit 101 based on the pixel data captured by the imaging unit 110 on the display screen.
  • the operation unit 130 includes a keyboard, a mouse, or a touch panel arranged so as to be superimposed on the liquid crystal display of the display unit, and receives a user's input operation.
  • FIGS. 5 and 6 are diagrams showing a subroutine of FIG.
  • step S10 the virtual projection plane VP is converted (set).
  • the virtual projection plane conversion in step S10 will be described with reference to FIG.
  • step S11 the type of virtual projection plane shape is selected by the user. As the types, there are a cylindrical shape and a three-sided mirror shape as described above.
  • step S12 the shape and position of the virtual projection plane are set.
  • FIG. 7 is a diagram showing an example of the shape of the virtual projection plane
  • FIG. 7A is an example in which a cylindrical shape is selected as the shape of the virtual projection plane
  • FIG. 7B is a diagram in which a trihedral mirror shape is selected. This is an example. In the former, cylindrical distortion correction is performed, and in the latter, trihedral mirror distortion correction is performed.
  • both virtual projection planes VP have a symmetrical shape about the central axis ov-axis. By making it symmetrical, the memory capacity required to store the virtual projection plane VP can be reduced.
  • the inclination of the end portion is not limited to the inside (a shape convex toward the object point side), and may be a shape inclined toward the outside (a shape convex toward the image side) (see FIG. 24).
  • the virtual projection plane VP shown in FIG. 7 (a) has a cylindrical shape with both end portions inclined inward, and has a shape in which the inward inclination angle increases as the distance from the central axis v-axis increases.
  • the length l X, l Y, set the shape of the virtual projection plane VP by setting the l Z is performed, it is possible to set the position by changing the position of the ov.
  • the sides corresponding to the lengths l X , l Y , and l Z are parallel to the X axis, the Y axis, and the Z axis, respectively.
  • the camera viewpoint displayed on the display unit 120 is changed by changing the position as shown in FIG.
  • the virtual projection plane VP shown in FIG. 7B has a three-sided mirror shape in which both ends are inclined inward, and three panels of the main mirror part m and the side mirror parts S1 and S2 are continuous.
  • the plane of the main mirror part m is parallel to the XY plane, and the sides corresponding to the lengths l X1 and l X2 , l Y1 and l Z1 are parallel to the X axis, Y axis and Z axis, respectively. is there.
  • the shape of the side mirror parts S1 and S2 and the angle of the plane with respect to the plane of the main mirror part m are the same, and the virtual projection plane VP has a bilaterally symmetric shape about the central axis v-axis.
  • the shape of the virtual projection plane VP is set, and the position can be set by changing the position of ov.
  • FIG. 1 an example of a three-sided mirror in which two side mirror parts are connected to both sides of the main mirror part is shown, but not only on both sides (X axis direction) of the main mirror part but also in the vertical direction (Y axis).
  • the shape may be a polygonal mirror in which side mirrors are connected in the direction).
  • the virtual projection plane VP set up to step S12 is divided into n pixels.
  • the number of pixels is preferably equal to or greater than the total number of display pixels (screen resolution) of the display unit 120.
  • both the number of pixels n and the total number of display pixels of the display unit 120 will be described under a fixed condition of 640 ⁇ 480 pixels (total number of pixels 307,000).
  • the interval between adjacent pixels on the virtual projection plane VP is set to be equal.
  • the virtual projection plane VP has a shape formed by a curved surface or a surface obtained by connecting a plurality of planes, and a shape in which the end side surface is inclined.
  • the virtual projection plane VP is inclined as compared with the periphery of the center ov that is more directly opposed to the lens center plane LC (the main mirror portion m in the example of FIG. 7B). Pixels are arranged at higher density on both end sides (in the example of FIG. 7B, the side mirror portions S1 and S2) with respect to the change in the incident angle ⁇ .
  • This is the control related to the subroutine of step S10 shown in FIG.
  • step S20 of FIG. 4 (corresponding to “second and third steps”), distortion correction processing is mainly performed by the image processing unit 101 based on the virtual projection plane VP set in step S10.
  • the distortion correction process in step S20 will be described with reference to FIG.
  • step S21 the coordinates Gv (X, Y, Z) of the world coordinate system are acquired for each pixel Gv on the virtual projection plane VP.
  • FIG. 8 is a schematic diagram for explaining a coordinate system. As shown in FIG. 8, point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D (639, 0) at the four corners of the virtual projection plane VP. , Zd) is divided into 640 ⁇ 480 pixel pixels Gv (total number of pixels: 307,000) at equal intervals, and the coordinates of all the pixels Gv in the world coordinate system are obtained.
  • the inclination of each panel is determined based on the curvature of the virtual projection surface VP if a plurality of panels such as a three-surface shape are continuous. Calculate based on the angle.
  • step S22 coordinates Gi (x, y in the corresponding camera coordinate system on the image sensor surface IA are calculated from the coordinates of the pixel Gv in the world coordinate system and the distortion correction coefficient of the imaging unit 110 stored in the storage unit 103. ) Is calculated. Specifically, the distortion correction coefficient calculated from the lens parameters of the optical system is stored in the storage unit 103, and is calculated from the incident angle ⁇ with respect to the optical axis Z obtained from the coefficient and the coordinates of each pixel Gv. (Reference: International Publication No. 2010/032720).
  • FIG. 9 is a diagram showing a correspondence relationship between the camera coordinate system xy and the imaging element surface IA.
  • points a to d are obtained by converting the points A to D in FIG. 8 into the camera coordinate system.
  • the virtual projection plane VP surrounded by the points A to D is a rectangular plane, but in FIG. 9, the area surrounded by the points a to d after the coordinate transformation to the camera coordinate system is (the virtual projection plane VP It becomes a distorted shape (corresponding to the position and shape).
  • the figure shows an example of distortion in a barrel shape.
  • it may be a pincushion type or a Jinkasa type (a shape that changes into a barrel shape at the center and straight or pincushion at the end). There is also.
  • step S23 the pixel of the imaging device to be referred to is determined from the coordinates Gi (x ', y').
  • x and y in the coordinates (x, y) of each pixel of the image sensor are integers, but x ′ and y ′ in the coordinates Gi (x ′, y ′) calculated in step S22 are not necessarily integers. Can take a real value with a fractional part.
  • x ′ and y ′ are integers as in the former and the coordinates Gi (x ′, y ′) coincide with the pixel position of the image sensor, the pixel data of the pixel of the corresponding image sensor is used as the virtual projection plane.
  • the pixel data of the pixel Gv is calculated as the pixel data around the calculated coordinate Gi (x ′, y ′).
  • the pixel data of the pixels for example, the top four pixels close to the position of the coordinate Gi (x ′, y ′), they are closer to these simple average values or the distance to the coordinate Gi (x ′, y ′).
  • Pixel data calculated by weighting four pixels may be used.
  • the peripheral locations are not limited to 4 locations, and may be 1 location, 16 locations or more.
  • Steps S21 to S24 are moved from the point A (0, 0, Za), which is the starting point of FIG. 8, by one pixel (pixel) at a time until each pixel ( (Pixel), image data in which distortion correction has been performed for all pixels can be acquired. This is the control related to the subroutine of step S20 shown in FIG.
  • step S40 (corresponding to “fourth step”), the image data acquired in step S20 is displayed on the display screen of the display unit 120. Steps S20 and S40 are sequentially performed, and the image data after distortion processing based on the captured pixel data is displayed on the display unit 120 in real time.
  • the virtual projection plane VP by setting the shape, position, and size of the virtual projection plane VP, it is possible to handle all processes including panning, tilting, and zooming processes including distortion correction processes in a batch process. Therefore, the processing becomes light and the processing time can be shortened with a relatively small circuit.
  • the virtual projection plane VP by making the virtual projection plane VP into a shape formed by a curved surface or a surface obtained by connecting a plurality of planes, cylindrical distortion correction, spherical distortion correction, and trihedral mirror distortion correction can be handled in a single process.
  • FIG. 10 is a schematic diagram illustrating distortion correction according to the second embodiment.
  • the single virtual projection plane is used.
  • the virtual projection plane is not limited to this and may be two or more.
  • the image data obtained at the positions of the respective virtual projection planes are switched and displayed by time division or a user instruction, or the display screen is divided and displayed at the same time.
  • the second embodiment shown in FIG. 10 is an example in which two virtual projection planes are set. Except for the configuration shown in FIG. 10, it is the same as the embodiment described in FIGS.
  • FIG. 10 shows an example in which two virtual projection planes VPh and VPj are set. Both can set the shape, position, and size independently.
  • the ranges corresponding to the virtual projection planes VPh and VPj on the image sensor surface IA are the areas h and j
  • the points corresponding to the object points P1 and P2 are the points p1 and p2.
  • Image data is calculated for each of the set virtual projection planes VPh and VPj by the flow shown in FIGS. 4 to 6, and each is individually displayed on the display unit 120 by the process of step S40 of FIG.
  • FIG. 11 to 14 are examples in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the rotation center (or the movement center).
  • rotation about the x-axis with the image center o as the rotation center is pitch (also referred to as tilt)
  • rotation about the y-axis is yaw (also referred to as pan)
  • rotation about the Z-axis Is roll.
  • FIGS. 12, 13, and 14 show examples in which the virtual projection plane VP0 is rotated by roll, pitch, and yaw, respectively, based on the input rotation amount setting value.
  • Ca0 and Ca1 are virtual cameras
  • cav is the camera viewpoint of the virtual camera Ca1 after the position change. Note that the camera viewpoint cav of the virtual camera Ca0 before the position change coincides with the Z axis.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating it.
  • the camera viewpoint cav coincides with before and after the position change.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the pitch.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the yaw, and the camera viewpoint cav is rotating clockwise in FIG.
  • 15 to 17 are examples in which the position of the virtual projection plane VP is changed based on the center ov of the virtual projection plane VP0 based on the input rotation amount setting value.
  • viewpoint conversion corresponding to rotating or changing the position of the virtual camera Ca0 is performed.
  • one of two axes that are orthogonal to each other on the virtual projection plane VP0 is set as Yaw-axis, and the other is set as P-axis. Both are axes passing through the center ov, and rotation around the Yaw-axis with the center ov as the rotation center is called virtual yaw rotation, and rotation around the P-axis is called virtual pitch rotation.
  • the virtual projection plane VP0 is parallel to the xy plane of the camera coordinate system, and the center ov exists on the Z axis.
  • P-axis is parallel to the x axis
  • Yaw-axis is parallel to the y axis (note that Yaw-axis and v-axis in FIG. 7 are the same).
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual pitch, and after the position change, the virtual camera Ca1 is positioned above, and the camera viewpoint cav is in a direction to look down.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual yaw, and the virtual camera Ca1 and the camera viewpoint cav rotate counterclockwise after the position change.
  • FIG. 18 is an example in which the position of the virtual projection plane VP0 is changed with the image center o in the camera coordinate system as the movement center. In these examples, viewpoint conversion corresponding to parallel translation of the virtual camera Ca0 together with the virtual projection plane VP0 is performed.
  • the virtual projection plane VP0 is offset-shifted (translated) in the X, Y, and Z directions based on the input offset movement amount setting value. is there.
  • the offset movement in the Z direction is the same movement as zooming in and zooming out.
  • the offset movement in each direction is effective when moving the dark part outside the imaging area of the optical system outside the image area.
  • FIG. 22 and 23 are examples of display images displayed on the display unit 120.
  • FIG. FIG. 22 is an example of a display image that displays image data calculated based on a virtual projection plane VP of a cylindrical shape (FIG. 1, FIG. 7A, etc.).
  • FIG. It is an example of the display image in the virtual projection surface VP of b)).
  • An image close to the center can be accurately recognized with little distortion, and an image close to the left and right ends is enlarged and displayed. Since cylindrical distortion correction or trihedral mirror distortion correction is performed, the shape is unlikely to be deformed as the image is enlarged on the end side.
  • FIG. 24 shows a modification of the shape of the virtual projection plane VP.
  • the virtual projection plane VP has a reverse cylindrical shape, and both end sides are inclined outward (back side).
  • FIG. 25 is an example of a display image that displays image data calculated based on the virtual projection plane VP having the reverse cylindrical shape shown in FIG. Although an image close to the center has little distortion, an image close to the left and right ends can display a wide area although distortion remains. Even in the virtual projection plane having such a shape, the pixels are arranged at a high density with respect to the change in the incident angle ⁇ on the side closer to the left and right ends than the center side. The phenomenon that the image becomes rough on the side can be avoided.
  • the optical system including a single lens is exemplified as the optical system including the condensing lens.
  • the condensing lens may be composed of a plurality of lenses.
  • an optical element other than the condenser lens may be provided, and the invention of the present application can be applied.
  • the distortion correction coefficient may be a value for the entire optical system.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

Disclosed is an image processing method for applying a specific distortion correction in less time with a comparatively small circuit. In said image processing method, a virtual projection surface set in a world coordinate system, said virtual projection surface comprising a curved surface or a plurality of flat surfaces joined together, is converted to a camera coordinate system by means of distortion-correction coefficients, and data for an image projected onto said virtual projection surface is computed on the basis of the converted camera-coordinate-system coordinates and pixel data from imaging elements.

Description

画像処理方法、プログラム、画像処理装置及び撮像装置Image processing method, program, image processing apparatus, and imaging apparatus
 本願発明は、集光レンズを含む光学系を介して撮像素子により撮像された画像の歪み補正処理を行う、画像処理方法、プログラム、画像処理装置及び撮像装置に関するものである。 The present invention relates to an image processing method, a program, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
 一般に、広角レンズあるいは魚眼レンズのような焦点距離の短いレンズや画角の大きなレンズを備えた光学系により撮影した画像は歪曲を伴うので、歪曲を補正する画像処理を行う。特許文献1には従来技術の補正方法として、焦点距離の短いレンズを使用して撮像された撮像画像に生じる歪曲を、レンズの補正用のパラメータを用いて補正する方法が開示されている。 Generally, since an image taken by an optical system having a short focal length lens or a large angle of view lens such as a wide-angle lens or a fish-eye lens is distorted, image processing for correcting the distortion is performed. Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
 特許文献2の車両周辺を表示する表示装置では、広角レンズを用いて撮影した撮影データをディスプレイに表示させる際の画像処理において、入射角度が所定値以上の場合には像高の変化率が増加するようにし、所定値未満の場合には像高の変化率が減少するように補正を行っている。 In the display device that displays the periphery of a vehicle disclosed in Patent Document 2, the image height change rate increases when the incident angle is greater than or equal to a predetermined value in image processing when displaying photographing data photographed using a wide-angle lens on the display. In the case of less than a predetermined value, correction is performed so that the change rate of the image height decreases.
特開2009-140066号公報JP 2009-140066 A 特開2010-3014号公報JP 2010-3014 A
 特許文献1、2に開示されたように、レンズで得た撮像画像をリニアに補正すると、中央部にくらべて周辺部では像が拡大されるために形が崩れ見にくい画像となる。これを防ぐための補正として円筒歪補正や球形歪補正があるが、これらの補正を行うために画像処理装置としてハード化した場合には処理時間が長くなり、回路規模が増大してしまい、コストが嵩んでしまう問題があった。 As disclosed in Patent Documents 1 and 2, when the captured image obtained by the lens is linearly corrected, the image is enlarged in the peripheral portion as compared with the central portion, so that the image is deformed and is difficult to see. Corrections to prevent this include cylindrical distortion correction and spherical distortion correction. However, when hardware is used as an image processing apparatus for performing these corrections, the processing time becomes long, the circuit scale increases, and the cost increases. There was a problem that increased.
 特に、監視カメラや特許文献2に開示された車載カメラの様に撮像画像をモニターにリアルタイムで表示させるような場合においては、ユーザにより撮影領域を変更する指示に基づいてズーム、パン、チルト等の処理を追加するたびに新たな画像処理が必要となる。 In particular, in a case where a captured image is displayed on a monitor in real time, such as a surveillance camera or an in-vehicle camera disclosed in Patent Document 2, zoom, pan, tilt, etc. are performed based on an instruction to change a shooting area by a user. Each time processing is added, new image processing is required.
 このため画像処理装置としてハード化した場合に処理時間が長くなり、回路規模が増大してしまい、コストが嵩んでしまう問題があった。 For this reason, there is a problem that when the image processing apparatus is implemented as hardware, the processing time becomes long, the circuit scale increases, and the cost increases.
 本願発明はこのような問題に鑑み、比較的小規模な回路で、処理時間の短縮化を図りながら特定の歪み処理を行うことが可能な、画像処理方法、プログラム、画像処理装置及び撮像装置を提供することを目的とする。 In view of such a problem, the present invention provides an image processing method, a program, an image processing apparatus, and an imaging apparatus capable of performing specific distortion processing while reducing processing time with a relatively small circuit. The purpose is to provide.
 上記の目的は、下記に記載する発明により達成される。 The above object is achieved by the invention described below.
 1.レンズを備えた光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理方法において、
 ユーザの指示に基づいて、曲面若しくは複数の平面を連結した面からなる仮想投影面の面形状、サイズ及びワールド座標系における位置の設定を行う第1ステップと、
 前記第1ステップで設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換する第2ステップと、
 前記複数の画素データと前記第2ステップで変換したカメラ座標系における座標とに基づいて、前記第1ステップで設定された仮想投影面の画像データを算出する第3ステップと、
 を有することを特徴とする画像処理方法。
1. In an image processing method for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an image sensor having a plurality of pixels via an optical system including a lens,
A first step of setting a surface shape, a size, and a position in a world coordinate system of a virtual projection surface composed of a curved surface or a surface obtained by connecting a plurality of planes based on a user instruction;
A second step of converting coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step into a camera coordinate system using a distortion correction coefficient of the optical system;
A third step of calculating image data of the virtual projection plane set in the first step based on the plurality of pixel data and the coordinates in the camera coordinate system converted in the second step;
An image processing method comprising:
 2.前記第1ステップで設定された仮想投影面の面形状は、断面が円弧状である円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする前記1に記載の画像処理方法。 2. 2. The image processing according to 1, wherein the surface shape of the virtual projection plane set in the first step is a cylindrical shape having a circular cross section or a polygonal mirror shape including a plurality of continuous panels. Method.
 3.前記第3ステップで算出した画像データを表示部に表示させる第4ステップを有することを特徴とする前記1又は2に記載の画像処理方法。 3. 3. The image processing method according to 1 or 2, further comprising a fourth step of displaying the image data calculated in the third step on a display unit.
 4.前記第1ステップで設定される仮想投影面は、複数の仮想投影面であることを特徴とする、前記1から3のいずれか一項に記載の画像処理方法。 4. The image processing method according to any one of claims 1 to 3, wherein the virtual projection plane set in the first step is a plurality of virtual projection planes.
 5.前記第3ステップでは、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする前記1から4のいずれか一項に記載の画像処理方法。 5. In the third step, the image data is calculated from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane, and the corresponding position on the imaging element surface is calculated. 5. The image processing method according to claim 1, wherein image data at each position is obtained from pixel data.
 6.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理装置であって、
 ワールド座標系における仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部と、
 を有することを特徴とする画像処理装置。
6). An image processing apparatus for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system,
A virtual projection plane in the world coordinate system, which converts the coordinates in the world coordinate system of each pixel of the virtual projection plane consisting of a curved surface or a plane connecting a plurality of planes to the camera coordinate system using the distortion correction coefficient of the optical system An image processing unit that calculates image data of the virtual projection plane set by the setting unit based on the coordinates converted into the camera coordinate system and the plurality of pixel data;
An image processing apparatus comprising:
 7.前記仮想投影面の形状は、断面が円弧状の円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする前記6に記載の画像処理装置。 7. 7. The image processing apparatus according to 6, wherein the shape of the virtual projection surface is a cylindrical shape having a circular arc cross section or a polygonal mirror shape including a plurality of continuous panels.
 8.前記仮想投影面は、複数の仮想投影面であることを特徴とする前記6又は7に記載の画像処理装置。 8. 8. The image processing apparatus according to 6 or 7, wherein the virtual projection plane is a plurality of virtual projection planes.
 9.前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする前記6から8の何れか一項に記載の画像処理装置。 9. The image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. The image processing apparatus according to any one of 6 to 8, wherein image data at each position is obtained from pixel data.
 10.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
 位置及びサイズが設定されたワールド座標系の仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部、
 として機能させるプログラム。
10. An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
A virtual projection plane of a world coordinate system in which a position and a size are set, and the coordinates in the world coordinate system of each pixel of the virtual projection plane formed by a curved surface or a plane connecting a plurality of planes is used as a distortion correction coefficient of the optical system. An image processing unit for calculating image data on the virtual projection plane based on the coordinates converted into the camera coordinate system and the plurality of pixel data.
Program to function as.
 11.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
 ワールド座標系における仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の形状、サイズ及び配置位置を設定可能な設定部と、
 前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部、
 として機能させるプログラム。
11. An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
A setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane in the world coordinate system, and includes a curved surface or a plane connecting a plurality of planes;
The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data of the virtual projection plane set by the setting unit,
Program to function as.
 12.光学系と、
 複数の画素を有する撮像素子と、
 ワールド座標系の仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の形状、サイズ及び配置位置を設定可能な設定部と、
 前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記撮像素子に受光して得られた複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部と、
 を有することを特徴とする撮像装置。
12 Optical system,
An imaging device having a plurality of pixels;
A setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane of a world coordinate system, which is a curved surface or a plane connecting a plurality of planes;
The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, and the coordinates converted into the camera coordinate system and the image sensor are received. An image processing unit that calculates image data of a virtual projection plane set by the setting unit based on a plurality of pixel data obtained in the above manner;
An imaging device comprising:
 13.前記仮想投影面の形状は、断面が円弧状の円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする前記12に記載の撮像装置。 13. 13. The imaging apparatus according to 12, wherein the shape of the virtual projection surface is a cylindrical shape having a circular arc cross section or a polygonal mirror shape including a plurality of continuous panels.
 14.ユーザが操作する操作部と、
 表示部と、
 を有し、
 前記設定部は、前記操作部への操作に基づいてワールド座標系における前記仮想投影面の形状、サイズ及び位置の設定を行い、
 前記表示部は、設定された前記仮想投影面での画像データを表示することを特徴とする前記12又は13に記載の撮像装置。
14 An operation unit operated by a user;
A display unit;
Have
The setting unit sets the shape, size, and position of the virtual projection plane in the world coordinate system based on an operation to the operation unit,
The image pickup apparatus according to item 12 or 13, wherein the display unit displays image data on the set virtual projection plane.
 15.前記設定部で設定される仮想投影面は、複数の仮想投影面であることを特徴とする前記12から14の何れか一項に記載の撮像装置。 15. 15. The imaging apparatus according to any one of 12 to 14, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
 16.前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする前記12から15の何れか一項に記載の撮像装置。 16. The image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. The image pickup apparatus according to any one of 12 to 15, wherein image data at each position is obtained from pixel data.
 本願発明によれば、ワールド座標系に設定された仮想投影面を歪み補正係数を用いてカメラ座標系に変換し、変換したカメラ座標系の座標と、撮像素子の画素データに基づいて仮想投影面の画像データを算出することにより、比較的小規模な回路で、処理時間の短縮化を図ることが可能となる。 According to the present invention, the virtual projection plane set in the world coordinate system is converted into the camera coordinate system using the distortion correction coefficient, and the virtual projection plane is based on the converted coordinates of the camera coordinate system and the pixel data of the image sensor. By calculating the image data, it is possible to reduce the processing time with a relatively small circuit.
 特に仮想投影面を、曲面若しくは複数の平面を連結した面からなる構成とすることにより円筒歪補正、球形歪補正、三面鏡歪補正も一度の処理で対応することが可能となる。 In particular, the virtual projection plane is composed of a curved surface or a plane connecting a plurality of planes, so that cylindrical distortion correction, spherical distortion correction, and trihedral mirror distortion correction can be handled in a single process.
本実施形態に係る歪曲補正を説明する模式図である。It is a schematic diagram explaining the distortion correction which concerns on this embodiment. 仮想投影面VPの位置を移動させた例を示している。An example in which the position of the virtual projection plane VP is moved is shown. 撮像装置の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of an imaging device. メインの制御フローを示す図である。It is a figure which shows the main control flow. ステップS10のサブルーチンを示す図である。It is a figure which shows the subroutine of step S10. ステップS20のサブルーチンを示す図である。It is a figure which shows the subroutine of step S20. 仮想投影面の形状例を示す図であり、図7(a)は円筒形状の例であり、図7(b)は三面鏡形状の例である。FIG. 7A is a diagram illustrating an example of the shape of a virtual projection plane, FIG. 7A is an example of a cylindrical shape, and FIG. 7B is an example of a trihedral mirror shape. 仮想投影面VPの座標を説明する図である。It is a figure explaining the coordinate of the virtual projection surface VP. カメラ座標系xyと撮像素子面IAとの対応関係を示す図である。It is a figure which shows the correspondence of camera coordinate system xy and image pick-up element surface IA. 2つの仮想投影面VPを設定した例を示している。An example in which two virtual projection planes VP are set is shown. カメラ座標系の画像中心oを回転中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the image center o of the camera coordinate system as the rotation center. 仮想投影面VP0をroll回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by roll is shown. 仮想投影面VP0をpitch回転させた例を示すものである。An example in which the virtual projection plane VP0 is pitch rotated is shown. 仮想投影面VP0をyaw回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by yaw is shown. 仮想投影面VP0の中心ovを回転中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the center ov of the virtual projection plane VP0 as the rotation center. 仮想投影面VP0を仮想pitch回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by a virtual pitch is shown. 仮想投影面VP0を仮想yaw回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by a virtual yaw is shown. カメラ座標系の画像中心oを移動中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center. 仮想投影面VP0をX方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the X direction is shown. 仮想投影面VP0をY方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the Y direction is shown. 仮想投影面VP0をZ方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the Z direction is shown. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 仮想投影面VPの形状が、逆向きの円筒形状の例を示している。An example in which the shape of the virtual projection plane VP is a cylindrical shape in the reverse direction is shown. 表示部120に図24に対応する仮想投影面により算出した画像データを表示させた例である。FIG. 25 is an example in which image data calculated using a virtual projection plane corresponding to FIG. 24 is displayed on the display unit 120. FIG.
 本発明を実施の形態に基づいて説明するが、本発明は該実施の形態に限られない。 The present invention will be described based on an embodiment, but the present invention is not limited to the embodiment.
 図1は、本実施形態に係る歪曲補正を説明する模式図である。図1において、X、Y、Zはワールド座標系であり、原点Oはレンズ中心である。Zは光軸、XY平面はレンズ中心Oを通るレンズ中心面LCを含んでいる。点Pはワールド座標系XYZにおける対象物の物点である。θは光軸(Z軸に一致)に対する入射角度である。 FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment. In FIG. 1, X, Y, and Z are world coordinate systems, and the origin O is the lens center. Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O. Point P is an object point of the object in the world coordinate system XYZ. θ is an incident angle with respect to the optical axis (coincident with the Z axis).
 x、yはカメラ座標系であり、xy平面は撮像素子面IAに対応する。oは画像中心であり光軸Zと撮像素子面との交点である。点pはカメラ座標系における撮像素子面上の点であり、物点Pをレンズ特性に基づくパラメータ(以下、「レンズパラメータ」という)に基づく歪み補正係数を用いてカメラ座標系に変換したものである。 X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA. o is the center of the image and is the intersection of the optical axis Z and the image sensor surface. The point p is a point on the image sensor surface in the camera coordinate system, and the object point P is converted into the camera coordinate system using a distortion correction coefficient based on a parameter based on lens characteristics (hereinafter referred to as “lens parameter”). is there.
 VPは仮想投影面であり、曲面若しくは複数の平面を連結した面からなる。仮想投影面VPは光学系のレンズ位置(レンズ中心面LC)に対して撮像素子(及び撮像素子面IA)とは反対側に設定される。仮想投影面VPは、ユーザによる操作部130(図3参照)への指示に基づいてその形状、サイズ、及びワールド座標系における位置の変更を行うことが可能である。 VP is a virtual projection plane, and consists of a curved surface or a surface connecting a plurality of planes. The virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system. The virtual projection plane VP can be changed in shape, size, and position in the world coordinate system based on an instruction from the user to the operation unit 130 (see FIG. 3).
 本願において「位置変更」とは、仮想投影面VPをXY平面上で平行移動させる場合のみならず、XY平面に対する角度変更(姿勢変更ともいう)をも含む概念である。 In the present application, “position change” is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
 「円筒歪補正(あるいは円筒補正)」とは次の仮想投影面VPを用いて行う歪補正方法である。仮想投影面VP形状が、光軸Z方向に垂直なXY平面(X軸は水平方向、Y軸は垂直方向)へ投影した形状については長方形又は正方形とし、光軸Z方向については一定でなく、任意に変化がある。 “Cylinder distortion correction (or cylinder correction)” is a distortion correction method performed using the following virtual projection plane VP. The virtual projection plane VP shape is rectangular or square for the shape projected on the XY plane perpendicular to the optical axis Z direction (the X axis is the horizontal direction and the Y axis is the vertical direction), and is not constant in the optical axis Z direction, There are arbitrary changes.
 「球形歪補正(あるいは球形補正)」とは次の仮想投影面VPを用いて行う歪補正方法である。仮想投影面VP形状が、球の内面を矩形に切り欠いた形状であり、光軸Zに垂直なXY平面に投影した形状については、長方形又は正方形とし、X軸方向、Y軸方向についてもZ軸方向についても一定でなく、任意に変化があり、特にX軸方向、Y軸方向についての座標単位変化量は一定の曲率がある。 “Spherical distortion correction (or spherical correction)” is a distortion correction method performed using the following virtual projection plane VP. The virtual projection plane VP shape is a shape in which the inner surface of the sphere is cut out into a rectangle, and the shape projected on the XY plane perpendicular to the optical axis Z is a rectangle or square, and the X axis direction and the Y axis direction are also Z. The axial direction is not constant and is arbitrarily changed. In particular, the coordinate unit change amount in the X-axis direction and the Y-axis direction has a constant curvature.
 「三面鏡補正(あるいは三面鏡歪補正)」とは次のの仮想投影面VPを用いて行う歪補正方法である。仮想投影面VP形状がX軸方向、Y軸方向については任意であり(特に好ましくは長方形又は正方形)、光軸Z方向については一部が一定(変化量がない)で、他の部分には任意に変化がある。 “Trihedral mirror correction (or trihedral mirror distortion correction)” is a distortion correction method performed using the following virtual projection plane VP. The virtual projection plane VP shape is arbitrary in the X-axis direction and Y-axis direction (particularly preferably rectangular or square), a part is constant (no change) in the optical axis Z direction, There are arbitrary changes.
 図1に示す例では、仮想投影面として円筒の内周面の一部を切り取った様な円筒形状の例を示している。仮想投影面の形状としてはこれに限られず、球をこれと交わる平面で切り欠いた半球あるいは球冠の内周面の形状、あるいは球の内周面を矩形に切り取った形状であってもよい。また複数の面を連結した形状であってもよい。複数の面を連結した形状としては、3枚の連続したパネルからなる三面鏡形状(後述)が好ましい。 In the example shown in FIG. 1, an example of a cylindrical shape in which a part of the inner peripheral surface of the cylinder is cut off as the virtual projection surface is shown. The shape of the virtual projection plane is not limited to this, and may be a shape of the inner peripheral surface of a hemisphere or a spherical crown cut out by a plane intersecting with the sphere, or a shape obtained by cutting the inner peripheral surface of the sphere into a rectangle. . Moreover, the shape which connected the some surface may be sufficient. As a shape in which a plurality of surfaces are connected, a three-sided mirror shape (described later) composed of three continuous panels is preferable.
 初期状態(初期の位置設定のこと、以下同様)において仮想投影面VPは、所定の形状及びサイズで仮想投影面VPの中心ovはZ軸上に位置している。Gvは物点Pが仮想投影面VP上に投影された点であり、物点Pとレンズ中心Oを通る直線と仮想投影面VPとの交点である。図2における仮想投影面VP1は、仮想投影面VP0を操作部130の入力に基づいてXZ平面上で回転させた状態を示している。 In the initial state (initial position setting, the same applies hereinafter), the virtual projection plane VP has a predetermined shape and size, and the center ov of the virtual projection plane VP is located on the Z axis. Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP. A virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
 [ブロック図]
 図3は、撮像装置の概略構成を示すブロック図である。撮影装置は、撮像ユニット110、制御装置100、表示部120、操作部130を備えている。
[Block Diagram]
FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus. The imaging apparatus includes an imaging unit 110, a control device 100, a display unit 120, and an operation unit 130.
 撮像ユニット110は、短焦点のレンズ、撮像素子等から構成される。本実施形態においては、本実施形態においては、レンズとしては例えば広角レンズ、魚眼レンズがある。 The imaging unit 110 includes a short-focus lens, an imaging element, and the like. In the present embodiment, in the present embodiment, examples of the lens include a wide-angle lens and a fisheye lens.
 制御装置100は、画像処理部101、設定部102、記憶部103から構成される。 The control device 100 includes an image processing unit 101, a setting unit 102, and a storage unit 103.
 設定部102では、操作部130への入力指示に基づいて仮想投影面VPの形状、位置、サイズの設定を行う。 The setting unit 102 sets the shape, position, and size of the virtual projection plane VP based on an input instruction to the operation unit 130.
 画像処理部101では、設定された仮想投影面VPの形状、位置、サイズに基づいて仮想投影面上の各座標のカメラ座標系への変換テーブルを作成し、当該変換テーブルを用いて撮像ユニット110で撮影した画素データを処理して表示部120に表示させる画像データを作成する。記憶部103には、レンズのレンズパラメータにより算出された歪み補正係数が記憶されている。また仮想投影面VPの位置、サイズ及び作成した変換テーブルの記憶も行う。 The image processing unit 101 creates a conversion table of each coordinate on the virtual projection plane into the camera coordinate system based on the set shape, position, and size of the virtual projection plane VP, and uses the conversion table to capture the imaging unit 110. The pixel data photographed in the above is processed to generate image data to be displayed on the display unit 120. The storage unit 103 stores a distortion correction coefficient calculated based on the lens parameters of the lens. Also, the position and size of the virtual projection plane VP and the created conversion table are stored.
 表示部120は、液晶ディスプレイ等の表示画面を備え、撮像ユニット110で撮影した画素データに基づいて画像処理部101で作成した画像データを逐次、表示画面に表示させる。 The display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the image data created by the image processing unit 101 based on the pixel data captured by the imaging unit 110 on the display screen.
 操作部130は、キーボード、マウス、あるいは表示部の液晶ディスプレイに重畳して配置したタッチパネルを備え、ユーザの入力操作を受け付ける。 The operation unit 130 includes a keyboard, a mouse, or a touch panel arranged so as to be superimposed on the liquid crystal display of the display unit, and receives a user's input operation.
 [制御フロー]
 図4から図6は、本実施形態の制御フローを示す図である。図4はメインの制御フローを示す図であり、図5、図6は図4のサブルーチンを示す図である。
[Control flow]
4 to 6 are diagrams showing a control flow of the present embodiment. FIG. 4 is a diagram showing a main control flow, and FIGS. 5 and 6 are diagrams showing a subroutine of FIG.
 ステップS10(「第1ステップ」に相当)では仮想投影面VPの変換(設定)を行う。ステップS10の仮想投影面変換について図5に基づいて説明する。ステップS11では、ユーザにより仮想投影面形状の種類選択を行う、種類としては前述のように円筒形状、三面鏡形状がある。 In step S10 (corresponding to “first step”), the virtual projection plane VP is converted (set). The virtual projection plane conversion in step S10 will be described with reference to FIG. In step S11, the type of virtual projection plane shape is selected by the user. As the types, there are a cylindrical shape and a three-sided mirror shape as described above.
 続いてステップS12では仮想投影面の形状、位置を設定する。図7は仮想投影面の形状例を示す図であり、図7(a)は仮想投影面の形状として円筒形状が選択された例であり、図7(b)は、三面鏡形状が選択された例である。前者では円筒歪補正が行われ、後者では三面鏡歪補正が行われる。 In step S12, the shape and position of the virtual projection plane are set. FIG. 7 is a diagram showing an example of the shape of the virtual projection plane, FIG. 7A is an example in which a cylindrical shape is selected as the shape of the virtual projection plane, and FIG. 7B is a diagram in which a trihedral mirror shape is selected. This is an example. In the former, cylindrical distortion correction is performed, and in the latter, trihedral mirror distortion correction is performed.
 図7(a)、図7(b)に示すように初期状態においては、長手方向はワールド座標系のX軸と平行であり、短手方向は同Y軸と平行であり、X軸方向の端部に近い面は内側に傾いている。また仮想投影面VPの中心ovはZ軸上に位置し、中心ovを通る中心軸v-axisはXY平面上にある。また双方の仮想投影面VPは中心軸ov-axisを中心として左右対称の形状をしている。左右対称とすることにより仮想投影面VPを記憶するのに必要なメモリ容量を削減することができる。なお端部の傾斜は内側(物点側に凸の形状)に限られず、外側に傾斜する形状(像側に凸の形状)としてもよい(図24参照)。 As shown in FIGS. 7A and 7B, in the initial state, the longitudinal direction is parallel to the X axis of the world coordinate system, and the short side direction is parallel to the Y axis. The surface near the end is inclined inward. The center ov of the virtual projection plane VP is located on the Z axis, and the center axis v-axis passing through the center ov is on the XY plane. Further, both virtual projection planes VP have a symmetrical shape about the central axis ov-axis. By making it symmetrical, the memory capacity required to store the virtual projection plane VP can be reduced. Note that the inclination of the end portion is not limited to the inside (a shape convex toward the object point side), and may be a shape inclined toward the outside (a shape convex toward the image side) (see FIG. 24).
 図7(a)に示す仮想投影面VPは両端部が内側に傾斜した円筒形状であり、中心軸v-axisから離れるほど内側への傾斜角度が大きくなる形状となっている。同図において、長さl、l、lを設定することにより仮想投影面VPの形状の設定が行われ、ovの位置を変更することにより位置の設定を行うことができる。なお初期状態においては、長さl、l、lに対応する各辺はそれぞれX軸、Y軸、Z軸に平行である。図2に示すように位置を変更することにより表示部120に表示されるカメラ視点が変更される(パン、チルトに相当)。また仮想投影面VPの位置の変更に伴いレンズ中心Oとの距離が変更されればズームイン、ズームアウトされることになる。また縦横の長さl、lを変更することにより画角が変更されるのでズームイン、ズームアウトと同様の効果を得ることができる。また奥行き方向の長さlを変更することにより仮想投影面VPの曲率が変更されるので、これに伴い円筒歪補正の強度が変更される。なお仮想投影面VPの位置変更に関しての具体例は後述する。 The virtual projection plane VP shown in FIG. 7 (a) has a cylindrical shape with both end portions inclined inward, and has a shape in which the inward inclination angle increases as the distance from the central axis v-axis increases. In the figure, the length l X, l Y, set the shape of the virtual projection plane VP by setting the l Z is performed, it is possible to set the position by changing the position of the ov. In the initial state, the sides corresponding to the lengths l X , l Y , and l Z are parallel to the X axis, the Y axis, and the Z axis, respectively. The camera viewpoint displayed on the display unit 120 is changed by changing the position as shown in FIG. 2 (corresponding to pan and tilt). Further, if the distance from the lens center O is changed with the change of the position of the virtual projection plane VP, the zoom-in / zoom-out is performed. The vertical and horizontal lengths l X, since the angle of view is changed by changing the l Y can be obtained zoom, the same effects and zoom-out. Since the curvature of the virtual projection plane VP is changed by changing the depth direction of the length l Z, the strength of the cylindrical distortion correction due to this is changed. A specific example regarding the position change of the virtual projection plane VP will be described later.
 図7(b)に示す仮想投影面VPは両端部が内側に傾斜した三面鏡形状であり、主鏡部m、側鏡部S1、S2の3枚のパネルが連続している。初期状態においては、主鏡部mの平面はXY平面と平行であり、長さlX1とlX2、lY1、lZ1に対応する各辺はそれぞれX軸、Y軸、Z軸に平行である。側鏡部S1、S2の形状及びその平面の主鏡部mの平面に対する角度は同一であり、仮想投影面VPは中心軸v-axisを中心として左右対称の形状をしている。長さlX1、lX2、lY1、lZ1を設定することにより仮想投影面VPの形状の設定が行われ、ovの位置を変更することにより位置の設定を行うことができる。なお同図においては2枚の側鏡部を主鏡部の両脇に連結した三面鏡の例をしめしているが、主鏡部の両脇(X軸方向)だけでなく上下方向(Y軸方向)にも側鏡を連結した多面鏡の形状であってもよい。 The virtual projection plane VP shown in FIG. 7B has a three-sided mirror shape in which both ends are inclined inward, and three panels of the main mirror part m and the side mirror parts S1 and S2 are continuous. In the initial state, the plane of the main mirror part m is parallel to the XY plane, and the sides corresponding to the lengths l X1 and l X2 , l Y1 and l Z1 are parallel to the X axis, Y axis and Z axis, respectively. is there. The shape of the side mirror parts S1 and S2 and the angle of the plane with respect to the plane of the main mirror part m are the same, and the virtual projection plane VP has a bilaterally symmetric shape about the central axis v-axis. By setting the lengths l X1 , l X2 , l Y1 , and l Z1 , the shape of the virtual projection plane VP is set, and the position can be set by changing the position of ov. In the figure, an example of a three-sided mirror in which two side mirror parts are connected to both sides of the main mirror part is shown, but not only on both sides (X axis direction) of the main mirror part but also in the vertical direction (Y axis). The shape may be a polygonal mirror in which side mirrors are connected in the direction).
 図5の説明に戻る。ステップS13では、ステップS12までで設定された仮想投影面VPを画素数nに分割する。当該画素数は表示部120の表示総画素数(画面解像度)と同等かこれ以上であることが好ましい。以下においては例として、当該画素数n及び表示部120の表示総画素数はともに640×480pixel(総画素数30.7万)の固定条件で説明する。 Returning to the explanation of FIG. In step S13, the virtual projection plane VP set up to step S12 is divided into n pixels. The number of pixels is preferably equal to or greater than the total number of display pixels (screen resolution) of the display unit 120. Hereinafter, as an example, both the number of pixels n and the total number of display pixels of the display unit 120 will be described under a fixed condition of 640 × 480 pixels (total number of pixels 307,000).
 なお本実施形態においては仮想投影面VPの面上において隣接する画素との間隔は等間隔に設定している。また仮想投影面VPを、曲面若しくは複数の平面を連結した面からなる形状、且つ、端部側の面を傾斜させた形状としている。これらのことから、仮想投影面VPの、レンズ中心面LCに対してより正対している中心ovの周辺(図7(b)の例においては主鏡部m)に比べて、傾斜している両端側(図7(b)の例においては側鏡部S1、S2)の方が、入射角度θの変化に対してより高密度で画素が配置されることになる。これにより歪み補正を行う際の引き延ばしに対して、形状、色に与える影響が少なく、認識し易い画像を得ることができる。ここまでが図5に示したステップS10のサブルーチンに関する制御である。 In the present embodiment, the interval between adjacent pixels on the virtual projection plane VP is set to be equal. Further, the virtual projection plane VP has a shape formed by a curved surface or a surface obtained by connecting a plurality of planes, and a shape in which the end side surface is inclined. For these reasons, the virtual projection plane VP is inclined as compared with the periphery of the center ov that is more directly opposed to the lens center plane LC (the main mirror portion m in the example of FIG. 7B). Pixels are arranged at higher density on both end sides (in the example of FIG. 7B, the side mirror portions S1 and S2) with respect to the change in the incident angle θ. As a result, it is possible to obtain an easily recognizable image with little influence on the shape and color with respect to stretching when performing distortion correction. This is the control related to the subroutine of step S10 shown in FIG.
 図4のステップS20(「第2、第3ステップ」に相当)では、ステップS10で設定された仮想投影面VPに基づいて、主に画像処理部101により歪み補正処理を行う。ステップS20の歪み補正処理について図6を参照して説明する。 In step S20 of FIG. 4 (corresponding to “second and third steps”), distortion correction processing is mainly performed by the image processing unit 101 based on the virtual projection plane VP set in step S10. The distortion correction process in step S20 will be described with reference to FIG.
 ステップS21では、仮想投影面VP上での各々の画素Gvについてワールド座標系の座標Gv(X,Y,Z)を取得する。図8は、座標系を説明する模式図である。図8に示すように仮想投影面VPの4隅の点A(0,0,Za)、点B(0,479,Zb)、点C(639,479,Zc)、点D(639,0,Zd)で囲まれる平面を等間隔で640×480pixelの画素Gv(総画素数30.7万)に分割し、全ての画素Gvそれぞれのワールド座標系における座標を取得する。 In step S21, the coordinates Gv (X, Y, Z) of the world coordinate system are acquired for each pixel Gv on the virtual projection plane VP. FIG. 8 is a schematic diagram for explaining a coordinate system. As shown in FIG. 8, point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D (639, 0) at the four corners of the virtual projection plane VP. , Zd) is divided into 640 × 480 pixel pixels Gv (total number of pixels: 307,000) at equal intervals, and the coordinates of all the pixels Gv in the world coordinate system are obtained.
 ワールド座標系を取得する際には、円筒状等の曲面を持つ仮想投影面VPであればその曲率に基づいて、三面形状等の複数のパネルを連続させた形状であれば各々のパネルの傾き角度に基づいて算出する。 When acquiring the world coordinate system, if the virtual projection surface VP has a curved surface such as a cylindrical shape, the inclination of each panel is determined based on the curvature of the virtual projection surface VP if a plurality of panels such as a three-surface shape are continuous. Calculate based on the angle.
 ステップS22では、画素Gvのワールド座標系での座標と記憶部103に記憶されている撮像ユニット110の歪み補正係数から、撮像素子面IAでの対応するカメラ座標系での座標Gi(x,y)を算出する。具体的には、光学系のレンズパラメータより算出された歪み補正係数が記憶部103に記憶されており、当該係数と各画素Gvの座標から得られる光軸Zに対する入射角度θにより算出している(参考文献:国際公開第2010/032720号)。 In step S22, coordinates Gi (x, y in the corresponding camera coordinate system on the image sensor surface IA are calculated from the coordinates of the pixel Gv in the world coordinate system and the distortion correction coefficient of the imaging unit 110 stored in the storage unit 103. ) Is calculated. Specifically, the distortion correction coefficient calculated from the lens parameters of the optical system is stored in the storage unit 103, and is calculated from the incident angle θ with respect to the optical axis Z obtained from the coefficient and the coordinates of each pixel Gv. (Reference: International Publication No. 2010/032720).
 図9はカメラ座標系xyと撮像素子面IAとの対応関係を示す図である。図9において点a~dは、図8の点A~Dをカメラ座標系に変換したものである。なお図8では点A~Dで囲まれる仮想投影面VPは矩形の面であるが、図9においてカメラ座標系に座標変換した後の点a~dで囲まれる領域は(仮想投影面VPの位置、形状に対応して)歪んだ形状となる。同図においては樽型形状に歪んだ例を示しているが、光学系の特性により糸巻型、陣笠型(中央では樽型で端部では直線あるいは糸巻型に変化する形状)の歪みとなる場合もある。 FIG. 9 is a diagram showing a correspondence relationship between the camera coordinate system xy and the imaging element surface IA. In FIG. 9, points a to d are obtained by converting the points A to D in FIG. 8 into the camera coordinate system. In FIG. 8, the virtual projection plane VP surrounded by the points A to D is a rectangular plane, but in FIG. 9, the area surrounded by the points a to d after the coordinate transformation to the camera coordinate system is (the virtual projection plane VP It becomes a distorted shape (corresponding to the position and shape). The figure shows an example of distortion in a barrel shape. However, depending on the characteristics of the optical system, it may be a pincushion type or a Jinkasa type (a shape that changes into a barrel shape at the center and straight or pincushion at the end). There is also.
 ステップS23では、座標Gi(x’,y’)から参照する撮像素子の画素を決定する。なお撮像素子の各画素の座標(x,y)におけるx、yは整数であるが、ステップS22で算出される座標Gi(x’,y’)のx’、y’は整数とは限らず小数部分を持つ実数値を取り得る。前者のようにx’、y’が整数で、座標Gi(x’,y’)と撮像素子の画素の位置とが一致する場合には、対応する撮像素子の画素の画素データを仮想投影面VP上の画素Gv(X,Y,Z)の画素データとして用いる。後者のようにx’、y’が整数でなくx’、y’とx、yとが一致しない場合には画素Gvの画素データとして、算出された座標Gi(x’,y’)周辺の画素、例えば座標Gi(x’,y’)の位置に近接する上位4箇所の画素の画素データを用いて、これらの単純平均値あるいは、座標Gi(x’,y’)に対する距離により近接する4箇所の画素に対して重み付けをして算出した画素データを用いたりしてもよい。なお周辺の箇所としては4箇所には限られず1箇所、又は16箇所若しくはそれ以上であってもよい。 In step S23, the pixel of the imaging device to be referred to is determined from the coordinates Gi (x ', y'). Note that x and y in the coordinates (x, y) of each pixel of the image sensor are integers, but x ′ and y ′ in the coordinates Gi (x ′, y ′) calculated in step S22 are not necessarily integers. Can take a real value with a fractional part. When x ′ and y ′ are integers as in the former and the coordinates Gi (x ′, y ′) coincide with the pixel position of the image sensor, the pixel data of the pixel of the corresponding image sensor is used as the virtual projection plane. Used as pixel data of the pixel Gv (X, Y, Z) on the VP. As in the latter case, when x ′ and y ′ are not integers and x ′ and y ′ do not match x and y, the pixel data of the pixel Gv is calculated as the pixel data around the calculated coordinate Gi (x ′, y ′). Using the pixel data of the pixels, for example, the top four pixels close to the position of the coordinate Gi (x ′, y ′), they are closer to these simple average values or the distance to the coordinate Gi (x ′, y ′). Pixel data calculated by weighting four pixels may be used. The peripheral locations are not limited to 4 locations, and may be 1 location, 16 locations or more.
 当該ステップS21からステップS24を、図8の起点となる点A(0,0,Za)から1画素(ピクセル)ずつ移動させて右下の終点C(639,479,Zc)までの各画素(ピクセル)について実行することで全画素について歪み補正が行われた画像データを取得することができる。ここまでが図6に示したステップS20のサブルーチンに関する制御である。 Steps S21 to S24 are moved from the point A (0, 0, Za), which is the starting point of FIG. 8, by one pixel (pixel) at a time until each pixel ( (Pixel), image data in which distortion correction has been performed for all pixels can be acquired. This is the control related to the subroutine of step S20 shown in FIG.
 図4の制御フローの説明に戻る。ステップS40(「第4ステップ」に相当)では、ステップS20で取得した画像データを表示部120の表示画面に表示させる。なおステップS20、S40は逐次実行されるものであり、撮影された画素データに基づく歪み処理後の画像データを表示部120にリアルタイムで表示させる。 Returning to the explanation of the control flow in FIG. In step S40 (corresponding to “fourth step”), the image data acquired in step S20 is displayed on the display screen of the display unit 120. Steps S20 and S40 are sequentially performed, and the image data after distortion processing based on the captured pixel data is displayed on the display unit 120 in real time.
 本実施形態によれば、仮想投影面VPの形状、位置、サイズを設定することによりパン、チルト、ズーム処理をはじめ歪み補正処理を含む全ての処理を一括した処理で対応することが可能となるために、処理が軽くなり、比較的小規模な回路で、処理時間の短縮化を図ることが可能となる。特に仮想投影面VPを、曲面若しくは複数の平面を連結した面からなる形状とすることにより、円筒歪補正、球形歪補正、三面鏡歪補正も一度の処理で対応することが可能となる。より詳しくは、従来であれば画像データを算出、パン、チルト、ズーム処理、円筒歪み補正と、一連の複数段階の処理を経て処理後の画像データを得ているが、本実施形態においては、これら一連の全て処理を、一括した処理で行うことが可能となる。このことによりASIC(Application Specific Integrated Circuit)等のような比較的小規模な回路構成であってもリアルタイムで処理可能となる。
[第2の実施形態]
 図10は第2の実施形態に係る歪曲補正を説明する模式図である。図1から図6の説明において、仮想投影面は単一であったが、これに限られず2つあるいはこれ以上であってもよい。各々の仮想投影面の位置で得られた画像データを時分割やユーザの指示によりで切り換えて表示したり、表示画面を分割して同時に並べて表示したりする。図10に示す第2の実施形態では、2つの仮想投影面を設定した例である。図10に示す構成以外は、図3から図9に説明した実施形態と同一であり説明は省略する。
According to the present embodiment, by setting the shape, position, and size of the virtual projection plane VP, it is possible to handle all processes including panning, tilting, and zooming processes including distortion correction processes in a batch process. Therefore, the processing becomes light and the processing time can be shortened with a relatively small circuit. In particular, by making the virtual projection plane VP into a shape formed by a curved surface or a surface obtained by connecting a plurality of planes, cylindrical distortion correction, spherical distortion correction, and trihedral mirror distortion correction can be handled in a single process. More specifically, conventionally, image data is calculated, panned, tilted, zoomed processing, cylindrical distortion correction, and processed image data is obtained through a series of multiple stages of processing, but in this embodiment, All of these series of processes can be performed as a batch process. Thus, even a relatively small circuit configuration such as an ASIC (Application Specific Integrated Circuit) can be processed in real time.
[Second Embodiment]
FIG. 10 is a schematic diagram illustrating distortion correction according to the second embodiment. In the description of FIGS. 1 to 6, the single virtual projection plane is used. However, the virtual projection plane is not limited to this and may be two or more. The image data obtained at the positions of the respective virtual projection planes are switched and displayed by time division or a user instruction, or the display screen is divided and displayed at the same time. The second embodiment shown in FIG. 10 is an example in which two virtual projection planes are set. Except for the configuration shown in FIG. 10, it is the same as the embodiment described in FIGS.
 図10では仮想投影面VPh、VPjの2つの仮想投影面を設定した例を示している。両者は独立にその形状、位置、サイズを設定可能である。同図においては、撮像素子面IA上仮想投影面VPh、VPj、に対応する範囲は領域h、jであり、物点P1、P2に対応する点は、点p1、p2である。 FIG. 10 shows an example in which two virtual projection planes VPh and VPj are set. Both can set the shape, position, and size independently. In the drawing, the ranges corresponding to the virtual projection planes VPh and VPj on the image sensor surface IA are the areas h and j, and the points corresponding to the object points P1 and P2 are the points p1 and p2.
 設定された仮想投影面VPh、VPjのそれぞれに対して図4から図6に示すフローにより画像データが算出され、それぞれが個別に、図4のステップS40の処理により表示部120に2画面表示される。 Image data is calculated for each of the set virtual projection planes VPh and VPj by the flow shown in FIGS. 4 to 6, and each is individually displayed on the display unit 120 by the process of step S40 of FIG. The
 [仮想投影面VPの位置の変更の具体例]
 図11から図14は、カメラ座標系の画像中心oを回転中心(若しくは移動中心)として仮想投影面VPの位置を変更する例である。図11に示すように画像中心oを回転中心としてx軸回りの回転がpitch(tilt:チルトともいう)であり、y軸回りの回転がyaw(pan:パンともいう)、Z軸回りの回転がrollである。
[Specific Example of Changing Position of Virtual Projection Plane VP]
11 to 14 are examples in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the rotation center (or the movement center). As shown in FIG. 11, rotation about the x-axis with the image center o as the rotation center is pitch (also referred to as tilt), rotation about the y-axis is yaw (also referred to as pan), and rotation about the Z-axis. Is roll.
 図12、図13、図14は入力された回転量の設定値に基づいてそれぞれ仮想投影面VP0をroll回転、pitch回転、yaw回転させた例を示すものである。これらの図(及びこれ以降も)においてCa0、Ca1は仮想カメラであり、cavは位置変更後の仮想カメラCa1のカメラ視点である。なお位置変更前の仮想カメラCa0のカメラ視点cavはZ軸と一致している。 FIGS. 12, 13, and 14 show examples in which the virtual projection plane VP0 is rotated by roll, pitch, and yaw, respectively, based on the input rotation amount setting value. In these figures (and also thereafter), Ca0 and Ca1 are virtual cameras, and cav is the camera viewpoint of the virtual camera Ca1 after the position change. Note that the camera viewpoint cav of the virtual camera Ca0 before the position change coincides with the Z axis.
 図12において仮想投影面VP0を位置変更してroll回転させたものが仮想投影面VP1である。カメラ視点cavは位置変更前と後で一致している。 In FIG. 12, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating it. The camera viewpoint cav coincides with before and after the position change.
 図13において、仮想投影面VP0を位置変更してpitch回転させたものが仮想投影面VP1であり、カメラ視点cavは同図において見上げる方向に移動している。 In FIG. 13, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the pitch.
 図14において、仮想投影面VP0を位置変更してyaw回転させたものが仮想投影面VP1であり、カメラ視点cavは同図において時計回りに回転している。 14, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the yaw, and the camera viewpoint cav is rotating clockwise in FIG.
 図15から図17は、入力された回転量の設定値に基づいて仮想投影面VP0の中心ovを回転中心として仮想投影面VPの位置を変更する例である。以下においては仮想カメラCa0を回転あるいは位置変更させたことに相当する視点変換が行われる。 15 to 17 are examples in which the position of the virtual projection plane VP is changed based on the center ov of the virtual projection plane VP0 based on the input rotation amount setting value. In the following, viewpoint conversion corresponding to rotating or changing the position of the virtual camera Ca0 is performed.
 図15に示すように仮想投影面VP0上の直交する関係となる2軸の一方をYaw-axis、他方をP-axisとして設定する。両者は中心ovを通る軸であり、中心ovを回転中心としてYaw-axis回りの回転を仮想yaw回転、P-axis回りの回転を仮想pitch回転という。初期状態においては、仮想投影面VP0はカメラ座標系のxy平面と平行であり、Z軸上に中心ovが存在する。この初期状態においては、P-axisはx軸と平行であり、Yaw-axisはy軸と平行である(なおYaw-axisと図7のv-axisは同一である)。 As shown in FIG. 15, one of two axes that are orthogonal to each other on the virtual projection plane VP0 is set as Yaw-axis, and the other is set as P-axis. Both are axes passing through the center ov, and rotation around the Yaw-axis with the center ov as the rotation center is called virtual yaw rotation, and rotation around the P-axis is called virtual pitch rotation. In the initial state, the virtual projection plane VP0 is parallel to the xy plane of the camera coordinate system, and the center ov exists on the Z axis. In this initial state, P-axis is parallel to the x axis, and Yaw-axis is parallel to the y axis (note that Yaw-axis and v-axis in FIG. 7 are the same).
 図16において、仮想投影面VP0を位置変更して仮想pitch回転させたものが仮想投影面VP1であり、位置変更後において仮想カメラCa1は上方に位置し、カメラ視点cavは見下げる方向となる。 In FIG. 16, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual pitch, and after the position change, the virtual camera Ca1 is positioned above, and the camera viewpoint cav is in a direction to look down.
 図17において、仮想投影面VP0を位置変更して仮想yaw回転させたものが仮想投影面VP1であり、位置変更後において仮想カメラCa1、及びカメラ視点cavは反時計方向に回転する。 17, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual yaw, and the virtual camera Ca1 and the camera viewpoint cav rotate counterclockwise after the position change.
 図18は、カメラ座標系の画像中心oを移動中心として仮想投影面VP0の位置を変更する例である。これらの例においては仮想カメラCa0を仮想投影面VP0とともに平行移動させたことに相当する視点変換が行われる。 FIG. 18 is an example in which the position of the virtual projection plane VP0 is changed with the image center o in the camera coordinate system as the movement center. In these examples, viewpoint conversion corresponding to parallel translation of the virtual camera Ca0 together with the virtual projection plane VP0 is performed.
 図19、図20、図21は、入力されたオフセット移動量の設定値に基づいて仮想投影面VP0をX方向、Y方向、Z方向にそれぞれオフセット移動(平行移動)させた例を示すものである。初期状態においては、Z方向へのオフセット移動は、ズームイン、ズームアウトと同様の動きとなる。初期状態以外においては、各方向へのオフセット移動は光学系の撮影領域外の暗部を画像領域外に移動させる際に有効である。 19, 20, and 21 show examples in which the virtual projection plane VP0 is offset-shifted (translated) in the X, Y, and Z directions based on the input offset movement amount setting value. is there. In the initial state, the offset movement in the Z direction is the same movement as zooming in and zooming out. Except in the initial state, the offset movement in each direction is effective when moving the dark part outside the imaging area of the optical system outside the image area.
 図22、図23は、表示部120に表示させた表示画像の例である。図22は、円筒形状(図1、図7(a)等)の仮想投影面VPに基づいて算出した画像データを表示した表示画像の例であり、図22は、三面鏡形状(図7(b))の仮想投影面VPでの表示画像の例である。中央に近い画像は、歪みが少なく正確に被写体を認識することができ、左右の端部に近い画像では、拡大表示されている。円筒歪補正あるいは三面鏡歪補正を行っているために端部側での像拡大にともなって形の崩れは生じがたい。 22 and 23 are examples of display images displayed on the display unit 120. FIG. FIG. 22 is an example of a display image that displays image data calculated based on a virtual projection plane VP of a cylindrical shape (FIG. 1, FIG. 7A, etc.). FIG. It is an example of the display image in the virtual projection surface VP of b)). An image close to the center can be accurately recognized with little distortion, and an image close to the left and right ends is enlarged and displayed. Since cylindrical distortion correction or trihedral mirror distortion correction is performed, the shape is unlikely to be deformed as the image is enlarged on the end side.
 図24は、仮想投影面VPの形状の変形例である。同図に示す例では仮想投影面VPは逆向きの円筒形状であり、両端側が外側(奥側)に傾斜している。 FIG. 24 shows a modification of the shape of the virtual projection plane VP. In the example shown in the figure, the virtual projection plane VP has a reverse cylindrical shape, and both end sides are inclined outward (back side).
 図25は、図24に示した逆向きの円筒形状の仮想投影面VPに基づいて算出した画像データを表示した表示画像の例である。中央に近い画像では歪みは少ないが、左右の端部に近い画像では、歪みが残っているが広い領域を表示させることができる。またこのような形状の仮想投影面であっても、中央側よりも左右の端部に近い側では入射角度θの変化に対して高密度で画素が配置されることになるので、左右の端部側で画像が粗くなるという現象を避けることができる。 FIG. 25 is an example of a display image that displays image data calculated based on the virtual projection plane VP having the reverse cylindrical shape shown in FIG. Although an image close to the center has little distortion, an image close to the left and right ends can display a wide area although distortion remains. Even in the virtual projection plane having such a shape, the pixels are arranged at a high density with respect to the change in the incident angle θ on the side closer to the left and right ends than the center side. The phenomenon that the image becomes rough on the side can be avoided.
 以上に説明した実施の形態では、集光レンズを含む光学系として、単玉レンズからなる光学系を例示して説明したが、集光レンズは複数枚のレンズで構成されてもよく、光学系には集光レンズ以外の光学素子を備えていてもよいことは勿論であり、本願の発明を適用できるものである。その場合には、歪み補正係数は、光学系全体としての値を利用すればよい。しかしながら、撮像装置の小型化や低コスト化の上では、以上の実施の形態で例示したように、単玉のレンズを用いることが最も好ましいものである。 In the embodiment described above, the optical system including a single lens is exemplified as the optical system including the condensing lens. However, the condensing lens may be composed of a plurality of lenses. Of course, an optical element other than the condenser lens may be provided, and the invention of the present application can be applied. In that case, the distortion correction coefficient may be a value for the entire optical system. However, in terms of downsizing and cost reduction of the imaging device, it is most preferable to use a single lens as exemplified in the above embodiment.
 100 制御装置
 101 画像処理部
 102 設定部
 103 記憶部
 110 撮像ユニット
 120 表示部
 130 操作部
 VP 仮想投影面
 LC レンズ中心面
 IA 撮像素子面
 O レンズ中心
 o 画像中心
 ov 仮想投影面の中心
DESCRIPTION OF SYMBOLS 100 Control apparatus 101 Image processing part 102 Setting part 103 Storage part 110 Imaging unit 120 Display part 130 Operation part VP Virtual projection plane LC Lens center plane IA Image sensor surface O Lens center o Image center ov Center of virtual projection plane

Claims (16)

  1.  レンズを備えた光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理方法において、
     ユーザの指示に基づいて、曲面若しくは複数の平面を連結した面からなる仮想投影面の面形状、サイズ及びワールド座標系における位置の設定を行う第1ステップと、
     前記第1ステップで設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換する第2ステップと、
     前記複数の画素データと前記第2ステップで変換したカメラ座標系における座標とに基づいて、前記第1ステップで設定された仮想投影面の画像データを算出する第3ステップと、
     を有することを特徴とする画像処理方法。
    In an image processing method for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an image sensor having a plurality of pixels via an optical system including a lens,
    A first step of setting a surface shape, a size, and a position in a world coordinate system of a virtual projection surface composed of a curved surface or a surface obtained by connecting a plurality of planes based on a user instruction;
    A second step of converting coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step into a camera coordinate system using a distortion correction coefficient of the optical system;
    A third step of calculating image data of the virtual projection plane set in the first step based on the plurality of pixel data and the coordinates in the camera coordinate system converted in the second step;
    An image processing method comprising:
  2.  前記第1ステップで設定された仮想投影面の面形状は、断面が円弧状である円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする請求項1に記載の画像処理方法。 2. The image according to claim 1, wherein the surface shape of the virtual projection plane set in the first step is a cylindrical shape having a circular cross section or a polygonal mirror shape including a plurality of continuous panels. Processing method.
  3.  前記第3ステップで算出した画像データを表示部に表示させる第4ステップを有することを特徴とする請求項1又は2に記載の画像処理方法。 3. The image processing method according to claim 1, further comprising a fourth step of displaying the image data calculated in the third step on a display unit.
  4.  前記第1ステップで設定される仮想投影面は、複数の仮想投影面であることを特徴とする、請求項1から3のいずれか一項に記載の画像処理方法。 The image processing method according to any one of claims 1 to 3, wherein the virtual projection plane set in the first step is a plurality of virtual projection planes.
  5.  前記第3ステップでは、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする請求項1から4のいずれか一項に記載の画像処理方法。 In the third step, the image data is calculated from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane, and the corresponding position on the imaging element surface is calculated. The image processing method according to claim 1, wherein image data at each position is obtained from pixel data.
  6.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理装置であって、
     ワールド座標系における仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部と、
     を有することを特徴とする画像処理装置。
    An image processing apparatus for obtaining image data that has been subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system,
    A virtual projection plane in the world coordinate system, which converts the coordinates in the world coordinate system of each pixel of the virtual projection plane consisting of a curved surface or a plane connecting a plurality of planes to the camera coordinate system using the distortion correction coefficient of the optical system An image processing unit that calculates image data of the virtual projection plane set by the setting unit based on the coordinates converted into the camera coordinate system and the plurality of pixel data;
    An image processing apparatus comprising:
  7.  前記仮想投影面の形状は、断面が円弧状の円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする請求項6に記載の画像処理装置。 The image processing apparatus according to claim 6, wherein the shape of the virtual projection surface is a cylindrical shape having a circular cross section or a polygonal mirror shape including a plurality of continuous panels.
  8.  前記仮想投影面は、複数の仮想投影面であることを特徴とする請求項6又は7に記載の画像処理装置。 The image processing apparatus according to claim 6 or 7, wherein the virtual projection plane is a plurality of virtual projection planes.
  9.  前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする請求項6から8の何れか一項に記載の画像処理装置。 The image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. 9. The image processing apparatus according to claim 6, wherein image data at each position is obtained from pixel data.
  10.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
     位置及びサイズが設定されたワールド座標系の仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部、
     として機能させるプログラム。
    An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
    A virtual projection plane of a world coordinate system in which a position and a size are set, and the coordinates in the world coordinate system of each pixel of the virtual projection plane formed by a curved surface or a plane connecting a plurality of planes is used as a distortion correction coefficient of the optical system. An image processing unit for calculating image data on the virtual projection plane based on the coordinates converted into the camera coordinate system and the plurality of pixel data.
    Program to function as.
  11.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
     ワールド座標系における仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の形状、サイズ及び配置位置を設定可能な設定部と、
     前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部、
     として機能させるプログラム。
    An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
    A setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane in the world coordinate system, and includes a curved surface or a plane connecting a plurality of planes;
    The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data of the virtual projection plane set by the setting unit,
    Program to function as.
  12.  光学系と、
     複数の画素を有する撮像素子と、
     ワールド座標系の仮想投影面であって、曲面若しくは複数の平面を連結した面からなる仮想投影面の形状、サイズ及び配置位置を設定可能な設定部と、
     前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記撮像素子に受光して得られた複数の画素データに基づいて、前記設定部で設定された仮想投影面の画像データを算出する画像処理部と、
     を有することを特徴とする撮像装置。
    Optical system,
    An imaging device having a plurality of pixels;
    A setting unit capable of setting a shape, a size, and an arrangement position of a virtual projection plane, which is a virtual projection plane of a world coordinate system, which is a curved surface or a plane connecting a plurality of planes;
    The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, and the coordinates converted into the camera coordinate system and the image sensor are received. An image processing unit that calculates image data of a virtual projection plane set by the setting unit based on a plurality of pixel data obtained in the above manner;
    An imaging device comprising:
  13.  前記仮想投影面の形状は、断面が円弧状の円筒形状若しくは複数枚の連続したパネルからなる多面鏡形状であることを特徴とする請求項12に記載の撮像装置。 13. The imaging apparatus according to claim 12, wherein the shape of the virtual projection plane is a cylindrical shape having a circular cross section or a polygonal mirror shape including a plurality of continuous panels.
  14.  ユーザが操作する操作部と、
     表示部と、
     を有し、
     前記設定部は、前記操作部への操作に基づいてワールド座標系における前記仮想投影面の形状、サイズ及び位置の設定を行い、
     前記表示部は、設定された前記仮想投影面での画像データを表示することを特徴とする請求項12又は13に記載の撮像装置。
    An operation unit operated by a user;
    A display unit;
    Have
    The setting unit sets the shape, size, and position of the virtual projection plane in the world coordinate system based on an operation to the operation unit,
    The imaging device according to claim 12 or 13, wherein the display unit displays image data on the set virtual projection plane.
  15.  前記設定部で設定される仮想投影面は、複数の仮想投影面であることを特徴とする請求項12から14の何れか一項に記載の撮像装置。 15. The imaging apparatus according to claim 12, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
  16.  前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、対応する位置の画素の画素データから前記各位置における画像データを得ることを特徴とする請求項12から15の何れか一項に記載の撮像装置。 The image processing unit calculates the position of the corresponding pixel on the image sensor surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. The image pickup apparatus according to claim 12, wherein image data at each position is obtained from pixel data.
PCT/JP2010/060185 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device WO2011158344A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2012520203A JPWO2011158344A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing apparatus, and imaging apparatus
PCT/JP2010/060185 WO2011158344A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/060185 WO2011158344A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Publications (1)

Publication Number Publication Date
WO2011158344A1 true WO2011158344A1 (en) 2011-12-22

Family

ID=45347766

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/060185 WO2011158344A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Country Status (2)

Country Link
JP (1) JPWO2011158344A1 (en)
WO (1) WO2011158344A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018509799A (en) * 2015-02-17 2018-04-05 コンティ テミック マイクロエレクトロニック ゲゼルシャフト ミット ベシュレンクテル ハフツングConti Temic microelectronic GmbH Method and apparatus for distortion-free display of the vehicle periphery of a vehicle
CN112947885A (en) * 2021-05-14 2021-06-11 深圳精智达技术股份有限公司 Method and device for generating curved surface screen flattening image

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11261868A (en) * 1998-03-13 1999-09-24 Fujitsu Ltd Fisheye lens camera device and image distortion correction method and image extraction method thereof
JP2000083242A (en) * 1993-02-08 2000-03-21 Interactive Pictures Corp Full field still camera and surveillance system
JP2000242773A (en) * 1999-02-19 2000-09-08 Fitto:Kk Image data converting device
JP2006148767A (en) * 2004-11-24 2006-06-08 Canon Inc Video image distribution system, video image distributing apparatus, video image receiving apparatus, communication method for video image distributing apparatus, display method of video image receiving apparatus, program, and storage medium
JP2007228531A (en) * 2006-02-27 2007-09-06 Sony Corp Camera apparatus and monitor system
JP2008052589A (en) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc Method for correcting distortion of wide angle image
JP2008311890A (en) * 2007-06-14 2008-12-25 Fujitsu General Ltd Image data converter, and camera device provided therewith
WO2009014075A1 (en) * 2007-07-20 2009-01-29 Techwell Japan K.K. Image processing device and camera system
JP2009043060A (en) * 2007-08-09 2009-02-26 Canon Inc Image processing method for performing distortion correction to image data, program, and recording medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000083242A (en) * 1993-02-08 2000-03-21 Interactive Pictures Corp Full field still camera and surveillance system
JPH11261868A (en) * 1998-03-13 1999-09-24 Fujitsu Ltd Fisheye lens camera device and image distortion correction method and image extraction method thereof
JP2000242773A (en) * 1999-02-19 2000-09-08 Fitto:Kk Image data converting device
JP2006148767A (en) * 2004-11-24 2006-06-08 Canon Inc Video image distribution system, video image distributing apparatus, video image receiving apparatus, communication method for video image distributing apparatus, display method of video image receiving apparatus, program, and storage medium
JP2007228531A (en) * 2006-02-27 2007-09-06 Sony Corp Camera apparatus and monitor system
JP2008052589A (en) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc Method for correcting distortion of wide angle image
JP2008311890A (en) * 2007-06-14 2008-12-25 Fujitsu General Ltd Image data converter, and camera device provided therewith
WO2009014075A1 (en) * 2007-07-20 2009-01-29 Techwell Japan K.K. Image processing device and camera system
JP2009043060A (en) * 2007-08-09 2009-02-26 Canon Inc Image processing method for performing distortion correction to image data, program, and recording medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018509799A (en) * 2015-02-17 2018-04-05 コンティ テミック マイクロエレクトロニック ゲゼルシャフト ミット ベシュレンクテル ハフツングConti Temic microelectronic GmbH Method and apparatus for distortion-free display of the vehicle periphery of a vehicle
CN112947885A (en) * 2021-05-14 2021-06-11 深圳精智达技术股份有限公司 Method and device for generating curved surface screen flattening image

Also Published As

Publication number Publication date
JPWO2011158344A1 (en) 2013-08-15

Similar Documents

Publication Publication Date Title
US9196022B2 (en) Image transformation and multi-view output systems and methods
US8134608B2 (en) Imaging apparatus
EP3438919B1 (en) Image displaying method and head-mounted display apparatus
WO2012060269A1 (en) Image processing method, image processing device, and imaging device
EP3016065B1 (en) Coordinate computation device and method, and image processing device and method
KR101521008B1 (en) Correction method of distortion image obtained by using fisheye lens and image display system implementing thereof
US20130265468A1 (en) Camera, distortion correction device and distortion correction method
JP6253280B2 (en) Imaging apparatus and control method thereof
WO2020207108A1 (en) Image processing method, device and system, and robot
TWI443604B (en) Image correction method and image correction apparatus
JP2004228619A (en) Method of adjusting distortion in video image of projector
JP7012058B2 (en) Image correction method and its device
JP5187480B2 (en) Projector, program, information storage medium, and image generation method
WO2011158344A1 (en) Image processing method, program, image processing device, and imaging device
JP3594225B2 (en) Wide-field camera device
WO2012056982A1 (en) Image processing method, image processing device, and imaging device
JP2009123131A (en) Imaging apparatus
WO2011161746A1 (en) Image processing method, program, image processing device and image capturing device
JP2013005393A (en) Image processing method having wide-angle distortion correction processing, image processing apparatus and imaging apparatus
JP5682473B2 (en) Image processing method having wide-angle distortion correction processing, image processing apparatus, and imaging apparatus
WO2011158343A1 (en) Image processing method, program, image processing device, and imaging device
KR101011704B1 (en) Apparatus and method for processing video signal to generate wide viewing image
WO2009069997A2 (en) Method for image geometric transformation
WO2012060271A1 (en) Image processing method, image processing device, and imaging device
JP4211653B2 (en) Video generation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10853226

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012520203

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10853226

Country of ref document: EP

Kind code of ref document: A1