WO2011158343A1 - Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie - Google Patents

Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie Download PDF

Info

Publication number
WO2011158343A1
WO2011158343A1 PCT/JP2010/060184 JP2010060184W WO2011158343A1 WO 2011158343 A1 WO2011158343 A1 WO 2011158343A1 JP 2010060184 W JP2010060184 W JP 2010060184W WO 2011158343 A1 WO2011158343 A1 WO 2011158343A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual projection
projection plane
coordinate system
image processing
pixel
Prior art date
Application number
PCT/JP2010/060184
Other languages
English (en)
Japanese (ja)
Inventor
上田 滋之
央樹 坪井
Original Assignee
コニカミノルタオプト株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタオプト株式会社 filed Critical コニカミノルタオプト株式会社
Priority to JP2012520202A priority Critical patent/JPWO2011158343A1/ja
Priority to PCT/JP2010/060184 priority patent/WO2011158343A1/fr
Publication of WO2011158343A1 publication Critical patent/WO2011158343A1/fr

Links

Images

Classifications

    • G06T3/12

Definitions

  • the present invention relates to an image processing method, a program, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
  • Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
  • the image height change rate increases when the incident angle is greater than or equal to a predetermined value in image processing when displaying photographing data photographed using a wide-angle lens on the display. In the case of less than a predetermined value, correction is performed so that the change rate of the image height decreases.
  • Image processing for captured images obtained with lenses as disclosed in Patent Documents 1 and 2 requires a lot of correction processing such as shading correction and distortion correction, and is therefore processed when hardware is used as an image processing apparatus. There is a problem that the time is increased, the circuit scale is increased, and the cost is increased.
  • the present invention has an object to provide an image processing method, a program, an image processing apparatus, and an imaging apparatus capable of reducing processing time with a relatively small circuit. .
  • an image processing method for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system
  • a first step of setting the position and size of the virtual projection plane in the world coordinate system based on a user instruction;
  • An image processing method comprising:
  • calculation of image data is performed by calculating a corresponding position on the imaging element surface from an incident angle ⁇ with respect to the optical axis of the optical system at each pixel position on the virtual projection plane. 4. The image processing method according to claim 1, wherein image data of the pixel on the virtual projection plane is obtained from pixel data of the pixel.
  • An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system, The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • An image processing unit that calculates image data on the virtual projection plane, An image processing apparatus comprising:
  • An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
  • a setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
  • the coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • An image processing unit that calculates image data on the virtual projection plane set by the setting unit,
  • An image processing apparatus comprising:
  • the image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane. 8. The image processing apparatus according to item 6 or 7, wherein image data at the pixel on the virtual projection plane is obtained from pixel data.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising: The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data on the virtual projection plane, Program to function as.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system
  • the computer comprising: A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system; The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • Optical system An imaging device having a plurality of pixels; A setting unit for setting the position and size of the virtual projection plane in the world coordinate system; The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data And an image processing unit that calculates image data on the virtual projection plane set by the setting unit.
  • An operation unit operated by a user A display unit; Have The setting unit sets the position and size of the virtual projection plane in the world coordinate system based on an operation to the operation unit, 12.
  • the image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane. 14. The imaging apparatus according to any one of 11 to 13, wherein image data of the pixel on the virtual projection plane is obtained from pixel data.
  • the virtual projection plane set in the world coordinate system is converted into the camera coordinate system using the distortion correction coefficient, and the virtual projection plane is based on the converted coordinates of the camera coordinate system and the pixel data of the image sensor.
  • FIG. 4A shows a main control flow
  • FIG. 4B shows a subroutine of step S20.
  • It is a figure explaining the coordinate of the virtual projection surface VP.
  • It is a figure which shows the correspondence of camera coordinate system xy and image pick-up element surface IA.
  • An example in which two virtual projection planes VP are set is shown. This is an example in which the position of the virtual projection plane VP is changed with the image center o of the camera coordinate system as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by roll is shown.
  • FIG. VP0 An example in which the virtual projection plane VP0 is pitch rotated is shown.
  • An example in which the virtual projection plane VP0 is rotated by yaw is shown. This is an example in which the position of the virtual projection plane VP is changed with the center ov of the virtual projection plane VP0 as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by a virtual pitch is shown.
  • An example in which the virtual projection plane VP0 is rotated by a virtual yaw is shown.
  • This is an example in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center.
  • An example in which the virtual projection plane VP0 is offset in the x direction is shown.
  • the virtual projection plane VP0 is offset in the y direction is shown.
  • FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment.
  • X, Y, and Z are world coordinate systems, and the origin O is the lens center.
  • Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O.
  • Point P is an object point of the object in the world coordinate system XYZ.
  • is an incident angle with respect to the optical axis (coincident with the Z axis).
  • X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA.
  • o is the center of the image and is the intersection of the optical axis Z and the image sensor surface.
  • the point p is a point on the image sensor surface in the camera coordinate system, and the object point P is converted into the camera coordinate system using a distortion correction coefficient based on a parameter based on lens characteristics (hereinafter referred to as “lens parameter”). is there.
  • the VP is a virtual projection plane.
  • the virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system.
  • the virtual projection plane VP can be changed in position and size based on an instruction from the user to the operation unit 130 (see FIG. 3).
  • position change is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
  • the virtual projection plane VP is arranged at a predetermined position (Z direction) parallel to the lens center plane LC (XY direction) with a predetermined size, and the center of the virtual projection plane VP.
  • ov is located on the Z axis.
  • Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP.
  • a virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus.
  • the imaging apparatus includes an imaging unit 110, a control device 100, a display unit 120, and an operation unit 130.
  • the imaging unit 110 includes a lens, an imaging element, and the like.
  • examples of the lens include a wide-angle lens and a fisheye lens.
  • the control device 100 includes an image processing unit 101, a setting unit 102, and a storage unit 103.
  • the setting unit 102 sets the position and size of the virtual projection plane VP based on an input instruction to the operation unit 130.
  • the image processing unit 101 creates a conversion table of each coordinate on the virtual projection plane into the camera coordinate system based on the set position and size of the virtual projection plane VP, and shoots with the imaging unit 110 using the conversion table.
  • the processed pixel data is processed to create image data to be displayed on the display unit 120.
  • the storage unit 103 stores a distortion correction coefficient calculated based on the lens parameters of the lens. Also, the position and size of the virtual projection plane VP and the created conversion table are stored.
  • the display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the image data created by the image processing unit 101 based on the pixel data captured by the imaging unit 110 on the display screen.
  • the operation unit 130 includes a keyboard, a mouse, or a touch panel arranged so as to be superimposed on the liquid crystal display of the display unit, and receives a user's input operation.
  • FIG. 4 is a diagram showing a control flow of the present embodiment.
  • FIG. 4A shows a main control flow
  • FIG. 4B shows a subroutine of step S20.
  • step S10 the virtual projection plane VP is converted (set).
  • the setting unit 102 instructs the virtual projection plane VP in the world coordinate system in response to an input instruction to the operation unit 130 as described above.
  • the size is set.
  • the camera viewpoint displayed on the display unit 120 is changed by changing the position as shown in FIG. 2 (corresponding to pan and tilt). Further, if the distance from the lens center O is changed with the change of the position of the virtual projection plane VP, the zoom-in / zoom-out is performed. Zooming in and zooming out can also be performed by changing the size of the virtual projection plane VP. A specific example regarding the position change of the virtual projection plane VP will be described later.
  • the virtual projection plane VP is divided into the number of pixels n based on the input size setting (or the default size value).
  • the number of pixels is preferably equal to or greater than the total number of display pixels (screen resolution) of the display unit 120.
  • both the number of pixels n and the total number of display pixels of the display unit 120 will be described under a fixed condition of 640 ⁇ 480 pixels (total number of pixels 307,000).
  • the size of the virtual projection plane VP is fixed when the number of pixels n is fixed.
  • step S20 distortion correction processing is mainly performed by the image processing unit 101 based on the state of the virtual projection plane VP set in step S10.
  • the distortion correction process will be described with reference to FIG.
  • step S21 the coordinates Gv (X, Y, Z) of the world coordinate system are acquired for each pixel Gv on the virtual projection plane VP.
  • FIG. 5 is a schematic diagram for explaining a coordinate system. As shown in FIG. 5, point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D (639, 0) at the four corners of the virtual projection plane VP. , Zd) is divided into 640 ⁇ 480 pixel pixels Gv (total number of pixels: 307,000) at equal intervals, and the coordinates of all the pixels Gv in the world coordinate system are obtained.
  • step S22 coordinates Gi (x, y in the corresponding camera coordinate system on the image sensor surface IA are calculated from the coordinates of the pixel Gv in the world coordinate system and the distortion correction coefficient of the imaging unit 110 stored in the storage unit 103. ) Is calculated. Specifically, the distortion correction coefficient calculated from the lens parameters of the optical system is stored in the storage unit 103, and is calculated from the incident angle ⁇ with respect to the optical axis Z obtained from the coefficient and the coordinates of each pixel Gv. (Reference: International Publication No. 2010/032720).
  • FIG. 6 is a diagram showing a correspondence relationship between the camera coordinate system xy and the imaging element surface IA.
  • points a to d are obtained by converting the points A to D in FIG. 5 into the camera coordinate system.
  • the virtual projection plane VP surrounded by the points A to D is a rectangular plane.
  • the area surrounded by the points a to d after the coordinate conversion to the camera coordinate system is (the virtual projection plane VP
  • the shape is distorted (corresponding to the position).
  • the figure shows an example of distortion in a barrel shape, but the case may be a pincushion type or a Jinkasa type (a shape that changes into a barrel type at the center and straight or pincushion at the end) due to the characteristics of the optical system. There is also.
  • step S23 the pixel of the image sensor to be referenced is determined from the coordinates Gi (x ′, y ′) in the camera coordinate system.
  • x and y in the coordinates (x, y) of each pixel of the image sensor are integers, but x ′ and y ′ in the coordinates Gi (x ′, y ′) calculated in step S22 are not necessarily integers. Can take a real value with a fractional part.
  • x ′ and y ′ are integers as in the former and the coordinates Gi (x ′, y ′) coincide with the pixel position of the image sensor, the pixel data of the pixel of the corresponding image sensor is used as the virtual projection plane.
  • the calculated coordinates Gi (x ′, y ′) are used as pixel data of the pixel Gv.
  • the peripheral locations are not limited to 4 locations, and may be 1 location, 16 locations or more.
  • Steps S21 to S23 are moved from the point A (0, 0, Za), which is the starting point of FIG. 5, by one pixel (pixel) at a time to each pixel (up to the lower right end point C (639, 479, Zc)).
  • Pixel image data in which distortion correction has been performed for all pixels can be acquired. This is the control related to the subroutine of step S20 shown in FIG.
  • step S40 (corresponding to “fourth step”), the image data acquired in step S20 is displayed on the display screen of the display unit 120. Steps S20 and S40 are sequentially performed, and the image data after distortion processing based on the captured pixel data is displayed on the display unit 120 in real time.
  • the present embodiment by setting the position and size of the virtual projection plane VP, it is possible to handle all processes including panning, tilting, and zooming processes including distortion correction processes in a batch process.
  • the processing becomes lighter, and the processing time can be shortened with a relatively small circuit.
  • ASIC Application Specific Integrated Circuit
  • FIG. 7 is a schematic diagram for explaining distortion correction according to the second embodiment.
  • the single virtual projection plane is used.
  • the virtual projection plane is not limited to this and may be two or more.
  • the image data obtained at the positions of the respective virtual projection planes are switched and displayed by time division or a user instruction, or the display screen is divided and displayed at the same time.
  • the second embodiment shown in FIG. 7 is an example in which two virtual projection planes are set. Except for the configuration shown in FIG. 7, it is the same as the embodiment described in FIGS.
  • FIG. 7 shows an example in which two virtual projection planes VPh and VPj are set. Both can set the position and size independently.
  • the ranges corresponding to the virtual projection planes VPh and VPj on the image sensor surface IA are the areas h and j
  • the points corresponding to the object points P1 and P2 are the points p1 and p2.
  • Image data is calculated by the flow shown in FIG. 4B for each of the set virtual projection planes VPh and VPj, and each of them is individually displayed on the display unit 120 by the process of step S40 of FIG. 4A. Displayed on the screen.
  • [Specific Example of Changing Position of Virtual Projection Plane VP] 8 to 11 are examples in which the position of the virtual projection plane VP0 is changed with the image center o of the camera coordinate system as the rotation center (or the movement center). As shown in FIG. 8, rotation about the x-axis with the image center o as the rotation center is pitch (also referred to as tilt), rotation about the y-axis is yaw (also referred to as pan), and rotation about the Z-axis. Is roll.
  • FIG. 10, and FIG. 11 show examples in which the virtual projection plane VP0 is rotated by roll, pitch, and yaw, respectively, based on the input set value of the rotation amount.
  • Ca0 and Ca1 are virtual cameras
  • cav is the camera viewpoint of the virtual camera Ca1 after the position change. Note that the camera viewpoint cav of the virtual camera Ca0 before the position change coincides with the Z axis.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating it.
  • the camera viewpoint cav coincides with before and after the position change.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and performing the pitch rotation, and the camera viewpoint cav moves in the direction of looking up in FIG.
  • the virtual projection plane VP0 is obtained by changing the position of the virtual projection plane VP0 and rotating the yaw rotation, which is the virtual projection plane VP1, and the camera viewpoint cav is rotating clockwise in FIG.
  • FIG. 12 to 14 are examples in which the position of the virtual projection plane VP is changed based on the center ov of the virtual projection plane VP0 based on the input rotation amount setting value.
  • viewpoint conversion corresponding to rotating or changing the position of the virtual camera Ca0 is performed.
  • one of the two axes that are orthogonal to each other on the virtual projection plane VP0 is set as Yaw-axis, and the other is set as P-axis. Both are axes passing through the center ov, and rotation around the Yaw-axis with the center ov as the rotation center is called virtual yaw rotation, and rotation around the P-axis is called virtual pitch rotation.
  • the virtual projection plane VP0 is parallel to the xy plane of the camera coordinate system, and the center ov exists on the Z axis.
  • P-axis is parallel to the x-axis and Yaw-axis is parallel to the y-axis.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual pitch, and after the position change, the virtual camera Ca1 is positioned upward and the camera viewpoint cav is in a direction to look down.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual yaw, and the virtual camera Ca1 and the camera viewpoint cav rotate counterclockwise after the position change.
  • 15 to 18 are examples in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center.
  • viewpoint conversion corresponding to the parallel movement of the virtual camera Ca0 together with the virtual projection plane VP is performed.
  • FIGS. 16, 17, and 18 show examples in which the virtual projection plane VP0 is offset (translated) in the X, Y, and Z directions based on the input offset movement amount setting value. is there.
  • the offset movement in the Z direction is the same movement as zooming in and zooming out.
  • the offset movement in each direction is effective when moving a dark part (see an example described later) outside the imaging area of the optical system outside the image area.
  • FIG. 19 to 26 are examples of display images displayed on the display unit 120.
  • FIG. FIG. 19 shows an example of a distorted image when the distortion correction processing is not performed.
  • 20 to 27 are examples of display images that have been subjected to distortion correction processing.
  • 26 and 27 are examples in which a plurality of virtual projection planes VP are set and images corresponding to the respective virtual projection planes VP are displayed on the display unit 120.
  • FIG. 20 shows an example in which the virtual projection plane VP is set parallel to the lens center plane LC, and the center of the virtual projection plane VP is substantially coincident with the optical axis.
  • the display image corresponds to the virtual projection plane VP0 in the initial state.
  • FIG. 21 shows an example in which the virtual projection plane VP0 is offset in the Z direction.
  • FIG. 22 shows the position of the virtual projection plane VP0 rotated by yaw (corresponding to FIG. 11).
  • the right end is outside the imaging area and is therefore a dark part.
  • FIG. 23 shows the position of the virtual projection plane VP0 rotated by pitch (corresponding to FIG. 10). Also in the figure, a dark portion is generated at the lower end.
  • FIG. 24 shows the virtual projection plane VP0 rotated by roll (corresponding to FIG. 9).
  • FIG. 25 shows the position of the virtual projection plane VP0 rotated by a virtual yaw (see FIG. 14; however, the direction of rotation is opposite to that in FIG. 14). Also in the figure, a dark part is generated at the right end.
  • FIG. 26 is an example in which two virtual projection planes VP are set
  • FIG. 27 is an example in which three virtual projection planes VP are set, and display images corresponding to the respective virtual projection planes VP are displayed on the display unit 120. It is divided and displayed.
  • each of the plurality of virtual projection planes VP can be rotated and moved independently.
  • the center image and the two images on both sides differ in the offset amount in the Z direction. Is enlarged.
  • the center image and the images on both sides overlap a part of the shooting area.
  • the optical system including a single lens is exemplified as the optical system including the condensing lens.
  • the condensing lens may be composed of a plurality of lenses.
  • an optical element other than the condenser lens may be provided, and the invention of the present application can be applied.
  • the distortion correction coefficient may be a value for the entire optical system.

Abstract

Afin de réduire le temps de traitement de correction de distorsion dans un circuit relativement petit, le procédé de traitement d'image décrit convertit une surface de projection virtuelle en un système de coordonnées de caméra au moyen de coefficients de correction de distorsion, et calcule les données d'image de la surface de projection virtuelle établies dans un système de coordonnées universel sur la base des coordonnées du système de coordonnées converti de la caméra et des données de pixels d'un élément d'imagerie.
PCT/JP2010/060184 2010-06-16 2010-06-16 Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie WO2011158343A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2012520202A JPWO2011158343A1 (ja) 2010-06-16 2010-06-16 画像処理方法、プログラム、画像処理装置及び撮像装置
PCT/JP2010/060184 WO2011158343A1 (fr) 2010-06-16 2010-06-16 Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/060184 WO2011158343A1 (fr) 2010-06-16 2010-06-16 Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie

Publications (1)

Publication Number Publication Date
WO2011158343A1 true WO2011158343A1 (fr) 2011-12-22

Family

ID=45347765

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/060184 WO2011158343A1 (fr) 2010-06-16 2010-06-16 Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie

Country Status (2)

Country Link
JP (1) JPWO2011158343A1 (fr)
WO (1) WO2011158343A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013094588A1 (fr) * 2011-12-19 2013-06-27 大日本印刷株式会社 Dispositif de traitement d'images, procédé de traitement d'images, programme pour dispositif de traitement d'images, support à mémoire et dispositif d'affichage d'images

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11261868A (ja) * 1998-03-13 1999-09-24 Fujitsu Ltd 魚眼レンズカメラ装置及びその画像歪み補正方法及び画像抽出方法
JP2000083242A (ja) * 1993-02-08 2000-03-21 Interactive Pictures Corp 全視野静止カメラとサ―ベイランスシステム
JP2000242773A (ja) * 1999-02-19 2000-09-08 Fitto:Kk 画像データ変換装置
JP2006148767A (ja) * 2004-11-24 2006-06-08 Canon Inc 映像配信システム、映像配信装置、映像受信装置、映像配信装置の通信方法、映像受信装置の表示方法、プログラム、及び記憶媒体
JP2007228531A (ja) * 2006-02-27 2007-09-06 Sony Corp カメラ装置及び監視システム
JP2008052589A (ja) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc 広角画像の歪み補正方法
JP2008311890A (ja) * 2007-06-14 2008-12-25 Fujitsu General Ltd 画像データ変換装置、及びこれを備えたカメラ装置
WO2009014075A1 (fr) * 2007-07-20 2009-01-29 Techwell Japan K.K. Dispositif de traitement d'image et système de caméra
JP2009043060A (ja) * 2007-08-09 2009-02-26 Canon Inc 画像データに歪曲収差補正を施す画像処理方法、プログラム、および、記録媒体

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000083242A (ja) * 1993-02-08 2000-03-21 Interactive Pictures Corp 全視野静止カメラとサ―ベイランスシステム
JPH11261868A (ja) * 1998-03-13 1999-09-24 Fujitsu Ltd 魚眼レンズカメラ装置及びその画像歪み補正方法及び画像抽出方法
JP2000242773A (ja) * 1999-02-19 2000-09-08 Fitto:Kk 画像データ変換装置
JP2006148767A (ja) * 2004-11-24 2006-06-08 Canon Inc 映像配信システム、映像配信装置、映像受信装置、映像配信装置の通信方法、映像受信装置の表示方法、プログラム、及び記憶媒体
JP2007228531A (ja) * 2006-02-27 2007-09-06 Sony Corp カメラ装置及び監視システム
JP2008052589A (ja) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc 広角画像の歪み補正方法
JP2008311890A (ja) * 2007-06-14 2008-12-25 Fujitsu General Ltd 画像データ変換装置、及びこれを備えたカメラ装置
WO2009014075A1 (fr) * 2007-07-20 2009-01-29 Techwell Japan K.K. Dispositif de traitement d'image et système de caméra
JP2009043060A (ja) * 2007-08-09 2009-02-26 Canon Inc 画像データに歪曲収差補正を施す画像処理方法、プログラム、および、記録媒体

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013094588A1 (fr) * 2011-12-19 2013-06-27 大日本印刷株式会社 Dispositif de traitement d'images, procédé de traitement d'images, programme pour dispositif de traitement d'images, support à mémoire et dispositif d'affichage d'images
US9269124B2 (en) 2011-12-19 2016-02-23 Dai Nippon Printing Co., Ltd. Image processing device, image processing method, program for image processing device, recording medium, and image display device

Also Published As

Publication number Publication date
JPWO2011158343A1 (ja) 2013-08-15

Similar Documents

Publication Publication Date Title
US9196022B2 (en) Image transformation and multi-view output systems and methods
JP6960238B2 (ja) 像ブレ補正装置及びその制御方法、プログラム、記憶媒体
CN107770433B (zh) 影像获取装置及其影像平顺缩放方法
US8134608B2 (en) Imaging apparatus
EP3438919B1 (fr) Procédé d'affichage d'image et appareil d'affichage monté sur la tête
TWI393072B (zh) 廣角感測器陣列模組及其影像校正方法、運作方法與應用
US20130300875A1 (en) Correction of image distortion in ir imaging
JP6253280B2 (ja) 撮像装置およびその制御方法
JP6071545B2 (ja) 撮像装置、画像処理装置及びその制御方法、プログラム、記憶媒体
US8913162B2 (en) Image processing method, image processing apparatus and image capturing apparatus
WO2020207108A1 (fr) Procédé de traitement d'image, dispositif et système, et robot
WO2021085246A1 (fr) Dispositif de support de formation d'image, système de support de formation d'image, système de formation d'image, procédé de support de formation d'image et programme
JP6236908B2 (ja) 撮像装置、撮像システムおよび撮像方法
WO2011161746A1 (fr) Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif de capture d'image
WO2012056982A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'imagerie
JP2013005393A (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
JP5393877B2 (ja) 撮像装置および集積回路
WO2011158344A1 (fr) Procédé de traitement d'images, programme, dispositif de traitement d'images et dispositif d'imagerie
JP2009123131A (ja) 撮像装置
JP5682473B2 (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
WO2011158343A1 (fr) Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie
JP2002112094A (ja) パノラマプロジェクタ装置およびパノラマ撮影装置
JP2013005392A (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
WO2012060271A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif de formation d'image
JP4175832B2 (ja) 広角画像生成装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10853225

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012520202

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10853225

Country of ref document: EP

Kind code of ref document: A1