WO2012081400A1 - Procédé de traitement d'image, dispositif de traitement d'image et dispositif de capture d'image - Google Patents

Procédé de traitement d'image, dispositif de traitement d'image et dispositif de capture d'image Download PDF

Info

Publication number
WO2012081400A1
WO2012081400A1 PCT/JP2011/077618 JP2011077618W WO2012081400A1 WO 2012081400 A1 WO2012081400 A1 WO 2012081400A1 JP 2011077618 W JP2011077618 W JP 2011077618W WO 2012081400 A1 WO2012081400 A1 WO 2012081400A1
Authority
WO
WIPO (PCT)
Prior art keywords
distortion
distortion coefficient
image processing
image
image data
Prior art date
Application number
PCT/JP2011/077618
Other languages
English (en)
Japanese (ja)
Inventor
恭 河邊
Original Assignee
コニカミノルタオプト株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタオプト株式会社 filed Critical コニカミノルタオプト株式会社
Priority to JP2012548720A priority Critical patent/JPWO2012081400A1/ja
Publication of WO2012081400A1 publication Critical patent/WO2012081400A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3877Image rotation
    • H04N1/3878Skew detection or correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention relates to an image processing method, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
  • Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
  • Patent Document 2 it is necessary to use an external information processing device to calculate the optical distortion correction parameter for each lens position from the wide end to the tele end of the optical zoom mechanism by using an interpolation operation.
  • the optical distortion correction parameter for the discrete lens position within the range to perform the optical zoom the lens position at the time of zooming is limited to the lens position having the optical distortion correction parameter.
  • the optical zoom between the positions is connected by electronic zoom.
  • distortion correction is performed depending on the difference in the angle of view switched by the angle-of-view switching means, such as performing distortion correction in the case of the angle of view on the wide angle side, and not performing distortion correction in the case of the angle of view other than the wide angle side.
  • a video processing apparatus for changing the above is disclosed.
  • Patent Document 1 there is a problem that when a captured image obtained by a lens is hardwareized as an image processing device, the processing time becomes long, the circuit scale increases, and the cost increases. .
  • the lens position at the time of zooming is limited to a position corresponding to the discrete distortion correction parameter, and the zoom operation is omitted by interpolating the distortion correction parameter by connecting between them with an electronic zoom.
  • the zoom operation of the image pickup device alone is realized.
  • it can be applied only to a one-dimensional lens movement such as a zoom operation, and is difficult to apply to various movements such as panning and tilting.
  • the image after distortion correction processing has a narrow viewing angle, so it is difficult to recognize a wide area at once.
  • An image that is not subjected to distortion correction processing has a wide viewing angle, but on the other hand, the subject is distorted, making it difficult to recognize the sense of distance and size.
  • the present invention provides an image processing method, an image processing apparatus, and an imaging apparatus capable of accurately recognizing a subject and shortening the processing time with a relatively small circuit.
  • the purpose is to provide.
  • the coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step are converted into a camera coordinate system using a distortion coefficient, and based on the converted coordinates in the camera coordinate system and the plurality of pixel data
  • a second step of calculating image data of the virtual projection plane set in the first step is converted into a camera coordinate system using a distortion coefficient, and based on the converted coordinates in the camera coordinate system and the plurality of pixel data
  • a second step of calculating image data of the virtual projection plane set in the first step A third step of displaying a display screen based on the image data calculated in the second step;
  • Have In the second step, one distortion correction rate from a first distortion coefficient for correcting distortion caused by the optical system to a second distortion coefficient not correcting distortion is set within a range of 100% to 0%.
  • a plurality of image data is calculated using the above distortion coefficient and a plurality of distortion coefficients of the second distortion
  • the number of divisions of the display screen is two, and of the two areas,
  • the calculation of the image data of one area is performed using the second distortion coefficient
  • the calculation of the other area image data is a distortion coefficient calculated by an interpolation process from the second distortion coefficient to the first distortion coefficient, and is continuously changed from the second distortion coefficient to the first distortion coefficient.
  • the number of divisions of the display screen is two, and the screen ratio of one of the two areas in the initial stage is set higher than the screen ratio of the other area, 3.
  • the image processing method according to 1 or 2 wherein a screen ratio of two regions obtained by dividing the display screen is continuously changed by increasing the other region.
  • the coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step are converted into a camera coordinate system using a distortion coefficient, and based on the converted coordinates in the camera coordinate system and the plurality of pixel data
  • a second step of calculating image data of each virtual projection plane set in the first step A third step of displaying the plurality of image data calculated in the second step in each of a plurality of regions obtained by dividing the display screen; Based on a user's selection instruction for any one of the plurality of areas displayed in the third step, the screen ratio of the selected area to the entire area of the display screen is continuously increased or selected.
  • a fourth step of continuously changing the position and size of the virtual projection plane corresponding to the set area.
  • An image processing apparatus that obtains image data processed using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, A storage unit for storing a distortion coefficient; The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion coefficient stored in the storage unit, the coordinates converted into the camera coordinate system, and the plurality of coordinates An image processing unit that calculates image data of the virtual projection plane based on pixel data; An image signal output unit for outputting an image signal for a display screen of the image data calculated by the image processing unit; Have In the image processing unit, a distortion correction rate from a first distortion coefficient for correcting distortion caused by the optical system to a second distortion coefficient not correcting distortion is set within a range of 100% to 0%.
  • An image processing apparatus wherein a plurality of image data calculated using two or more distortion coefficients and a plurality of distortion coefficients of the second distortion coefficient are displayed in each of a plurality of regions obtained by
  • the number of divisions of the display screen is two, and of the two areas,
  • the calculation of the image data of one area is performed using the second distortion coefficient
  • the calculation of the other area image data is a distortion coefficient calculated by an interpolation process from the second distortion coefficient to the first distortion coefficient, and is continuously changed from the second distortion coefficient to the first distortion coefficient.
  • the number of divisions of the display screen is two, and the screen ratio of one of the two areas in the initial stage is set higher than the screen ratio of the other area, 8.
  • the virtual projection plane is plural,
  • the image processing unit calculates a plurality of the image data by applying the distortion coefficient having a different distortion correction rate to each of the plurality of virtual projection planes.
  • the image processing apparatus according to item.
  • An image processing apparatus that obtains image data processed using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, A storage unit for storing a distortion coefficient; The coordinates in the world coordinate system of each pixel of the plurality of virtual projection planes whose positions and sizes are set are converted into the camera coordinate system using the distortion coefficient stored in the previous storage unit, the coordinates converted into the camera coordinate system, An image processing unit that calculates image data of each of the virtual projection planes based on a plurality of pixel data; An image signal output unit that divides a display region into a plurality of regions and outputs an image signal for a display screen in which a plurality of image data calculated by the image processing unit is arranged for the divided regions; Have In the image processing unit, the screen ratio of the selected area to the whole area of the display screen is continuously increased or selected based on a user selection instruction for any one of the plurality of areas.
  • An image processing apparatus characterized by continuously changing the position and size of the virtual projection plane corresponding to
  • Optical system An imaging device having a plurality of pixels; A storage unit for storing a distortion coefficient of the optical system; The coordinates in the world coordinate system of each pixel of the virtual projection plane whose position and size are set are converted into the camera coordinate system using the distortion coefficient of the optical system, and the coordinates converted into the camera coordinate system and the image sensor are received.
  • An image processing unit that calculates image data of the virtual projection plane based on a plurality of pixel data obtained in the above-described manner; An image signal output unit for outputting an image signal for a display screen of the image data calculated by the image processing unit; A display unit for displaying the image signal; Have In the image processing unit, a distortion correction rate from a first distortion coefficient for correcting distortion caused by the optical system to a second distortion coefficient not correcting distortion is set within a range of 100% to 0%.
  • An imaging apparatus wherein a plurality of image data calculated using at least two distortion coefficients and a plurality of distortion coefficients of the second distortion coefficient are displayed in each of a plurality of regions obtained by dividing the display screen.
  • the number of divisions of the display screen is two, and of the two areas,
  • the calculation of the image data of one area is performed using the second distortion coefficient
  • the calculation of the other area image data is a distortion coefficient calculated by an interpolation process from the second distortion coefficient to the first distortion coefficient, and is continuously changed from the second distortion coefficient to the first distortion coefficient.
  • the number of divisions of the display screen is two, and the screen ratio of one of the two areas in the initial stage is set higher than the screen ratio of the other area, 13.
  • the virtual projection plane is plural,
  • the image processing unit calculates a plurality of the image data by applying the distortion coefficient having a different distortion correction rate to each of the plurality of virtual projection planes.
  • the imaging device according to item.
  • An imaging device having a plurality of pixels; A storage unit for storing a distortion coefficient of the optical system; The coordinates in the world coordinate system of each pixel of the plurality of virtual projection planes whose positions and sizes are set are converted into the camera coordinate system using the distortion coefficient of the optical system, and the coordinates converted into the camera coordinate system and the image sensor; An image processing unit that calculates image data of each of the virtual projection planes based on a plurality of pixel data obtained by receiving light; An image signal output unit that divides a display region into a plurality of regions and outputs an image signal for a display screen in which a plurality of image data calculated by the image processing unit is arranged for the divided regions; A display unit for displaying the image signal; Have In the image processing unit, the screen ratio of the selected region to the entire region of the display screen is continuously increased based on a user selection instruction for any one of the plurality of regions displayed on the display unit.
  • An image pickup apparatus characterized by causing the position and size of the virtual projection plane corresponding to
  • one or more distortion coefficients set within a range from the first distortion coefficient to the second distortion coefficient that does not correct the distortion, and the second distortion coefficient By displaying a plurality of image data calculated using a plurality of distortion coefficients in each of a plurality of areas obtained by dividing the display screen, it becomes possible to accurately recognize a subject, and distortion using a virtual projection plane By calculating the image data subjected to the correction process, the processing time can be shortened with a relatively small circuit.
  • FIG. 1 It is a schematic diagram explaining the distortion correction which concerns on this embodiment.
  • An example in which the position of the virtual projection plane VP is moved is shown.
  • (a) is a figure which shows the setting of the screen ratio of the screen 2 with respect to the whole display screen
  • (b) is a figure which shows the screen which sets the direction of the slide-in of the screen 2. is there.
  • FIG. It is a schematic diagram which shows the change (a), (b), (c), (d) of the display screen displayed on the display part 120.
  • FIG. It is a time chart explaining switching of RAM109 to refer in the setting which changes screen ratio from 0% to 50%. It is a time chart explaining switching of RAM109 to be referred to when the screen ratio is fixed to 50%.
  • FIG. 1 It is a display example of an initial screen in which the screen ratio of screen 2 is 0%. This is an example in which the screen ratio of screen 2 is changed every 10% from 10% to 100%.
  • A) And (b) is a figure which shows the control flow in 2nd Embodiment.
  • (A) And (b) is an example of the input screen of the user operation part 130.
  • FIG. In this example, the screen ratio of the screen A is increased from 25% to 100% (b1 to b5).
  • (c1) to (c3) is an example in which the upper left image (screen A) is “moved right”, and (d1) to (d3) is “right rotation”. This is an example.
  • FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment.
  • X, Y, and Z are world coordinate systems, and the origin O is the lens center.
  • Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O.
  • Point P is an object point of the subject in the world coordinate system XYZ.
  • is an incident angle with respect to the optical axis (coincident with the Z axis).
  • X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA.
  • o is the optical center, which is the intersection of the optical axis Z and the image sensor surface.
  • a point p is a point on the image sensor surface in the camera coordinate system, and an LUT (also referred to as a “distortion coefficient”) generated by using an object point P using a parameter based on physical characteristics of the lens (hereinafter referred to as “lens parameter”). And converted into the camera coordinate system.
  • LUT also referred to as a “distortion coefficient”
  • the VP is a virtual projection plane.
  • the virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system.
  • the virtual projection plane VP can be moved and changed in size based on an instruction from the user to the operation unit 130 (see FIG. 3).
  • position change is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
  • the virtual projection plane VP is arranged at a predetermined position (Z direction) parallel to the lens center plane LC (XY direction) with a predetermined size, and the center of the virtual projection plane VP.
  • ov is located on the Z axis.
  • Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP.
  • a virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus.
  • the imaging apparatus includes an imaging unit 110, an image processing unit 100, a display unit 120, a user operation unit 130, and a synchronization signal generation circuit 150.
  • the imaging unit 110 includes a short-focus lens, an imaging element, and the like.
  • examples of the lens include a wide-angle lens and a fisheye lens.
  • the image processing unit 100 based on an input instruction to the user operation unit 130, the position and size of the virtual projection plane VP are set, distortion correction strength, distortion correction change speed and change upper and lower limit settings, and display screen display. Settings can be made.
  • the image processing unit also functions as an image signal output unit that outputs an image signal for display.
  • a horizontal synchronization signal (Hsy), a vertical synchronization signal (Vsy), and a reference clock (clk) are generated. These signals are generated by the imaging unit 110 and the distortion correction processing unit 101 of the image processing unit 100. , The distortion correction parameter generation unit 102, the LUT generation unit 103, and the video memory 104.
  • the viewpoint of the camera is changed, such as punching, tilting, zooming in, and zooming out.
  • These settings are converted by the distortion correction parameter generation unit 102. Is sent to the distortion correction processing unit 101 as a distortion correction parameter.
  • the setting information of the distortion correction change speed and change width is sent to the LUT generation unit 103, and the LUT is generated from these information and the lens parameter information stored in the ROM.
  • the ROM 108 stores at least two lens parameters. One is a lens parameter based on the physical characteristics of the actual lens of the imaging unit 110, and the other is a lens parameter when distortion correction is not performed (corresponding to a pinhole lens).
  • the LUT generated by each lens parameter is the first LUT and the second LUT (also referred to as “first distortion coefficient” and “second distortion coefficient”, respectively).
  • the LUT generated by the LUT generation unit 103 is written into the RAM 109a or the RAM 109b (hereinafter collectively referred to as the RAM 109). Writing is performed at a timing synchronized with the vertical synchronization signal (Vsy) sent from the synchronization signal generation circuit 150.
  • Vsy vertical synchronization signal
  • the distortion correction processing unit 101 performs distortion correction processing on the input image from the imaging unit using the distortion correction parameters sent from the distortion correction parameter generation unit and the LUT stored in the RAM 109, and generates an output image. To do.
  • the output image is written in the video memory 104 and sent to the display unit 120 frame by frame in accordance with the synchronization signal from the synchronization signal generation circuit 150.
  • the display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the output image sent on the display screen.
  • the user operation unit 130 includes a keyboard, a mouse, or a touch panel arranged on a liquid crystal display for display, and accepts a user input operation.
  • FIG. 4 is a diagram showing a control flow of the present embodiment. Hereinafter, the distortion correction processing will be described with reference to FIGS.
  • step S10 distortion correction parameters are set.
  • the distortion correction parameter includes lens parameter setting and virtual projection plane VP position setting. These settings are performed by the distortion correction parameter generation unit in response to an input from the user operation unit 130.
  • FIG. 5 is an example of an input screen of the user operation unit 130.
  • the initial and final distortion coefficients are set for each of the two screens.
  • the screen appears to change continuously from the beginning to the end (except when the initial and end settings are the same), and the end is the end of the continuous interval. Immediately after the end, it becomes an initial state, and it seems to change discontinuously between the end and the initial stage. The image from the initial stage to the final stage is changed at a cycle set on the setting screen described later, and this is repeated.
  • the initial and periodic distortion coefficients are set by specifying one of the areas a51, and a numerical input screen (not shown) is displayed.
  • the distortion correction rate of 0% is a second distortion coefficient that is not subjected to distortion correction.
  • the correction rate of 100% is a first distortion coefficient for performing distortion correction.
  • the distortion coefficient corresponding to the distortion correction ratio between 100% and 0% is a distortion coefficient calculated from the first distortion coefficient to the second distortion coefficient by interpolation processing such as linear interpolation.
  • the time (cycle) from the initial stage to the final stage can be set by the area a52 shown in FIG. In the example of the figure, it is set to 5 sec, and this is repeated by changing from the initial stage to the final stage in a period of 5 sec.
  • FIG. 5 shows another example in FIG.
  • the distortion coefficient can be set by the area a51 and the viewpoint conversion can be performed by the area a54.
  • “front” is an example in which the virtual projection plane VP is provided at the initial position as shown in FIG. 1, and “downward ⁇ 45 deg” is a pitch, This is to rotate the position of the virtual projection plane VP 45 degrees downward about the x axis with the image center o (see FIG. 1) as the rotation center (or movement center).
  • the viewpoint position is changed from an initial front to an overhead viewpoint that looks down on the lower part of the final stage.
  • two virtual projection planes VP are set. Only in the initial stage, the two virtual projection planes VP have the same position and size, but the positions of the virtual projection planes VP corresponding to the screen 2 gradually move.
  • step S11 display screen setting is performed.
  • FIG. A screen for setting the direction of is shown.
  • the third from the top is selected, which indicates that the screen ratio of the screen 2 is changed from the initial 0% to the final 50% in the cycle set in step S10.
  • FIG. 7B an upward arrow is selected, and in this setting, the screen 2 is slid upward from the lower end of the display screen.
  • the screen 2 occupies a screen ratio of 25% and 50% from the initial stage.
  • the setting screen in FIG. Functions as a screen for setting the initial arrangement position of the screen 2.
  • FIG. 8 is a schematic diagram showing changes in the display screen displayed on the display unit 120 in the settings of FIGS. 7A and 7B.
  • FIG. 8A is an initial display screen
  • FIG. d) shows the display screen at the end
  • FIGS. 8B and 8C show the display screen when the screen ratio of the screen 2 is 10% and 30% during the change from the initial stage to the final stage.
  • step S21 an image signal is input from the imaging unit 110, and an input image is obtained at a frame rate of 30 fps, for example.
  • step S22 the coordinates of each pixel on the virtual projection plane VP are converted from the world coordinate system to the camera coordinate system (w2c conversion).
  • the transformation of the coordinate system is a coordinate transformation that applies to which coordinate position of the camera coordinates the world coordinate position corresponds to based on the distortion coefficient.
  • the distortion correction and the virtual projection plane VP Viewpoint transformations such as rotation, translation, enlargement, reduction, etc. are performed together with the calculation of the position.
  • two distortion coefficients are set as the distortion coefficient (A1) between the second distortion coefficient and the distortion coefficient between the second distortion coefficient and the first distortion coefficient by setting in step S10.
  • step S22 coordinate conversion is performed in accordance with the set distortion parameter.
  • the two or more distortion coefficients used in step S22 are performed by referring to the distortion coefficients stored in the RAMs 109a and 109b.
  • the distortion coefficient stored in the RAM 109b is written and updated for each frame.
  • a vertical synchronizing signal Vsy is a synchronizing signal from the synchronizing signal generating circuit 150, and each frame image signal VD indicates a frame number (f1, f2, etc.).
  • the period of the vertical synchronization signal Vsy is, for example, 30 Hz or 60 Hz. In the former case, image data is processed at a frame rate of 30 fps.
  • FIG. 9 shows an example in which the screen ratio of the screen 2 is changed from 0% to 50% at a change rate of 10% / 1 frame, and f0 to f5 correspond to the screen ratio of 0% to 50%.
  • FIG. 10 shows an example in which the screen ratio is fixed to 50%. All of f0 to f5 have a screen ratio of 50%.
  • the example shown in the figure is an extreme example for making the illustration easy to understand. Actually, it is changed at a speed of about 1% / 3 frames (50% in 5 sec).
  • LutSel is a switch for selecting a RAM to be used for reading.
  • the RAM 10a is selected.
  • the RAM 109b is selected.
  • the RAM 109a applies a fixed distortion coefficient (for example, the second distortion coefficient), and the RAM 109b changes the distortion coefficient (for example, the second distortion coefficient to the first distortion coefficient) for each frame. Yes.
  • a dark shaded area wr indicates a writing period
  • a thin shaded area rd indicates a reading period.
  • a delay is added to the synchronization signal from the synchronization signal generation circuit 150 using the reference clock (clk).
  • the LUT is updated for each frame by changing the LUT to the RAM 109b during the V blanking period (reference: Vblank) when the signal of the vertical synchronization signal Vsy is L level. Then, the image processing unit 100 generates image data of the screen 1 (SC1) using the distortion coefficient read from the RAM 109a and image data of the screen 2 (SC2) using the distortion coefficient read from the RAM 109b.
  • SC1 screen 1
  • SC2 image data of the screen 2
  • a configuration may be provided that includes a number of RAMs corresponding to the upper limit number of distortion coefficients.
  • the RAM may be divided into a plurality of fixed memory areas so as to correspond to a plurality of distortion coefficients.
  • FIG. 11 is a schematic diagram illustrating a coordinate system.
  • point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D of the virtual projection plane VP in the world coordinate system.
  • a plane G surrounded by (639, 0, Zd) is a pixel G of 640 ⁇ 480 pixels at equal intervals. Dividing into v (total number of pixels: 307,000), the coordinates of all the pixels Gv in the world coordinate system are acquired. Note that the values of the X and Y coordinates in the figure are examples, and the X and Y coordinates of the point A are displayed as zero for easy understanding.
  • the coordinate Gi (x ′, y ′) in the corresponding camera coordinate system on the image sensor surface IA. ) Is calculated. Specifically, it is calculated from the incident angle ⁇ with respect to the optical axis Z obtained from the distortion coefficient and the coordinates of each pixel Gv (reference document: International Publication No. 2010/032720).
  • FIG. 12 is a diagram illustrating a correspondence relationship between the camera coordinate system xy and the imaging element surface IA.
  • points a to d are obtained by converting the points A to D in FIG. 11 into the camera coordinate system based on the distortion coefficient (first distortion coefficient) corresponding to the lens parameter.
  • the virtual projection plane VP surrounded by the points A to D is a rectangular plane, but in FIG. 12, the region ra surrounded by the points a to d after the coordinate conversion to the camera coordinate system is (virtual projection plane VP). (Corresponding to the position of).
  • the figure shows an example in which the barrel shape is distorted, but due to the characteristics of the optical system, it becomes a pincushion type and a Jinkasa type (a barrel shape at the center and a shape that changes to a straight line or a pincushion at the end). In some cases.
  • FIG. 13 is a schematic diagram showing the relationship between the image height h and the incident angle ⁇ .
  • the image height (distance from the optical axis Z) of the subject (object point P) on the image sensor surface IA is determined by the incident angle ⁇ and the distortion coefficient.
  • pi is the imaging position of the object point P on the image sensor surface IA determined by the lens parameter for which distortion correction is not performed, and the distance (image height) from the optical axis is the incident angle ⁇ as in a pinhole camera.
  • Gi is the image position after distortion correction calculated using the distortion coefficient determined by the lens parameter based on the physical characteristics of the actual lens, and the image height at that time is h.
  • the coordinates of the camera coordinate system on the imaging element surface IA are determined by the distortion coefficient based on the lens parameters and the position of the virtual projection plane VP in the world coordinate system.
  • step S23 image processing is performed using the pixel data of the coordinates Gi of the image sensor by performing the processing in step S22 on the image signal input in step S21.
  • image generation method there is a four-point interpolation as described below.
  • the pixel of the image sensor to be referenced is determined from the coordinates Gi (x ′, y ′) determined by the w2c conversion.
  • the coordinates (x, y) of each pixel of the image sensor are determined.
  • x and y are integers, but x ′ and y ′ of the coordinates Gi (x ′, y ′) after w2c conversion are not limited to integers and can take real values having a decimal part.
  • the pixel data of the pixel of the corresponding image sensor is used as a pixel on the virtual projection plane VP. It can be used as pixel data of an output image corresponding to Gv (X, Y, Z).
  • the calculated coordinates Gi are used as pixel data of the output image corresponding to the pixel Gv as four-point interpolation.
  • step S24 output processing is performed based on the generated image.
  • an output process in the demosaic process, an output image can be obtained by calculating BGR data of each pixel from a signal of a peripheral pixel.
  • the demosaic process is, for example, an image sensor composed of pixels arranged in a Bayer array, so that each pixel has only color information for one color, and interpolation processing is performed from information on neighboring pixels to obtain the color for three colors. It is to calculate color information.
  • step S25 the image processing unit 100 functioning as an “image output unit” sends the output image obtained in step S24 to the video memory 104 and causes the display unit 120 to display the output image in synchronization with the synchronization signal. By performing this continuously, a moving image of 30 fps is displayed on the display unit 120.
  • FIG. 14 to 16 are examples of display screens displayed on the display unit 120 in the present embodiment.
  • FIG. 14 shows an example in which an image processed with the second distortion coefficient is displayed as a full screen as an initial image (see FIG. 8A). At this time, the screen ratio of the screen 2 is 0%.
  • FIG. 15 shows an example in which the screen ratio of screen 2 is displayed at 50% as the final image.
  • the screen 2 displays an image processed with the first distortion coefficient.
  • FIG. 16 shows an example in which the screen ratio of screen 2 is displayed at 50% as the final image.
  • the screen 2 displays an image processed with the first distortion coefficient. Furthermore, the viewpoint conversion to “ ⁇ 45 deg (downward)” is also performed.
  • FIG. 17 and 18 are examples in which the screen ratio of the screen 2 is changed from 0% to 100% on the display unit 120 display screen.
  • the distortion correction rate for screen 1 is 0% and the distortion correction rate for screen 2 is 100%, both of which are fixed settings.
  • the virtual projection plane VP corresponding to the screen 1 is located in front, and the position of the virtual projection plane VP corresponding to the screen 2 corresponds to the position of the red frame in FIG.
  • FIG. 17 shows an initial screen
  • FIG. 18 shows screens a1 (10%) to a10 (100%) displayed in time series in which the screen ratio and the distortion correction rate are both changed from 10% to 100%. ing.
  • one or more distortion coefficients set in the range from the first distortion coefficient to the second distortion coefficient that does not correct the distortion as described above and a plurality of distortion coefficients of the second distortion coefficient are used.
  • FIG. 19A and FIG. 19B are diagrams showing a control flow in the second embodiment.
  • the flow in the figure corresponds to steps S10 and S11 in FIG. These flows are set using the input screen of the user operation unit 130 shown in FIG.
  • step S101 the number of virtual projection planes VP (number of planes) and the position and size of each set virtual projection plane VP are set as distortion correction parameter settings.
  • Setting is performed on the input screen of FIG.
  • the number of screen divisions (the number of virtual projection planes) can be set by moving the cursor CS to the selection button a91.
  • a state in which four virtual projection planes VP are set is shown.
  • the position and size of the virtual projection plane VP are set by changing the positions and sizes of the square frames A to D shown in the setting area a80.
  • the positions of the square frames A to D can be moved and the size can be changed by operating the cursor CS.
  • the area a90 shows the arrangement of the display screen, and a moving image is displayed on the display unit 120 with the arrangement shown in FIG. In the figure, it is shown that the image data obtained by the respective virtual projection planes VP corresponding to the square frames A to D of the area a80 are arranged as the screens A to D.
  • the default setting of the position of the virtual projection plane VP in the selection condition where the number of virtual projection planes VP is four is the entire (square frame A), left side (square frame B), front ( Square frame C), right side (square frame D).
  • FIG. 20B shows a state in which the positions and sizes of the four virtual projection planes VP are changed.
  • step S102 screen layout and screen selection are performed as display screen settings.
  • the size of each screen can be changed by moving the position of the intersection point a92 of the area shown in the area a90. Further, by operating the cursor CS, any screen displayed in the area a90 can be selected.
  • the screen A is selected, and as a result, the virtual projection plane VP corresponding to the screen A is selected for control.
  • step S103 the screen ratio with respect to the entire area of the display screen of the selection screen (screen A) selected in step S102 is continuously increased. This is executed by selecting the “enlarge screen area” button a93 as shown in FIG.
  • step S103 the control flow after step S21 in FIG. 4 is executed. By executing this for each frame of the moving image, the moving image in which the screen ratio is continuously changed is displayed on the display unit 120.
  • FIG. 21 is an example of the display screen of the display unit 120 when the screen ratio for the entire area of the display screen of the selection screen (screen A) is set to be continuously increased.
  • FIG. 21 shows a screen ratio of 25% (b1) to 100 for the entire area of the display screen of screen A shown in the upper left in order of time series from (b1) to (b5) based on the setting example of FIG. It shows an example of continuously increasing up to% (b5).
  • step S104 is executed instead of step S103 in FIG.
  • the above-described button group a94 shown in FIG. 20 is used for inputting a viewpoint conversion instruction, and can perform zooming, horizontal movement, and horizontal rotation.
  • the left / right movement is to translate the virtual projection plane VP corresponding to the selected screen in the direction of the spread of the virtual projection plane VP at that position in the world coordinate system.
  • the viewpoint conversion shown in the figure is an example, and as another example of the viewpoint conversion, movement of pitch and yaw may be selected.
  • step S104 the control flow after step S21 in FIG. 4 is executed. By executing this for each frame of the moving image, the moving image on which the viewpoint conversion has been continuously performed is displayed on the display unit 120.
  • FIG. 22 is an example of an image obtained by continuously performing viewpoint conversion.
  • (C1) to (c3) are examples in which the upper left image (screen A) is “moved right”, and (d1) to (d3) are examples in which “right rotation” is performed.
  • the recognizability of each image is improved.
  • the recognizability is improved, and by gradually changing to the enlarged display continuously, the user can accurately recognize the subject without being confused. Become.
  • Image Processing Unit 101 Distortion Correction Processing Unit 102 Distortion Correction Parameter Generation Unit 103 LUT Generation Unit 104 Video Memory 108 ROM 109a, 109b RAM DESCRIPTION OF SYMBOLS 110 Imaging unit 120 Display part 130 User operation part 150 Synchronization signal generation circuit VP Virtual projection surface LC Lens center plane IA Image sensor surface O Lens center o Optical center

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur un procédé de traitement d'image, un dispositif de traitement d'image et un dispositif de capture d'image, au moyen desquels il est possible de reconnaître de manière précise un sujet photographique et de réduire le temps de traitement avec un circuit comparativement de petite échelle. Un dispositif de traitement d'image comprend : une unité de traitement d'image qui utilise un coefficient de distorsion, qui est stocké dans l'unité de stockage, pour convertir des coordonnées dans un schéma de coordonnées de monde de chaque pixel d'un plan de projection virtuel, avec des localisations et des dimensions qui sont configurées dans celle-ci, en un schéma de coordonnées d'appareil photographique, et qui calcule des données d'image du plan de projection virtuel sur la base des coordonnées qui sont converties en le schéma de coordonnées d'appareil photographique et la pluralité de données de pixel ; et une unité d'émission de signal d'image qui émet un signal d'image pour une fenêtre d'affichage des données d'image qui est calculé avec l'unité de traitement d'image. Dans une pluralité de régions en lesquelles la fenêtre d'affichage est divisée, l'unité de traitement d'image affiche respectivement une pluralité de données d'image, qui sont calculées à l'aide d'une pluralité de coefficients de distorsion : un ou plusieurs coefficients de distorsion configurés à l'intérieur d'une plage de degrés de compensation de distorsion de 100 %-0 %, allant d'un premier coefficient de distorsion pour une compensation de distorsion qui survient à partir de l'ensemble optique, à un second coefficient de distorsion auquel la distorsion n'est pas compensée ; et le second coefficient de distorsion.
PCT/JP2011/077618 2010-12-14 2011-11-30 Procédé de traitement d'image, dispositif de traitement d'image et dispositif de capture d'image WO2012081400A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012548720A JPWO2012081400A1 (ja) 2010-12-14 2011-11-30 画像処理方法、画像処理装置及び撮像装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010277843 2010-12-14
JP2010-277843 2010-12-14

Publications (1)

Publication Number Publication Date
WO2012081400A1 true WO2012081400A1 (fr) 2012-06-21

Family

ID=46244509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/077618 WO2012081400A1 (fr) 2010-12-14 2011-11-30 Procédé de traitement d'image, dispositif de traitement d'image et dispositif de capture d'image

Country Status (2)

Country Link
JP (1) JPWO2012081400A1 (fr)
WO (1) WO2012081400A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007055336A1 (fr) * 2005-11-11 2007-05-18 Sony Corporation Dispositif, procede et programme de traitement d'images, support d'enregistrement contenant le programme et dispositif d'imagerie
JP2007295335A (ja) * 2006-04-26 2007-11-08 Opt Kk カメラ装置および画像記録再生方法
JP2011061511A (ja) * 2009-09-10 2011-03-24 Dainippon Printing Co Ltd 魚眼監視システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007055336A1 (fr) * 2005-11-11 2007-05-18 Sony Corporation Dispositif, procede et programme de traitement d'images, support d'enregistrement contenant le programme et dispositif d'imagerie
JP2007295335A (ja) * 2006-04-26 2007-11-08 Opt Kk カメラ装置および画像記録再生方法
JP2011061511A (ja) * 2009-09-10 2011-03-24 Dainippon Printing Co Ltd 魚眼監視システム

Also Published As

Publication number Publication date
JPWO2012081400A1 (ja) 2014-05-22

Similar Documents

Publication Publication Date Title
JP5353393B2 (ja) 画像処理装置及び画像処理方法等
WO2012060269A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif de formation d'image
WO2017122500A1 (fr) Système de projection, dispositif de traitement d'image, procédé de projection et programme
US8774495B2 (en) Image synthesizing apparatus and method of synthesizing images
US20130265468A1 (en) Camera, distortion correction device and distortion correction method
JP6041651B2 (ja) 画像処理装置、画像処理方法及びプログラム
JP4935440B2 (ja) 画像処理装置およびカメラ装置
KR20050027038A (ko) 촬상 장치
WO2012056982A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'imagerie
JP2013005393A (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
WO2011161746A1 (fr) Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif de capture d'image
WO2012060271A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif de formation d'image
JP6565326B2 (ja) 撮像表示装置及びその制御方法
JP5682473B2 (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
WO2012081400A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif de capture d'image
JP5503497B2 (ja) 画像信号処理装置、画像信号処理方法およびプログラム
US20220224852A1 (en) Image sensor, camera module, and optical device comprising camera module
CN102065253B (zh) 一种软硬件协同的图像水平缩放方法及装置
WO2012077544A1 (fr) Procédé de traitement d'image et dispositif de prise d'image
WO2011158344A1 (fr) Procédé de traitement d'images, programme, dispositif de traitement d'images et dispositif d'imagerie
JP2013005392A (ja) 広角歪補正処理を有する画像処理方法、画像処理装置及び撮像装置
JP6291795B2 (ja) 撮像システムおよび撮像方法
WO2011158343A1 (fr) Procédé de traitement d'image, programme, dispositif de traitement d'image et dispositif d'imagerie
US20060082841A1 (en) Imaging apparatus
JP4211653B2 (ja) 映像生成システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11848435

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2012548720

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11848435

Country of ref document: EP

Kind code of ref document: A1