WO2014136392A1 - Image processing device, image processing method and imaging device - Google Patents

Image processing device, image processing method and imaging device Download PDF

Info

Publication number
WO2014136392A1
WO2014136392A1 PCT/JP2014/000889 JP2014000889W WO2014136392A1 WO 2014136392 A1 WO2014136392 A1 WO 2014136392A1 JP 2014000889 W JP2014000889 W JP 2014000889W WO 2014136392 A1 WO2014136392 A1 WO 2014136392A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
optical system
circle
boundary
Prior art date
Application number
PCT/JP2014/000889
Other languages
French (fr)
Japanese (ja)
Inventor
壮功 北田
高山 淳
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2015504151A priority Critical patent/JPWO2014136392A1/en
Publication of WO2014136392A1 publication Critical patent/WO2014136392A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing an image.
  • the present invention relates to an image processing apparatus and an imaging apparatus using the image processing method.
  • imaging devices have been used as information acquisition means in portable devices, in-vehicle devices, medical devices, industrial devices, and the like.
  • imaging devices employ lenses made of resin materials such as plastics because they are relatively inexpensive and can be produced in large quantities.
  • this resin material lens has the above-mentioned advantages, it has the disadvantage that it is easily affected by temperature changes, and the optical characteristics (focal length, image center, distortion coefficient, etc.) of the lens change due to the temperature changes. Yes.
  • the image generation process is generally performed on the assumption that these optical characteristics do not change. Changes in the optical characteristics directly lead to degradation of image quality. If a change in optical characteristics can be detected and a parameter suitable for the change can be used, an image with reduced image quality can be obtained.
  • Patent Document 1 discloses a technique for correcting such a change in optical characteristics due to a temperature change.
  • the imaging device disclosed in Patent Document 1 is disposed at a position facing a subject, and is provided with a lens array in which a plurality of lenses are arranged in an array, an image plane side of the lens array, and the plurality of lenses.
  • an imaging apparatus comprising: an imaging element that captures a compound eye image that is a set of reduced images (single-eye images) of the imaged subject; and an arithmetic unit that processes the compound eye image captured by the imaging element.
  • a light-shielding wall that prevents crosstalk of light rays between adjacent lenses constituting the lens array, and the computing unit includes a center position and a focal point of each lens based on luminance information incident on the image sensor.
  • the internal parameters related to the distance and the lens distortion are determined, and the distortion amount due to the ambient temperature of the light shielding wall is calculated and corrected.
  • Zhang disclosed in the nonpatent literature 1 is proposed as a method of calculating
  • the image pickup apparatus disclosed in Patent Document 1 removes the influence of temperature change by calculating the position of each lens based on the luminance information of the light shielding wall imaged by the image sensor. Therefore, in the imaging device disclosed in Patent Document 1, the light shielding wall is indispensable in order to remove the influence due to the temperature change. For this reason, in the imaging device disclosed in Patent Document 1, the number of manufacturing steps and costs for the light shielding wall are increased.
  • the present invention has been made in view of the above circumstances, and its purpose is to detect a change in optical characteristics of a lens without providing a special optical member, and to correct an image in accordance with the change in the optical characteristics.
  • An image processing apparatus, an image processing method, and an imaging apparatus capable of performing the above are provided.
  • the imaging optical system is based on the electrical signal output from the imaging element by receiving the optical image of the subject imaged on the light receiving surface by the imaging optical system.
  • the image circle boundary position is detected, a predetermined parameter relating to a predetermined image correction process is determined based on the detected boundary position of the image circle, and using the determined predetermined parameter, The image is corrected. Therefore, since such an image processing apparatus, an image processing method, and an imaging apparatus use an image circle that is inherently included in the imaging optical system for determining parameters relating to image correction processing, a special optical member is provided. Without change, it is possible to detect a change in the optical characteristics of the lens and correct the image in accordance with the change in the optical characteristics.
  • 5 is a flowchart illustrating an operation related to image correction in the imaging apparatus. It is a figure for demonstrating the luminance distribution of an image circle in the said imaging device. It is a figure for demonstrating the determination method which determines the boundary position of an image circle in the said imaging device. It is a figure for demonstrating the data used for the determination of the boundary position of an image circle in the said imaging device. It is a figure for demonstrating the boundary position change and luminance distribution change of the image circle by a temperature change in the said imaging device.
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining a light receiving surface (imaging surface) of the image sensor in the imaging apparatus of the embodiment.
  • FIG. 3 is a diagram illustrating a configuration of a parameter table stored in the storage unit in the imaging apparatus according to the embodiment.
  • the imaging apparatus is an apparatus that captures an image of a subject using an imaging optical system and generates an image of the subject.
  • Such an imaging apparatus CA includes, for example, an imaging optical system 1 and an image processing unit 2 as shown in FIG.
  • the imaging apparatus CA may further include at least one of the input unit 6, the display unit 7, and the interface unit (IF unit) 8 as necessary.
  • the imaging optical system 1 is an optical system that forms an optical image of a subject on a predetermined surface, and includes one or more optical elements.
  • the imaging optical system 1 includes, for example, an optical aperture, one or a plurality of lenses, and a light shielding plate between the optical elements as necessary in order to prevent so-called stray light.
  • the imaging optical system 1 may be a monocular that forms one optical image (one image circle) of the subject, but in this embodiment, the imaging optical system 1 includes a plurality of lenses (single eyes). A compound eye that forms a plurality of optical images (a plurality of image circles) of the subject.
  • the plurality of lenses are arranged in an array in parallel so that their optical axes are parallel to each other.
  • the lens array may be a one-dimensional lens array including a plurality of lenses arranged in a line on a straight line in parallel in a one-dimensional array, but in this embodiment, the lens array is linearly independent in two directions More specifically, it is a two-dimensional lens array including a plurality of lenses arranged in a two-dimensional array in parallel in two directions orthogonal to each other.
  • the two-dimensional lens array may include a plurality of lenses having a honeycomb structure in the parallel direction (parallel direction).
  • the single-lens lens may have a single lens configuration or a multiple lens configuration.
  • the image processing unit 2 is a device that generates an image of a subject.
  • the image processing unit 2 includes an imaging device 3, a control calculation unit 4, and a storage unit 5.
  • the image pickup device 3 includes a plurality of photoelectric conversion elements (a plurality of pixels), and an optical image of a subject formed by the image pickup optical system 1 on the light receiving surface formed by the plurality of photoelectric conversion elements.
  • the image sensor 3 is connected to the control calculation unit 4. For example, R (red), G (green), and B according to the amount of light in the optical image of the subject imaged by the imaging optical system 1. Photo-electrically convert the image signal of each component of (blue) and output to the control calculation unit 4.
  • the optical image of the subject is guided to the light receiving surface of the image sensor 3 along the optical axis by the imaging optical system 1 at a predetermined magnification, and the optical image of the subject is captured by the image sensor 3.
  • the imaging optical system 1 since the imaging optical system 1 includes a two-dimensional lens array, two optical images (single-eye images) of subjects equal to the number of lenses in the two-dimensional lens array are present on the light receiving surface of the imaging element 3.
  • the images are formed in parallel in a two-dimensional array in a manner corresponding to the arrangement of a plurality of lenses in the two-dimensional array lens.
  • the light receiving surface (effective pixel region) of the image pickup device 3 is divided into a plurality of image pickup regions corresponding to the positions of the image circles formed on the light receiving surface by the lenses in the two-dimensional lens array.
  • the imaging area is an imaging area for each lens (each eye).
  • the two-dimensional array lens is composed of 16 lenses arranged in parallel in a 4 ⁇ 4 two-dimensional array in two XY directions orthogonal to each other
  • the light receiving surface (effective pixel region) 30 has 16 pieces corresponding to the positions of the 16 image circle ICs formed on the light receiving surface 30 by the 16 lenses in the two-dimensional lens array.
  • Imaging regions (imaging regions for each eye) 31 31-11 to 31-44), and an optical image of a subject is formed on each of these 16 imaging regions 31.
  • FIG. 2 only the image circle IC for the imaging region 31-11 in the first row and the first column is shown, and the others are omitted.
  • These 16 imaging regions 31 are arranged in parallel in a 4 ⁇ 4 two-dimensional array in two directions in the XY directions orthogonal to each other in accordance with the arrangement of the 16 lenses described above.
  • Each of these 16 imaging regions 31 is set in a rectangular shape inscribed in an image circle IC formed under a predetermined design condition (for example, room temperature 23 ° C.). That is, the imaging region 31 is set so that its diagonal line is the diameter of the image circle IC.
  • One imaging region 31 includes a plurality of pixels (photoelectric conversion elements) whose number is 1/16 or less of the total number of pixels of the imaging element 3.
  • the image sensor 3 includes a boundary position detection unit for detecting the boundary position of the image circle IC of the imaging optical system 1. More specifically, the light receiving surface (effective pixel region) 30 of the image pickup device 3 has a size including the boundary position of the image circle IC in a region adjacent to the image pickup region 31 as a boundary position detection unit. Area 32 is set. In the example shown in FIG. 2, one boundary position detection region 32 is set in a region adjacent to the imaging region 31-11 in the first row and the first column. As described above, the boundary position detection region 32 uses a photoelectric conversion element that is not used as the imaging region 31 on the light receiving surface 30.
  • a plurality of boundary position detection areas 32 may be set, for example, a plurality may be set for all the imaging areas 31, or a plurality may be set for the imaging area 31 in the first row, for example.
  • a plurality of settings may be set for the imaging region 31 in the first column, for example, a plurality of settings may be set for the imaging region 31 on the diagonal line, and for example, for the imaging region 31 every other row.
  • a plurality may be set for the imaging region 31 every other column.
  • the lens may expand only in the vertical direction or the horizontal direction. In such a case, the distortion amount can be accurately measured by arranging a plurality of detection areas in one row or one column.
  • the light receiving surface of one image sensor 3 is divided according to the number of lenses of the two-dimensional lens array.
  • an image sensor may be provided for each lens of the two-dimensional lens array.
  • 16 image sensors corresponding to the 16 lenses are provided.
  • the storage unit 5 is connected to the control calculation unit 4 and includes various programs such as a control program for controlling the entire imaging apparatus CA, an image forming program for forming an image of the subject, and an image correction program for correcting the image of the subject. And an element for storing various data such as data necessary for executing these various programs and data generated during the execution.
  • the storage unit 5 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element that stores the various programs, and an EEPROM (Electrically Erasable Programmable Read) that is a rewritable nonvolatile storage element that stores the various data.
  • the RAM of the storage unit 5 also functions as a so-called working memory of a CPU described later of the control calculation unit 4.
  • the storage unit 5 functionally stores an image storage unit 51 that stores image data, and a parameter storage unit 52 that stores parameters corresponding to each temperature (each boundary position of the image circle) in the imaging optical system 1 in advance. Is provided.
  • the predetermined parameter relating to the predetermined image correction process is a value used when correcting the image of the subject.
  • the predetermined parameter include an internal parameter, an external parameter, a blur coefficient, and the like.
  • the internal parameters are, for example, the focal position, the center position, the distortion coefficient, and the like of the imaging optical system 1, and the external parameters are the translation coefficient, the rotation coefficient, and the like of the imaging optical system 1.
  • the blur coefficient is a coefficient used in so-called super-resolution processing.
  • Each value of such a predetermined parameter is obtained in advance for each temperature of the imaging optical system 1. In the present embodiment, the temperature of the imaging optical system 1 is obtained at the boundary position of the image circle IC.
  • each value of the predetermined parameter is obtained in advance for each boundary position of the image circle IC in the imaging optical system 1.
  • each boundary position of the image circle IC in the imaging optical system 1 for each of the temperatures 0 ° C., 20 ° C., 40 ° C., 60 ° C., and 80 ° C. Is determined in advance, and each value of the predetermined parameter is determined in advance.
  • the internal parameter and the external parameter may be obtained in advance by, for example, the Zhang method disclosed in Non-Patent Document 1 described above. As a result, the correspondence between each boundary position of the image circle IC in the imaging optical system 1 and each value of the predetermined parameter is obtained in advance.
  • a predetermined function expression indicating the correspondence between each boundary position of the image circle and each value of the predetermined parameter is obtained from the correspondence created in advance and stored in the parameter storage unit 52.
  • the predetermined parameter relating to the predetermined image correction processing may be determined from the boundary position of the image circle by using the predetermined function formula created in advance. In the present embodiment, the predetermined parameter is generated in this way.
  • a lookup table indicating the correspondence relationship is created and stored in the parameter storage unit 52, and a predetermined parameter relating to a predetermined image correction process is obtained by using the previously created lookup table (parameter table). It is determined from the boundary position of the circle.
  • Such a parameter table PT includes, for example, a boundary position x field 5211 for registering a boundary position x in the x direction and a boundary position y field 5212 for registering a boundary position y in the y direction, as shown in FIG.
  • a temperature field 522 for registering the temperature a focal length fx field 5231 for registering the focal length fx in the x direction, a focal length fy field 5232 for registering the focal length fy in the y direction, and the image center in the x direction.
  • Distortion coefficient k2 field 5252 for registering coefficient k2 and distortion for registering third distortion coefficient p1 The number p1 field 5253, and includes various fields of the distortion coefficient p2 field 5254 for registering a fourth strain coefficient p2, and a record for each temperature.
  • each boundary position of the image circle and each value of the external parameter may be registered in the parameter table PT, and each boundary position of the image circle and each value of the blur coefficient May be registered in the parameter table PT.
  • the parameter table PT may be prepared for each lens (single eye), which is applied to all lenses in the lens array (common to all eyes).
  • the parameter table PT corresponding to the lens (single eye) for each lens (single eye) for each lens (single eye), it is possible to appropriately set the parameter value according to the optical characteristic of the lens with respect to the temperature change.
  • the control calculation unit 4 controls the operation of the entire imaging apparatus CA by controlling each unit in accordance with the function, and corrects the image by using a predetermined parameter related to a predetermined image correction process, while correcting the image of the subject. It is an apparatus to form.
  • the control calculation unit 4 includes, for example, a CPU (Central Processing Unit) and peripheral circuits thereof.
  • the control calculation unit 4 functionally includes a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 by executing various programs.
  • the control unit 41 controls the entire operation of the imaging apparatus CA by controlling each unit according to the function.
  • the image generation unit 42 generates an image of a subject based on the electrical signal output from the image sensor 3.
  • the image data of the generated image is stored in the image storage unit 51 of the storage unit 5.
  • the boundary detection unit 43 detects the boundary position of the image circle of the imaging optical system 1 based on the electrical signal output from the imaging device 3.
  • the parameter determination unit 44 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle detected by the boundary detection unit 43.
  • the image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using the predetermined parameter determined by the parameter determination unit 44.
  • the image data of the corrected image is stored in the image storage unit 51 of the storage unit 5.
  • the input unit 6 is a device for inputting various instructions such as an instruction for imaging conditions and an instruction to start imaging to the imaging apparatus CA from the outside, and is, for example, a rotary switch or a button switch.
  • the display unit 7 is a device for outputting instruction contents input from the input unit 6 and processing results of the control calculation unit 4 such as an image of a subject, and is an LCD (liquid crystal display), an organic EL display, or the like, for example. .
  • the display unit 7 may function as a so-called live view finder.
  • the input unit 6 may include, for example, a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method, and the position input device and the display unit 7 constitute a so-called touch panel. Good.
  • a position input device is provided on the display surface of the display unit, one or more input content candidates that can be input are displayed on the display unit, and the user touches the display position where the input content to be input is displayed. Then, the position is detected by the position input device, and the display content displayed at the detected position is input to the imaging device CA as the operation input content of the user.
  • an imaging apparatus CA that is easy for the user to handle is provided. And by changing the display content displayed on the display part 7, various input content can be input and it becomes possible to perform various operation with respect to imaging device CA.
  • the IF unit 6 is a circuit for inputting / outputting data to / from an external device.
  • an interface circuit using the IEEE802.11 series standard called Wi-Fi (Wireless fidelity), Bluetooth (registered trademark) An interface circuit using a standard, an interface circuit performing infrared communication such as an IrDA (Infrared Data Association) standard, and an interface circuit using a USB (Universal Serial Bus) standard.
  • FIG. 4 is a diagram for explaining a change in the image circle due to a temperature change in the imaging apparatus according to the embodiment.
  • 4A is a diagram for explaining a change in the size of the image circle due to a temperature change
  • FIG. 4B is a diagram for explaining a change in the center position of the image circle due to a temperature change.
  • the temperature of the image pickup optical system 1 changes due to a change in the ambient temperature used. For this reason, expansion / contraction or distortion may occur in the optical element of the imaging optical system 1.
  • the optical element of the imaging optical system 1 is a lens made of a resin material made of a resin material such as plastic
  • expansion and contraction and distortion are more likely to occur due to its temperature characteristics.
  • the expansion / contraction occurs in the optical element of the imaging optical system 1, as shown in FIG. 4A, the size (area) of the image circle IC expands and contracts, and the boundary position of the image circle IC changes accordingly. To do.
  • the temperature of the imaging optical system 1 changes from the temperature 20 ° C.
  • the image circle IC increases from the image circle IC (20) to the image circle IC (60).
  • IC (t) represents an image circle IC when the temperature is t ° C.
  • the center position of the image circle IC (image center position) is shifted, and the boundary position of the image circle IC changes accordingly. ,shift.
  • the center position (cx, cy) of the image circle IC is the center position (cx2) of the image circle IC (20).
  • Xy2) is shifted to the center position (cx4, xy4) of the image circle IC (60).
  • the boundary position of the image circle IC correlates with the temperature of the imaging optical system 1 and correlates with a predetermined parameter relating to image correction processing given for each temperature of the imaging optical system 1. For this reason, predetermined parameters relating to image correction processing for removing or reducing an error given to an image due to a temperature change are set by detecting the boundary position of the image circle IC by associating with each other in advance. It becomes possible to do.
  • the imaging apparatus CA of the present embodiment detects the boundary position of the image circle IC by operating as follows, determines the value of a predetermined parameter related to the image correction process, and determines this The subject image is corrected by using predetermined parameters.
  • FIG. 5 is a flowchart showing an operation related to image correction in the imaging apparatus of the embodiment.
  • FIG. 6 is a diagram for explaining the luminance distribution of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 6A is a diagram schematically showing the state of the light receiving surface of the image sensor when a subject having a substantially uniform luminance distribution is imaged
  • FIG. 6B is a luminance distribution along line X1-X2 shown in FIG. 6A.
  • the horizontal axis in FIG. 6B indicates the position on the X1-X2 line
  • the vertical axis indicates the luminance value.
  • FIG. 7 is a diagram for explaining a determination method for determining the boundary position of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 8 is a diagram for explaining data used for determining the boundary position of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 8A shows the luminance distribution in the vicinity of the boundary of the image circle obtained when a subject having a uniform luminance distribution (uniform subject) is imaged.
  • FIG. 8B shows the luminance distribution BA in the vicinity of the boundary of the image circle obtained when a general subject (general subject A) having a nonuniform luminance distribution is imaged.
  • FIG. 8C shows the luminance distribution BB in the vicinity of the boundary of the image circle obtained when another general subject (general subject B) with a nonuniform luminance distribution is imaged.
  • FIG. 8A shows the luminance distribution in the vicinity of the boundary of the image circle obtained when a subject having a uniform luminance distribution (uniform subject) is imaged.
  • FIG. 8B shows the luminance distribution BA in the vicinity of the boundary of the image circle obtained when a general subject (general subject A) having a nonuniform luminance distribution is
  • FIG. 8D shows a luminance distribution BC in the vicinity of the boundary of the image circle obtained when a general subject (general subject C) having a nonuniform luminance distribution is imaged.
  • FIG. 8E shows an average luminance distribution Bave obtained by averaging the luminance distributions BA to BC shown in FIGS. 8B to 8D.
  • FIG. 9 is a diagram for explaining a change in the boundary position of the image circle and a change in luminance distribution due to a temperature change in the imaging apparatus according to the embodiment.
  • FIG. 9A shows a change in the boundary position of the image circle due to a temperature change
  • FIG. 9B shows a change in the luminance distribution of the image circle due to a temperature change.
  • the horizontal axis in FIG. 9B indicates the position, and the vertical axis indicates the luminance value.
  • control calculation unit 4 controls the control program and the image correction program stored in the storage unit 5. Etc. are executed.
  • a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 are functionally configured in the control calculation unit 4.
  • the imaging apparatus CA when an unillustrated shutter button included in the input unit 6 in the imaging apparatus CA is operated, the imaging apparatus CA generates an image of the subject. More specifically, a light beam from the subject enters the imaging optical system 1, and an optical image of the subject is formed on the light receiving surface 30 of the imaging element 3 by the imaging optical system 1.
  • the imaging optical system 1 since the imaging optical system 1 includes the two-dimensional lens array as described above, an optical image (single-eye image) of a plurality of subjects with the size of the image circle IC is formed on the light receiving surface 30 of the imaging element 3. Is imaged.
  • the image sensor 3 converts the optical image of the subject into an electric signal having a magnitude corresponding to the amount of light received by each photoelectric conversion element. These electrical signals are output from the image sensor 3 to the control calculation unit 4.
  • the image generation unit 42 of the control calculation unit 4 is an image of a subject image (single-eye image) based on each electric signal obtained by each photoelectric conversion element corresponding to each imaging region 31 for each imaging region 31. Generate data. Then, the image generation unit 42 stores the generated image data in the image storage unit 51 in association with the imaging region 31.
  • the boundary detection unit 43 of the control calculation unit 4 detects the boundary position of the image circle IC of the imaging optical system 1 based on each electrical signal obtained by each photoelectric conversion element corresponding to the boundary position detection region 32. To do. Then, the boundary detection unit 43 notifies the parameter determination unit 44 of the detected boundary position of the image circle IC (S1).
  • the detection of the boundary position of the image circle IC is executed as follows, for example.
  • the luminance distribution on the straight line X1-X2 including not only the image circle IC but also the pixels (photoelectric conversion elements) outside the image circle IC has a luminance value on one outer side of the image circle IC as shown in FIG. 6B. Is approximately 0, the brightness value suddenly increases near one boundary in the image circle IC, the brightness value is substantially constant at a predetermined value in the image circle IC, and suddenly near the other boundary in the image circle IC.
  • the brightness value becomes small, and the profile having the brightness value of approximately 0 again has the outside outside the image circle IC. That is, the amount of light is large at the center (center) of the image circle IC, and the amount of light decreases toward the periphery of the image circle IC.
  • the intermediate value is a central value between the maximum luminance value and the minimum luminance value. Such an intermediate value may be set in advance to a central value between the maximum luminance value and the minimum luminance value that can be taken by the photoelectric conversion element of the image sensor 3 (static setting).
  • It may be set to a central value between the maximum luminance value and the minimum luminance value obtained from the actual luminance distribution of each photoelectric conversion element (each pixel) arranged along a predetermined straight line in the detection region 32 (motion Setting).
  • the threshold value Th may be set to a predetermined value excluding 0, the influence of noise is reduced, and the detection accuracy of the boundary position of the image circle IC is improved.
  • the boundary detection unit 43 sets the luminance value to the second threshold Th1.
  • the position of the photoelectric conversion element (pixel) at which the luminance value exceeds the second threshold Th1 is set as the boundary position of the image circle IC. To detect.
  • the boundary detection unit 43 is configured to detect the boundary position of the image circle IC of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging device 3. May be.
  • the plurality of optical images may be formed on the same subject at different points in time, or may be formed on different subjects.
  • the boundary detection unit 43 may be configured to detect the boundary position of the image circle of the imaging optical system 1 based on the integrated signal obtained by integrating the plurality of electrical signals. More specifically, for example, as shown in FIGS. 8B to 8D, a plurality of electrical signals are first obtained by converting each of a plurality of optical images having different luminance distributions by the image sensor 3. The plurality of electrical signals are added for each photoelectric conversion element (each pixel) and averaged by the number of the plurality of optical images. Thereby, the integrated signal is obtained. Since this integrated signal is an average of a plurality of optical images having various luminance distributions, it becomes a substantially uniform luminance distribution as shown in FIG. 8E and approximates the luminance distribution shown in FIGS. 7 and 8A.
  • the boundary detection unit 43 selects the maximum luminance value at the corresponding pixel from the plurality of electrical signals, and based on the selected maximum luminance value, the boundary of the imaging optical system is selected. It may be configured to detect the boundary position of the image circle. More specifically, for example, by converting each of a plurality of optical images having different luminance distributions by the imaging element 3, a plurality of electrical signals are first obtained. In the plurality of electric signals, the maximum luminance value is selected for each photoelectric conversion element (each pixel), and the selected maximum luminance value is set as the luminance value of the photoelectric conversion element (the pixel). The As a result, a signal composed of the maximum luminance value in each photoelectric conversion element (each pixel) is obtained, and the boundary position of the image circle of the imaging optical system is detected from the luminance distribution of this signal in the same manner as described above.
  • the parameter determination unit 44 of the control calculation unit 4 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle IC detected by the boundary detection unit 43. Then, the parameter determination unit 44 notifies the image correction unit 45 of the determined parameter value (S2). The determination of this parameter is executed as follows, for example.
  • the parameter table PT of the lookup table stored in advance in the parameter storage unit 52 as described above is used when determining the parameters. More specifically, the parameter determination unit 44 searches the boundary position x field 5211 and the boundary position y field 5212 using the boundary position of the image circle IC detected by the boundary detection unit 43 as a key. A record in which the boundary position of the detected image circle IC is registered in the boundary position x field 5211 and the boundary position y field 5212 is found.
  • the parameter determination unit 44 performs the focal length fx field 5231, focal length fy field 5232, image center cx field 5241, image center cy field 5242, distortion coefficient k1 field 5251, distortion coefficient k2 field 5252, distortion in the found record. From the coefficient p1 field 5253 and the distortion coefficient p2 field 5254, the focal length fx, the focal length fy, the image center position cx, the image center position cy, the first distortion coefficient k1, the second distortion coefficient k2, and the third distortion, respectively. The parameter values of the coefficient p1 and the fourth distortion coefficient p2 are acquired. Thereby, each value of each parameter is determined.
  • the image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using each value of each parameter determined by the parameter determination unit 44 (S3). .
  • the image correcting unit 45 uses a focal length fx and a focal length fy determined by the parameter determining unit 44, and uses a known correction method for enlarging / reducing the image with an enlargement ratio corresponding to the focal length shift amount. , Correct the image. If the enlargement ratio is greater than 1, the enlargement / reduction process is an enlargement process. If the enlargement ratio is 1, the enlargement / reduction process is an equal magnification process, and the enlargement ratio is less than 1. In this case, the enlargement / reduction process is a reduction process.
  • the image correction unit 45 uses the known correction method for correcting the baseline length according to the image center position cx determined by the parameter determination unit 44 and the shift amount of the center position using the image center position cy. to correct.
  • the baseline length is shifted longer than the original baseline length due to the shift of the center position
  • the baseline length is corrected to be shorter according to the shift amount of the center position
  • the baseline length is longer than the original baseline length due to the shift of the center position.
  • the base line length is corrected to be long according to the shift amount of the center position.
  • the image correction unit 45 corrects the image by correcting the position of the posture by a known correction method using the image center position cx and the image center position cy determined by the parameter determination unit 44.
  • the image correction unit 45 uses the first to fourth distortion coefficients k1, k2, p1, and p2 determined by the parameter determination unit 44, and has an abbreviated shape similar to a scene in which a distorted image can be seen with the naked eye.
  • the image is corrected by a known correction method for correcting a natural image without distortion.
  • the image correction unit 45 corrects the image by a known correction method that rotates the image by a matrix for rotating the image using the parameter value determined by the parameter determination unit 44. To do. Further, for example, when correcting the translation, the image correction unit 45 corrects the image by a known correction method that translates the image by a matrix for translating the image using the parameter value determined by the parameter determination unit 44. To do.
  • the image correction unit 45 corrects the blur estimation coefficient in the super-resolution processing using the blur coefficient determined by the parameter determination unit 44, and performs known super-resolution processing. By doing so, the image is corrected.
  • This super-resolution processing is a technique for generating a single high-resolution image from a plurality of low-resolution images.
  • the high-resolution processing is performed by aligning a plurality of overlapping low-resolution images with sub-pixel accuracy. One pixel image value is complemented, and then one high-resolution image is generated by removing blur, noise, and the like.
  • Such super-resolution processing is disclosed in, for example, Japanese Patent Application Laid-Open Nos.
  • the blur is also caused by the image correction processing related to the internal parameters such as the correction of the shift of the focal length, the correction of the shift of the center position, and the correction of the distortion. For this reason, it is preferable that the blur correction is performed after the image correction processing related to the internal parameters.
  • the size of the image circle IC increases as shown in FIG. 9A, and the boundary position of the image circle IC shifts. To do.
  • the luminance distribution is also shifted as shown in FIG. 9B.
  • the boundary position (X, 4) of the shifted image circle IC is detected as, for example, coordinates (7, 4) by the boundary detection unit 43 in process S1.
  • the parameter table PT is retrieved by the parameter determination unit 44 at the coordinates (7, 4), and the focal length fx and the focal length are obtained from the fields 5231 to 5254 of the record corresponding to the coordinates (7, 4).
  • step S3 the parameter values fx4, fy4, cx4, xy4, k14, k24, p14, and p24 are used to correct the image of the subject.
  • the parameter is determined for each formation of the image of the subject, but the parameter determination timing is not limited to this.
  • the parameter is determined when the imaging apparatus CA is activated, and this parameter may be used until the imaging apparatus CA is turned off.
  • the parameter may be determined and updated at a predetermined time interval set in advance. Further, for example, the parameter may be determined and updated every time a subject image formed by operating a shutter button (not shown) reaches a predetermined number.
  • the imaging device CA and the image processing unit 2 in the present embodiment detect the boundary position of the image circle IC of the imaging optical system 1 based on the electrical signal output from the imaging device 3, and the detected image circle IC. Based on the boundary position, a predetermined parameter relating to a predetermined image correction process is determined, and the image of the subject is corrected using the determined predetermined parameter. For this reason, since the imaging apparatus CA and the image processing unit 2 in this embodiment use the image circle IC inherently included in the imaging optical system 1 to determine parameters relating to image correction processing, It is possible to detect a change in the optical characteristic of the lens without providing a special optical member as in the technique, and to correct an image in accordance with the change in the optical characteristic.
  • the imaging apparatus CA and the image processing unit 2 have boundaries between the image circle ICs of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging element 3. Since the position is detected, the boundary position of the image circle IC can be detected with higher accuracy. Therefore, the imaging apparatus CA and the image processing unit 2 in this embodiment can correct an image with higher accuracy.
  • the imaging apparatus CA in the present embodiment includes a lens array in which the imaging optical system 1 is arranged with lenses having small diameters, so that the back focus of the imaging optical system 1 can be shortened and the height can be reduced. .
  • Such an imaging apparatus CA is particularly suitable for mobile phones and smartphones that are becoming thinner.
  • An image processing apparatus includes an imaging device that converts an optical image of a subject formed on a light receiving surface by an imaging optical system into an electrical signal, and an image of the subject based on the electrical signal output from the imaging device.
  • An image generation unit that generates the image circle, a boundary detection unit that detects a boundary position of the image circle of the imaging optical system based on the electrical signal output from the imaging element, and the image circle detected by the boundary detection unit
  • a parameter determining unit that determines a predetermined parameter related to a predetermined image correction process based on the boundary position, and an image of the subject generated by the image generating unit using the predetermined parameter determined by the parameter determining unit.
  • the image circle of the imaging optical system is a circular range in which light passing through a lens included in the imaging optical system connects the images. For this reason, the boundary of the image circle is affected by the optical characteristics of the lens included in the imaging optical system.
  • the image processing device detects a boundary position of an image circle of the imaging optical system based on an electric signal output from the image sensor, and performs a predetermined image correction process based on the detected boundary position of the image circle.
  • the parameter is determined, and the image of the subject is corrected using the determined predetermined parameter.
  • the image processing apparatus uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, for example, the light shielding wall disclosed in Patent Document 1 is used. Without providing a special optical member, a change in the optical characteristics of the lens can be detected, and an image can be corrected according to the change in the optical characteristics.
  • the boundary detection unit includes an image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging element.
  • the boundary position of is detected.
  • the boundary detection unit detects a boundary position of an image circle of the imaging optical system based on an integrated signal obtained by integrating the plurality of electrical signals.
  • the boundary detection unit selects a maximum luminance value at a corresponding pixel from the plurality of electric signals, and based on the selected maximum luminance value, a boundary position of an image circle of the imaging optical system Is detected.
  • Such an image processing apparatus detects the boundary position of the image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging device, so that it has higher accuracy.
  • the boundary position of the image circle can be detected. Therefore, such an image processing apparatus can correct an image with higher accuracy.
  • the predetermined parameter is one or more of a focus position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient.
  • an image processing apparatus that corrects an image using one or more of a focal position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient as the predetermined parameter.
  • An image processing method includes an imaging step of converting an optical image of a subject formed on a light receiving surface of an imaging element by an imaging optical system into an electrical signal, and an electrical signal obtained by the imaging step.
  • An image generation step of generating an image of the subject based on the boundary a boundary detection step of detecting a boundary position of an image circle of the imaging optical system based on the electrical signal obtained by the imaging step, and the boundary detection step
  • a parameter determining step for determining a predetermined parameter relating to a predetermined image correction process based on the detected boundary position of the image circle, and the image generating unit using the predetermined parameter determined by the parameter determining step And an image correction process for correcting the image of the subject.
  • Such an image processing method detects a boundary position of an image circle of an imaging optical system based on an electric signal output from an image sensor, and relates to a predetermined image correction process based on the detected boundary position of the image circle.
  • a predetermined parameter is determined, and the image of the subject is corrected using the determined predetermined parameter.
  • the image processing method uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, the optical characteristics of the lens can be changed without providing a special optical member. Can be detected, and the image can be corrected in accordance with the change in the optical characteristics.
  • An imaging apparatus includes an imaging optical system that forms an optical image of a subject on a predetermined surface, and an image processing unit that generates the image of the subject, wherein the image processing unit includes the imaging unit.
  • the image processing apparatus according to any one of the above, wherein the light receiving surface of the element is assembled so as to be a predetermined surface of the imaging optical system.
  • Such an image pickup apparatus uses any of the above-described image processing apparatuses, it detects a change in the optical characteristics of the lens without providing a special optical member, and corrects the image in accordance with the change in the optical characteristics. be able to.
  • the imaging optical system includes a lens array including a plurality of lenses arranged in an array.
  • Such an image pickup apparatus includes a lens array in which lenses having small diameters are arranged in the image pickup optical system, so that the back focus of the image pickup optical system can be shortened and the height can be reduced.
  • Such an imaging device is particularly suitable for mobile phones and smartphones that are becoming thinner.
  • an image processing device an image processing method, and an imaging device can be provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

In the case of the present image processing device, image processing method, and imaging device, the boundary position of an image circle of an imaging optical system is detected on the basis of an electric signal that is output from an imaging component upon receiving an optical image of a photographic object formed on a light-receiving surface by the imaging optical system, predefined parameters related to predefined image correction processing are determined on the basis of the detected image circle boundary position, and the image of the photographic object is corrected using the determined predefined parameters.

Description

画像処理装置および画像処理方法ならびに撮像装置Image processing apparatus, image processing method, and imaging apparatus
 本発明は、画像を処理する画像処理装置および画像処理方法に関する。そして、本発明は、この画像処理装置および画像処理方法を用いた撮像装置に関する。 The present invention relates to an image processing apparatus and an image processing method for processing an image. The present invention relates to an image processing apparatus and an imaging apparatus using the image processing method.
 近年、撮像装置(カメラ)は、携帯機器、車載機器、医療機器および産業機器等における情報取得手段として利用されている。特に、近年の携帯電話やスマートフォンには、標準的に撮像装置が搭載されている。これら撮像装置には、比較的安価で大量に生産できる等の利点から、プラスチック等の樹脂材料製のレンズが採用されていることが多い。この樹脂材料製レンズは、前記利点を有するが、温度変化の影響を受け易く、前記温度変化によってレンズの光学特性(焦点距離、画像中心および歪係数等)が変わってしまうという欠点を有している。レンズを含む撮像光学系を介して撮像素子で得られた電気信号に基づいて被写体の画像を生成する場合、画像生成の処理は、一般に、これら光学特性が変化しないものとして実行されているため、光学特性の変化は、そのまま画質の劣化に繋がる。光学特性の変化を検出してそれに合ったパラメータを使用することができれば、画質の劣化を低減した画像を得ることができる。 In recent years, imaging devices (cameras) have been used as information acquisition means in portable devices, in-vehicle devices, medical devices, industrial devices, and the like. In particular, recent mobile phones and smartphones are typically equipped with an imaging device. In many cases, these imaging devices employ lenses made of resin materials such as plastics because they are relatively inexpensive and can be produced in large quantities. Although this resin material lens has the above-mentioned advantages, it has the disadvantage that it is easily affected by temperature changes, and the optical characteristics (focal length, image center, distortion coefficient, etc.) of the lens change due to the temperature changes. Yes. When an image of a subject is generated based on an electrical signal obtained by an imaging device via an imaging optical system including a lens, the image generation process is generally performed on the assumption that these optical characteristics do not change. Changes in the optical characteristics directly lead to degradation of image quality. If a change in optical characteristics can be detected and a parameter suitable for the change can be used, an image with reduced image quality can be obtained.
 このような温度変化による光学特性の変化を補正する技術が、例えば特許文献1に開示されている。この特許文献1に開示の撮像装置は、被写体と対向する位置に配設され、複数のレンズをアレイ状に配列したレンズアレイと、該レンズアレイの像面側に設けられ、前記複数のレンズにより結像された前記被写体の縮小像(個眼像)の集合である複眼像を撮像する撮像素子と、該撮像素子により撮像された前記複眼像を処理する演算器とを備えた撮像装置において、前記レンズアレイを構成する隣接する各レンズ間での光線のクロストークを防止する遮光壁を備え、前記演算器は、前記撮像素子に入射した輝度情報に基づいて、前記各レンズの中心位置、焦点距離およびレンズ歪みにかかる内部パラメータを決定して前記遮光壁の周囲温度による歪み量を算出して補正するものである。そして、この特許文献1には、各温度のカメラ固有の内部パラメータおよび外部パラメータを求める手法として、非特許文献1に開示されたZhangの手法が提案されている。 For example, Patent Document 1 discloses a technique for correcting such a change in optical characteristics due to a temperature change. The imaging device disclosed in Patent Document 1 is disposed at a position facing a subject, and is provided with a lens array in which a plurality of lenses are arranged in an array, an image plane side of the lens array, and the plurality of lenses. In an imaging apparatus comprising: an imaging element that captures a compound eye image that is a set of reduced images (single-eye images) of the imaged subject; and an arithmetic unit that processes the compound eye image captured by the imaging element. A light-shielding wall that prevents crosstalk of light rays between adjacent lenses constituting the lens array, and the computing unit includes a center position and a focal point of each lens based on luminance information incident on the image sensor. The internal parameters related to the distance and the lens distortion are determined, and the distortion amount due to the ambient temperature of the light shielding wall is calculated and corrected. And in this patent document 1, the method of Zhang disclosed in the nonpatent literature 1 is proposed as a method of calculating | requiring the internal parameter and external parameter intrinsic | native to the camera of each temperature.
 ところで、前記特許文献1に開示の撮像装置は、撮像素子で撮像された遮光壁の輝度情報に基づいて各レンズの位置を計算することによって、温度変化による影響を取り除いている。したがって、前記特許文献1に開示の撮像装置は、温度変化による影響を取り除くために、前記遮光壁が必須である。このため、前記特許文献1に開示の撮像装置では、遮光壁の製造工数やコストがかかってしまう。 By the way, the image pickup apparatus disclosed in Patent Document 1 removes the influence of temperature change by calculating the position of each lens based on the luminance information of the light shielding wall imaged by the image sensor. Therefore, in the imaging device disclosed in Patent Document 1, the light shielding wall is indispensable in order to remove the influence due to the temperature change. For this reason, in the imaging device disclosed in Patent Document 1, the number of manufacturing steps and costs for the light shielding wall are increased.
特開2011-147079号公報JP 2011-147079 A
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、特別な光学部材を設けることなくレンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる画像処理装置および画像処理方法ならびに撮像装置を提供することである。 The present invention has been made in view of the above circumstances, and its purpose is to detect a change in optical characteristics of a lens without providing a special optical member, and to correct an image in accordance with the change in the optical characteristics. An image processing apparatus, an image processing method, and an imaging apparatus capable of performing the above are provided.
 本発明にかかる画像処理装置および画像処理方法ならびに撮像装置では、撮像光学系によって受光面に結像された被写体の光学像を受光することによって撮像素子から出力された電気信号に基づいて撮像光学系のイメージサークルの境界位置が検出され、この検出された前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータが決定され、そして、この決定された所定のパラメータを用いて被写体の画像が補正される。したがって、このような画像処理装置および画像処理方法ならびに撮像装置は、画像補正処理に関するパラメータの決定に、撮像光学系が本来的に有しているイメージサークルを利用するので、特別な光学部材を設けることなく、レンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる。 In the image processing apparatus, the image processing method, and the imaging apparatus according to the present invention, the imaging optical system is based on the electrical signal output from the imaging element by receiving the optical image of the subject imaged on the light receiving surface by the imaging optical system. The image circle boundary position is detected, a predetermined parameter relating to a predetermined image correction process is determined based on the detected boundary position of the image circle, and using the determined predetermined parameter, The image is corrected. Therefore, since such an image processing apparatus, an image processing method, and an imaging apparatus use an image circle that is inherently included in the imaging optical system for determining parameters relating to image correction processing, a special optical member is provided. Without change, it is possible to detect a change in the optical characteristics of the lens and correct the image in accordance with the change in the optical characteristics.
 上記並びにその他の本発明の目的、特徴及び利点は、以下の詳細な記載と添付図面から明らかになるであろう。 The above and other objects, features and advantages of the present invention will become apparent from the following detailed description and the accompanying drawings.
実施形態における撮像装置の構成を示すブロック図である。It is a block diagram which shows the structure of the imaging device in embodiment. 前記撮像装置における撮像素子の受光面(撮像面)を説明するための図である。It is a figure for demonstrating the light-receiving surface (imaging surface) of the image pick-up element in the said imaging device. 前記撮像装置における記憶部に記憶されるパラメータテーブルの構成を示す図である。It is a figure which shows the structure of the parameter table memorize | stored in the memory | storage part in the said imaging device. 前記撮像装置において、温度変化によるイメージサークルの変化を説明するための図である。It is a figure for demonstrating the change of the image circle by the temperature change in the said imaging device. 前記撮像装置における画像の補正に関する動作を示すフローチャートである。5 is a flowchart illustrating an operation related to image correction in the imaging apparatus. 前記撮像装置において、イメージサークルの輝度分布を説明するための図である。It is a figure for demonstrating the luminance distribution of an image circle in the said imaging device. 前記撮像装置において、イメージサークルの境界位置を決定する決定手法を説明するための図である。It is a figure for demonstrating the determination method which determines the boundary position of an image circle in the said imaging device. 前記撮像装置において、イメージサークルの境界位置の決定に用いるデータを説明するための図である。It is a figure for demonstrating the data used for the determination of the boundary position of an image circle in the said imaging device. 前記撮像装置において、温度変化によるイメージサークルの境界位置変化および輝度分布変化を説明するための図である。It is a figure for demonstrating the boundary position change and luminance distribution change of the image circle by a temperature change in the said imaging device.
 以下、本発明にかかる実施の一形態を図面に基づいて説明する。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。また、本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, an embodiment according to the present invention will be described with reference to the drawings. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, The description is abbreviate | omitted suitably. Further, in this specification, when referring generically, it is indicated by a reference symbol without a suffix, and when referring to an individual configuration, it is indicated by a reference symbol with a suffix.
 図1は、実施形態における撮像装置の構成を示すブロック図である。図2は、実施形態の撮像装置における撮像素子の受光面(撮像面)を説明するための図である。図3は、実施形態の撮像装置における記憶部に記憶されるパラメータテーブルの構成を示す図である。 FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment. FIG. 2 is a diagram for explaining a light receiving surface (imaging surface) of the image sensor in the imaging apparatus of the embodiment. FIG. 3 is a diagram illustrating a configuration of a parameter table stored in the storage unit in the imaging apparatus according to the embodiment.
 本実施形態における撮像装置は、被写体を撮像光学系によって撮像し、被写体の画像を生成する装置である。このような撮像装置CAは、例えば、図1に示すように、撮像光学系1と、画像処理部2とを備える。そして、この撮像装置CAは、入力部6、表示部7およびインターフェース部(IF部)8のうちの少なくとも1つを必要に応じてさらに備えてよい。 The imaging apparatus according to the present embodiment is an apparatus that captures an image of a subject using an imaging optical system and generates an image of the subject. Such an imaging apparatus CA includes, for example, an imaging optical system 1 and an image processing unit 2 as shown in FIG. The imaging apparatus CA may further include at least one of the input unit 6, the display unit 7, and the interface unit (IF unit) 8 as necessary.
 撮像光学系1は、被写体の光学像を所定面に結像する光学系であり、1または複数の光学素子を備える。撮像光学系1は、例えば、光学絞り、1または複数のレンズを備え、そして、いわゆる迷光を防止するために、必要に応じて光学素子間に遮光板を備える。そして、撮像光学系1は、被写体の1個の光学像(1個のイメージサークル)を形成する単眼であってもよいが、本実施形態では、撮像光学系1は、複数のレンズ(個眼)を備えるレンズアレイを含み、被写体の複数の光学像(複数のイメージサークル)を形成する複眼である。これら複数のレンズは、それら各光軸が互いに平行となるように、並列的にアレイ状に配列される。レンズアレイは、直線上に一列で並列的に1次元アレイ状に配列された複数のレンズを備える1次元レンズアレイであってもよいが、本実施形態では、レンズアレイは、線形独立な2方向に、より具体的には互いに直交する2方向に並列的に2次元アレイ状に配列された複数のレンズを備える2次元レンズアレイである。また、2次元レンズアレイは、並列方向(並設方向)にハニカム構造で複数のレンズを備えてもよい。また前記個眼のレンズは、1枚構成であってよく、複数枚構成であってよい。 The imaging optical system 1 is an optical system that forms an optical image of a subject on a predetermined surface, and includes one or more optical elements. The imaging optical system 1 includes, for example, an optical aperture, one or a plurality of lenses, and a light shielding plate between the optical elements as necessary in order to prevent so-called stray light. The imaging optical system 1 may be a monocular that forms one optical image (one image circle) of the subject, but in this embodiment, the imaging optical system 1 includes a plurality of lenses (single eyes). A compound eye that forms a plurality of optical images (a plurality of image circles) of the subject. The plurality of lenses are arranged in an array in parallel so that their optical axes are parallel to each other. The lens array may be a one-dimensional lens array including a plurality of lenses arranged in a line on a straight line in parallel in a one-dimensional array, but in this embodiment, the lens array is linearly independent in two directions More specifically, it is a two-dimensional lens array including a plurality of lenses arranged in a two-dimensional array in parallel in two directions orthogonal to each other. The two-dimensional lens array may include a plurality of lenses having a honeycomb structure in the parallel direction (parallel direction). The single-lens lens may have a single lens configuration or a multiple lens configuration.
 画像処理部2は、被写体の画像を生成する装置である。画像処理部2は、例えば、図1に示すように、撮像素子3と、制御演算部4と、記憶部5とを備える。画像処理部2は、撮像素子3の受光面が撮像光学系1の所定面となるように組み付けられる(撮像光学系1の所定面の位置=撮像素子の受光面の位置)。 The image processing unit 2 is a device that generates an image of a subject. For example, as illustrated in FIG. 1, the image processing unit 2 includes an imaging device 3, a control calculation unit 4, and a storage unit 5. The image processing unit 2 is assembled so that the light receiving surface of the image pickup device 3 becomes a predetermined surface of the image pickup optical system 1 (position of the predetermined surface of the image pickup optical system 1 = position of the light receiving surface of the image pickup device).
 撮像素子3は、複数の光電変換素子(複数の画素)を備え、これら複数の光電変換素子で形成された受光面に、撮像光学系1によって結像された被写体の光学像を、前記複数の光電変換素子を用いて電気信号に変換する素子である。より具体的には、撮像素子3は、制御演算部4に接続され、例えば、撮像光学系1によって結像された被写体の光学像における光量に応じてR(赤)、G(緑)、B(青)の各成分の画像信号に光電変換して制御演算部4へ出力する。このように被写体の光学像は、撮像光学系1によってその光軸に沿って所定の倍率で撮像素子3の受光面まで導かれ、撮像素子3によって前記被写体の光学像が撮像される。 The image pickup device 3 includes a plurality of photoelectric conversion elements (a plurality of pixels), and an optical image of a subject formed by the image pickup optical system 1 on the light receiving surface formed by the plurality of photoelectric conversion elements. An element that converts an electric signal using a photoelectric conversion element. More specifically, the image sensor 3 is connected to the control calculation unit 4. For example, R (red), G (green), and B according to the amount of light in the optical image of the subject imaged by the imaging optical system 1. Photo-electrically convert the image signal of each component of (blue) and output to the control calculation unit 4. Thus, the optical image of the subject is guided to the light receiving surface of the image sensor 3 along the optical axis by the imaging optical system 1 at a predetermined magnification, and the optical image of the subject is captured by the image sensor 3.
 本実施形態では、撮像光学系1が2次元レンズアレイを含むので、撮像素子3の受光面には、2次元レンズアレイにおけるレンズの個数と同数の被写体の光学像(個眼像)が、2次元アレイレンズにおける複数のレンズの配列態様に応じた態様で、並列的に2次元アレイ状に結像される。このため、撮像素子3の受光面(有効画素領域)は、2次元レンズアレイにおける各レンズが受光面上に形成する各イメージサークルの各位置に対応した複数の撮像領域に分割され、これら複数の撮像領域は、レンズごと(個眼ごと)の撮像領域となる。より具体的には、例えば、2次元アレイレンズが、互いに直交するXY方向の2方向に4×4の2次元アレイ状に並列的に配列された16個のレンズから成る場合、撮像素子3の受光面(有効画素領域)30は、図2に示すように、2次元レンズアレイにおける16個の各レンズが受光面30上に形成する16個の各イメージサークルICの各位置に対応した16個の撮像領域(個眼ごとの撮像領域)31(31-11~31-44)に分割され、これら16個の撮像領域31それぞれには、被写体の光学像が結像される。なお、図2には、1行1列目の撮像領域31-11に対するイメージサークルICのみが図示され、他は、省略されている。これら16個の撮像領域31は、上述の16個のレンズの配列態様に応じて、互いに直交するXY方向の2方向に4×4の2次元アレイ状に並列的に配列される。そして、これら16個の撮像領域31は、それぞれ、所定の設計条件(例えば室温23℃)の下で形成されるイメージサークルICに内接する矩形状に設定される。すなわち、撮像領域31は、その対角線が前記イメージサークルICの直径となるように設定される。1個の撮像領域31には、撮像素子3の総画素数に対する1/16以下である個数の複数の画素(光電変換素子)を含む。 In the present embodiment, since the imaging optical system 1 includes a two-dimensional lens array, two optical images (single-eye images) of subjects equal to the number of lenses in the two-dimensional lens array are present on the light receiving surface of the imaging element 3. The images are formed in parallel in a two-dimensional array in a manner corresponding to the arrangement of a plurality of lenses in the two-dimensional array lens. For this reason, the light receiving surface (effective pixel region) of the image pickup device 3 is divided into a plurality of image pickup regions corresponding to the positions of the image circles formed on the light receiving surface by the lenses in the two-dimensional lens array. The imaging area is an imaging area for each lens (each eye). More specifically, for example, when the two-dimensional array lens is composed of 16 lenses arranged in parallel in a 4 × 4 two-dimensional array in two XY directions orthogonal to each other, As shown in FIG. 2, the light receiving surface (effective pixel region) 30 has 16 pieces corresponding to the positions of the 16 image circle ICs formed on the light receiving surface 30 by the 16 lenses in the two-dimensional lens array. Imaging regions (imaging regions for each eye) 31 (31-11 to 31-44), and an optical image of a subject is formed on each of these 16 imaging regions 31. In FIG. 2, only the image circle IC for the imaging region 31-11 in the first row and the first column is shown, and the others are omitted. These 16 imaging regions 31 are arranged in parallel in a 4 × 4 two-dimensional array in two directions in the XY directions orthogonal to each other in accordance with the arrangement of the 16 lenses described above. Each of these 16 imaging regions 31 is set in a rectangular shape inscribed in an image circle IC formed under a predetermined design condition (for example, room temperature 23 ° C.). That is, the imaging region 31 is set so that its diagonal line is the diameter of the image circle IC. One imaging region 31 includes a plurality of pixels (photoelectric conversion elements) whose number is 1/16 or less of the total number of pixels of the imaging element 3.
 そして、撮像素子3は、撮像光学系1のイメージサークルICの境界位置を検出するための境界位置検出部を備える。より具体的には、撮像素子3の受光面(有効画素領域)30には、撮像領域31に隣接する領域に、イメージサークルICの境界位置を含む大きさで前記境界位置検出部として境界位置検出領域32が設定される。図2に示す例では、境界位置検出領域32は、1行1列目の撮像領域31-11に隣接する領域に1個設定されている。このように境界位置検出領域32は、受光面30において、撮像領域31として使用していない光電変換素子を利用している。なお、境界位置検出領域32は、複数設定されてもよく、例えば、全ての撮像領域31に対して複数設定されてよく、また例えば、1行目の撮像領域31に対して複数設定されてよく、また例えば、1列目の撮像領域31に対して複数設定されてよく、また例えば、対角線上の撮像領域31に対して複数設定されてよく、また例えば、1行おきに撮像領域31に対して複数設定されてよく、また例えば、1列おきに撮像領域31に対して複数設定されてよい。このように境界位置検出領域32を複数設けることによって、境界位置の検出精度を向上できる。ここで、一体成型されたレンズは、その製造方法によって温度変化によるレンズの歪み方に傾向がある。また一体成型ではないレンズを複数用いる場合に、撮像装置内の温度分布(この温度分布は例えば電子回路の発熱による影響で生じる)によって、レンズの配置位置によって歪み方に傾向があることが多い。例えば、一体成型されたレンズの場合、撮像素子3の中央位置を中心にレンズが等方膨張することが考えられる。このような場合、温度変化によるレンズの歪みは、周辺(対角、四隅)の位置が最も歪み量が大きくなるため、周辺に位置する複数の個眼に対し複数の検出領域を設けることで、精度良く歪み量を計測できる。これによって検出領域を全個眼に対し設ける必要がなくなり、効率的に複数の検出領域を配置できる。また、製造方法によって、縦方向または横方向のみにレンズが膨張することもある。このような場合、1行、あるいは1列に、検出領域を複数配置することで、精度良く歪み量を計測できる。 The image sensor 3 includes a boundary position detection unit for detecting the boundary position of the image circle IC of the imaging optical system 1. More specifically, the light receiving surface (effective pixel region) 30 of the image pickup device 3 has a size including the boundary position of the image circle IC in a region adjacent to the image pickup region 31 as a boundary position detection unit. Area 32 is set. In the example shown in FIG. 2, one boundary position detection region 32 is set in a region adjacent to the imaging region 31-11 in the first row and the first column. As described above, the boundary position detection region 32 uses a photoelectric conversion element that is not used as the imaging region 31 on the light receiving surface 30. A plurality of boundary position detection areas 32 may be set, for example, a plurality may be set for all the imaging areas 31, or a plurality may be set for the imaging area 31 in the first row, for example. In addition, for example, a plurality of settings may be set for the imaging region 31 in the first column, for example, a plurality of settings may be set for the imaging region 31 on the diagonal line, and for example, for the imaging region 31 every other row. For example, a plurality may be set for the imaging region 31 every other column. By providing a plurality of boundary position detection regions 32 as described above, the detection accuracy of the boundary position can be improved. Here, an integrally molded lens tends to be distorted due to a temperature change depending on its manufacturing method. In addition, when a plurality of lenses that are not integrally molded are used, there is often a tendency of distortion depending on the arrangement position of the lens due to the temperature distribution in the imaging apparatus (this temperature distribution is caused by the heat generated by the electronic circuit, for example). For example, in the case of an integrally molded lens, it is conceivable that the lens expands isotropically around the center position of the image sensor 3. In such a case, since the distortion amount of the lens due to temperature change is the largest at the peripheral (diagonal, four corners) position, by providing a plurality of detection regions for a plurality of individual eyes located in the periphery, Strain can be measured with high accuracy. Accordingly, it is not necessary to provide detection areas for all the eyes, and a plurality of detection areas can be arranged efficiently. Further, depending on the manufacturing method, the lens may expand only in the vertical direction or the horizontal direction. In such a case, the distortion amount can be accurately measured by arranging a plurality of detection areas in one row or one column.
 なお、上述では、1個の撮像素子3の受光面が2次元レンズアレイのレンズの個数に応じて分割されたが、2次元レンズアレイの各レンズごとに撮像素子が設けられてもよい。上述の4×4の2次元レンズアレイの場合、16個のレンズに応じた16個の撮像素子が設けられることになる。 In the above description, the light receiving surface of one image sensor 3 is divided according to the number of lenses of the two-dimensional lens array. However, an image sensor may be provided for each lens of the two-dimensional lens array. In the case of the 4 × 4 two-dimensional lens array described above, 16 image sensors corresponding to the 16 lenses are provided.
 記憶部5は、制御演算部4に接続され、撮像装置CA全体を制御するための制御プログラム、被写体の画像を形成するための画像形成プログラムおよび被写体の画像を補正する画像補正プログラム等の各種プログラム、および、これら各種プログラムを実行する上で必要なデータや実行中に生じるデータ等の各種データを記憶する素子である。記憶部5は、例えば、前記各種プログラムを記憶する不揮発性の記憶素子であるROM(Read Only Memory)や、前記各種データを記憶する書き換え可能な不揮発性の記憶素子であるEEPROM(Electrically Erasable Programmable Read Only Memory)や、揮発性の記憶素子であるRAM(Random Access Memory)等の記憶素子、および、これらの周辺回路とを備えて構成される。記憶部5の前記RAMは、制御演算部4の後述のCPUのいわゆるワーキングメモリとしても機能する。 The storage unit 5 is connected to the control calculation unit 4 and includes various programs such as a control program for controlling the entire imaging apparatus CA, an image forming program for forming an image of the subject, and an image correction program for correcting the image of the subject. And an element for storing various data such as data necessary for executing these various programs and data generated during the execution. The storage unit 5 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element that stores the various programs, and an EEPROM (Electrically Erasable Programmable Read) that is a rewritable nonvolatile storage element that stores the various data. It is configured to include a storage element such as an only memory (RAM), a volatile storage element such as a RAM (Random Access Memory), and peripheral circuits thereof. The RAM of the storage unit 5 also functions as a so-called working memory of a CPU described later of the control calculation unit 4.
 そして、記憶部5は、機能的に、画像データを記憶する画像記憶部51と、撮像光学系1における各温度(イメージサークルの各境界位置)に応じたパラメータを予め記憶するパラメータ記憶部52とを備える。 The storage unit 5 functionally stores an image storage unit 51 that stores image data, and a parameter storage unit 52 that stores parameters corresponding to each temperature (each boundary position of the image circle) in the imaging optical system 1 in advance. Is provided.
 所定の画像補正処理に関する所定のパラメータは、被写体の画像を補正する際に用いられる値である。この所定のパラメータとして、例えば、内部パラメータおよび外部パラメータならびにボケ係数等が挙げられる。内部パラメータは、例えば、撮像光学系1の焦点位置、中心位置および歪み係数等であり、外部パラメータは、撮像光学系1の並進係数および回転係数等である。ボケ係数は、いわゆる超解像処理で使用される係数である。このような所定のパラメータの各値は、撮像光学系1の各温度ごとに予め求められる。本実施形態では、撮像光学系1の温度は、イメージサークルICの境界位置で求められる。このため、前記所定のパラメータの各値は、撮像光学系1におけるイメージサークルICの各境界位置ごとに予め求められる。例えば、撮像装置CAの使用温度範囲の仕様を考慮することによって、温度0℃、20℃、40℃、60℃および80℃の各場合に対し、撮像光学系1におけるイメージサークルICの各境界位置が予め求められ、そして、所定のパラメータの各値が予め求められる。なお、前記内部パラメータおよび外部パラメータは、例えば、上述の非特許文献1に開示されているZhangの手法によって予め求められてよい。これによって撮像光学系1におけるイメージサークルICの各境界位置と所定のパラメータの各値との対応関係が予め求められる。 The predetermined parameter relating to the predetermined image correction process is a value used when correcting the image of the subject. Examples of the predetermined parameter include an internal parameter, an external parameter, a blur coefficient, and the like. The internal parameters are, for example, the focal position, the center position, the distortion coefficient, and the like of the imaging optical system 1, and the external parameters are the translation coefficient, the rotation coefficient, and the like of the imaging optical system 1. The blur coefficient is a coefficient used in so-called super-resolution processing. Each value of such a predetermined parameter is obtained in advance for each temperature of the imaging optical system 1. In the present embodiment, the temperature of the imaging optical system 1 is obtained at the boundary position of the image circle IC. For this reason, each value of the predetermined parameter is obtained in advance for each boundary position of the image circle IC in the imaging optical system 1. For example, by considering the specification of the operating temperature range of the imaging apparatus CA, each boundary position of the image circle IC in the imaging optical system 1 for each of the temperatures 0 ° C., 20 ° C., 40 ° C., 60 ° C., and 80 ° C. Is determined in advance, and each value of the predetermined parameter is determined in advance. The internal parameter and the external parameter may be obtained in advance by, for example, the Zhang method disclosed in Non-Patent Document 1 described above. As a result, the correspondence between each boundary position of the image circle IC in the imaging optical system 1 and each value of the predetermined parameter is obtained in advance.
 このように予め作成された前記対応関係から、イメージサークルの各境界位置と所定のパラメータの各値との対応関係を示す例えば所定の関数式等が、求められてパラメータ記憶部52に記憶され、所定の画像補正処理に関する所定のパラメータは、この予め作成された所定の関数式等を用いることによって、イメージサークルの境界位置から決定されてもよいが、本実施形態では、このように予め作成された前記対応関係を示すルックアップテーブルが作成されてパラメータ記憶部52に記憶され、所定の画像補正処理に関する所定のパラメータは、この予め作成されたルックアップテーブル(パラメータテーブル)を用いることによって、イメージサークルの境界位置から決定される。 Thus, for example, a predetermined function expression indicating the correspondence between each boundary position of the image circle and each value of the predetermined parameter is obtained from the correspondence created in advance and stored in the parameter storage unit 52. The predetermined parameter relating to the predetermined image correction processing may be determined from the boundary position of the image circle by using the predetermined function formula created in advance. In the present embodiment, the predetermined parameter is generated in this way. In addition, a lookup table indicating the correspondence relationship is created and stored in the parameter storage unit 52, and a predetermined parameter relating to a predetermined image correction process is obtained by using the previously created lookup table (parameter table). It is determined from the boundary position of the circle.
 このようなパラメータテーブルPTは、例えば、図3に示すように、x方向の境界位置xを登録するための境界位置xフィールド5211、y方向の境界位置yを登録するための境界位置yフィールド5212、温度を登録するための温度フィールド522、x方向の焦点距離fxを登録するための焦点距離fxフィールド5231、y方向の焦点距離fyを登録するための焦点距離fyフィールド5232、x方向の画像中心位置cxを登録するための画像中心cxフィールド5241、y方向の画像中心位置cyを登録するための画像中心cyフィールド5242、第1歪係数k1を登録するための歪係数k1フィールド5251、第2歪係数k2を登録するための歪係数k2フィールド5252、第3歪係数p1を登録するための歪係数p1フィールド5253、および、第4歪係数p2を登録するための歪係数p2フィールド5254の各種フィールドを備え、温度ごとにレコードを備えている。 Such a parameter table PT includes, for example, a boundary position x field 5211 for registering a boundary position x in the x direction and a boundary position y field 5212 for registering a boundary position y in the y direction, as shown in FIG. A temperature field 522 for registering the temperature, a focal length fx field 5231 for registering the focal length fx in the x direction, a focal length fy field 5232 for registering the focal length fy in the y direction, and the image center in the x direction. An image center cx field 5241 for registering the position cx, an image center cy field 5242 for registering the image center position cy in the y direction, a distortion coefficient k1 field 5251 for registering the first distortion coefficient k1, and a second distortion. Distortion coefficient k2 field 5252 for registering coefficient k2 and distortion for registering third distortion coefficient p1 The number p1 field 5253, and includes various fields of the distortion coefficient p2 field 5254 for registering a fourth strain coefficient p2, and a record for each temperature.
 このように図3に示す例では、イメージサークルの各境界位置と内部パラメータ(焦点位置、中心位置および歪み係数)の各値との対応関係がパラメータテーブルPTに登録されている。 As described above, in the example shown in FIG. 3, the correspondence between each boundary position of the image circle and each value of the internal parameters (focal position, center position, and distortion coefficient) is registered in the parameter table PT.
 例えば、図3に示す例では、温度フィールド522に”20”が登録されているレコードでは、境界位置xフィールド5211には、”3”が登録されており、境界位置yフィールド5212には、”4”が登録されており、焦点距離fxフィールド5231には、”fx2”が登録されており、焦点距離fyフィールド5232には、”fy2”が登録されており、画像中心cxフィールド5241には、”cx1”が登録されており、画像中心cyフィールド5242には、”cy1”が登録されており、歪係数k1フィールド5251には、”k12”が登録されており、歪係数k2フィールド5252には、”k22”が登録されており、歪係数p1フィールド5253には、”p12”が登録されており、そして、歪係数p2フィールド5254には、”p22”が登録されている。 For example, in the example shown in FIG. 3, in the record in which “20” is registered in the temperature field 522, “3” is registered in the boundary position x field 5211, and “ 4 ”is registered,“ fx2 ”is registered in the focal length fx field 5231,“ fy2 ”is registered in the focal length fy field 5232, and the image center cx field 5241 has “Cx1” is registered, “cy1” is registered in the image center cy field 5242, “k12” is registered in the distortion coefficient k1 field 5251, and the distortion coefficient k2 field 5252 is registered. , “K22” is registered, “p12” is registered in the distortion coefficient p1 field 5253, and the distortion coefficient p2 field is registered. The Rudo 5254, "p22" has been registered.
 なお、もちろん、イメージサークルの各境界位置と外部パラメータ(並進係数および回転係数)の各値との対応関係がパラメータテーブルPTに登録されてよく、イメージサークルの各境界位置とボケ係数の各値との対応関係がパラメータテーブルPTに登録されてよい。また、上述では、パラメータテーブルPTは、レンズアレイにおける全てのレンズに対し、適用される(全個眼共通)が、レンズ(個眼)ごとに用意されてもよい。このようにレンズ(個眼)別に当該レンズ(個眼)に対応したパラメータテーブルPTが用意されることによって、温度変化に対する当該レンズの光学特性に応じて適切にパラメータ値を設定することができる。 Of course, the correspondence between each boundary position of the image circle and each value of the external parameter (translation coefficient and rotation coefficient) may be registered in the parameter table PT, and each boundary position of the image circle and each value of the blur coefficient May be registered in the parameter table PT. In the above description, the parameter table PT may be prepared for each lens (single eye), which is applied to all lenses in the lens array (common to all eyes). Thus, by preparing the parameter table PT corresponding to the lens (single eye) for each lens (single eye), it is possible to appropriately set the parameter value according to the optical characteristic of the lens with respect to the temperature change.
 制御演算部4は、各部を当該機能に応じて制御することによって撮像装置CA全体の動作を司るとともに、所定の画像補正処理に関する所定のパラメータを用いることによって画像を補正しつつ、被写体の画像を形成する装置である。制御演算部4は、例えば、CPU(Central Processing Unit、中央処理装置)およびこれらの周辺回路とを備えて構成される。制御演算部4は、各種プログラムが実行されることによって、機能的に、制御部41と、画像生成部42と、境界検出部43と、パラメータ決定部44と、画像補正部45とを備える。 The control calculation unit 4 controls the operation of the entire imaging apparatus CA by controlling each unit in accordance with the function, and corrects the image by using a predetermined parameter related to a predetermined image correction process, while correcting the image of the subject. It is an apparatus to form. The control calculation unit 4 includes, for example, a CPU (Central Processing Unit) and peripheral circuits thereof. The control calculation unit 4 functionally includes a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 by executing various programs.
 制御部41は、各部を当該機能に応じて制御することによって撮像装置CA全体の動作を司るものである。 The control unit 41 controls the entire operation of the imaging apparatus CA by controlling each unit according to the function.
 画像生成部42は、撮像素子3から出力された電気信号に基づいて被写体の画像を生成するものである。この生成された画像の画像データは、記憶部5の画像記憶部51に記憶される。 The image generation unit 42 generates an image of a subject based on the electrical signal output from the image sensor 3. The image data of the generated image is stored in the image storage unit 51 of the storage unit 5.
 境界検出部43は、撮像素子3から出力された電気信号に基づいて撮像光学系1のイメージサークルの境界位置を検出するものである。 The boundary detection unit 43 detects the boundary position of the image circle of the imaging optical system 1 based on the electrical signal output from the imaging device 3.
 パラメータ決定部44は、境界検出部43によって検出されたイメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定するものである。 The parameter determination unit 44 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle detected by the boundary detection unit 43.
 画像補正部45は、パラメータ決定部44によって決定された前記所定のパラメータを用いて、画像生成部42で生成され画像記憶部51に記憶されている被写体の画像を補正するものである。この補正された画像の画像データは、記憶部5の画像記憶部51に記憶される。 The image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using the predetermined parameter determined by the parameter determination unit 44. The image data of the corrected image is stored in the image storage unit 51 of the storage unit 5.
 入力部6は、例えば撮影条件の指示や撮影開始の指示等の各種指示を、外部から撮像装置CAに入力するための装置であり、例えば、回転式スイッチやボタン式スイッチ等である。表示部7は、入力部6から入力された指示内容や、被写体の画像等の制御演算部4の処理結果を出力するための装置であり、例えばLCD(液晶ディスプレイ)や有機ELディスプレイ等である。表示部7は、いわゆるライブビューファインダーとして機能してもよい。なお、入力部6は、例えば抵抗膜方式や静電容量方式等の操作位置を検出して入力する位置入力装置を含んでよく、この位置入力装置と表示部7とでいわゆるタッチパネルが構成されてよい。 The input unit 6 is a device for inputting various instructions such as an instruction for imaging conditions and an instruction to start imaging to the imaging apparatus CA from the outside, and is, for example, a rotary switch or a button switch. The display unit 7 is a device for outputting instruction contents input from the input unit 6 and processing results of the control calculation unit 4 such as an image of a subject, and is an LCD (liquid crystal display), an organic EL display, or the like, for example. . The display unit 7 may function as a so-called live view finder. The input unit 6 may include, for example, a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method, and the position input device and the display unit 7 constitute a so-called touch panel. Good.
 このタッチパネルでは、表示部の表示面上に位置入力装置が設けられ、表示部に入力可能な1または複数の入力内容の候補が表示され、ユーザが、入力したい入力内容を表示した表示位置を触れると、位置入力装置によってその位置が検出され、検出された位置に表示された表示内容がユーザの操作入力内容として撮像装置CAに入力される。このようなタッチパネルでは、ユーザは、入力操作を直感的に理解し易いので、ユーザにとって取り扱い易い撮像装置CAが提供される。そして、表示部7に表示される表示内容を変えることで、様々な入力内容を入力することができ、撮像装置CAに対し様々な操作を行うことが可能となる。 In this touch panel, a position input device is provided on the display surface of the display unit, one or more input content candidates that can be input are displayed on the display unit, and the user touches the display position where the input content to be input is displayed. Then, the position is detected by the position input device, and the display content displayed at the detected position is input to the imaging device CA as the operation input content of the user. In such a touch panel, since the user can easily understand the input operation intuitively, an imaging apparatus CA that is easy for the user to handle is provided. And by changing the display content displayed on the display part 7, various input content can be input and it becomes possible to perform various operation with respect to imaging device CA.
 IF部6は、外部機器との間でデータの入出力を行う回路であり、例えば、Wi-Fi(Wireless fidelity)と呼称されるIEEE802.11シリーズの規格を用いたインターフェース回路、Bluetooth(登録商標)規格を用いたインターフェース回路、IrDA(Infrared Data Association)規格等の赤外線通信を行うインターフェース回路、および、USB(Universal Serial Bus)規格を用いたインターフェース回路等である。 The IF unit 6 is a circuit for inputting / outputting data to / from an external device. For example, an interface circuit using the IEEE802.11 series standard called Wi-Fi (Wireless fidelity), Bluetooth (registered trademark) ) An interface circuit using a standard, an interface circuit performing infrared communication such as an IrDA (Infrared Data Association) standard, and an interface circuit using a USB (Universal Serial Bus) standard.
 次に、本実施形態における撮像装置の動作について説明する。図4は、実施形態の撮像装置において、温度変化によるイメージサークルの変化を説明するための図である。図4Aは、温度変化によるイメージサークルの大きさの変化を説明するための図であり、図4Bは、温度変化によるイメージサークルの中心位置の変化を説明するための図である。 Next, the operation of the imaging device in this embodiment will be described. FIG. 4 is a diagram for explaining a change in the image circle due to a temperature change in the imaging apparatus according to the embodiment. 4A is a diagram for explaining a change in the size of the image circle due to a temperature change, and FIG. 4B is a diagram for explaining a change in the center position of the image circle due to a temperature change.
 このような撮像装置CAでは、使用される環境温度等の変化によって撮像光学系1の温度が変化する。このため、撮像光学系1の光学素子に膨縮や歪みが生じる場合がある。特に、撮像光学系1の光学素子が例えばプラスチック等の樹脂材料で形成された樹脂材料製レンズである場合には、その温度特性により膨縮や歪みがより生じ易い。撮像光学系1の光学素子に前記膨縮が生じると、図4Aに示すように、イメージサークルICの大きさ(面積)が拡縮し、これに伴ってイメージサークルICの境界位置が変化し、シフトする。図4Aに示す例では、撮像光学系1の温度が温度20℃から温度60℃へ変化した場合、イメージサークルICは、イメージサークルIC(20)からイメージサークルIC(60)へ大きくなる。なお、IC(t)は、温度t℃の場合のイメージサークルICを表している。また、撮像光学系1の光学素子に前記歪みが生じると、図4Bに示すように、イメージサークルICの中心位置(画像中心位置)がずれ、これに伴ってイメージサークルICの境界位置が変化し、シフトする。図4Bに示す例では、撮像光学系1の温度が温度20℃から温度60℃へ変化した場合、イメージサークルICの中心位置(cx、cy)は、イメージサークルIC(20)の中心位置(cx2、xy2)からイメージサークルIC(60)の中心位置(cx4、xy4)へずれる。 In such an image pickup apparatus CA, the temperature of the image pickup optical system 1 changes due to a change in the ambient temperature used. For this reason, expansion / contraction or distortion may occur in the optical element of the imaging optical system 1. In particular, when the optical element of the imaging optical system 1 is a lens made of a resin material made of a resin material such as plastic, expansion and contraction and distortion are more likely to occur due to its temperature characteristics. When the expansion / contraction occurs in the optical element of the imaging optical system 1, as shown in FIG. 4A, the size (area) of the image circle IC expands and contracts, and the boundary position of the image circle IC changes accordingly. To do. In the example shown in FIG. 4A, when the temperature of the imaging optical system 1 changes from the temperature 20 ° C. to the temperature 60 ° C., the image circle IC increases from the image circle IC (20) to the image circle IC (60). IC (t) represents an image circle IC when the temperature is t ° C. Further, when the distortion occurs in the optical element of the imaging optical system 1, as shown in FIG. 4B, the center position of the image circle IC (image center position) is shifted, and the boundary position of the image circle IC changes accordingly. ,shift. In the example shown in FIG. 4B, when the temperature of the imaging optical system 1 changes from the temperature 20 ° C. to the temperature 60 ° C., the center position (cx, cy) of the image circle IC is the center position (cx2) of the image circle IC (20). , Xy2) is shifted to the center position (cx4, xy4) of the image circle IC (60).
 したがって、イメージサークルICの境界位置は、撮像光学系1の温度と相関し、撮像光学系1の温度ごとに与えられる画像補正処理に関する所定のパラメータと相関することになる。このため、予め相互に対応付けることによって、イメージサークルICの境界位置を検出することで、温度変化に起因して画像に与えられた誤差を除去または低減するための画像補正処理に関する所定のパラメータを設定することが可能となる。 Therefore, the boundary position of the image circle IC correlates with the temperature of the imaging optical system 1 and correlates with a predetermined parameter relating to image correction processing given for each temperature of the imaging optical system 1. For this reason, predetermined parameters relating to image correction processing for removing or reducing an error given to an image due to a temperature change are set by detecting the boundary position of the image circle IC by associating with each other in advance. It becomes possible to do.
 より具体的には、本実施形態の撮像装置CAは、次のように動作することによって、イメージサークルICの境界位置を検出し、画像補正処理に関する所定のパラメータの値を決定し、この決定した所定のパラメータを用いることによって被写体の画像を補正する。 More specifically, the imaging apparatus CA of the present embodiment detects the boundary position of the image circle IC by operating as follows, determines the value of a predetermined parameter related to the image correction process, and determines this The subject image is corrected by using predetermined parameters.
 図5は、実施形態の撮像装置における画像の補正に関する動作を示すフローチャートである。図6は、実施形態の撮像装置において、イメージサークルの輝度分布を説明するための図である。図6Aは、輝度分布が略均一な被写体を撮像した場合に、撮像素子の受光面の様子を模式的に示した図であり、図6Bは、図6Aに示すX1-X2線での輝度分布を示す図である。図6Bの横軸は、前記X1-X2線上における位置を示し、その縦軸は、輝度値を示す。図7は、実施形態の撮像装置において、イメージサークルの境界位置を決定する決定手法を説明するための図である。図7の横軸は、位置を示し、その縦軸は、輝度値を示す。図8は、実施形態の撮像装置において、イメージサークルの境界位置の決定に用いるデータを説明するための図である。図8Aは、輝度分布が均一な被写体(均一被写体)を撮像した場合に得られるイメージサークルの境界付近における輝度分布を示す。図8Bは、輝度分布が不均一な一般的な被写体(一般被写体A)を撮像した場合に得られるイメージサークルの境界付近における輝度分布BAを示す。図8Cは、輝度分布が不均一な一般的な他の被写体(一般被写体B)を撮像した場合に得られるイメージサークルの境界付近における輝度分布BBを示す。図8Dは、輝度分布が不均一な一般的な被写体(一般被写体C)を撮像した場合に得られるイメージサークルの境界付近における輝度分布BCを示す。図8Eは、これら図8Bないし図8Dに示す輝度分布BA~BCを平均した平均輝度分布Baveを示す。図9は、実施形態の撮像装置において、温度変化によるイメージサークルの境界位置変化および輝度分布変化を説明するための図である。図9Aは、温度変化によるイメージサークルの境界位置変化を示し、図9Bは、温度変化によるイメージサークルの輝度分布変化を示す。図9Bの横軸は、位置を示し、その縦軸は、輝度値を示す。 FIG. 5 is a flowchart showing an operation related to image correction in the imaging apparatus of the embodiment. FIG. 6 is a diagram for explaining the luminance distribution of the image circle in the imaging apparatus according to the embodiment. FIG. 6A is a diagram schematically showing the state of the light receiving surface of the image sensor when a subject having a substantially uniform luminance distribution is imaged, and FIG. 6B is a luminance distribution along line X1-X2 shown in FIG. 6A. FIG. The horizontal axis in FIG. 6B indicates the position on the X1-X2 line, and the vertical axis indicates the luminance value. FIG. 7 is a diagram for explaining a determination method for determining the boundary position of the image circle in the imaging apparatus according to the embodiment. The horizontal axis in FIG. 7 indicates the position, and the vertical axis indicates the luminance value. FIG. 8 is a diagram for explaining data used for determining the boundary position of the image circle in the imaging apparatus according to the embodiment. FIG. 8A shows the luminance distribution in the vicinity of the boundary of the image circle obtained when a subject having a uniform luminance distribution (uniform subject) is imaged. FIG. 8B shows the luminance distribution BA in the vicinity of the boundary of the image circle obtained when a general subject (general subject A) having a nonuniform luminance distribution is imaged. FIG. 8C shows the luminance distribution BB in the vicinity of the boundary of the image circle obtained when another general subject (general subject B) with a nonuniform luminance distribution is imaged. FIG. 8D shows a luminance distribution BC in the vicinity of the boundary of the image circle obtained when a general subject (general subject C) having a nonuniform luminance distribution is imaged. FIG. 8E shows an average luminance distribution Bave obtained by averaging the luminance distributions BA to BC shown in FIGS. 8B to 8D. FIG. 9 is a diagram for explaining a change in the boundary position of the image circle and a change in luminance distribution due to a temperature change in the imaging apparatus according to the embodiment. FIG. 9A shows a change in the boundary position of the image circle due to a temperature change, and FIG. 9B shows a change in the luminance distribution of the image circle due to a temperature change. The horizontal axis in FIG. 9B indicates the position, and the vertical axis indicates the luminance value.
 例えば、撮像装置CAにおける入力部6に含まれる図略の電源スイッチが操作され、撮像装置CAが起動されると、制御演算部4は、記憶部5に格納されている制御プログラムや画像補正プログラム等の各種プログラムが実行される。これによって制御演算部4には、制御部41、画像生成部42、境界検出部43、パラメータ決定部44および画像補正部45が機能的に構成される。 For example, when a power switch (not shown) included in the input unit 6 of the imaging apparatus CA is operated and the imaging apparatus CA is activated, the control calculation unit 4 controls the control program and the image correction program stored in the storage unit 5. Etc. are executed. As a result, a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 are functionally configured in the control calculation unit 4.
 そして、例えば、撮像装置CAにおける入力部6に含まれる図略のシャッタボタンが操作されると、撮像装置CAは、被写体の画像を生成する。より具体的には、被写体からの光束が撮像光学系1に入射され、被写体の光学像が撮像光学系1によって撮像素子3の受光面30に結像される。本実施形態では、撮像光学系1が上述したように2次元レンズアレイを含むので、撮像素子3の受光面30には、イメージサークルICの大きさで複数の被写体の光学像(個眼像)が結像される。 For example, when an unillustrated shutter button included in the input unit 6 in the imaging apparatus CA is operated, the imaging apparatus CA generates an image of the subject. More specifically, a light beam from the subject enters the imaging optical system 1, and an optical image of the subject is formed on the light receiving surface 30 of the imaging element 3 by the imaging optical system 1. In the present embodiment, since the imaging optical system 1 includes the two-dimensional lens array as described above, an optical image (single-eye image) of a plurality of subjects with the size of the image circle IC is formed on the light receiving surface 30 of the imaging element 3. Is imaged.
 撮像素子3は、被写体の光学像を各光電変換素子によって、その受光量に応じた大きさの電気信号にそれぞれ変換する。これら電気信号は、撮像素子3から制御演算部4へ出力される。 The image sensor 3 converts the optical image of the subject into an electric signal having a magnitude corresponding to the amount of light received by each photoelectric conversion element. These electrical signals are output from the image sensor 3 to the control calculation unit 4.
 制御演算部4の画像生成部42は、各撮像領域31のそれぞれについて、その撮像領域31に対応する各光電変換素子によって得られた各電気信号に基づいて被写体の画像(個眼画像)の画像データを生成する。そして、画像生成部42は、この生成した画像データを撮像領域31に対応付けて画像記憶部51に記憶する。 The image generation unit 42 of the control calculation unit 4 is an image of a subject image (single-eye image) based on each electric signal obtained by each photoelectric conversion element corresponding to each imaging region 31 for each imaging region 31. Generate data. Then, the image generation unit 42 stores the generated image data in the image storage unit 51 in association with the imaging region 31.
 図5において、制御演算部4の境界検出部43は、境界位置検出領域32に対応する各光電変換素子によって得られた各電気信号に基づいて撮像光学系1のイメージサークルICの境界位置を検出する。そして、境界検出部43は、この検出したイメージサークルICの境界位置をパラメータ決定部44へ通知する(S1)。このイメージサークルICの境界位置の検出は、例えば、次のように実行される。 In FIG. 5, the boundary detection unit 43 of the control calculation unit 4 detects the boundary position of the image circle IC of the imaging optical system 1 based on each electrical signal obtained by each photoelectric conversion element corresponding to the boundary position detection region 32. To do. Then, the boundary detection unit 43 notifies the parameter determination unit 44 of the detected boundary position of the image circle IC (S1). The detection of the boundary position of the image circle IC is executed as follows, for example.
 例えば、輝度分布の略均一な被写体である場合、撮像素子3の受光面30には、図6Aに示すように、イメージサークルIC内が略一様に明るい光学像が形成され、イメージサークルIC外では、略一様に暗い。このため、イメージサークルIC内だけでなくイメージサークルIC外の画素(光電変換素子)も含む直線X1-X2上の輝度分布は、図6Bに示すように、イメージサークルICの一方の外側では輝度値が略0であり、イメージサークルICにおける一方の境界付近で急激に輝度値が大きくなり、イメージサークルIC内では輝度値が所定値で略一定であり、イメージサークルICにおける他方の境界付近で急激に輝度値が小さくなり、そして、イメージサークルICの他方の外側では輝度値が再び略0であるプロファイルを持つ。すなわち、イメージサークルICの中心(中央)では、光量が多く、イメージサークルICの周辺へ行くほど光量が落ちる。 For example, when the subject has a substantially uniform luminance distribution, an optical image is formed on the light receiving surface 30 of the image sensor 3 as shown in FIG. 6A. Then it is almost uniformly dark. For this reason, the luminance distribution on the straight line X1-X2 including not only the image circle IC but also the pixels (photoelectric conversion elements) outside the image circle IC has a luminance value on one outer side of the image circle IC as shown in FIG. 6B. Is approximately 0, the brightness value suddenly increases near one boundary in the image circle IC, the brightness value is substantially constant at a predetermined value in the image circle IC, and suddenly near the other boundary in the image circle IC. The brightness value becomes small, and the profile having the brightness value of approximately 0 again has the outside outside the image circle IC. That is, the amount of light is large at the center (center) of the image circle IC, and the amount of light decreases toward the periphery of the image circle IC.
 このため、イメージサークルICの境界位置であるか否かを判定する閾値Thを所定値、例えば、図7に示すように、0の第1閾値Th0(Th0=0)に設定することによって、イメージサークルICの境界位置が検出可能となる。すなわち、境界検出部43は、まず、境界位置検出領域32における所定の直線に沿って配列された各光電変換素子(各画素)によって得られた各電気信号それぞれに対し、その輝度値が第1閾値Th0(=0)を超えるか否かを判定する。そして、境界検出部43は、輝度値が第1閾値Th0を超えたと判定した場合に、この輝度値が第1閾値Th0を超えた光電変換素子(画素)の位置をイメージサークルICの境界位置として検出する。また例えば、図7に示すように、イメージサークルICの境界位置であるか否かを判定する閾値Thを、例えば中間値の第2閾値Th1(Th1=中間値)に設定することによって、イメージサークルICの境界位置が検出可能となる。すなわち、境界検出部43は、まず、境界位置検出領域32における所定の直線に沿って配列された各光電変換素子(各画素)によって得られた各電気信号それぞれに対し、その輝度値が第2閾値Th1(=中間値)を超えるか否かを判定する。そして、境界検出部43は、輝度値が第2閾値Th1を超えたと判定した場合に、この輝度値が第2閾値Th1を超えた光電変換素子(画素)の位置をイメージサークルICの境界位置として検出する。前記中間値は、最大輝度値と最低輝度値との中央の値である。このような中間値は、例えば、撮像素子3の光電変換素子が取り得る最大の輝度値と最低の輝度値との中央の値に予め設定されてよく(静的設定)、また例えば、境界位置検出領域32における所定の直線に沿って配列された各光電変換素子(各画素)の実際の輝度分布から求めた最大の輝度値と最低の輝度値との中央の値に設定されてよい(動的設定)。このように閾値Thが0を除く所定値に設定されることによって、ノイズの影響が低減され、イメージサークルICの境界位置の検出精度が向上する。 For this reason, the threshold value Th for determining whether or not it is the boundary position of the image circle IC is set to a predetermined value, for example, the first threshold value Th0 (Th0 = 0) as shown in FIG. The boundary position of the circle IC can be detected. That is, the boundary detection unit 43 first has the luminance value of each electric signal obtained by each photoelectric conversion element (each pixel) arranged along a predetermined straight line in the boundary position detection region 32 having a first luminance value. It is determined whether or not the threshold Th0 (= 0) is exceeded. When the boundary detection unit 43 determines that the luminance value exceeds the first threshold Th0, the position of the photoelectric conversion element (pixel) at which the luminance value exceeds the first threshold Th0 is set as the boundary position of the image circle IC. To detect. Further, for example, as shown in FIG. 7, by setting a threshold value Th for determining whether or not the boundary position of the image circle IC is, for example, a second threshold value Th1 (Th1 = intermediate value) as an intermediate value, the image circle The boundary position of the IC can be detected. That is, the boundary detection unit 43 first sets the luminance value to the second value for each electric signal obtained by each photoelectric conversion element (each pixel) arranged along a predetermined straight line in the boundary position detection region 32. It is determined whether or not the threshold Th1 (= intermediate value) is exceeded. When the boundary detection unit 43 determines that the luminance value exceeds the second threshold Th1, the position of the photoelectric conversion element (pixel) where the luminance value exceeds the second threshold Th1 is set as the boundary position of the image circle IC. To detect. The intermediate value is a central value between the maximum luminance value and the minimum luminance value. Such an intermediate value may be set in advance to a central value between the maximum luminance value and the minimum luminance value that can be taken by the photoelectric conversion element of the image sensor 3 (static setting). It may be set to a central value between the maximum luminance value and the minimum luminance value obtained from the actual luminance distribution of each photoelectric conversion element (each pixel) arranged along a predetermined straight line in the detection region 32 (motion Setting). Thus, by setting the threshold value Th to a predetermined value excluding 0, the influence of noise is reduced, and the detection accuracy of the boundary position of the image circle IC is improved.
 本実施形態では、図2および図7に示すように、矩形の境界位置検出領域32における互いに隣接する2辺に沿うようにXY座標系が設定された場合に、X軸に沿った直線、より具体的には、Y=4の直線に沿って配列された各光電変換素子(各画素)によって得られた各電気信号それぞれに対し、境界検出部43は、その輝度値が第2閾値Th1を超えるか否かを判定し、輝度値が第2閾値Th1を超えたと判定した場合に、この輝度値が第2閾値Th1を超えた光電変換素子(画素)の位置をイメージサークルICの境界位置として検出する。 In the present embodiment, as shown in FIGS. 2 and 7, when the XY coordinate system is set along two adjacent sides in the rectangular boundary position detection region 32, a straight line along the X axis Specifically, for each electric signal obtained by each photoelectric conversion element (each pixel) arranged along a straight line of Y = 4, the boundary detection unit 43 sets the luminance value to the second threshold Th1. When it is determined whether or not the luminance value exceeds the second threshold Th1, the position of the photoelectric conversion element (pixel) at which the luminance value exceeds the second threshold Th1 is set as the boundary position of the image circle IC. To detect.
 なお、上述では、被写体の輝度分布が略均一とされたが、被写体の輝度分布は、必ずしも均一であるとは限らない。このため、境界検出部43は、複数の光学像それぞれを撮像素子3で変換することによって得られた複数の電気信号に基づいて撮像光学系1のイメージサークルICの境界位置を検出するように構成されてもよい。前記複数の光学像は、同一の被写体に対し異なる時点で形成されたものであってよく、異なる被写体に対して形成されたものであってよい。 In the above description, the luminance distribution of the subject is substantially uniform, but the luminance distribution of the subject is not necessarily uniform. For this reason, the boundary detection unit 43 is configured to detect the boundary position of the image circle IC of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging device 3. May be. The plurality of optical images may be formed on the same subject at different points in time, or may be formed on different subjects.
 このような場合において、境界検出部43は、前記複数の電気信号を積分することによって得られた積分信号に基づいて撮像光学系1のイメージサークルの境界位置を検出するように構成されてよい。より具体的には、例えば、図8BないしDに示すように、輝度分布の異なる複数の光学像それぞれを、撮像素子3で変換することによって、まず、複数の電気信号が得られる。そして、これら複数の電気信号が各光電変換素子(各画素)ごとに加算され、前記複数の光学像の個数で平均化される。これによって前記積分信号が得られる。この積分信号は、様々な輝度分布を持つ複数の光学像の平均であるので、図8Eに示すように略均一な輝度分布となり、図7および図8Aに示す輝度分布に近似する。 In such a case, the boundary detection unit 43 may be configured to detect the boundary position of the image circle of the imaging optical system 1 based on the integrated signal obtained by integrating the plurality of electrical signals. More specifically, for example, as shown in FIGS. 8B to 8D, a plurality of electrical signals are first obtained by converting each of a plurality of optical images having different luminance distributions by the image sensor 3. The plurality of electrical signals are added for each photoelectric conversion element (each pixel) and averaged by the number of the plurality of optical images. Thereby, the integrated signal is obtained. Since this integrated signal is an average of a plurality of optical images having various luminance distributions, it becomes a substantially uniform luminance distribution as shown in FIG. 8E and approximates the luminance distribution shown in FIGS. 7 and 8A.
 また、このような場合において、前記境界検出部43は、前記複数の電気信号の中から対応画素での最大の輝度値を選択し、前記選択した最大の輝度値に基づいて前記撮像光学系のイメージサークルの境界位置を検出するように、構成されてもよい。より具体的には、例えば、輝度分布の異なる複数の光学像それぞれを、撮像素子3で変換することによって、まず、複数の電気信号が得られる。そして、これら複数の電気信号において、各光電変換素子(各画素)ごとにその最大の輝度値が選択され、この選択された最大の輝度値が当該光電変換素子(当該画素)の輝度値とされる。これによって各光電変換素子(各画素)における最大の輝度値で構成された信号が得られ、この信号の輝度分布から上述と同様に前記撮像光学系のイメージサークルの境界位置が検出される。 In such a case, the boundary detection unit 43 selects the maximum luminance value at the corresponding pixel from the plurality of electrical signals, and based on the selected maximum luminance value, the boundary of the imaging optical system is selected. It may be configured to detect the boundary position of the image circle. More specifically, for example, by converting each of a plurality of optical images having different luminance distributions by the imaging element 3, a plurality of electrical signals are first obtained. In the plurality of electric signals, the maximum luminance value is selected for each photoelectric conversion element (each pixel), and the selected maximum luminance value is set as the luminance value of the photoelectric conversion element (the pixel). The As a result, a signal composed of the maximum luminance value in each photoelectric conversion element (each pixel) is obtained, and the boundary position of the image circle of the imaging optical system is detected from the luminance distribution of this signal in the same manner as described above.
 次に、制御演算部4のパラメータ決定部44は、境界検出部43によって検出されたイメージサークルICの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定する。そして、パラメータ決定部44は、この決定したパラメータの値を画像補正部45へ通知する(S2)。このパラメータの決定は、例えば、次のように実行される。 Next, the parameter determination unit 44 of the control calculation unit 4 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle IC detected by the boundary detection unit 43. Then, the parameter determination unit 44 notifies the image correction unit 45 of the determined parameter value (S2). The determination of this parameter is executed as follows, for example.
 例えば、本実施形態では、パラメータの決定の際に、上述のようにパラメータ記憶部52に予め記憶されたルックアップテーブルのパラメータテーブルPTが用いられる。より具体的には、パラメータ決定部44は、境界検出部43によって検出されたイメージサークルICの境界位置をキーに、境界位置xフィールド5211および境界位置yフィールド5212を検索し、境界検出部43によって検出されたイメージサークルICの境界位置を境界位置xフィールド5211および境界位置yフィールド5212に登録するレコードを見つけ出す。そして、パラメータ決定部44は、この見つけ出したレコードにおける焦点距離fxフィールド5231、焦点距離fyフィールド5232、画像中心cxフィールド5241、画像中心cyフィールド5242、歪係数k1フィールド5251、歪係数k2フィールド5252、歪係数p1フィールド5253および歪係数p2フィールド5254の各フィールドから、それぞれ、焦点距離fx、焦点距離fy、画像中心位置cx、画像中心位置cy、第1歪係数k1、第2歪係数k2、第3歪係数p1および第4歪係数p2の各パラメータ値を取得する。これによって各パラメータの各値が決定される。 For example, in the present embodiment, the parameter table PT of the lookup table stored in advance in the parameter storage unit 52 as described above is used when determining the parameters. More specifically, the parameter determination unit 44 searches the boundary position x field 5211 and the boundary position y field 5212 using the boundary position of the image circle IC detected by the boundary detection unit 43 as a key. A record in which the boundary position of the detected image circle IC is registered in the boundary position x field 5211 and the boundary position y field 5212 is found. Then, the parameter determination unit 44 performs the focal length fx field 5231, focal length fy field 5232, image center cx field 5241, image center cy field 5242, distortion coefficient k1 field 5251, distortion coefficient k2 field 5252, distortion in the found record. From the coefficient p1 field 5253 and the distortion coefficient p2 field 5254, the focal length fx, the focal length fy, the image center position cx, the image center position cy, the first distortion coefficient k1, the second distortion coefficient k2, and the third distortion, respectively. The parameter values of the coefficient p1 and the fourth distortion coefficient p2 are acquired. Thereby, each value of each parameter is determined.
 次に、画像補正部45は、パラメータ決定部44によって決定された各パラメータの各値を用いて画像生成部42で生成され画像記憶部51に記憶されている被写体の画像を補正する(S3)。 Next, the image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using each value of each parameter determined by the parameter determination unit 44 (S3). .
 例えば、撮像光学系1における焦点距離のずれを補正する場合、焦点距離のずれは、画像の大きさの変化として画像に現れる。このため、画像補正部45は、パラメータ決定部44で決定された焦点距離fxおよび焦点距離fyを用いた、焦点距離のずれ量に応じた拡大率で画像を拡大縮小処理する公知の補正方法によって、画像を補正する。なお、拡大率が1より大きい場合には、前記拡大縮小処理は、拡大処理となり、拡大率が1である場合には、前記拡大縮小処理は、等倍処理となり、そして、拡大率が1未満である場合には、前記拡大縮小処理は、縮小処理となる。 For example, when correcting the focal length shift in the imaging optical system 1, the focal length shift appears in the image as a change in the size of the image. For this reason, the image correcting unit 45 uses a focal length fx and a focal length fy determined by the parameter determining unit 44, and uses a known correction method for enlarging / reducing the image with an enlargement ratio corresponding to the focal length shift amount. , Correct the image. If the enlargement ratio is greater than 1, the enlargement / reduction process is an enlargement process. If the enlargement ratio is 1, the enlargement / reduction process is an equal magnification process, and the enlargement ratio is less than 1. In this case, the enlargement / reduction process is a reduction process.
 また例えば、撮像光学系1における中心位置のずれを補正する場合、中心位置のずれは、ステレオカメラでは、基線長のずれとして現れる。このため、画像補正部45は、パラメータ決定部44で決定された画像中心位置cxおよび画像中心位置cyを用いた中心位置のずれ量に応じて基線長を補正する公知の補正方法によって、画像を補正する。中心位置のずれによって基線長が本来の基線長より長くずれている場合には、前記中心位置のずれ量に応じて基線長が短く補正され、中心位置のずれによって基線長が本来の基線長より短くずれている場合には、前記中心位置のずれ量に応じて基線長が長く補正される。また、中心位置のずれは、個眼像間の姿勢の位置ずれ(画像位置の偏り)として現れる。このため、画像補正部45は、パラメータ決定部44で決定された画像中心位置cxおよび画像中心位置cyを用いて、姿勢の位置を公知の補正方法で補正することによって、画像を補正する。 Also, for example, when correcting the shift of the center position in the imaging optical system 1, the shift of the center position appears as a shift in the baseline length in the stereo camera. For this reason, the image correction unit 45 uses the known correction method for correcting the baseline length according to the image center position cx determined by the parameter determination unit 44 and the shift amount of the center position using the image center position cy. to correct. When the baseline length is shifted longer than the original baseline length due to the shift of the center position, the baseline length is corrected to be shorter according to the shift amount of the center position, and the baseline length is longer than the original baseline length due to the shift of the center position. In the case of a short shift, the base line length is corrected to be long according to the shift amount of the center position. Further, the shift of the center position appears as a positional shift of the posture between the single-eye images (image position bias). Therefore, the image correction unit 45 corrects the image by correcting the position of the posture by a known correction method using the image center position cx and the image center position cy determined by the parameter determination unit 44.
 また例えば、撮像光学系1における歪みを補正する場合、歪みは、画像の歪み、特に画像における周辺部分の歪みとなって現れる。このため、画像補正部45は、パラメータ決定部44で決定された第1ないし第4歪係数k1、k2、p1、p2を用いて、歪んだ画像を肉眼で見える光景と同様な相似形の略歪みのない自然な画像に補正する公知の補正方法によって、画像を補正する。 Also, for example, when correcting distortion in the imaging optical system 1, the distortion appears as image distortion, particularly distortion in the peripheral portion of the image. For this reason, the image correction unit 45 uses the first to fourth distortion coefficients k1, k2, p1, and p2 determined by the parameter determination unit 44, and has an abbreviated shape similar to a scene in which a distorted image can be seen with the naked eye. The image is corrected by a known correction method for correcting a natural image without distortion.
 また例えば、回転を補正する場合、画像補正部45は、パラメータ決定部44で決定されたパラメータ値を用いた、画像を回転するための行列によって画像を回転させる公知の補正方法によって、画像を補正する。また例えば、並進を補正する場合、画像補正部45は、パラメータ決定部44で決定されたパラメータ値を用いた、画像を並進するための行列によって画像を並進させる公知の補正方法によって、画像を補正する。 Further, for example, when correcting the rotation, the image correction unit 45 corrects the image by a known correction method that rotates the image by a matrix for rotating the image using the parameter value determined by the parameter determination unit 44. To do. Further, for example, when correcting the translation, the image correction unit 45 corrects the image by a known correction method that translates the image by a matrix for translating the image using the parameter value determined by the parameter determination unit 44. To do.
 また例えば、ボケを補正する場合、画像補正部45は、パラメータ決定部44で決定されたボケ係数を用いて超解像処理におけるボケ推定の係数を補正し、そして、公知の超解像処理を行うことによって、画像を補正する。この超解像処理は、複数の低解像度の画像から1つの高解像度の画像を生成する技術であり、例えば、重なりを持つ複数の低解像度画像をサブピクセル精度で位置合わせを行うことによって高解像度画像の画素値を補完し、その後、ボケやノイズ等を除去することで、1つの高解像度画像を生成する。このような超解像処理は、例えば、特開2003-141529号公報、特開2009-65535号公報および特開2011-182237号公報等に開示されている。なお、ボケは、上述の焦点距離のずれの補正、中心位置のずれの補正および歪みの補正等の内部パラメータに関する画像補正の処理によっても生じる。このため、ボケの補正は、内部パラメータに関する画像補正の処理後に実行されることが好ましい。 Further, for example, when correcting blur, the image correction unit 45 corrects the blur estimation coefficient in the super-resolution processing using the blur coefficient determined by the parameter determination unit 44, and performs known super-resolution processing. By doing so, the image is corrected. This super-resolution processing is a technique for generating a single high-resolution image from a plurality of low-resolution images. For example, the high-resolution processing is performed by aligning a plurality of overlapping low-resolution images with sub-pixel accuracy. One pixel image value is complemented, and then one high-resolution image is generated by removing blur, noise, and the like. Such super-resolution processing is disclosed in, for example, Japanese Patent Application Laid-Open Nos. 2003-141529, 2009-65535, and 2011-182237. Note that the blur is also caused by the image correction processing related to the internal parameters such as the correction of the shift of the focal length, the correction of the shift of the center position, and the correction of the distortion. For this reason, it is preferable that the blur correction is performed after the image correction processing related to the internal parameters.
 このような補正処理を行うことによって撮像光学系1における光学特性の変化に応じて画像を補正することができる。 By performing such correction processing, it is possible to correct an image according to a change in optical characteristics in the imaging optical system 1.
 一具体例では、撮像光学系1の温度が約20℃から約60℃に変化すると、図9Aに示すように、イメージサークルICの大きさは、大きくなり、イメージサークルICの境界位置は、シフトする。この結果、輝度分布も図9Bに示すように、シフトする。このシフトしたイメージサークルICの境界位置(X、4)は、処理S1で境界検出部43によって例えば座標(7、4)と検出される。この結果、処理S2では、座標(7、4)でパラメータテーブルPTがパラメータ決定部44によって検索され、座標(7、4)に対応するレコードの各フィールド5231~5254から、焦点距離fx、焦点距離fy、画像中心位置cx、画像中心位置cy、第1歪係数k1、第2歪係数k2、第3歪係数p1および第4歪係数p2の各パラメータ値fx4、fy4、cx4、xy4、k14、k24、p14およびp24が取得され、各パラメータの各値が決定される。そして、処理S3では、これら各パラメータ値fx4、fy4、cx4、xy4、k14、k24、p14およびp24が用いられ、被写体の画像が補正される。 In one specific example, when the temperature of the imaging optical system 1 changes from about 20 ° C. to about 60 ° C., the size of the image circle IC increases as shown in FIG. 9A, and the boundary position of the image circle IC shifts. To do. As a result, the luminance distribution is also shifted as shown in FIG. 9B. The boundary position (X, 4) of the shifted image circle IC is detected as, for example, coordinates (7, 4) by the boundary detection unit 43 in process S1. As a result, in the process S2, the parameter table PT is retrieved by the parameter determination unit 44 at the coordinates (7, 4), and the focal length fx and the focal length are obtained from the fields 5231 to 5254 of the record corresponding to the coordinates (7, 4). fy, image center position cx, image center position cy, first distortion coefficient k1, second distortion coefficient k2, third distortion coefficient p1 and fourth distortion coefficient p2 parameter values fx4, fy4, cx4, xy4, k14, k24 , P14 and p24 are obtained, and each value of each parameter is determined. In step S3, the parameter values fx4, fy4, cx4, xy4, k14, k24, p14, and p24 are used to correct the image of the subject.
 なお、上述では、被写体の画像の形成ごとにパラメータが決定されたが、パラメータの決定のタイミングは、これに限定されるものではない。例えば、パラメータは、撮像装置CAの起動の際に決定され、撮像装置CAの電源がオフされるまで、このパラメータが使用されてよい。また例えば、パラメータは、予め設定された所定の時間間隔で決定されて更新されてよい。また例えば、パラメータは、前記図略のシャッタボタンが操作されることによって形成された被写体の画像が予め設定された所定の枚数に達するごとに決定されて更新されてよい。 In the above description, the parameter is determined for each formation of the image of the subject, but the parameter determination timing is not limited to this. For example, the parameter is determined when the imaging apparatus CA is activated, and this parameter may be used until the imaging apparatus CA is turned off. For example, the parameter may be determined and updated at a predetermined time interval set in advance. Further, for example, the parameter may be determined and updated every time a subject image formed by operating a shutter button (not shown) reaches a predetermined number.
 このように本実施形態における撮像装置CAおよび画像処理部2は、撮像素子3から出力された電気信号に基づいて撮像光学系1のイメージサークルICの境界位置を検出し、この検出したイメージサークルICの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定し、そして、この決定した所定のパラメータを用いて被写体の画像を補正する。このため、このような本実施形態における撮像装置CAおよび画像処理部2は、画像補正処理に関するパラメータの決定に、撮像光学系1が本来的に有しているイメージサークルICを利用するので、従来技術のような特別な光学部材を設けることなくレンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる。 As described above, the imaging device CA and the image processing unit 2 in the present embodiment detect the boundary position of the image circle IC of the imaging optical system 1 based on the electrical signal output from the imaging device 3, and the detected image circle IC. Based on the boundary position, a predetermined parameter relating to a predetermined image correction process is determined, and the image of the subject is corrected using the determined predetermined parameter. For this reason, since the imaging apparatus CA and the image processing unit 2 in this embodiment use the image circle IC inherently included in the imaging optical system 1 to determine parameters relating to image correction processing, It is possible to detect a change in the optical characteristic of the lens without providing a special optical member as in the technique, and to correct an image in accordance with the change in the optical characteristic.
 また、本実施形態における撮像装置CAおよび画像処理部2は、複数の光学像それぞれを撮像素子3で変換することによって得られた複数の電気信号に基づいて撮像光学系1のイメージサークルICの境界位置を検出するので、より高精度にイメージサークルICの境界位置を検出できる。したがって、このような本実施形態における撮像装置CAおよび画像処理部2は、より高精度に画像の補正を行うことができる。 In addition, the imaging apparatus CA and the image processing unit 2 according to the present embodiment have boundaries between the image circle ICs of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging element 3. Since the position is detected, the boundary position of the image circle IC can be detected with higher accuracy. Therefore, the imaging apparatus CA and the image processing unit 2 in this embodiment can correct an image with higher accuracy.
 また、本実施形態における撮像装置CAは、撮像光学系1が径の小さいレンズを並べたレンズアレイを含むので、撮像光学系1のバックフォーカスを短くすることができ、低背化が可能となる。このような撮像装置CAは、特に、薄型化の進む携帯電話やスマートフォンに好適である。 Further, the imaging apparatus CA in the present embodiment includes a lens array in which the imaging optical system 1 is arranged with lenses having small diameters, so that the back focus of the imaging optical system 1 can be shortened and the height can be reduced. . Such an imaging apparatus CA is particularly suitable for mobile phones and smartphones that are becoming thinner.
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 This specification discloses various modes of technology as described above, and the main technologies are summarized below.
 一態様にかかる画像処理装置は、撮像光学系によって受光面に結像された被写体の光学像を電気信号に変換する撮像素子と、前記撮像素子から出力された電気信号に基づいて前記被写体の画像を生成する画像生成部と、前記撮像素子から出力された電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出する境界検出部と、前記境界検出部によって検出された前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定するパラメータ決定部と、前記パラメータ決定部によって決定された前記所定のパラメータを用いて前記画像生成部で生成された前記被写体の画像を補正する画像補正部とを備える。 An image processing apparatus according to an aspect includes an imaging device that converts an optical image of a subject formed on a light receiving surface by an imaging optical system into an electrical signal, and an image of the subject based on the electrical signal output from the imaging device. An image generation unit that generates the image circle, a boundary detection unit that detects a boundary position of the image circle of the imaging optical system based on the electrical signal output from the imaging element, and the image circle detected by the boundary detection unit A parameter determining unit that determines a predetermined parameter related to a predetermined image correction process based on the boundary position, and an image of the subject generated by the image generating unit using the predetermined parameter determined by the parameter determining unit. An image correction unit for correction.
 撮像光学系のイメージサークルは、撮像光学系に含まれるレンズを通った光が像を結ぶ円形の範囲である。このため、イメージサークルの境界は、撮像光学系に含まれるレンズの光学特性の影響を受ける。上記画像処理装置は、撮像素子から出力された電気信号に基づいて撮像光学系のイメージサークルの境界位置を検出し、この検出した前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定し、そして、この決定した所定のパラメータを用いて被写体の画像を補正する。このように上記画像処理装置は、画像補正処理に関するパラメータの決定に、撮像光学系が本来的に有しているイメージサークルを利用するので、例えば前記特許文献1に開示の遮光壁等のような特別な光学部材を設けることなく、レンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる。 The image circle of the imaging optical system is a circular range in which light passing through a lens included in the imaging optical system connects the images. For this reason, the boundary of the image circle is affected by the optical characteristics of the lens included in the imaging optical system. The image processing device detects a boundary position of an image circle of the imaging optical system based on an electric signal output from the image sensor, and performs a predetermined image correction process based on the detected boundary position of the image circle. The parameter is determined, and the image of the subject is corrected using the determined predetermined parameter. As described above, since the image processing apparatus uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, for example, the light shielding wall disclosed in Patent Document 1 is used. Without providing a special optical member, a change in the optical characteristics of the lens can be detected, and an image can be corrected according to the change in the optical characteristics.
 他の一態様では、上述の画像処理装置において、前記境界検出部は、複数の光学像それぞれを前記撮像素子で変換することによって得られた複数の電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出する。好ましくは、前記境界検出部は、前記複数の電気信号を積分することによって得られた積分信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出するものである。また好ましくは、前記境界検出部は、前記複数の電気信号の中から対応画素での最大の輝度値を選択し、前記選択した最大の輝度値に基づいて前記撮像光学系のイメージサークルの境界位置を検出するものである。 In another aspect, in the above-described image processing apparatus, the boundary detection unit includes an image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging element. The boundary position of is detected. Preferably, the boundary detection unit detects a boundary position of an image circle of the imaging optical system based on an integrated signal obtained by integrating the plurality of electrical signals. Preferably, the boundary detection unit selects a maximum luminance value at a corresponding pixel from the plurality of electric signals, and based on the selected maximum luminance value, a boundary position of an image circle of the imaging optical system Is detected.
 このような画像処理装置は、複数の光学像それぞれを前記撮像素子で変換することによって得られた複数の電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出するので、より高精度に前記イメージサークルの境界位置を検出できる。したがって、このような画像処理装置は、より高精度に画像の補正を行うことができる。 Such an image processing apparatus detects the boundary position of the image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging device, so that it has higher accuracy. In addition, the boundary position of the image circle can be detected. Therefore, such an image processing apparatus can correct an image with higher accuracy.
 他の一態様では、上述の画像処理装置において、前記所定のパラメータは、焦点位置、中心位置、歪み係数、並進係数、回転係数およびボケ係数のうちの1または複数である。 In another aspect, in the above-described image processing apparatus, the predetermined parameter is one or more of a focus position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient.
 これによれば、焦点位置、中心位置、歪み係数、並進係数、回転係数およびボケ係数のうちの1または複数を前記所定のパラメータとして画像の補正を行う画像処理装置が提供される。 According to this, there is provided an image processing apparatus that corrects an image using one or more of a focal position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient as the predetermined parameter.
 また、他の一態様にかかる画像処理方法は、撮像光学系によって撮像素子の受光面に結像された被写体の光学像を電気信号に変換する撮像工程と、前記撮像工程によって得られた電気信号に基づいて前記被写体の画像を生成する画像生成工程と、前記撮像工程によって得られた電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出する境界検出工程と、前記境界検出工程によって検出された前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定するパラメータ決定工程と、前記パラメータ決定工程によって決定された前記所定のパラメータを用いて前記画像生成部で生成された前記被写体の画像を補正する画像補正工程とを備える。 An image processing method according to another aspect includes an imaging step of converting an optical image of a subject formed on a light receiving surface of an imaging element by an imaging optical system into an electrical signal, and an electrical signal obtained by the imaging step. An image generation step of generating an image of the subject based on the boundary, a boundary detection step of detecting a boundary position of an image circle of the imaging optical system based on the electrical signal obtained by the imaging step, and the boundary detection step A parameter determining step for determining a predetermined parameter relating to a predetermined image correction process based on the detected boundary position of the image circle, and the image generating unit using the predetermined parameter determined by the parameter determining step And an image correction process for correcting the image of the subject.
 このような画像処理方法は、撮像素子から出力された電気信号に基づいて撮像光学系のイメージサークルの境界位置を検出し、この検出した前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定し、そして、この決定した所定のパラメータを用いて被写体の画像を補正する。このように上記画像処理方法は、画像補正処理に関するパラメータの決定に、撮像光学系が本来的に有しているイメージサークルを利用するので、特別な光学部材を設けることなくレンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる。 Such an image processing method detects a boundary position of an image circle of an imaging optical system based on an electric signal output from an image sensor, and relates to a predetermined image correction process based on the detected boundary position of the image circle. A predetermined parameter is determined, and the image of the subject is corrected using the determined predetermined parameter. As described above, since the image processing method uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, the optical characteristics of the lens can be changed without providing a special optical member. Can be detected, and the image can be corrected in accordance with the change in the optical characteristics.
 また、他の一態様にかかる撮像装置は、被写体の光学像を所定面に結像する撮像光学系と、前記被写体の画像を生成する画像処理部とを備え、前記画像処理部は、前記撮像素子の受光面が前記撮像光学系の所定面となるように組み付けられた上述のいずれかの画像処理装置である。 An imaging apparatus according to another aspect includes an imaging optical system that forms an optical image of a subject on a predetermined surface, and an image processing unit that generates the image of the subject, wherein the image processing unit includes the imaging unit. The image processing apparatus according to any one of the above, wherein the light receiving surface of the element is assembled so as to be a predetermined surface of the imaging optical system.
 このような撮像装置は、上述のいずれかの画像処理装置を用いるので、特別な光学部材を設けることなくレンズの光学特性の変化を検出し、前記光学特性の変化に応じた画像の補正を行うことができる。 Since such an image pickup apparatus uses any of the above-described image processing apparatuses, it detects a change in the optical characteristics of the lens without providing a special optical member, and corrects the image in accordance with the change in the optical characteristics. be able to.
 また、他の態様では、上述の撮像装置において、前記撮像光学系は、アレイ状に配列された複数のレンズを備えるレンズアレイを含む。 In another aspect, in the above-described imaging device, the imaging optical system includes a lens array including a plurality of lenses arranged in an array.
 このような撮像装置は、撮像光学系が径の小さいレンズを並べたレンズアレイを含むので、撮像光学系のバックフォーカスを短くすることができ、低背化が可能となる。このような撮像装置は、特に、薄型化の進む携帯電話やスマートフォンに好適である。 Such an image pickup apparatus includes a lens array in which lenses having small diameters are arranged in the image pickup optical system, so that the back focus of the image pickup optical system can be shortened and the height can be reduced. Such an imaging device is particularly suitable for mobile phones and smartphones that are becoming thinner.
 この出願は、2013年3月5日に出願された日本国特許出願特願2013-42771を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2013-42771 filed on March 5, 2013, the contents of which are included in the present application.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 In order to express the present invention, the present invention has been properly and fully described through the embodiments with reference to the drawings. However, those skilled in the art can easily change and / or improve the above-described embodiments. It should be recognized that this is possible. Therefore, unless the modifications or improvements implemented by those skilled in the art are at a level that departs from the scope of the claims recited in the claims, the modifications or improvements are not covered by the claims. To be construed as inclusive.
 本発明によれば、画像処理装置および画像処理方法ならびに撮像装置を提供することができる。 According to the present invention, an image processing device, an image processing method, and an imaging device can be provided.

Claims (8)

  1.  撮像光学系によって受光面に結像された被写体の光学像を電気信号に変換する撮像素子と、
     前記撮像素子から出力された電気信号に基づいて前記被写体の画像を生成する画像生成部と、
     前記撮像素子から出力された電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出する境界検出部と、
     前記境界検出部によって検出された前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定するパラメータ決定部と、
     前記パラメータ決定部によって決定された前記所定のパラメータを用いて前記画像生成部で生成された前記被写体の画像を補正する画像補正部とを備えること
     を特徴とする画像処理装置。
    An image sensor that converts an optical image of a subject imaged on a light receiving surface by an imaging optical system into an electrical signal;
    An image generation unit that generates an image of the subject based on an electrical signal output from the image sensor;
    A boundary detection unit that detects a boundary position of an image circle of the imaging optical system based on an electrical signal output from the imaging element;
    A parameter determination unit that determines a predetermined parameter related to a predetermined image correction process based on a boundary position of the image circle detected by the boundary detection unit;
    An image processing apparatus comprising: an image correction unit that corrects the image of the subject generated by the image generation unit using the predetermined parameter determined by the parameter determination unit.
  2.  前記境界検出部は、複数の光学像それぞれを前記撮像素子で変換することによって得られた複数の電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出すること
     を特徴とする請求項1に記載の画像処理装置。
    The boundary detection unit detects a boundary position of an image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging device. The image processing apparatus according to 1.
  3.  前記境界検出部は、前記複数の電気信号を積分することによって得られた積分信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出すること
     を特徴とする請求項2に記載の画像処理装置。
    The image processing according to claim 2, wherein the boundary detection unit detects a boundary position of an image circle of the imaging optical system based on an integrated signal obtained by integrating the plurality of electrical signals. apparatus.
  4.  前記境界検出部は、前記複数の電気信号の中から対応画素での最大の輝度値を選択し、前記選択した最大の輝度値に基づいて前記撮像光学系のイメージサークルの境界位置を検出すること
     を特徴とする請求項2に記載の画像処理装置。
    The boundary detection unit selects a maximum luminance value at a corresponding pixel from the plurality of electrical signals, and detects a boundary position of an image circle of the imaging optical system based on the selected maximum luminance value. The image processing apparatus according to claim 2.
  5.  前記所定のパラメータは、焦点位置、中心位置、歪み係数、並進係数、回転係数およびボケ係数のうちの1または複数であること
     を特徴とする請求項1ないし請求項4のいずれか1項に記載の画像処理装置。
    The predetermined parameter is one or more of a focus position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient. Image processing apparatus.
  6.  撮像光学系によって撮像素子の受光面に結像された被写体の光学像を電気信号に変換する撮像工程と、
     前記撮像工程によって得られた電気信号に基づいて前記被写体の画像を生成する画像生成工程と、
     前記撮像工程によって得られた電気信号に基づいて前記撮像光学系のイメージサークルの境界位置を検出する境界検出工程と、
     前記境界検出工程によって検出された前記イメージサークルの境界位置に基づいて所定の画像補正処理に関する所定のパラメータを決定するパラメータ決定工程と、
     前記パラメータ決定工程によって決定された前記所定のパラメータを用いて前記画像生成部で生成された前記被写体の画像を補正する画像補正工程とを備えること
     を特徴とする画像処理方法。
    An imaging step of converting an optical image of a subject imaged on a light receiving surface of an image sensor by an imaging optical system into an electrical signal;
    An image generation step of generating an image of the subject based on the electrical signal obtained by the imaging step;
    A boundary detection step of detecting a boundary position of an image circle of the imaging optical system based on the electrical signal obtained by the imaging step;
    A parameter determining step for determining a predetermined parameter related to a predetermined image correction process based on a boundary position of the image circle detected by the boundary detecting step;
    An image processing method comprising: an image correction step of correcting the image of the subject generated by the image generation unit using the predetermined parameter determined by the parameter determination step.
  7.  被写体の光学像を所定面に結像する撮像光学系と、
     前記被写体の画像を生成する画像処理部とを備え、
     前記画像処理部は、前記撮像素子の受光面が前記撮像光学系の所定面となるように組み付けられた請求項1ないし請求項5のいずれか1項に記載の画像処理装置であること
     を特徴とする撮像装置。
    An imaging optical system that forms an optical image of a subject on a predetermined surface;
    An image processing unit for generating an image of the subject,
    The image processing device according to any one of claims 1 to 5, wherein the image processing unit is assembled so that a light receiving surface of the imaging element is a predetermined surface of the imaging optical system. An imaging device.
  8.  前記撮像光学系は、アレイ状に配列された複数のレンズを備えるレンズアレイを含むこと
     を特徴とする請求項7に記載の撮像装置。
    The imaging apparatus according to claim 7, wherein the imaging optical system includes a lens array including a plurality of lenses arranged in an array.
PCT/JP2014/000889 2013-03-05 2014-02-20 Image processing device, image processing method and imaging device WO2014136392A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015504151A JPWO2014136392A1 (en) 2013-03-05 2014-02-20 Image processing apparatus, image processing method, and imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-042771 2013-03-05
JP2013042771 2013-03-05

Publications (1)

Publication Number Publication Date
WO2014136392A1 true WO2014136392A1 (en) 2014-09-12

Family

ID=51490931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/000889 WO2014136392A1 (en) 2013-03-05 2014-02-20 Image processing device, image processing method and imaging device

Country Status (2)

Country Link
JP (1) JPWO2014136392A1 (en)
WO (1) WO2014136392A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020005840A (en) * 2018-07-06 2020-01-16 株式会社ユニバーサルエンターテインメント Game machine and game device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011114794A (en) * 2009-11-30 2011-06-09 Canon Inc Photographic apparatus
JP2011147079A (en) * 2010-01-18 2011-07-28 Ricoh Co Ltd Image pickup device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011114794A (en) * 2009-11-30 2011-06-09 Canon Inc Photographic apparatus
JP2011147079A (en) * 2010-01-18 2011-07-28 Ricoh Co Ltd Image pickup device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020005840A (en) * 2018-07-06 2020-01-16 株式会社ユニバーサルエンターテインメント Game machine and game device
JP7049200B2 (en) 2018-07-06 2022-04-06 株式会社ユニバーサルエンターテインメント Gaming machines and equipment for gaming

Also Published As

Publication number Publication date
JPWO2014136392A1 (en) 2017-02-09

Similar Documents

Publication Publication Date Title
US9386216B2 (en) Imaging device, defocus amount calculating method, and imaging optical system
US9407884B2 (en) Image pickup apparatus, control method therefore and storage medium employing phase difference pixels
US9288384B2 (en) Image device and focus control method
JP6031587B2 (en) Imaging apparatus, signal processing method, and signal processing program
JP6173156B2 (en) Image processing apparatus, imaging apparatus, and image processing method
WO2013168505A1 (en) Imaging device and signal correction method
EP3606027B1 (en) Electronic device with camera based ambient light detection
JP5677625B2 (en) Signal processing apparatus, imaging apparatus, and signal correction method
JP6307526B2 (en) Imaging apparatus and focus control method
CN104813212B (en) Imaging device and exposure determination method
KR102316448B1 (en) Image apparatus and depth caculation method thereof
CN103842879A (en) Imaging device, and method for calculating sensitivity ratio of phase difference pixels
US20220006956A1 (en) Camera module, method of correcting movement of the module, image stabilization device therefor
US20170155889A1 (en) Image capturing device, depth information generation method and auto-calibration method thereof
US20170187951A1 (en) Imaging device and focus control method
KR20130104756A (en) Image apparatus and image sensor thereof
JP6462189B2 (en) Focus control device, focus control method, focus control program, lens device, imaging device
JP5493900B2 (en) Imaging device
US20170187950A1 (en) Imaging device and focus control method
KR20130084415A (en) Camera module, auto focus method and auto focus calibration method
JP6136583B2 (en) Image processing apparatus, imaging apparatus, and image processing program
WO2014136392A1 (en) Image processing device, image processing method and imaging device
KR101797080B1 (en) Camera module and method for correcting lens shading thereof
WO2020021887A1 (en) Image processing apparatus, imaging apparatus, image processing method, and program
WO2018051699A1 (en) Image capturing device, image capturing method, and image capturing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14760310

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015504151

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14760310

Country of ref document: EP

Kind code of ref document: A1