WO2014136392A1 - Dispositif de traitement d'image, procédé de traitement d'image, et dispositif de formation d'image - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, et dispositif de formation d'image Download PDF

Info

Publication number
WO2014136392A1
WO2014136392A1 PCT/JP2014/000889 JP2014000889W WO2014136392A1 WO 2014136392 A1 WO2014136392 A1 WO 2014136392A1 JP 2014000889 W JP2014000889 W JP 2014000889W WO 2014136392 A1 WO2014136392 A1 WO 2014136392A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
optical system
circle
boundary
Prior art date
Application number
PCT/JP2014/000889
Other languages
English (en)
Japanese (ja)
Inventor
壮功 北田
高山 淳
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2015504151A priority Critical patent/JPWO2014136392A1/ja
Publication of WO2014136392A1 publication Critical patent/WO2014136392A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing an image.
  • the present invention relates to an image processing apparatus and an imaging apparatus using the image processing method.
  • imaging devices have been used as information acquisition means in portable devices, in-vehicle devices, medical devices, industrial devices, and the like.
  • imaging devices employ lenses made of resin materials such as plastics because they are relatively inexpensive and can be produced in large quantities.
  • this resin material lens has the above-mentioned advantages, it has the disadvantage that it is easily affected by temperature changes, and the optical characteristics (focal length, image center, distortion coefficient, etc.) of the lens change due to the temperature changes. Yes.
  • the image generation process is generally performed on the assumption that these optical characteristics do not change. Changes in the optical characteristics directly lead to degradation of image quality. If a change in optical characteristics can be detected and a parameter suitable for the change can be used, an image with reduced image quality can be obtained.
  • Patent Document 1 discloses a technique for correcting such a change in optical characteristics due to a temperature change.
  • the imaging device disclosed in Patent Document 1 is disposed at a position facing a subject, and is provided with a lens array in which a plurality of lenses are arranged in an array, an image plane side of the lens array, and the plurality of lenses.
  • an imaging apparatus comprising: an imaging element that captures a compound eye image that is a set of reduced images (single-eye images) of the imaged subject; and an arithmetic unit that processes the compound eye image captured by the imaging element.
  • a light-shielding wall that prevents crosstalk of light rays between adjacent lenses constituting the lens array, and the computing unit includes a center position and a focal point of each lens based on luminance information incident on the image sensor.
  • the internal parameters related to the distance and the lens distortion are determined, and the distortion amount due to the ambient temperature of the light shielding wall is calculated and corrected.
  • Zhang disclosed in the nonpatent literature 1 is proposed as a method of calculating
  • the image pickup apparatus disclosed in Patent Document 1 removes the influence of temperature change by calculating the position of each lens based on the luminance information of the light shielding wall imaged by the image sensor. Therefore, in the imaging device disclosed in Patent Document 1, the light shielding wall is indispensable in order to remove the influence due to the temperature change. For this reason, in the imaging device disclosed in Patent Document 1, the number of manufacturing steps and costs for the light shielding wall are increased.
  • the present invention has been made in view of the above circumstances, and its purpose is to detect a change in optical characteristics of a lens without providing a special optical member, and to correct an image in accordance with the change in the optical characteristics.
  • An image processing apparatus, an image processing method, and an imaging apparatus capable of performing the above are provided.
  • the imaging optical system is based on the electrical signal output from the imaging element by receiving the optical image of the subject imaged on the light receiving surface by the imaging optical system.
  • the image circle boundary position is detected, a predetermined parameter relating to a predetermined image correction process is determined based on the detected boundary position of the image circle, and using the determined predetermined parameter, The image is corrected. Therefore, since such an image processing apparatus, an image processing method, and an imaging apparatus use an image circle that is inherently included in the imaging optical system for determining parameters relating to image correction processing, a special optical member is provided. Without change, it is possible to detect a change in the optical characteristics of the lens and correct the image in accordance with the change in the optical characteristics.
  • 5 is a flowchart illustrating an operation related to image correction in the imaging apparatus. It is a figure for demonstrating the luminance distribution of an image circle in the said imaging device. It is a figure for demonstrating the determination method which determines the boundary position of an image circle in the said imaging device. It is a figure for demonstrating the data used for the determination of the boundary position of an image circle in the said imaging device. It is a figure for demonstrating the boundary position change and luminance distribution change of the image circle by a temperature change in the said imaging device.
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining a light receiving surface (imaging surface) of the image sensor in the imaging apparatus of the embodiment.
  • FIG. 3 is a diagram illustrating a configuration of a parameter table stored in the storage unit in the imaging apparatus according to the embodiment.
  • the imaging apparatus is an apparatus that captures an image of a subject using an imaging optical system and generates an image of the subject.
  • Such an imaging apparatus CA includes, for example, an imaging optical system 1 and an image processing unit 2 as shown in FIG.
  • the imaging apparatus CA may further include at least one of the input unit 6, the display unit 7, and the interface unit (IF unit) 8 as necessary.
  • the imaging optical system 1 is an optical system that forms an optical image of a subject on a predetermined surface, and includes one or more optical elements.
  • the imaging optical system 1 includes, for example, an optical aperture, one or a plurality of lenses, and a light shielding plate between the optical elements as necessary in order to prevent so-called stray light.
  • the imaging optical system 1 may be a monocular that forms one optical image (one image circle) of the subject, but in this embodiment, the imaging optical system 1 includes a plurality of lenses (single eyes). A compound eye that forms a plurality of optical images (a plurality of image circles) of the subject.
  • the plurality of lenses are arranged in an array in parallel so that their optical axes are parallel to each other.
  • the lens array may be a one-dimensional lens array including a plurality of lenses arranged in a line on a straight line in parallel in a one-dimensional array, but in this embodiment, the lens array is linearly independent in two directions More specifically, it is a two-dimensional lens array including a plurality of lenses arranged in a two-dimensional array in parallel in two directions orthogonal to each other.
  • the two-dimensional lens array may include a plurality of lenses having a honeycomb structure in the parallel direction (parallel direction).
  • the single-lens lens may have a single lens configuration or a multiple lens configuration.
  • the image processing unit 2 is a device that generates an image of a subject.
  • the image processing unit 2 includes an imaging device 3, a control calculation unit 4, and a storage unit 5.
  • the image pickup device 3 includes a plurality of photoelectric conversion elements (a plurality of pixels), and an optical image of a subject formed by the image pickup optical system 1 on the light receiving surface formed by the plurality of photoelectric conversion elements.
  • the image sensor 3 is connected to the control calculation unit 4. For example, R (red), G (green), and B according to the amount of light in the optical image of the subject imaged by the imaging optical system 1. Photo-electrically convert the image signal of each component of (blue) and output to the control calculation unit 4.
  • the optical image of the subject is guided to the light receiving surface of the image sensor 3 along the optical axis by the imaging optical system 1 at a predetermined magnification, and the optical image of the subject is captured by the image sensor 3.
  • the imaging optical system 1 since the imaging optical system 1 includes a two-dimensional lens array, two optical images (single-eye images) of subjects equal to the number of lenses in the two-dimensional lens array are present on the light receiving surface of the imaging element 3.
  • the images are formed in parallel in a two-dimensional array in a manner corresponding to the arrangement of a plurality of lenses in the two-dimensional array lens.
  • the light receiving surface (effective pixel region) of the image pickup device 3 is divided into a plurality of image pickup regions corresponding to the positions of the image circles formed on the light receiving surface by the lenses in the two-dimensional lens array.
  • the imaging area is an imaging area for each lens (each eye).
  • the two-dimensional array lens is composed of 16 lenses arranged in parallel in a 4 ⁇ 4 two-dimensional array in two XY directions orthogonal to each other
  • the light receiving surface (effective pixel region) 30 has 16 pieces corresponding to the positions of the 16 image circle ICs formed on the light receiving surface 30 by the 16 lenses in the two-dimensional lens array.
  • Imaging regions (imaging regions for each eye) 31 31-11 to 31-44), and an optical image of a subject is formed on each of these 16 imaging regions 31.
  • FIG. 2 only the image circle IC for the imaging region 31-11 in the first row and the first column is shown, and the others are omitted.
  • These 16 imaging regions 31 are arranged in parallel in a 4 ⁇ 4 two-dimensional array in two directions in the XY directions orthogonal to each other in accordance with the arrangement of the 16 lenses described above.
  • Each of these 16 imaging regions 31 is set in a rectangular shape inscribed in an image circle IC formed under a predetermined design condition (for example, room temperature 23 ° C.). That is, the imaging region 31 is set so that its diagonal line is the diameter of the image circle IC.
  • One imaging region 31 includes a plurality of pixels (photoelectric conversion elements) whose number is 1/16 or less of the total number of pixels of the imaging element 3.
  • the image sensor 3 includes a boundary position detection unit for detecting the boundary position of the image circle IC of the imaging optical system 1. More specifically, the light receiving surface (effective pixel region) 30 of the image pickup device 3 has a size including the boundary position of the image circle IC in a region adjacent to the image pickup region 31 as a boundary position detection unit. Area 32 is set. In the example shown in FIG. 2, one boundary position detection region 32 is set in a region adjacent to the imaging region 31-11 in the first row and the first column. As described above, the boundary position detection region 32 uses a photoelectric conversion element that is not used as the imaging region 31 on the light receiving surface 30.
  • a plurality of boundary position detection areas 32 may be set, for example, a plurality may be set for all the imaging areas 31, or a plurality may be set for the imaging area 31 in the first row, for example.
  • a plurality of settings may be set for the imaging region 31 in the first column, for example, a plurality of settings may be set for the imaging region 31 on the diagonal line, and for example, for the imaging region 31 every other row.
  • a plurality may be set for the imaging region 31 every other column.
  • the lens may expand only in the vertical direction or the horizontal direction. In such a case, the distortion amount can be accurately measured by arranging a plurality of detection areas in one row or one column.
  • the light receiving surface of one image sensor 3 is divided according to the number of lenses of the two-dimensional lens array.
  • an image sensor may be provided for each lens of the two-dimensional lens array.
  • 16 image sensors corresponding to the 16 lenses are provided.
  • the storage unit 5 is connected to the control calculation unit 4 and includes various programs such as a control program for controlling the entire imaging apparatus CA, an image forming program for forming an image of the subject, and an image correction program for correcting the image of the subject. And an element for storing various data such as data necessary for executing these various programs and data generated during the execution.
  • the storage unit 5 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element that stores the various programs, and an EEPROM (Electrically Erasable Programmable Read) that is a rewritable nonvolatile storage element that stores the various data.
  • the RAM of the storage unit 5 also functions as a so-called working memory of a CPU described later of the control calculation unit 4.
  • the storage unit 5 functionally stores an image storage unit 51 that stores image data, and a parameter storage unit 52 that stores parameters corresponding to each temperature (each boundary position of the image circle) in the imaging optical system 1 in advance. Is provided.
  • the predetermined parameter relating to the predetermined image correction process is a value used when correcting the image of the subject.
  • the predetermined parameter include an internal parameter, an external parameter, a blur coefficient, and the like.
  • the internal parameters are, for example, the focal position, the center position, the distortion coefficient, and the like of the imaging optical system 1, and the external parameters are the translation coefficient, the rotation coefficient, and the like of the imaging optical system 1.
  • the blur coefficient is a coefficient used in so-called super-resolution processing.
  • Each value of such a predetermined parameter is obtained in advance for each temperature of the imaging optical system 1. In the present embodiment, the temperature of the imaging optical system 1 is obtained at the boundary position of the image circle IC.
  • each value of the predetermined parameter is obtained in advance for each boundary position of the image circle IC in the imaging optical system 1.
  • each boundary position of the image circle IC in the imaging optical system 1 for each of the temperatures 0 ° C., 20 ° C., 40 ° C., 60 ° C., and 80 ° C. Is determined in advance, and each value of the predetermined parameter is determined in advance.
  • the internal parameter and the external parameter may be obtained in advance by, for example, the Zhang method disclosed in Non-Patent Document 1 described above. As a result, the correspondence between each boundary position of the image circle IC in the imaging optical system 1 and each value of the predetermined parameter is obtained in advance.
  • a predetermined function expression indicating the correspondence between each boundary position of the image circle and each value of the predetermined parameter is obtained from the correspondence created in advance and stored in the parameter storage unit 52.
  • the predetermined parameter relating to the predetermined image correction processing may be determined from the boundary position of the image circle by using the predetermined function formula created in advance. In the present embodiment, the predetermined parameter is generated in this way.
  • a lookup table indicating the correspondence relationship is created and stored in the parameter storage unit 52, and a predetermined parameter relating to a predetermined image correction process is obtained by using the previously created lookup table (parameter table). It is determined from the boundary position of the circle.
  • Such a parameter table PT includes, for example, a boundary position x field 5211 for registering a boundary position x in the x direction and a boundary position y field 5212 for registering a boundary position y in the y direction, as shown in FIG.
  • a temperature field 522 for registering the temperature a focal length fx field 5231 for registering the focal length fx in the x direction, a focal length fy field 5232 for registering the focal length fy in the y direction, and the image center in the x direction.
  • Distortion coefficient k2 field 5252 for registering coefficient k2 and distortion for registering third distortion coefficient p1 The number p1 field 5253, and includes various fields of the distortion coefficient p2 field 5254 for registering a fourth strain coefficient p2, and a record for each temperature.
  • each boundary position of the image circle and each value of the external parameter may be registered in the parameter table PT, and each boundary position of the image circle and each value of the blur coefficient May be registered in the parameter table PT.
  • the parameter table PT may be prepared for each lens (single eye), which is applied to all lenses in the lens array (common to all eyes).
  • the parameter table PT corresponding to the lens (single eye) for each lens (single eye) for each lens (single eye), it is possible to appropriately set the parameter value according to the optical characteristic of the lens with respect to the temperature change.
  • the control calculation unit 4 controls the operation of the entire imaging apparatus CA by controlling each unit in accordance with the function, and corrects the image by using a predetermined parameter related to a predetermined image correction process, while correcting the image of the subject. It is an apparatus to form.
  • the control calculation unit 4 includes, for example, a CPU (Central Processing Unit) and peripheral circuits thereof.
  • the control calculation unit 4 functionally includes a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 by executing various programs.
  • the control unit 41 controls the entire operation of the imaging apparatus CA by controlling each unit according to the function.
  • the image generation unit 42 generates an image of a subject based on the electrical signal output from the image sensor 3.
  • the image data of the generated image is stored in the image storage unit 51 of the storage unit 5.
  • the boundary detection unit 43 detects the boundary position of the image circle of the imaging optical system 1 based on the electrical signal output from the imaging device 3.
  • the parameter determination unit 44 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle detected by the boundary detection unit 43.
  • the image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using the predetermined parameter determined by the parameter determination unit 44.
  • the image data of the corrected image is stored in the image storage unit 51 of the storage unit 5.
  • the input unit 6 is a device for inputting various instructions such as an instruction for imaging conditions and an instruction to start imaging to the imaging apparatus CA from the outside, and is, for example, a rotary switch or a button switch.
  • the display unit 7 is a device for outputting instruction contents input from the input unit 6 and processing results of the control calculation unit 4 such as an image of a subject, and is an LCD (liquid crystal display), an organic EL display, or the like, for example. .
  • the display unit 7 may function as a so-called live view finder.
  • the input unit 6 may include, for example, a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method, and the position input device and the display unit 7 constitute a so-called touch panel. Good.
  • a position input device is provided on the display surface of the display unit, one or more input content candidates that can be input are displayed on the display unit, and the user touches the display position where the input content to be input is displayed. Then, the position is detected by the position input device, and the display content displayed at the detected position is input to the imaging device CA as the operation input content of the user.
  • an imaging apparatus CA that is easy for the user to handle is provided. And by changing the display content displayed on the display part 7, various input content can be input and it becomes possible to perform various operation with respect to imaging device CA.
  • the IF unit 6 is a circuit for inputting / outputting data to / from an external device.
  • an interface circuit using the IEEE802.11 series standard called Wi-Fi (Wireless fidelity), Bluetooth (registered trademark) An interface circuit using a standard, an interface circuit performing infrared communication such as an IrDA (Infrared Data Association) standard, and an interface circuit using a USB (Universal Serial Bus) standard.
  • FIG. 4 is a diagram for explaining a change in the image circle due to a temperature change in the imaging apparatus according to the embodiment.
  • 4A is a diagram for explaining a change in the size of the image circle due to a temperature change
  • FIG. 4B is a diagram for explaining a change in the center position of the image circle due to a temperature change.
  • the temperature of the image pickup optical system 1 changes due to a change in the ambient temperature used. For this reason, expansion / contraction or distortion may occur in the optical element of the imaging optical system 1.
  • the optical element of the imaging optical system 1 is a lens made of a resin material made of a resin material such as plastic
  • expansion and contraction and distortion are more likely to occur due to its temperature characteristics.
  • the expansion / contraction occurs in the optical element of the imaging optical system 1, as shown in FIG. 4A, the size (area) of the image circle IC expands and contracts, and the boundary position of the image circle IC changes accordingly. To do.
  • the temperature of the imaging optical system 1 changes from the temperature 20 ° C.
  • the image circle IC increases from the image circle IC (20) to the image circle IC (60).
  • IC (t) represents an image circle IC when the temperature is t ° C.
  • the center position of the image circle IC (image center position) is shifted, and the boundary position of the image circle IC changes accordingly. ,shift.
  • the center position (cx, cy) of the image circle IC is the center position (cx2) of the image circle IC (20).
  • Xy2) is shifted to the center position (cx4, xy4) of the image circle IC (60).
  • the boundary position of the image circle IC correlates with the temperature of the imaging optical system 1 and correlates with a predetermined parameter relating to image correction processing given for each temperature of the imaging optical system 1. For this reason, predetermined parameters relating to image correction processing for removing or reducing an error given to an image due to a temperature change are set by detecting the boundary position of the image circle IC by associating with each other in advance. It becomes possible to do.
  • the imaging apparatus CA of the present embodiment detects the boundary position of the image circle IC by operating as follows, determines the value of a predetermined parameter related to the image correction process, and determines this The subject image is corrected by using predetermined parameters.
  • FIG. 5 is a flowchart showing an operation related to image correction in the imaging apparatus of the embodiment.
  • FIG. 6 is a diagram for explaining the luminance distribution of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 6A is a diagram schematically showing the state of the light receiving surface of the image sensor when a subject having a substantially uniform luminance distribution is imaged
  • FIG. 6B is a luminance distribution along line X1-X2 shown in FIG. 6A.
  • the horizontal axis in FIG. 6B indicates the position on the X1-X2 line
  • the vertical axis indicates the luminance value.
  • FIG. 7 is a diagram for explaining a determination method for determining the boundary position of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 8 is a diagram for explaining data used for determining the boundary position of the image circle in the imaging apparatus according to the embodiment.
  • FIG. 8A shows the luminance distribution in the vicinity of the boundary of the image circle obtained when a subject having a uniform luminance distribution (uniform subject) is imaged.
  • FIG. 8B shows the luminance distribution BA in the vicinity of the boundary of the image circle obtained when a general subject (general subject A) having a nonuniform luminance distribution is imaged.
  • FIG. 8C shows the luminance distribution BB in the vicinity of the boundary of the image circle obtained when another general subject (general subject B) with a nonuniform luminance distribution is imaged.
  • FIG. 8A shows the luminance distribution in the vicinity of the boundary of the image circle obtained when a subject having a uniform luminance distribution (uniform subject) is imaged.
  • FIG. 8B shows the luminance distribution BA in the vicinity of the boundary of the image circle obtained when a general subject (general subject A) having a nonuniform luminance distribution is
  • FIG. 8D shows a luminance distribution BC in the vicinity of the boundary of the image circle obtained when a general subject (general subject C) having a nonuniform luminance distribution is imaged.
  • FIG. 8E shows an average luminance distribution Bave obtained by averaging the luminance distributions BA to BC shown in FIGS. 8B to 8D.
  • FIG. 9 is a diagram for explaining a change in the boundary position of the image circle and a change in luminance distribution due to a temperature change in the imaging apparatus according to the embodiment.
  • FIG. 9A shows a change in the boundary position of the image circle due to a temperature change
  • FIG. 9B shows a change in the luminance distribution of the image circle due to a temperature change.
  • the horizontal axis in FIG. 9B indicates the position, and the vertical axis indicates the luminance value.
  • control calculation unit 4 controls the control program and the image correction program stored in the storage unit 5. Etc. are executed.
  • a control unit 41, an image generation unit 42, a boundary detection unit 43, a parameter determination unit 44, and an image correction unit 45 are functionally configured in the control calculation unit 4.
  • the imaging apparatus CA when an unillustrated shutter button included in the input unit 6 in the imaging apparatus CA is operated, the imaging apparatus CA generates an image of the subject. More specifically, a light beam from the subject enters the imaging optical system 1, and an optical image of the subject is formed on the light receiving surface 30 of the imaging element 3 by the imaging optical system 1.
  • the imaging optical system 1 since the imaging optical system 1 includes the two-dimensional lens array as described above, an optical image (single-eye image) of a plurality of subjects with the size of the image circle IC is formed on the light receiving surface 30 of the imaging element 3. Is imaged.
  • the image sensor 3 converts the optical image of the subject into an electric signal having a magnitude corresponding to the amount of light received by each photoelectric conversion element. These electrical signals are output from the image sensor 3 to the control calculation unit 4.
  • the image generation unit 42 of the control calculation unit 4 is an image of a subject image (single-eye image) based on each electric signal obtained by each photoelectric conversion element corresponding to each imaging region 31 for each imaging region 31. Generate data. Then, the image generation unit 42 stores the generated image data in the image storage unit 51 in association with the imaging region 31.
  • the boundary detection unit 43 of the control calculation unit 4 detects the boundary position of the image circle IC of the imaging optical system 1 based on each electrical signal obtained by each photoelectric conversion element corresponding to the boundary position detection region 32. To do. Then, the boundary detection unit 43 notifies the parameter determination unit 44 of the detected boundary position of the image circle IC (S1).
  • the detection of the boundary position of the image circle IC is executed as follows, for example.
  • the luminance distribution on the straight line X1-X2 including not only the image circle IC but also the pixels (photoelectric conversion elements) outside the image circle IC has a luminance value on one outer side of the image circle IC as shown in FIG. 6B. Is approximately 0, the brightness value suddenly increases near one boundary in the image circle IC, the brightness value is substantially constant at a predetermined value in the image circle IC, and suddenly near the other boundary in the image circle IC.
  • the brightness value becomes small, and the profile having the brightness value of approximately 0 again has the outside outside the image circle IC. That is, the amount of light is large at the center (center) of the image circle IC, and the amount of light decreases toward the periphery of the image circle IC.
  • the intermediate value is a central value between the maximum luminance value and the minimum luminance value. Such an intermediate value may be set in advance to a central value between the maximum luminance value and the minimum luminance value that can be taken by the photoelectric conversion element of the image sensor 3 (static setting).
  • It may be set to a central value between the maximum luminance value and the minimum luminance value obtained from the actual luminance distribution of each photoelectric conversion element (each pixel) arranged along a predetermined straight line in the detection region 32 (motion Setting).
  • the threshold value Th may be set to a predetermined value excluding 0, the influence of noise is reduced, and the detection accuracy of the boundary position of the image circle IC is improved.
  • the boundary detection unit 43 sets the luminance value to the second threshold Th1.
  • the position of the photoelectric conversion element (pixel) at which the luminance value exceeds the second threshold Th1 is set as the boundary position of the image circle IC. To detect.
  • the boundary detection unit 43 is configured to detect the boundary position of the image circle IC of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging device 3. May be.
  • the plurality of optical images may be formed on the same subject at different points in time, or may be formed on different subjects.
  • the boundary detection unit 43 may be configured to detect the boundary position of the image circle of the imaging optical system 1 based on the integrated signal obtained by integrating the plurality of electrical signals. More specifically, for example, as shown in FIGS. 8B to 8D, a plurality of electrical signals are first obtained by converting each of a plurality of optical images having different luminance distributions by the image sensor 3. The plurality of electrical signals are added for each photoelectric conversion element (each pixel) and averaged by the number of the plurality of optical images. Thereby, the integrated signal is obtained. Since this integrated signal is an average of a plurality of optical images having various luminance distributions, it becomes a substantially uniform luminance distribution as shown in FIG. 8E and approximates the luminance distribution shown in FIGS. 7 and 8A.
  • the boundary detection unit 43 selects the maximum luminance value at the corresponding pixel from the plurality of electrical signals, and based on the selected maximum luminance value, the boundary of the imaging optical system is selected. It may be configured to detect the boundary position of the image circle. More specifically, for example, by converting each of a plurality of optical images having different luminance distributions by the imaging element 3, a plurality of electrical signals are first obtained. In the plurality of electric signals, the maximum luminance value is selected for each photoelectric conversion element (each pixel), and the selected maximum luminance value is set as the luminance value of the photoelectric conversion element (the pixel). The As a result, a signal composed of the maximum luminance value in each photoelectric conversion element (each pixel) is obtained, and the boundary position of the image circle of the imaging optical system is detected from the luminance distribution of this signal in the same manner as described above.
  • the parameter determination unit 44 of the control calculation unit 4 determines a predetermined parameter related to a predetermined image correction process based on the boundary position of the image circle IC detected by the boundary detection unit 43. Then, the parameter determination unit 44 notifies the image correction unit 45 of the determined parameter value (S2). The determination of this parameter is executed as follows, for example.
  • the parameter table PT of the lookup table stored in advance in the parameter storage unit 52 as described above is used when determining the parameters. More specifically, the parameter determination unit 44 searches the boundary position x field 5211 and the boundary position y field 5212 using the boundary position of the image circle IC detected by the boundary detection unit 43 as a key. A record in which the boundary position of the detected image circle IC is registered in the boundary position x field 5211 and the boundary position y field 5212 is found.
  • the parameter determination unit 44 performs the focal length fx field 5231, focal length fy field 5232, image center cx field 5241, image center cy field 5242, distortion coefficient k1 field 5251, distortion coefficient k2 field 5252, distortion in the found record. From the coefficient p1 field 5253 and the distortion coefficient p2 field 5254, the focal length fx, the focal length fy, the image center position cx, the image center position cy, the first distortion coefficient k1, the second distortion coefficient k2, and the third distortion, respectively. The parameter values of the coefficient p1 and the fourth distortion coefficient p2 are acquired. Thereby, each value of each parameter is determined.
  • the image correction unit 45 corrects the image of the subject generated by the image generation unit 42 and stored in the image storage unit 51 using each value of each parameter determined by the parameter determination unit 44 (S3). .
  • the image correcting unit 45 uses a focal length fx and a focal length fy determined by the parameter determining unit 44, and uses a known correction method for enlarging / reducing the image with an enlargement ratio corresponding to the focal length shift amount. , Correct the image. If the enlargement ratio is greater than 1, the enlargement / reduction process is an enlargement process. If the enlargement ratio is 1, the enlargement / reduction process is an equal magnification process, and the enlargement ratio is less than 1. In this case, the enlargement / reduction process is a reduction process.
  • the image correction unit 45 uses the known correction method for correcting the baseline length according to the image center position cx determined by the parameter determination unit 44 and the shift amount of the center position using the image center position cy. to correct.
  • the baseline length is shifted longer than the original baseline length due to the shift of the center position
  • the baseline length is corrected to be shorter according to the shift amount of the center position
  • the baseline length is longer than the original baseline length due to the shift of the center position.
  • the base line length is corrected to be long according to the shift amount of the center position.
  • the image correction unit 45 corrects the image by correcting the position of the posture by a known correction method using the image center position cx and the image center position cy determined by the parameter determination unit 44.
  • the image correction unit 45 uses the first to fourth distortion coefficients k1, k2, p1, and p2 determined by the parameter determination unit 44, and has an abbreviated shape similar to a scene in which a distorted image can be seen with the naked eye.
  • the image is corrected by a known correction method for correcting a natural image without distortion.
  • the image correction unit 45 corrects the image by a known correction method that rotates the image by a matrix for rotating the image using the parameter value determined by the parameter determination unit 44. To do. Further, for example, when correcting the translation, the image correction unit 45 corrects the image by a known correction method that translates the image by a matrix for translating the image using the parameter value determined by the parameter determination unit 44. To do.
  • the image correction unit 45 corrects the blur estimation coefficient in the super-resolution processing using the blur coefficient determined by the parameter determination unit 44, and performs known super-resolution processing. By doing so, the image is corrected.
  • This super-resolution processing is a technique for generating a single high-resolution image from a plurality of low-resolution images.
  • the high-resolution processing is performed by aligning a plurality of overlapping low-resolution images with sub-pixel accuracy. One pixel image value is complemented, and then one high-resolution image is generated by removing blur, noise, and the like.
  • Such super-resolution processing is disclosed in, for example, Japanese Patent Application Laid-Open Nos.
  • the blur is also caused by the image correction processing related to the internal parameters such as the correction of the shift of the focal length, the correction of the shift of the center position, and the correction of the distortion. For this reason, it is preferable that the blur correction is performed after the image correction processing related to the internal parameters.
  • the size of the image circle IC increases as shown in FIG. 9A, and the boundary position of the image circle IC shifts. To do.
  • the luminance distribution is also shifted as shown in FIG. 9B.
  • the boundary position (X, 4) of the shifted image circle IC is detected as, for example, coordinates (7, 4) by the boundary detection unit 43 in process S1.
  • the parameter table PT is retrieved by the parameter determination unit 44 at the coordinates (7, 4), and the focal length fx and the focal length are obtained from the fields 5231 to 5254 of the record corresponding to the coordinates (7, 4).
  • step S3 the parameter values fx4, fy4, cx4, xy4, k14, k24, p14, and p24 are used to correct the image of the subject.
  • the parameter is determined for each formation of the image of the subject, but the parameter determination timing is not limited to this.
  • the parameter is determined when the imaging apparatus CA is activated, and this parameter may be used until the imaging apparatus CA is turned off.
  • the parameter may be determined and updated at a predetermined time interval set in advance. Further, for example, the parameter may be determined and updated every time a subject image formed by operating a shutter button (not shown) reaches a predetermined number.
  • the imaging device CA and the image processing unit 2 in the present embodiment detect the boundary position of the image circle IC of the imaging optical system 1 based on the electrical signal output from the imaging device 3, and the detected image circle IC. Based on the boundary position, a predetermined parameter relating to a predetermined image correction process is determined, and the image of the subject is corrected using the determined predetermined parameter. For this reason, since the imaging apparatus CA and the image processing unit 2 in this embodiment use the image circle IC inherently included in the imaging optical system 1 to determine parameters relating to image correction processing, It is possible to detect a change in the optical characteristic of the lens without providing a special optical member as in the technique, and to correct an image in accordance with the change in the optical characteristic.
  • the imaging apparatus CA and the image processing unit 2 have boundaries between the image circle ICs of the imaging optical system 1 based on a plurality of electrical signals obtained by converting each of the plurality of optical images by the imaging element 3. Since the position is detected, the boundary position of the image circle IC can be detected with higher accuracy. Therefore, the imaging apparatus CA and the image processing unit 2 in this embodiment can correct an image with higher accuracy.
  • the imaging apparatus CA in the present embodiment includes a lens array in which the imaging optical system 1 is arranged with lenses having small diameters, so that the back focus of the imaging optical system 1 can be shortened and the height can be reduced. .
  • Such an imaging apparatus CA is particularly suitable for mobile phones and smartphones that are becoming thinner.
  • An image processing apparatus includes an imaging device that converts an optical image of a subject formed on a light receiving surface by an imaging optical system into an electrical signal, and an image of the subject based on the electrical signal output from the imaging device.
  • An image generation unit that generates the image circle, a boundary detection unit that detects a boundary position of the image circle of the imaging optical system based on the electrical signal output from the imaging element, and the image circle detected by the boundary detection unit
  • a parameter determining unit that determines a predetermined parameter related to a predetermined image correction process based on the boundary position, and an image of the subject generated by the image generating unit using the predetermined parameter determined by the parameter determining unit.
  • the image circle of the imaging optical system is a circular range in which light passing through a lens included in the imaging optical system connects the images. For this reason, the boundary of the image circle is affected by the optical characteristics of the lens included in the imaging optical system.
  • the image processing device detects a boundary position of an image circle of the imaging optical system based on an electric signal output from the image sensor, and performs a predetermined image correction process based on the detected boundary position of the image circle.
  • the parameter is determined, and the image of the subject is corrected using the determined predetermined parameter.
  • the image processing apparatus uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, for example, the light shielding wall disclosed in Patent Document 1 is used. Without providing a special optical member, a change in the optical characteristics of the lens can be detected, and an image can be corrected according to the change in the optical characteristics.
  • the boundary detection unit includes an image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging element.
  • the boundary position of is detected.
  • the boundary detection unit detects a boundary position of an image circle of the imaging optical system based on an integrated signal obtained by integrating the plurality of electrical signals.
  • the boundary detection unit selects a maximum luminance value at a corresponding pixel from the plurality of electric signals, and based on the selected maximum luminance value, a boundary position of an image circle of the imaging optical system Is detected.
  • Such an image processing apparatus detects the boundary position of the image circle of the imaging optical system based on a plurality of electrical signals obtained by converting each of a plurality of optical images by the imaging device, so that it has higher accuracy.
  • the boundary position of the image circle can be detected. Therefore, such an image processing apparatus can correct an image with higher accuracy.
  • the predetermined parameter is one or more of a focus position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient.
  • an image processing apparatus that corrects an image using one or more of a focal position, a center position, a distortion coefficient, a translation coefficient, a rotation coefficient, and a blur coefficient as the predetermined parameter.
  • An image processing method includes an imaging step of converting an optical image of a subject formed on a light receiving surface of an imaging element by an imaging optical system into an electrical signal, and an electrical signal obtained by the imaging step.
  • An image generation step of generating an image of the subject based on the boundary a boundary detection step of detecting a boundary position of an image circle of the imaging optical system based on the electrical signal obtained by the imaging step, and the boundary detection step
  • a parameter determining step for determining a predetermined parameter relating to a predetermined image correction process based on the detected boundary position of the image circle, and the image generating unit using the predetermined parameter determined by the parameter determining step And an image correction process for correcting the image of the subject.
  • Such an image processing method detects a boundary position of an image circle of an imaging optical system based on an electric signal output from an image sensor, and relates to a predetermined image correction process based on the detected boundary position of the image circle.
  • a predetermined parameter is determined, and the image of the subject is corrected using the determined predetermined parameter.
  • the image processing method uses the image circle inherent in the imaging optical system for determining the parameters relating to the image correction processing, the optical characteristics of the lens can be changed without providing a special optical member. Can be detected, and the image can be corrected in accordance with the change in the optical characteristics.
  • An imaging apparatus includes an imaging optical system that forms an optical image of a subject on a predetermined surface, and an image processing unit that generates the image of the subject, wherein the image processing unit includes the imaging unit.
  • the image processing apparatus according to any one of the above, wherein the light receiving surface of the element is assembled so as to be a predetermined surface of the imaging optical system.
  • Such an image pickup apparatus uses any of the above-described image processing apparatuses, it detects a change in the optical characteristics of the lens without providing a special optical member, and corrects the image in accordance with the change in the optical characteristics. be able to.
  • the imaging optical system includes a lens array including a plurality of lenses arranged in an array.
  • Such an image pickup apparatus includes a lens array in which lenses having small diameters are arranged in the image pickup optical system, so that the back focus of the image pickup optical system can be shortened and the height can be reduced.
  • Such an imaging device is particularly suitable for mobile phones and smartphones that are becoming thinner.
  • an image processing device an image processing method, and an imaging device can be provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

La présente invention se rapporte à un dispositif de traitement d'image, à un procédé de traitement d'image, et à un dispositif de formation d'image. L'invention est caractérisée en ce que : la position limite d'un cercle d'image d'un système optique de formation d'image est détecté sur la base d'un signal électrique qui est délivré en sortie par un élément de formation d'image à réception d'une image optique d'un objet photographique formé sur une surface de réception de lumière par le système optique de formation d'image ; des paramètres prédéfinis correspondant à un traitement prédéfini de correction d'image sont déterminés sur la base de la position limite détectée du cercle d'image ; et l'image de l'objet photographique est corrigée au moyen des paramètres prédéfinis déterminés.
PCT/JP2014/000889 2013-03-05 2014-02-20 Dispositif de traitement d'image, procédé de traitement d'image, et dispositif de formation d'image WO2014136392A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015504151A JPWO2014136392A1 (ja) 2013-03-05 2014-02-20 画像処理装置および画像処理方法ならびに撮像装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013042771 2013-03-05
JP2013-042771 2013-03-05

Publications (1)

Publication Number Publication Date
WO2014136392A1 true WO2014136392A1 (fr) 2014-09-12

Family

ID=51490931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/000889 WO2014136392A1 (fr) 2013-03-05 2014-02-20 Dispositif de traitement d'image, procédé de traitement d'image, et dispositif de formation d'image

Country Status (2)

Country Link
JP (1) JPWO2014136392A1 (fr)
WO (1) WO2014136392A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020005840A (ja) * 2018-07-06 2020-01-16 株式会社ユニバーサルエンターテインメント 遊技機及び遊技用装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011114794A (ja) * 2009-11-30 2011-06-09 Canon Inc 撮影装置
JP2011147079A (ja) * 2010-01-18 2011-07-28 Ricoh Co Ltd 撮像装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011114794A (ja) * 2009-11-30 2011-06-09 Canon Inc 撮影装置
JP2011147079A (ja) * 2010-01-18 2011-07-28 Ricoh Co Ltd 撮像装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020005840A (ja) * 2018-07-06 2020-01-16 株式会社ユニバーサルエンターテインメント 遊技機及び遊技用装置
JP7049200B2 (ja) 2018-07-06 2022-04-06 株式会社ユニバーサルエンターテインメント 遊技機及び遊技用装置

Also Published As

Publication number Publication date
JPWO2014136392A1 (ja) 2017-02-09

Similar Documents

Publication Publication Date Title
US9386216B2 (en) Imaging device, defocus amount calculating method, and imaging optical system
US9407884B2 (en) Image pickup apparatus, control method therefore and storage medium employing phase difference pixels
US9288384B2 (en) Image device and focus control method
JP6031587B2 (ja) 撮像装置、信号処理方法、信号処理プログラム
JP6173156B2 (ja) 画像処理装置、撮像装置及び画像処理方法
WO2013168505A1 (fr) Dispositif d'imagerie et procédé de correction de signaux
EP3606027B1 (fr) Dispositif électronique avec détection de lumière ambiante par caméra
JP6307526B2 (ja) 撮像装置及び合焦制御方法
CN104813212B (zh) 摄像装置及曝光确定方法
US9344651B2 (en) Signal processing apparatus for correcting an output signal of a focus detecting pixel cell to improve a captured image quality
KR102316448B1 (ko) 이미지 장치 및 그것의 뎁쓰 계산 방법
CN103842879A (zh) 成像装置和用于计算相位差像素的灵敏度比率的方法
US20220006956A1 (en) Camera module, method of correcting movement of the module, image stabilization device therefor
US20170187951A1 (en) Imaging device and focus control method
KR20130104756A (ko) 영상 촬영 장치 및 이에 포함된 이미지 센서
JP6462189B2 (ja) 合焦制御装置、合焦制御方法、合焦制御プログラム、レンズ装置、撮像装置
KR20130143381A (ko) 디지털 촬영 장치 및 그의 제어 방법
US20180191976A1 (en) Image sensor and imaging device
JP5493900B2 (ja) 撮像装置
KR20130084415A (ko) 카메라 모듈, 그의 오토 포커스 방법 및 오토 포커스 캘리브레이션하는 방법
JP6136583B2 (ja) 画像処理装置、撮像装置及び画像処理プログラム
WO2014136392A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et dispositif de formation d'image
WO2018051699A1 (fr) Dispositif de capture d'image, procédé de capture d'image, et programme de capture d'image
KR101797080B1 (ko) 카메라 모듈 및 그의 렌즈 쉐이딩 교정 방법
WO2020021887A1 (fr) Appareil de traitement d'image, appareil d'imagerie, procédé de traitement d'image, et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14760310

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015504151

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14760310

Country of ref document: EP

Kind code of ref document: A1