WO2014042104A1 - Imaging controller and imaging control method and program - Google Patents
Imaging controller and imaging control method and program Download PDFInfo
- Publication number
- WO2014042104A1 WO2014042104A1 PCT/JP2013/074170 JP2013074170W WO2014042104A1 WO 2014042104 A1 WO2014042104 A1 WO 2014042104A1 JP 2013074170 W JP2013074170 W JP 2013074170W WO 2014042104 A1 WO2014042104 A1 WO 2014042104A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- images
- weighting
- index value
- area
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 106
- 238000000034 method Methods 0.000 title claims description 13
- 238000011156 evaluation Methods 0.000 claims abstract description 17
- 238000003860 storage Methods 0.000 claims description 5
- 238000012937 correction Methods 0.000 description 27
- 238000012545 processing Methods 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 8
- 230000015572 biosynthetic process Effects 0.000 description 7
- 238000003786 synthesis reaction Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 230000002950 deficient Effects 0.000 description 5
- 238000005375 photometry Methods 0.000 description 5
- 239000006185 dispersion Substances 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000003705 background correction Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- VJTAZCKMHINUKO-UHFFFAOYSA-M chloro(2-methoxyethyl)mercury Chemical compound [Cl-].COCC[Hg+] VJTAZCKMHINUKO-UHFFFAOYSA-M 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
Definitions
- the present invention relates to an imaging controller which can provide appropriate imaging conditions to multiple imaging units, and an imaging control method executed by the imaging controller and a program for realizing the imaging control method.
- omnidirectional imaging system which includes multiple wide-angle lenses such as fisheye lens or super wide-angle lens to capture an image in omnidirections at once. It is configured to project images from the lenses onto a sensor surface and combine the images through image processing to thereby generate an omnidirectional image. For example, by use of two wide-angle lenses with angle of view of over 180 degrees, omnidirectional images can be generated.
- image processing a partial image captured by each lens system is subjected to distortion correction and projection conversion on the basis of a certain projection model with a distortion from an ideal model taken into account. Then, the partial images are connected on the basis of an overlapping portion of the partial images to form a single omnidirectional image.
- Japanese Patent Application Publication No. 2007-329555 discloses an imaging system including multiple imaging units arranged to have an overlapping imaging area to extract an overlapping area from each of the images captured by the imaging units. It is configured to adjust at least one of the exposure and white balance of the imaging units according to each image of the extracted overlapping areas to reduce a difference in the brightness or color of the captured images, for the purpose of abating workloads of post-processing such as synthesis.
- the omnidirectional imaging system it is difficult for the omnidirectional imaging system to acquire a proper exposure by such a related-art exposure correction technique because the optical conditions or photographic circumstances of the imaging units thereof differ.
- the related art disclosed in the above document only concerns an overlapping area so that it cannot obtain appropriate exposure correction values under an unbalanced exposure condition if the overlapping area is small relative to the entire image.
- the imaging area of the omnidirectional imaging system is omnidirectional, a high-brightness subject as the sun is often captured on a sensor, which may cause a flare and an increase in image offset value.
- a proper exposure can be obtained for each sensor, however, it may cause a difference in brightness between the connecting portions of the images and impair the quality of an omnidirectional image.
- the present invention aims to provide an imaging controller and imaging control method and program which can provide to each of imaging units a proper imaging condition to abate a discontinuity at the connecting points of the images captured by the imaging units in synthesizing the images.
- an imaging controller comprises an index calculator to calculate an index value for each of divided areas of images captured by a plurality of imaging units, the index value for evaluating a photographic state of each of the divided areas, an evaluation value calculator to evaluate the images and an overlapping area between the images on the basis of the index value of each divided area calculated by the index calculator and calculate an overall evaluation value, and a condition determiner to determine an imaging condition for each of the imaging units on the basis of the overall evaluation value calculated by the evaluation value calculator.
- FIG. 1 is a cross section view of an omnidirectional imaging system according to the present embodiment
- FIG. 2 shows the hardware configuration of the omnidirectional imaging system in FIG. 1 ;
- FIG. 3 shows a flow of the entire image processing of the omnidirectional imaging system in FIG. 1 ;
- FIGs. 4 A, 4B show 0 th and 1 st images captured by two fisheye lenses, respectively and
- FIG. 4C shows a synthetic image of the 0 th and 1 st captured images by way of example;
- FIG. 5A, 5B show an area division method according to the present embodiment
- FIG. 6 is a flowchart for exposure control executed by the omnidirectional imaging system according to the present embodiment.
- FIG. 7 is a flowchart for exposure calculation executed by the omnidirectional imaging system according to the present embodiment.
- the present embodiment describes an omnidirectional imaging system which comprises a camera unit including two fisheye lenses and a function to decide an imaging condition on the basis of images captured by the two fisheye lenses.
- the omnidirectional imaging system can comprise a camera unit including three or more fisheye lenses to determine an imaging condition according to the images captured by the fisheye lenses.
- a fisheye lens can include a wide-angle lens or a super wide-angle lens.
- FIG. 1 is a cross section view of the omnidirectional imaging system 10 (hereinafter, simply imaging system). It comprises a camera unit 12, a housing 14 accommodating the camera unit 12 and elements as controller, batteries, and a shutter button 18 provided on the housing 14.
- the camera unit 12 in FIG. 1 comprises two lens systems 20A, 20B and two solid-state image sensors 22A, 22B as CCD (charge coupled device) sensor or CMOS (complementary metal oxide semiconductor).
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- each of the pairs of the lens systems 20 and solid-state image sensors 22 are referred to as imaging unit.
- the lens systems 20A, 20B are each comprised of 6 groups of 7 lenses as a fisheye lens, for instance.
- the optical elements as lenses, prisms, filters, aperture stops of the lens systems 20A, 20B are positioned relative to the solid-state image sensors 22A, 22B so that the optical axes of the optical elements are orthogonal to the centers of the light receiving areas of the corresponding solid-state image sensors 22 as well as the light receiving areas become the imaging planes of the corresponding fisheye lenses.
- the solid-state image sensors 22 are area image sensors on which photodiodes are two-dimensionally arranged, to convert light gathered by the lens systems 20 to image signals.
- the lens systems 20A, 20B are the same and disposed opposite to each other so that their optical axes coincide.
- the solid-state image sensors 22A, 22B convert light distribution to image signals and output them to a not-shown image processor on the controller.
- the image processor combines partial images from the solid-state image sensors 22A, 22B to generate a synthetic image with solid angle of 4 ⁇ in radian or an omnidirectional image.
- the omnidirectional image is captured in all the directions which can be seen from a shooting point. Instead of the omnidirectional image, a panorama image which is captured in a 360-degree range only on a horizontal plane can be generated.
- an overlapping portion of the captured images by the imaging units is used for connecting images as reference data representing the same image.
- Generated omnidirectional images are output to, for instance, a display provided in or connected to the camera unit 12, a printer or an external storage medium such as SD card®, compact flash®.
- FIG. 2 shows the structure of hardware of the imaging system 10 according to the present embodiment.
- the imaging system 10 comprises a digital still camera processor 100 (hereinafter, simply processor), a lens barrel unit 102, and various elements connected with the processor 100.
- the lens barrel unit 102 includes the two pairs of lens systems 20A, 20B and solid-state image sensors 22A, 22B.
- the solid-state image sensors 22A, 22B are controlled by a command from a CPU 130 of the processor 100.
- the processor 100 comprises ISPs (image signal processors) 108A, 108B, a DMAC (direct memory access controller) 1 10, an arbiter (ARBMEMC) 1 12 for memory access, a MEMC (memory controller) 1 14 for memory access, and a distortion correction and image synthesis block 1 18.
- the ISPs 108 A, 108B perform automatic exposure control to and set white balance and gamma balance of image data signal-processed by the solid-state image sensors 22A, 22B.
- the MEMC 1 14 is connected to an SDRAM 1 16 which temporarily stores data used in the processing of the ISP 108A, 108B and distortion correction and image synthesis block 1 18.
- the distortion correction and image synthesis block 1 1 8 performs distortion correction and vertical inclination correction on the two partial images from the two imaging units on the basis of information from a triaxial acceleration sensor 120 and synthesizes them.
- the processor 100 further comprises a DMAC 122, an image processing block 124, a CPU 130, an image data transferrer 126, an SDRAMC 128, a memory card control block 140, a USB block 146, a peripheral block 150, an audio unit 152, a serial block 158, an LCD (Liquid Crystal Display) driver 162, and a bridge 168.
- a DMAC 122 an image processing block 124, a CPU 130, an image data transferrer 126, an SDRAMC 128, a memory card control block 140, a USB block 146, a peripheral block 150, an audio unit 152, a serial block 158, an LCD (Liquid Crystal Display) driver 162, and a bridge 168.
- the CPU 130 controls the operations of the elements of the imaging system 10.
- the image processing block 124 performs various kinds of image processing on image data together with a resize block 132, a JPEG block 134, and an H. 264 block 136.
- the resize block 132 enlarges or shrinks the size of image data by interpolation.
- the JPEG block 134 is a codec block to compress and decompress image data in JPEG.
- the H. 264 block 136 is a codec block to compress and decompress video data in H.264.
- the image data transferrer 126 transfers the images processed by the image processing block 124.
- the SDRAMC 128 controls the SDRAM 138 connected to the processor 100 and temporarily storing image data during image processing by the processor 100.
- the memory card control block 140 controls data read and write to a memory card and a flash ROM 144 inserted to a memory card throttle 142 in which a memory card is detachably inserted.
- the USB block 146 controls USB communication with an external device such as personal computer connected via a USB connector 148.
- the peripheral block 150 is connected to a power switch 166.
- the audio unit 152 is connected to a microphone 156 for receiving an audio signal from a user and a speaker 154 for outputting the audio signal, to control audio input and output.
- the serial block 158 controls serial communication with the external device and is connected to a wireless NIC (network interface card) 160.
- the LCD driver 162 is a drive circuit for the LCD 164 and converts the image data to signals for displaying various kinds of information on an LCD 164.
- the flash ROM 144 contains a control program written in readable codes by the CPU 130 and various kinds of parameters. Upon power-on of the power switch 166, the control program is loaded onto a main memory.
- the CPU 130 controls the operations of the units and elements of the image processor in compliance with the control program on the main memory, and temporarily stores necessary control data in the SDRAM 138 and a not-shown local SRAM.
- FIG. 3 shows essential function blocks for controlling imaging condition and the flow of the entire image processing of the imaging system 10 according to the present embodiment.
- the solid-state image sensors 22A, 22B capture images under a certain exposure condition and output them.
- the ISPs 108 A, 108B in FIG. 2 perform optical black correction, defective pixel correction, linear correction, shading correction and area division (collectively referred to as first processing) to the images from the solid-state image sensors 22A, 22B and store them in memory.
- the optical black correction is a processing in which an output signal from an effective pixel area is subjected to clamp correction, using the output signals of optical black areas of the solid-state image sensors as a black reference level.
- a solid-state image sensor such as CMOS may contain defective pixels from which pixels values are not obtainable because of impurities entering a semiconductor substrate in the manufacturing of the image sensor.
- the defective pixel correction is a processing in which the value of a defective pixel is corrected according to a combined signal from neighboring pixels of the defective pixel.
- the linear correction is for each of RGBs.
- the shading correction is to correct a distortion of shading in an effective pixel area by multiplying the output signal of the effective pixel area by a certain correction coefficient.
- the area division is to divide a captured image into small areas and calculate an integrated value or an integrated average value of brightness values for each divided area.
- the ISPs 108A, 108B further perform white balance, gamma correction, Bayer interpolation, YUV conversion, edge enhancement and color correction (collectively referred to as second processing) to the images, and the images are stored in the memory.
- the amount of light transmitting through the color filters of the image sensors changes depending on the color of the filter.
- the white balance correction is to correct a difference in sensitivity to the three colors R (red), G (green), and B (blue) and set a gain for appropriately representing white color in an image.
- a WB (white balance) calculator 220 calculates a white balance parameter according to the RGB integrated value or integrated average value calculated in the area division process.
- the gamma correction is to correct a gamma value of an input signal so that the output linearity of an output device is maintained with the characteristic thereof taken into account.
- each pixel is attached with any of RGB color filters.
- the Bayer interpolation is to interpolate insufficient two colors from neighboring pixels.
- the YUV conversion is to convert RAW data in RGB format to data in YUV format of a brightness signal Y and a color difference signal UV.
- the edge enhancement is to extract the edges of an image according to a brightness signal, apply a gain to the edges, and remove noise in the image in parallel to the edge extraction.
- the color correction includes chroma setting, hue setting, partial hue change, and color suppression.
- the images are subjected to distortion correction and image synthesis.
- a generated omnidirectional image is added with a tag properly and stored in a file in the internal memory or an external storage.
- Inclination correction can be additionally performed on the basis of the information from the triaxial acceleration sensor 120 or a stored image file can be subjected to compression when appropriate.
- a thumb-nail image can be generated by cropping or cutting out the center area of an image.
- the exposure parameter for the solid-state image sensors 22A, 22B is determined and set in an exposure condition register 200 by an exposure condition controller 210.
- the imaging system 10 does not need to include a photometer for measuring the brightness of a subject but uses the outputs of the solid-state image sensors 22A, 22B for exposure control.
- image signals are constantly read from the solid-state image sensors 22A, 22B.
- the exposure condition controller 210 repeatedly conducts a photometry on the basis of a read image signal and determines whether a brightness level is appropriate, to correct the exposure parameter such as F-value, exposure time (shutter speed), amplifier gain (ISO sensitivity) and obtain a proper exposure.
- the two imaging units In omnidirectional photographing with the omnidirectional imaging system 10, the two imaging units generate two images.
- a flare may occur in one of the images as shown in FIGs. 4A, 4B and spread over the entire image from the high-brightness object.
- a synthetic image of the two images or omnidirectional image may be impaired in quality because an increased offset of the one of the images causes a difference in brightness at the connecting portions. Further, no proper object for exposure correction but an extremely white or black object will appear in an overlapping area of the two images.
- the exposure condition controller 210 is configured to evaluate the level of exposure of all of the images with the overlapping area and non-overlapping areas of the images taken into consideration and decide exposure parameters as aperture a, exposure time t and amplifier gain g for the solid-state image sensors 22A, 22B to be set in the exposure condition register 200.
- the exposure condition controller 210 includes an area calculator 212, an overall calculator 214 and an exposure condition determiner 216, and can be realized by the ISPs 108 and CPU 130.
- the ISPs 108 A, 108B calculate the integrated value or integrated average value for each divided area and outputs integrated data for each divided area, and the exposure condition controller 210 reads the integrated data.
- FIGs. 5A, 5B show how to divide an image into small areas by way of example.
- incident light on the lens systems 20A, 20B is imaged on the light-receiving areas of the solid-state image sensors 22 A, 22B in accordance with a certain projection model such as equidistant projection.
- Images are captured on the two-dimensional solid-state area image sensors and image data represented in a plane coordinate system.
- a circular fisheye lens having an image circle diameter smaller than an image diagonal line is used and an obtained image is a planar image including the entire image circle in which the photographic areas in FIGs. 4A, 4B are projected.
- each solid-state image sensor is divided into small areas in circular polar coordinate system with radius r and argument ⁇ in FIG. 5A or small areas in planar orthogonal coordinate system with x and y coordinates in FIG. 5B. It is preferable to exclude the outside of the image circle from a subject of integration and averaging since it is a non-exposed outside area.
- each image is divided into small areas as shown in FIGs. 5A, 5B and the integrated value or integrated average value of brightness is calculated for each divided area.
- the integrated value is obtained by integrating the brightness values of all the pixels in each divided area while the integrated average value is obtained by normalizing the integrated value with the size (number of pixels) of each divided area excluding the outside area.
- the area calculator 212 receives the integrated data for each divided area including the integrated average value and calculates an index value for each divided area to evaluate a photographic state thereof.
- the index value is an area brightness level b for evaluating an absolute brightness of each divided area.
- the brightness level b(x, y) of a certain divided area is calculated for each solid-state image sensor 22 by the following equation: where s(x,y) is an integrated average value for a certain divided area, a is an aperture, t is exposure time, and g is amplifier gain.
- the brightness level b(r, ⁇ ) of a divided area in a circular coordinate system can be calculated in the same manner.
- the brightness level b (x, y) is an index value to evaluate the brightness of a subject in each divided area and calculated from the brightness of a pixel value of an actual image according to a current exposure parameter (a, t, g).
- the overall calculator 214 evaluates the captured images including the overlapping area as a whole on the basis of the calculated brightness level b'(x, y) and calculates an overall evaluation value with weighting according to an overlapping portion between the photographic areas of the images.
- the solid-state image sensors 22A, 22B are referred to as 0 th and 1 st image sensors and their brightness levels are referred to as b° (x,y) and b ⁇ X j y) , respectively.
- the overall evaluation value is an overall brightness level bT to evaluate the brightness of all the areas of the images or subject brightness as a whole with a certain weighting.
- the overall brightness level bT 1 is calculated for an i-th solid-state image sensor by the following equation:
- b J (x,y) is a brightness level for each divided area of each solid-state image sensor j (j e 0, 1 ) and w J1 (x, y) is a weighted value in weighted averaging for each divided area.
- weighted values w jl (x, y) are used for each solid-state image sensor.
- the weighted values w jl (x, y) can be adjusted so that a larger value is given to a solid-state image sensor with a lower brightness level to prevent receipt of an influence from a light source. Thus, weighting can be performed properly in accordance with a result of determination about a photographic scene.
- the small areas (x, y) can be zoned into intermediate areas as overlapping area and non-overlapping area as indicated by hatching in FIGs.5A, 5B.
- the overall brightness level bT 1 can be calculated by the following equation, using the brightness level and weighted values for each intermediate area.
- bE° is a brighteness level (average) of an edge area (overlapping area) of the 0 th image
- bC° is a brighteness level (average) of a center area (non-overlapping area) of the 0 th image
- bE 1 is a brightness level (average) of an edge area (overlapping area) of the 1 st image
- bC 1 is a brighteness level (average) of a center area (non-overlapping area) of the 1 st image
- wli to w4i are weighted values of weighted averaging set for each intermediate area of an i-th solid-state image sensor.
- the basic values of weighted values wli to w4i can be calculated by the following equations (4):
- the overall calculator 214 can include a weighting setter to set the weighted values w J1 (x, y) according to a signal level of a captured image.
- the weighting setter is configured to create a brightness distribution (histogram) from the brightness levels b J (x, y) of all the divided areas and analyze a total average value and brightness distribution for the scene determination. Then, according to a determined scene, it can change the weighted values w J1 (x, y) for a certain divided area depending on the brightness level b J (x, y) for the divided area in question.
- the weighting setter sets, by adding a predetermined amount to the area brightness level, the weighted value w J1 (x, y) to a larger value for evaluating a divided area with a larger brightness level b(x, y) calculated.
- a bright subject in a dark scene is highly evaluated for photometry and exposure can be properly controlled according to a result of the photometry.
- the weighting setter sets a larger weighted value w J1 (x, y) for a divided area with a smaller brightness level b(x, y) calculated.
- a dark subject in a bright scene is highly evaluated for photometry.
- a divided area containing an extremely black or white subject can be detected according to an upper limit threshold and a lower limit threshold to exclude the divided area for a subject of the photometry. For example, if a divided area or a white area with a brightness level equal to or over an upper limit threshold (b ut j,) and/or a divided area or a black area with a brightness level equal to or below a lower limit threshold (ba h ) is/are detected, these areas can be given a smaller weight or zero. Thereby, it is possible to calculate the overall brightness level bT with the divided area unsuitable for exposure correction given a small weight or not taken into account.
- an upper limit threshold b ut j,
- ba h lower limit threshold
- the weighting setter determines a scene of each the 0 th and 1 st images captured by the two solid-state image sensors 22A, 22B on the basis of a relation of the brightness between the two images and sets weighted values appropriate for the scene for each of the solid-state image sensors 22A, 22B. For example, to prevent receipt of an influence from a light source, the weighted values w J 1 (x, y) can be adjusted so that a larger value is given to a solid-state image sensor with a lower brightness level.
- the exposure condition determiner 216 determines an exposure parameter (a, t, g) for each i-th solid-state image sensor on the basis of the overall brightness level bT 1 calculated by the overall calculator 214.
- the overall brightness level bT is to evaluate the brightness of a subject in the images, and a condition for acquiring a proper exposure can be represented by the following conditions:
- the condition (5) is a transformation of the condition (4) taking 2 as a base of logarithm of each of the four parameters.
- the exposure condition determiner 216 adjusts the aperture a, exposure time t and amplifier gain g for each i-th solid-state image sensors 22 according to a current exposure parameter and a measured subject brightness or overall brightness level and acquires a proper exposure.
- the corrected exposure parameter (a' , t' , g') can be obtained from the brightness level bT 1 , referring to a table called a program diagram which is prepared in advance in accordance with the characteristic of the imaging system 10.
- the program diagram refers to a diagram or a table containing the combinations of the amplifier gain g and exposure time t with a fixed aperture a. Exposure values can be determined from the combinations.
- an optimal combination of the aperture a, exposure time t and amplifier gain g is obtainable from the overall brightness level bT by a certain program diagram.
- at least one of the aperture a, exposure time t and amplifier gain g can be manually set and the rest of them can be found from the program diagram.
- Such automatic exposure mode exemplifies shutter priority mode in which exposure time t is manually set, aperture priority mode in which aperture a is manually set, and sensitivity priority mode in which amplifier gain g is manually set.
- FIG. 6 is a flowchart for the exposure control while FIG. 7 is a flowchart for exposure calculation process of the exposure control.
- the operation in FIG. 6 is repeatedly executed every time images are captured by the image sensors 22A, 22B.
- step S I 01 the imaging system 10 calculates an integrated average value for each divided area of the two image sensors 22A, 22B by integrating the pixel values thereof.
- step S I 02 the exposure calculation in FIG. 7 is called up.
- step S201 the imaging system 10 takes the statistics of integrated average values for the divided areas of each solid-state image sensor 22 to calculate an average and a dispersion (or a standard deviation) of the image.
- step S202 the imaging system 10 determines whether a current exposure parameter is in an allowable range from the average and dispersion of the two images to sufficiently evaluate a subject brightness. For example, if the average of the images is close to zero and the dispersion is lower than a certain threshold, it is probable that black saturation occur in the captured images. In contrast, if the average is close to saturation and a dispersion is low, it is probable that white saturation occur in the captured images. The light amount of captured images with black or white saturation cannot be properly measured so that the exposure parameter indicating black or white saturation is determined to be outside the allowable range.
- step S206 the imaging system 10 proceeds to step S206 and adjusts an exposure parameter to acquire a proper exposure, and completes the operation in step S207.
- the exposure parameter is adjusted so that the aperture is opened and exposure time and sensitivity are increased.
- the exposure parameter is adjusted so that the aperture is closed and exposure time and sensitivity are decreased.
- the gain g is fixed
- the exposure parameter (a' , t') is adjusted by lowering an exposure value Ev by a predetermined number of steps in black saturation.
- white saturation the exposure parameter (a', t') is adjusted by raising an exposure value Ev by a predetermined number of steps.
- the exposure parameter (t ⁇ g') is adjusted by increasing the exposure time t and amplifier gain g in black saturation and decreasing them in white saturation.
- the area calculator 212 calculates a brightness level b'(x,y) for each divided area of the 0 th and 1 st images on the basis of the integrated average value s(x, y) and a current exposure parameter (a, t, g).
- step S204 the overall calculator 214 determines a scene of the images and reads a weighted value w J1 (x, y) for the determined scene.
- step S205 the overall calculator 214 calculates the weighted average of the brightness levels b°(x, y) and b ⁇ x, y) and calculates the overall brightness level bT 1 for each i-th solid-state image sensors by the equations (2) and (3).
- step S206 the exposure condition determiner 216 adjusts the exposure according to the overall brightness level bT 1 to satisfy the conditions (4) and (5) and determines the exposure parameter (a', t', g'). Then, the imaging system 10 completes the exposure calculation and returns to step SI 03 in FIG. 6.
- step S I 03 the exposure parameter in the exposure condition register 200 is updated to the determined exposure parameter (a', t', g'), completing the exposure control operation.
- the exposure condition is set to a proper exposure satisfying the above conditions (4), (5).
- the same subject is captured in the overlapping area, therefore, the brightness levels b of the two images should be the same value.
- the overall exposure level of the captured images including the overlapping area and non-overlapping areas is evaluated to determine the exposure parameter (a, t, g) for each of the imaging units.
- an imaging controller and imaging control method and program which can provide to each of imaging units a proper imaging condition to abate a discontinuity at the connecting points of the images captured by the imaging units in synthesizing the images.
- the imaging system 10 to capture an omnidirectional still image as an example of the imaging controller.
- the present invention should not be limited to such an example.
- the imaging controller can be configured as an omnidirectional video imaging system or unit, a portable data terminal such as a smart phone or tablet having an omnidirectional still or video shooting function, or a digital still camera processor or a controller to control a camera unit of an imaging system.
- the functions of the omnidirectional imaging system can be realized by a computer-executable program written in legacy programming language such as assembler, C, C++, C#, JAVA ® or object-oriented programming language.
- a program can be stored in a storage medium such as ROM, EEPROM, EPROM, flash memory, flexible disc, CD-ROM, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, blue ray disc, SD card, or MO and distributed through an electric communication line.
- a part or all of the above functions can be implemented on, for example, a programmable device (PD) as field programmable gate array (FPGA) or implemented as application specific integrated circuit (ASIC).
- PD programmable device
- FPGA field programmable gate array
- ASIC application specific integrated circuit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
- Cameras In General (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
An imaging controller includes an index calculator to calculate an index value for each of divided areas of images captured by a plurality of imaging units, the index value for evaluating a photographic state of each of the divided areas, an evaluation value calculator to evaluate the images and an overlapping area between the images on the basis of the index value of each divided area calculated by the index calculator and calculate an overall evaluation value, and a condition determiner to determine an imaging condition for each of the imaging units on the basis of the overall evaluation value calculated by the evaluation value calculator.
Description
DESCRIPTION
IMAGING CONTROLLER AND IMAGING CONTROL METHOD AND
PROGRAM
CROSS REFERENCE TO RELATED APPLICATION
The present application is based on and claims priority from Japanese Patent Application No. 2012- 199622, filed on September 1 1 , 2012.
Technical Field
[0001 ]
The present invention relates to an imaging controller which can provide appropriate imaging conditions to multiple imaging units, and an imaging control method executed by the imaging controller and a program for realizing the imaging control method.
Background Art
[0002]
There is a known omnidirectional imaging system which includes multiple wide-angle lenses such as fisheye lens or super wide-angle lens to capture an image in omnidirections at once. It is configured to project images from the lenses onto a sensor surface and combine the images through image processing to thereby generate an omnidirectional image. For example, by use of two wide-angle lenses with angle of view of over 180 degrees, omnidirectional images can be generated. In the image processing a partial image captured by each lens system is subjected to distortion correction and projection conversion on the basis of a certain projection model with a distortion from an ideal model taken into account. Then, the partial images are connected on the basis of an overlapping portion of the partial images to form a single omnidirectional image.
[0003]
In related art an exposure correction technique of a digital camera to acquire a proper exposure from a captured image is known. For instance, Japanese Patent Application Publication No. 2007-329555 discloses an imaging system including multiple imaging units arranged to have an overlapping imaging area to extract an overlapping area from each of the images captured by the imaging units. It is configured to adjust at least one of the exposure and white balance of the imaging units according to each image of the extracted overlapping areas to reduce a difference in the brightness or color of the captured images, for the purpose of abating workloads of post-processing such as synthesis.
[0004]
However, it is difficult for the omnidirectional imaging system to acquire a proper exposure by such a related-art exposure correction technique because the optical conditions or photographic circumstances of the imaging units thereof differ. The related art disclosed in the above document only concerns an overlapping area so that it cannot obtain appropriate exposure correction values under an unbalanced exposure condition if the overlapping area is small relative to the entire image. In particular, since the imaging area of the omnidirectional imaging system is omnidirectional, a high-brightness subject as the sun is often captured on a sensor, which may cause a flare and an increase in image offset value. A proper exposure can be obtained for each sensor, however, it may cause a difference in brightness between the connecting portions of the images and impair the quality of an omnidirectional image.
Disclosure of the Invention
[0005]
The present invention aims to provide an imaging controller and imaging control method and program which can provide to each of imaging units a proper imaging condition to abate a discontinuity at the connecting points of the images captured by the imaging units in synthesizing the images.
[0006]
According to one aspect of the present invention, an imaging controller comprises an index calculator to calculate an index value for each of divided areas of images captured by a plurality of imaging units, the index value for evaluating a photographic state of each of the divided areas, an evaluation value calculator to evaluate the images and an overlapping area between the images on the basis of the index value of each divided area calculated by the index calculator and calculate an overall evaluation value, and a condition determiner to determine an imaging condition for each of the imaging units on the basis of the overall evaluation value calculated by the evaluation value calculator.
Brief Description of the Drawings
[0007]
Features, embodiments, and advantages of the present invention will become apparent from the following detailed description with reference to the accompanying drawings:
FIG. 1 is a cross section view of an omnidirectional imaging system according to the present embodiment;
FIG. 2 shows the hardware configuration of the omnidirectional imaging system in FIG. 1 ;
FIG. 3 shows a flow of the entire image processing of the omnidirectional imaging system in FIG. 1 ;
FIGs. 4 A, 4B show 0th and 1 st images captured by two fisheye lenses, respectively and FIG. 4C shows a synthetic image of the 0th and 1st captured images by way of example;
FIG. 5A, 5B show an area division method according to the present embodiment;
FIG. 6 is a flowchart for exposure control executed by the omnidirectional imaging system according to the present embodiment; and
FIG. 7 is a flowchart for exposure calculation executed by the omnidirectional imaging system according to the present embodiment.
Description of Embodiments
[0008]
Hereinafter, an embodiment of an imaging controller and an imaging system will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. By way of example, the present embodiment describes an omnidirectional imaging system which comprises a camera unit including two fisheye lenses and a function to decide an imaging condition on the basis of images captured by the two fisheye lenses. However, the present embodiment should not be limited to such an example. Alternatively, the omnidirectional imaging system can comprise a camera unit including three or more fisheye lenses to determine an imaging condition according to the images captured by the fisheye lenses. Herein, a fisheye lens can include a wide-angle lens or a super wide-angle lens.
[0009]
Referring to FIGs. 1 to 2, the overall configuration of an omnidirectional imaging system 10 is described. FIG. 1 is a cross section view of the
omnidirectional imaging system 10 (hereinafter, simply imaging system). It comprises a camera unit 12, a housing 14 accommodating the camera unit 12 and elements as controller, batteries, and a shutter button 18 provided on the housing 14.
[0010]
The camera unit 12 in FIG. 1 comprises two lens systems 20A, 20B and two solid-state image sensors 22A, 22B as CCD (charge coupled device) sensor or CMOS (complementary metal oxide semiconductor). Herein, each of the pairs of the lens systems 20 and solid-state image sensors 22 are referred to as imaging unit. The lens systems 20A, 20B are each comprised of 6 groups of 7 lenses as a fisheye lens, for instance. In the present embodiment the fisheye lens has total angle of view of 180 degrees (360 degrees/n, n = 2) or more, preferably 185 degrees or more, more preferably 190 degrees or more.
[001 1 ]
The optical elements as lenses, prisms, filters, aperture stops of the lens systems 20A, 20B are positioned relative to the solid-state image sensors 22A, 22B so that the optical axes of the optical elements are orthogonal to the centers of the light receiving areas of the corresponding solid-state image sensors 22 as well as the light receiving areas become the imaging planes of the corresponding fisheye lenses. The solid-state image sensors 22 are area image sensors on which photodiodes are two-dimensionally arranged, to convert light gathered by the lens systems 20 to image signals.
[0012]
In the present embodiment the lens systems 20A, 20B are the same and disposed opposite to each other so that their optical axes coincide. The solid-state image sensors 22A, 22B convert light distribution to image signals and output them to a not-shown image processor on the controller. The image
processor combines partial images from the solid-state image sensors 22A, 22B to generate a synthetic image with solid angle of 4π in radian or an omnidirectional image. The omnidirectional image is captured in all the directions which can be seen from a shooting point. Instead of the omnidirectional image, a panorama image which is captured in a 360-degree range only on a horizontal plane can be generated.
[0013]
To form an omnidirectional image with use of the fisheye lenses with total angle of view of more than 180 degrees, an overlapping portion of the captured images by the imaging units is used for connecting images as reference data representing the same image. Generated omnidirectional images are output to, for instance, a display provided in or connected to the camera unit 12, a printer or an external storage medium such as SD card®, compact flash®.
[0014]
FIG. 2 shows the structure of hardware of the imaging system 10 according to the present embodiment. The imaging system 10 comprises a digital still camera processor 100 (hereinafter, simply processor), a lens barrel unit 102, and various elements connected with the processor 100. The lens barrel unit 102 includes the two pairs of lens systems 20A, 20B and solid-state image sensors 22A, 22B. The solid-state image sensors 22A, 22B are controlled by a command from a CPU 130 of the processor 100.
[0015]
The processor 100 comprises ISPs (image signal processors) 108A, 108B, a DMAC (direct memory access controller) 1 10, an arbiter (ARBMEMC) 1 12 for memory access, a MEMC (memory controller) 1 14 for memory access, and a distortion correction and image synthesis block 1 18. The ISPs 108 A, 108B
perform automatic exposure control to and set white balance and gamma balance of image data signal-processed by the solid-state image sensors 22A, 22B.
[0016]
The MEMC 1 14 is connected to an SDRAM 1 16 which temporarily stores data used in the processing of the ISP 108A, 108B and distortion correction and image synthesis block 1 18. The distortion correction and image synthesis block 1 1 8 performs distortion correction and vertical inclination correction on the two partial images from the two imaging units on the basis of information from a triaxial acceleration sensor 120 and synthesizes them.
[0017]
The processor 100 further comprises a DMAC 122, an image processing block 124, a CPU 130, an image data transferrer 126, an SDRAMC 128, a memory card control block 140, a USB block 146, a peripheral block 150, an audio unit 152, a serial block 158, an LCD (Liquid Crystal Display) driver 162, and a bridge 168.
[0018]
The CPU 130 controls the operations of the elements of the imaging system 10. The image processing block 124 performs various kinds of image processing on image data together with a resize block 132, a JPEG block 134, and an H. 264 block 136. The resize block 132 enlarges or shrinks the size of image data by interpolation. The JPEG block 134 is a codec block to compress and decompress image data in JPEG. The H. 264 block 136 is a codec block to compress and decompress video data in H.264. The image data transferrer 126 transfers the images processed by the image processing block 124. The SDRAMC 128 controls the SDRAM 138 connected to the processor 100 and temporarily storing image data during image processing by the processor 100.
[0019]
The memory card control block 140 controls data read and write to a memory card and a flash ROM 144 inserted to a memory card throttle 142 in which a memory card is detachably inserted. The USB block 146 controls USB communication with an external device such as personal computer connected via a USB connector 148. The peripheral block 150 is connected to a power switch 166.
[0020]
The audio unit 152 is connected to a microphone 156 for receiving an audio signal from a user and a speaker 154 for outputting the audio signal, to control audio input and output. The serial block 158 controls serial communication with the external device and is connected to a wireless NIC (network interface card) 160. The LCD driver 162 is a drive circuit for the LCD 164 and converts the image data to signals for displaying various kinds of information on an LCD 164.
[0021 ]
The flash ROM 144 contains a control program written in readable codes by the CPU 130 and various kinds of parameters. Upon power-on of the power switch 166, the control program is loaded onto a main memory. The CPU 130 controls the operations of the units and elements of the image processor in compliance with the control program on the main memory, and temporarily stores necessary control data in the SDRAM 138 and a not-shown local SRAM.
[0022]
FIG. 3 shows essential function blocks for controlling imaging condition and the flow of the entire image processing of the imaging system 10 according to the present embodiment. First, the solid-state image sensors 22A, 22B capture images under a certain exposure condition and output them. Then, the ISPs 108 A, 108B in FIG. 2 perform optical black correction, defective pixel correction, linear
correction, shading correction and area division (collectively referred to as first processing) to the images from the solid-state image sensors 22A, 22B and store them in memory.
[0023]
The optical black correction is a processing in which an output signal from an effective pixel area is subjected to clamp correction, using the output signals of optical black areas of the solid-state image sensors as a black reference level. A solid-state image sensor such as CMOS may contain defective pixels from which pixels values are not obtainable because of impurities entering a semiconductor substrate in the manufacturing of the image sensor. The defective pixel correction is a processing in which the value of a defective pixel is corrected according to a combined signal from neighboring pixels of the defective pixel.
[0024]
The linear correction is for each of RGBs. The shading correction is to correct a distortion of shading in an effective pixel area by multiplying the output signal of the effective pixel area by a certain correction coefficient. The area division is to divide a captured image into small areas and calculate an integrated value or an integrated average value of brightness values for each divided area.
[0025]
Returning to FIG. 3, after the first processing the ISPs 108A, 108B further perform white balance, gamma correction, Bayer interpolation, YUV conversion, edge enhancement and color correction (collectively referred to as second processing) to the images, and the images are stored in the memory. The amount of light transmitting through the color filters of the image sensors changes depending on the color of the filter. The white balance correction is to correct a difference in sensitivity to the three colors R (red), G (green), and B (blue) and set a gain for appropriately representing white color in an image. A WB (white
balance) calculator 220 calculates a white balance parameter according to the RGB integrated value or integrated average value calculated in the area division process. The gamma correction is to correct a gamma value of an input signal so that the output linearity of an output device is maintained with the characteristic thereof taken into account.
[0026]
Further, in the CMOS each pixel is attached with any of RGB color filters. The Bayer interpolation is to interpolate insufficient two colors from neighboring pixels. The YUV conversion is to convert RAW data in RGB format to data in YUV format of a brightness signal Y and a color difference signal UV. The edge enhancement is to extract the edges of an image according to a brightness signal, apply a gain to the edges, and remove noise in the image in parallel to the edge extraction. The color correction includes chroma setting, hue setting, partial hue change, and color suppression.
[0027]
After the various kinds of processing to the captured images under a certain exposure parameter, the images are subjected to distortion correction and image synthesis. A generated omnidirectional image is added with a tag properly and stored in a file in the internal memory or an external storage. Inclination correction can be additionally performed on the basis of the information from the triaxial acceleration sensor 120 or a stored image file can be subjected to compression when appropriate. A thumb-nail image can be generated by cropping or cutting out the center area of an image.
[0028]
In the above-described image processing the exposure parameter for the solid-state image sensors 22A, 22B is determined and set in an exposure condition register 200 by an exposure condition controller 210. The imaging system 10
according to the present embodiment does not need to include a photometer for measuring the brightness of a subject but uses the outputs of the solid-state image sensors 22A, 22B for exposure control. To display a captured image on a LCD or an EVF (electronic view finder), image signals are constantly read from the solid-state image sensors 22A, 22B. The exposure condition controller 210 repeatedly conducts a photometry on the basis of a read image signal and determines whether a brightness level is appropriate, to correct the exposure parameter such as F-value, exposure time (shutter speed), amplifier gain (ISO sensitivity) and obtain a proper exposure.
[0029]
In omnidirectional photographing with the omnidirectional imaging system 10, the two imaging units generate two images. In a photographic scene including a high-brightness object as the sun, a flare may occur in one of the images as shown in FIGs. 4A, 4B and spread over the entire image from the high-brightness object. In such a case a synthetic image of the two images or omnidirectional image may be impaired in quality because an increased offset of the one of the images causes a difference in brightness at the connecting portions. Further, no proper object for exposure correction but an extremely white or black object will appear in an overlapping area of the two images.
[0030]
In the imaging unit using fisheye lenses with total angle of view of over 180 degrees, most of photographic areas do not overlap except for partial overlapping areas. Because of this, it is difficult to acquire a proper exposure for the above scene by exposure correction based only on the overlapping area. Further, even with the proper exposure obtained for the individual imaging units, a discontinuity of color attribute as brightness may occur at the connecting positions of a synthetic image.
[0031 ]
In view of avoiding insufficient exposure control, in the imaging system 10 the exposure condition controller 210 is configured to evaluate the level of exposure of all of the images with the overlapping area and non-overlapping areas of the images taken into consideration and decide exposure parameters as aperture a, exposure time t and amplifier gain g for the solid-state image sensors 22A, 22B to be set in the exposure condition register 200.
[0032]
Specifically, the exposure condition controller 210 includes an area calculator 212, an overall calculator 214 and an exposure condition determiner 216, and can be realized by the ISPs 108 and CPU 130. In the first processing the ISPs 108 A, 108B calculate the integrated value or integrated average value for each divided area and outputs integrated data for each divided area, and the exposure condition controller 210 reads the integrated data.
[0033]
FIGs. 5A, 5B show how to divide an image into small areas by way of example. In the present embodiment incident light on the lens systems 20A, 20B is imaged on the light-receiving areas of the solid-state image sensors 22 A, 22B in accordance with a certain projection model such as equidistant projection. Images are captured on the two-dimensional solid-state area image sensors and image data represented in a plane coordinate system. In the present embodiment a circular fisheye lens having an image circle diameter smaller than an image diagonal line is used and an obtained image is a planar image including the entire image circle in which the photographic areas in FIGs. 4A, 4B are projected.
[0034]
The entire image captured by each solid-state image sensor is divided into small areas in circular polar coordinate system with radius r and argument Θ in FIG.
5A or small areas in planar orthogonal coordinate system with x and y coordinates in FIG. 5B. It is preferable to exclude the outside of the image circle from a subject of integration and averaging since it is a non-exposed outside area. In the area division of the ISPs 108, each image is divided into small areas as shown in FIGs. 5A, 5B and the integrated value or integrated average value of brightness is calculated for each divided area. The integrated value is obtained by integrating the brightness values of all the pixels in each divided area while the integrated average value is obtained by normalizing the integrated value with the size (number of pixels) of each divided area excluding the outside area.
[0035]
The area calculator 212 receives the integrated data for each divided area including the integrated average value and calculates an index value for each divided area to evaluate a photographic state thereof. In the present embodiment the index value is an area brightness level b for evaluating an absolute brightness of each divided area. The brightness level b(x, y) of a certain divided area is calculated for each solid-state image sensor 22 by the following equation:
where s(x,y) is an integrated average value for a certain divided area, a is an aperture, t is exposure time, and g is amplifier gain. The brightness level b(r, Θ) of a divided area in a circular coordinate system can be calculated in the same manner.
[0036]
The brightness level b (x, y) is an index value to evaluate the brightness of a subject in each divided area and calculated from the brightness of a pixel value of an actual image according to a current exposure parameter (a, t, g).
[0037]
The overall calculator 214 evaluates the captured images including the overlapping area as a whole on the basis of the calculated brightness level b'(x, y) and calculates an overall evaluation value with weighting according to an overlapping portion between the photographic areas of the images. Herein, the solid-state image sensors 22A, 22B are referred to as 0th and 1st image sensors and their brightness levels are referred to as b° (x,y) and b^Xjy) , respectively. In the present embodiment the overall evaluation value is an overall brightness level bT to evaluate the brightness of all the areas of the images or subject brightness as a whole with a certain weighting.
[0038]
The overall brightness level bT1 is calculated for an i-th solid-state image sensor by the following equation:
∑b°(x,y)x w0i (x,y)+∑W (x,y)x wli (x, y)
∑wMfcj +∑wuM
where bJ (x,y) is a brightness level for each divided area of each solid-state image sensor j (j e 0, 1 ) and wJ1 (x, y) is a weighted value in weighted averaging for each divided area.
[0039]
As expressed by the above equation, different sets of weighted values wjl (x, y) are used for each solid-state image sensor. The weighted values wjl (x, y) can be adjusted so that a larger value is given to a solid-state image sensor with a lower brightness level to prevent receipt of an influence from a light source. Thus, weighting can be performed properly in accordance with a result of determination about a photographic scene. Further, the small areas (x, y) can be zoned into intermediate areas as overlapping area and non-overlapping area as
indicated by hatching in FIGs.5A, 5B. The overall brightness level bT1 can be calculated by the following equation, using the brightness level and weighted values for each intermediate area.
, wli x bE° + wli x bEl + w3i x bC° + w4i x bCl _
wli + w2i + w3i + w4i where bE° is a brighteness level (average) of an edge area (overlapping area) of the 0th image, bC° is a brighteness level (average) of a center area (non-overlapping area) of the 0th image, bE1 is a brightness level (average) of an edge area (overlapping area) of the 1st image, bC1 is a brighteness level (average) of a center area (non-overlapping area) of the 1st image, and wli to w4i are weighted values of weighted averaging set for each intermediate area of an i-th solid-state image sensor. The basic values of weighted values wli to w4i can be calculated by the following equations (4):
Wl i =W2 i =Ae/ (AO+A 1 + 2Ae)
W3 i =A0/ (AO+A 1 + 2Ae)
W4 i-Al/■ (A0+Al + 2Ae) where Ae is the size of the edge areas of the 0th and 1st images, and AO and Al are the sizes of the center areas of the 0th and 1st images. The calculated basis values can be corrected in accordance with a result of determination about a photographic scene such that a solid-state image sensor with a lower brightness level is given a larger weight.
[0040]
Preferably, the overall calculator 214 can include a weighting setter to set the weighted values wJ1(x, y) according to a signal level of a captured image. The weighting setter is configured to create a brightness distribution (histogram) from the brightness levels bJ(x, y) of all the divided areas and analyze a total average
value and brightness distribution for the scene determination. Then, according to a determined scene, it can change the weighted values wJ1(x, y) for a certain divided area depending on the brightness level bJ(x, y) for the divided area in question.
[0041 ]
For example, in a dark scene as night view, the weighting setter sets, by adding a predetermined amount to the area brightness level, the weighted value wJ1(x, y) to a larger value for evaluating a divided area with a larger brightness level b(x, y) calculated. Thereby, a bright subject in a dark scene is highly evaluated for photometry and exposure can be properly controlled according to a result of the photometry. Meanwhile, in a bright scene the weighting setter sets a larger weighted value wJ1(x, y) for a divided area with a smaller brightness level b(x, y) calculated. Thus, a dark subject in a bright scene is highly evaluated for photometry.
[0042]
Alternatively, a divided area containing an extremely black or white subject can be detected according to an upper limit threshold and a lower limit threshold to exclude the divided area for a subject of the photometry. For example, if a divided area or a white area with a brightness level equal to or over an upper limit threshold (butj,) and/or a divided area or a black area with a brightness level equal to or below a lower limit threshold (bah) is/are detected, these areas can be given a smaller weight or zero. Thereby, it is possible to calculate the overall brightness level bT with the divided area unsuitable for exposure correction given a small weight or not taken into account.
[0043]
The weighting setter determines a scene of each the 0th and 1 st images captured by the two solid-state image sensors 22A, 22B on the basis of a relation
of the brightness between the two images and sets weighted values appropriate for the scene for each of the solid-state image sensors 22A, 22B. For example, to prevent receipt of an influence from a light source, the weighted values wJ 1 (x, y) can be adjusted so that a larger value is given to a solid-state image sensor with a lower brightness level.
[0044]
The exposure condition determiner 216 determines an exposure parameter (a, t, g) for each i-th solid-state image sensor on the basis of the overall brightness level bT1 calculated by the overall calculator 214. The overall brightness level bT is to evaluate the brightness of a subject in the images, and a condition for acquiring a proper exposure can be represented by the following conditions:
h :A ... (4)
t K
Bv+Sv=Av+Tv ••• (5)
where k is a constant, Bv is a brightness value, Sv is a sensitivity value, Av is an aperture value and Tv is a time value. The condition (5) is a transformation of the condition (4) taking 2 as a base of logarithm of each of the four parameters.
[0045]
The four parameters are calculated by the following equations:
Av = 2 log 2 a •••(6)
Tv = log ,— ••• (7)
t
Sv = log 2(k2g) →
In the equations (8), (9) kl s k2 are constants.
[0046]
Specifically, to satisfy the above exposure conditions, the exposure condition determiner 216 adjusts the aperture a, exposure time t and amplifier gain g for each i-th solid-state image sensors 22 according to a current exposure parameter and a measured subject brightness or overall brightness level and acquires a proper exposure. The corrected exposure parameter (a' , t' , g') can be obtained from the brightness level bT1, referring to a table called a program diagram which is prepared in advance in accordance with the characteristic of the imaging system 10. Herein, the program diagram refers to a diagram or a table containing the combinations of the amplifier gain g and exposure time t with a fixed aperture a. Exposure values can be determined from the combinations.
[0047]
In the present embodiment an optimal combination of the aperture a, exposure time t and amplifier gain g is obtainable from the overall brightness level bT by a certain program diagram. Alternatively, at least one of the aperture a, exposure time t and amplifier gain g can be manually set and the rest of them can be found from the program diagram. Such automatic exposure mode exemplifies shutter priority mode in which exposure time t is manually set, aperture priority mode in which aperture a is manually set, and sensitivity priority mode in which amplifier gain g is manually set.
[0048]
Hereinafter, the exposure control by the imaging system 10 is described, referring to FIGs. 6, 7. FIG. 6 is a flowchart for the exposure control while FIG. 7 is a flowchart for exposure calculation process of the exposure control. The operation in FIG. 6 is repeatedly executed every time images are captured by the image sensors 22A, 22B. In step S I 01 the imaging system 10 calculates an integrated average value for each divided area of the two image sensors 22A, 22B
by integrating the pixel values thereof. In step S I 02 the exposure calculation in FIG. 7 is called up.
[0049]
In FIG. 7 in step S201 the imaging system 10 takes the statistics of integrated average values for the divided areas of each solid-state image sensor 22 to calculate an average and a dispersion (or a standard deviation) of the image. In step S202 the imaging system 10 determines whether a current exposure parameter is in an allowable range from the average and dispersion of the two images to sufficiently evaluate a subject brightness. For example, if the average of the images is close to zero and the dispersion is lower than a certain threshold, it is probable that black saturation occur in the captured images. In contrast, if the average is close to saturation and a dispersion is low, it is probable that white saturation occur in the captured images. The light amount of captured images with black or white saturation cannot be properly measured so that the exposure parameter indicating black or white saturation is determined to be outside the allowable range.
[0050]
With NO in step S202, the imaging system 10 proceeds to step S206 and adjusts an exposure parameter to acquire a proper exposure, and completes the operation in step S207. For instance, with the occurrence of black saturation, the exposure parameter is adjusted so that the aperture is opened and exposure time and sensitivity are increased. In contrast, with the occurrence of white saturation, the exposure parameter is adjusted so that the aperture is closed and exposure time and sensitivity are decreased. When the gain g is fixed, the exposure parameter (a' , t') is adjusted by lowering an exposure value Ev by a predetermined number of steps in black saturation. In white saturation the exposure parameter (a', t') is adjusted by raising an exposure value Ev by a predetermined number of steps.
When the aperture a is fixed, the exposure parameter (t\ g') is adjusted by increasing the exposure time t and amplifier gain g in black saturation and decreasing them in white saturation.
[0051]
With YES in step S202, the area calculator 212 calculates a brightness level b'(x,y) for each divided area of the 0th and 1st images on the basis of the integrated average value s(x, y) and a current exposure parameter (a, t, g).
[0052]
In step S204 the overall calculator 214 determines a scene of the images and reads a weighted value wJ1(x, y) for the determined scene. In step S205 the overall calculator 214 calculates the weighted average of the brightness levels b°(x, y) and b^x, y) and calculates the overall brightness level bT1 for each i-th solid-state image sensors by the equations (2) and (3). In step S206 the exposure condition determiner 216 adjusts the exposure according to the overall brightness level bT1 to satisfy the conditions (4) and (5) and determines the exposure parameter (a', t', g'). Then, the imaging system 10 completes the exposure calculation and returns to step SI 03 in FIG. 6.
[0053]
In step S I 03 the exposure parameter in the exposure condition register 200 is updated to the determined exposure parameter (a', t', g'), completing the exposure control operation. By repeating the operations in FIGs. 6, 7, the exposure condition is set to a proper exposure satisfying the above conditions (4), (5).
[0054]
In omnidirectional photographing with the omnidirectional imaging system 10, the same subject is captured in the overlapping area, therefore, the brightness levels b of the two images should be the same value. In view of the
occurrence of flares in one of the images as shown in FIGs. 4A, 4B, according to the present embodiment the overall exposure level of the captured images including the overlapping area and non-overlapping areas is evaluated to determine the exposure parameter (a, t, g) for each of the imaging units. Thereby, it is made possible to abate a discontinuity of the brightness at the connecting positions of a synthetic image and generate high-quality synthetic images.
[0055]
Thus, according to the above embodiment, it is possible to provide an imaging controller and imaging control method and program which can provide to each of imaging units a proper imaging condition to abate a discontinuity at the connecting points of the images captured by the imaging units in synthesizing the images.
[0056]
The above embodiment has described an example where two images captured with the lens systems having angle of view of over 180 degrees are overlapped for synthesis. Alternatively, three or more images captured with multiple imaging units can be overlapped for synthesis.
[0057]
Moreover, the above embodiment has described the imaging system 10 to capture an omnidirectional still image as an example of the imaging controller. The present invention should not be limited to such an example. Alternatively, the imaging controller can be configured as an omnidirectional video imaging system or unit, a portable data terminal such as a smart phone or tablet having an omnidirectional still or video shooting function, or a digital still camera processor or a controller to control a camera unit of an imaging system.
[0058]
The functions of the omnidirectional imaging system can be realized by a computer-executable program written in legacy programming language such as assembler, C, C++, C#, JAVA® or object-oriented programming language. Such a program can be stored in a storage medium such as ROM, EEPROM, EPROM, flash memory, flexible disc, CD-ROM, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, blue ray disc, SD card, or MO and distributed through an electric communication line. Further, a part or all of the above functions can be implemented on, for example, a programmable device (PD) as field programmable gate array (FPGA) or implemented as application specific integrated circuit (ASIC). To realize the functions on the PD, circuit configuration data as bit stream data and data written in HDL (hardware description language), VHDL (very high speed integrated circuits hardware description language), and Verilog-HDL stored in a storage medium can be distributed.
[0059]
Although the present invention has been described in terms of exemplary embodiments, it is not limited thereto. It should be appreciated that variations or modifications may be made in the embodiments described by persons skilled in the art without departing from the scope of the present invention as defined by the following claims.
Claims
1. An imaging controller comprising:
an index calculator to calculate an index value for each of divided areas of images captured by a plurality of imaging units, the index value for evaluating a photographic state of each of the divided areas;
an evaluation value calculator to evaluate the images and an overlapping area between the images on the basis of the index value of each divided area calculated by the index calculator and calculate an overall evaluation value; and a condition determiner to determine an imaging condition for each of the imaging units on the basis of the overall evaluation value calculated by the evaluation value calculator.
2. The imaging controller according to claim 1 , wherein:
the imaging condition includes an exposure condition;
the index value of a certain divided area is a first brightness level to evaluate a brightness of the certain divided area;
the overall evaluation value is a second brightness level for each of the imaging units to evaluate a total brightness of the images by applying weighting to the overlapping area; and
the condition determiner is configured to determine an imaging condition for each of the imaging units on the basis of the second brightness level.
3. The imaging controller according to either claim 1 or 2, further comprising
a setter to set, for each of the imaging units, a set of weighted values for the divided areas according to a relation of brightness between the images captured by the imaging units.
4. The imaging controller according to any one of claims 1 to 3, further comprising
a weighting setter to apply weighting to a certain divided area in accordance with the index value calculated for the certain divided area.
5. The imaging controller according to claim 4, wherein
the weighting setter is configured to apply weighting to the divided areas such that in a dark scene a divided area for which a larger index value is calculated is given a larger weighting.
6. The imaging controller according to either claim 4 or 5, wherein
the weighting setter is configured to apply weighting to the divided areas such that in a bright scene a divided area for which a smaller index value is calculated is given a larger weighting.
7. The imaging controller according to any one of claims 4 to 6, wherein the weighting setter is configured to apply weighting to the divided areas such that a divided area for which the index value as an upper limit threshold or more or a lower limit threshold or less is calculated is given a smaller weighting or zero.
8. The imaging controller according to any one of claims 1 to 7, wherein the imaging condition includes at least one of an aperture, an exposure time and a gain.
9. An imaging control method comprising the steps of:
calculating an index value for each of divided areas of images captured by a plurarity of imaging units, the index value for evaluating a photographic state of each of the divided areas;
evaluating the images and an overlapping area between the images on the basis of the index value of each divided area calculated in the calculating step and calculating an overall evaluation value; and
determining an imaging condition for each of the imaging units on the basis of the overall evaluation value calculated in the evaluating step.
10. A non-transitory computer-readable storage medium storing a program for causing a computer to execute the steps of the imaging control method according to claim 9.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380047263.1A CN104620569A (en) | 2012-09-11 | 2013-09-03 | Imaging controller and imaging control method and program |
US14/419,556 US9756243B2 (en) | 2012-09-11 | 2013-09-03 | Imaging controller and imaging control method and program |
EP13837222.2A EP2896201A4 (en) | 2012-09-11 | 2013-09-03 | Imaging controller and imaging control method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-199622 | 2012-09-11 | ||
JP2012199622A JP6065474B2 (en) | 2012-09-11 | 2012-09-11 | Imaging control apparatus, imaging control method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014042104A1 true WO2014042104A1 (en) | 2014-03-20 |
Family
ID=50278216
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/074170 WO2014042104A1 (en) | 2012-09-11 | 2013-09-03 | Imaging controller and imaging control method and program |
Country Status (5)
Country | Link |
---|---|
US (1) | US9756243B2 (en) |
EP (1) | EP2896201A4 (en) |
JP (1) | JP6065474B2 (en) |
CN (1) | CN104620569A (en) |
WO (1) | WO2014042104A1 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015182626A1 (en) * | 2014-05-27 | 2015-12-03 | Ricoh Company, Limited | Image processing system, imaging apparatus, image processing method, and computer-readable storage medium |
WO2016038886A1 (en) * | 2014-09-08 | 2016-03-17 | Ricoh Company, Limited | Imaging device, image processing device, and imaging method |
EP3068126A1 (en) * | 2015-03-10 | 2016-09-14 | Ricoh Company, Ltd. | Imaging apparatus, control system and control method |
CN106793986A (en) * | 2015-05-06 | 2017-05-31 | 皇家飞利浦有限公司 | The optimization energy weighting of dark field signal in differential phase contrast x-ray imaging |
FR3050596A1 (en) * | 2016-04-26 | 2017-10-27 | New Imaging Tech | TWO-SENSOR IMAGER SYSTEM |
EP3349433A4 (en) * | 2015-09-09 | 2018-09-12 | Ricoh Company, Ltd. | Control system, imaging device, and program |
WO2019079403A1 (en) * | 2017-10-18 | 2019-04-25 | Gopro, Inc. | Local exposure compensation |
JP2019088015A (en) * | 2019-01-16 | 2019-06-06 | 株式会社リコー | Image processing system, imaging apparatus, image processing method, and program |
US10432864B1 (en) | 2018-09-19 | 2019-10-01 | Gopro, Inc. | Systems and methods for stabilizing videos |
WO2019226211A1 (en) * | 2018-05-21 | 2019-11-28 | Gopro, Inc. | Image signal processing for reducing lens flare |
US10574894B2 (en) | 2018-05-18 | 2020-02-25 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10607313B2 (en) | 2016-06-30 | 2020-03-31 | Gopro, Inc. | Systems and methods for generating stabilized visual content using spherical visual content |
US11503232B2 (en) | 2019-09-17 | 2022-11-15 | Gopro, Inc. | Image signal processing for reducing lens flare |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6600936B2 (en) | 2014-11-06 | 2019-11-06 | 株式会社リコー | Image processing apparatus, image processing method, image processing system, program, and recording medium |
US9641752B2 (en) * | 2015-02-03 | 2017-05-02 | Jumio Corporation | Systems and methods for imaging identification information |
US9992394B2 (en) * | 2015-03-18 | 2018-06-05 | Gopro, Inc. | Dual-lens mounting for a spherical camera |
CN104835118A (en) * | 2015-06-04 | 2015-08-12 | 浙江得图网络有限公司 | Method for acquiring panorama image by using two fish-eye camera lenses |
CN104954694A (en) * | 2015-07-10 | 2015-09-30 | 王俊懿 | Industrial camera capable of viewing panoramic image in real time through WIFI (wireless fidelity) |
KR102340778B1 (en) * | 2015-08-24 | 2021-12-20 | 엘지이노텍 주식회사 | Camera module |
FR3041134B1 (en) * | 2015-09-10 | 2017-09-29 | Parrot | DRONE WITH FRONTAL VIEW CAMERA WHOSE PARAMETERS OF CONTROL, IN PARTICULAR SELF-EXPOSURE, ARE MADE INDEPENDENT OF THE ATTITUDE. |
CN105163039A (en) * | 2015-09-18 | 2015-12-16 | 联想(北京)有限公司 | Control method and control device |
WO2017104395A1 (en) | 2015-12-15 | 2017-06-22 | 株式会社リコー | Image-processing device and image-processing method |
KR20180113601A (en) | 2016-03-22 | 2018-10-16 | 가부시키가이샤 리코 | Image processing system, image processing method, and program |
DE102016210712A1 (en) * | 2016-06-15 | 2017-12-21 | I-Mmersive Gmbh | Image capture device, image acquisition system, image projection device, image transfer system, method for capturing a 360 ° object region and method for projecting an image |
WO2018025825A1 (en) * | 2016-08-02 | 2018-02-08 | ナーブ株式会社 | Image capture system |
JP2018046430A (en) | 2016-09-15 | 2018-03-22 | ソニー株式会社 | Information processing device, method, and program |
CN106170066B (en) * | 2016-09-26 | 2019-04-26 | 信利光电股份有限公司 | A kind of LSC compensation method of fish-eye camera and device |
CN106934772B (en) * | 2017-03-02 | 2019-12-20 | 深圳岚锋创视网络科技有限公司 | Horizontal calibration method and system for panoramic image or video and portable terminal |
JP6904560B2 (en) * | 2017-08-01 | 2021-07-21 | 株式会社シグマ | Signal processing device |
JP6981106B2 (en) | 2017-08-29 | 2021-12-15 | 株式会社リコー | Image pickup device, image display system, operation method, program |
CN108476291A (en) * | 2017-09-26 | 2018-08-31 | 深圳市大疆创新科技有限公司 | Image generating method, video generation device and machine readable storage medium |
JP6718420B2 (en) * | 2017-09-26 | 2020-07-08 | 日立オートモティブシステムズ株式会社 | Imaging device and adjusting method thereof |
JP6953961B2 (en) * | 2017-09-27 | 2021-10-27 | カシオ計算機株式会社 | Image processing equipment, image processing methods and programs |
US10965894B2 (en) * | 2017-11-20 | 2021-03-30 | Flir Commercial Systems, Inc. | Short wave infrared image sensor with automatic exposure and dynamic range control |
JP7197981B2 (en) * | 2018-01-24 | 2022-12-28 | キヤノン株式会社 | Camera, terminal device, camera control method, terminal device control method, and program |
JP7081473B2 (en) | 2018-03-02 | 2022-06-07 | 株式会社リコー | Imaging optical system, imaging system and imaging device |
CN110231694A (en) | 2018-03-05 | 2019-09-13 | 株式会社理光 | Camera optical system, camera system and photographic device |
JP7098980B2 (en) * | 2018-03-16 | 2022-07-12 | 株式会社リコー | Image pickup device, image processing device and image processing method |
US10852503B2 (en) | 2018-03-20 | 2020-12-01 | Ricoh Company, Ltd. | Joint structure |
JP7124366B2 (en) | 2018-03-20 | 2022-08-24 | 株式会社リコー | Imaging element fixing structure and imaging device |
JP2019164303A (en) | 2018-03-20 | 2019-09-26 | 株式会社リコー | Optical system and imaging apparatus |
JP2020036091A (en) * | 2018-08-27 | 2020-03-05 | キヤノン株式会社 | Imaging device and control method therefor, program, and storage medium |
US11721712B2 (en) | 2018-08-31 | 2023-08-08 | Gopro, Inc. | Image capture device |
US11388332B2 (en) | 2019-01-11 | 2022-07-12 | Ricoh Company, Ltd. | Image capturing apparatus, image capturing method, and recording medium |
JP2020153796A (en) | 2019-03-19 | 2020-09-24 | 株式会社リコー | Distance measuring device and method for measuring distance |
EP3719529A1 (en) | 2019-03-20 | 2020-10-07 | Ricoh Company, Ltd. | Range finding device and range finding method |
JP7205386B2 (en) * | 2019-05-31 | 2023-01-17 | 株式会社リコー | IMAGING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM |
JP6780749B2 (en) * | 2019-08-05 | 2020-11-04 | 株式会社リコー | Imaging equipment, image processing equipment, imaging methods and programs |
JP7063488B2 (en) * | 2020-03-02 | 2022-05-09 | 株式会社オプトエレクトロニクス | Imaging method, imaging device, determination method and program of imaging target |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002281379A (en) * | 2001-03-21 | 2002-09-27 | Ricoh Co Ltd | Image pickup system |
JP2003244511A (en) * | 2002-02-13 | 2003-08-29 | Fuji Photo Film Co Ltd | Omnidirectional photographing camera |
JP2007053617A (en) * | 2005-08-18 | 2007-03-01 | Fujifilm Holdings Corp | Exposure value operation method and imaging apparatus |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6201574B1 (en) * | 1991-05-13 | 2001-03-13 | Interactive Pictures Corporation | Motionless camera orientation system distortion correcting sensing element |
JP3035434B2 (en) * | 1993-10-13 | 2000-04-24 | キヤノン株式会社 | Compound eye imaging device |
JP2003141562A (en) * | 2001-10-29 | 2003-05-16 | Sony Corp | Image processing apparatus and method for nonplanar image, storage medium, and computer program |
JP2003244551A (en) * | 2002-02-18 | 2003-08-29 | Sony Corp | Imaging apparatus and method |
CN1512256A (en) | 2002-12-27 | 2004-07-14 | 金宝电子工业股份有限公司 | Automatic exposure sampling and control method for image shooting device |
JP3945430B2 (en) | 2003-03-19 | 2007-07-18 | コニカミノルタホールディングス株式会社 | Method for measuring object by image and imaging device |
JP2007043225A (en) * | 2005-07-29 | 2007-02-15 | Univ Of Electro-Communications | Picked-up processing apparatus and picked-up processing method |
JP4542058B2 (en) | 2006-03-24 | 2010-09-08 | 富士フイルム株式会社 | Imaging apparatus and imaging method |
JP4869795B2 (en) | 2006-06-06 | 2012-02-08 | 富士フイルム株式会社 | Imaging control apparatus, imaging system, and imaging control method |
KR100882011B1 (en) * | 2007-07-29 | 2009-02-04 | 주식회사 나노포토닉스 | Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof |
JP4787292B2 (en) * | 2008-06-16 | 2011-10-05 | 富士フイルム株式会社 | Omni-directional imaging device |
CN101534453B (en) | 2008-12-12 | 2011-07-27 | 昆山锐芯微电子有限公司 | Method for controlling automatic exposure, image processor and optical imaging device |
KR20110099845A (en) * | 2010-03-03 | 2011-09-09 | 삼성전자주식회사 | Apparatus and method for omnidirectional caller detection in video call system |
US8488958B2 (en) * | 2010-05-25 | 2013-07-16 | Apple Inc. | Scene adaptive auto exposure |
US9244258B2 (en) * | 2010-06-24 | 2016-01-26 | Panasonic Corporation | Omnidirectional imaging system |
US8675090B2 (en) * | 2010-12-15 | 2014-03-18 | Panasonic Corporation | Image generating apparatus, image generating method, and recording medium |
JP5910485B2 (en) * | 2012-03-16 | 2016-04-27 | 株式会社リコー | Imaging system |
JP5971207B2 (en) * | 2012-09-18 | 2016-08-17 | 株式会社リコー | Image adjustment apparatus, image adjustment method, and program |
JP6119235B2 (en) | 2012-12-20 | 2017-04-26 | 株式会社リコー | Imaging control apparatus, imaging system, imaging control method, and program |
JP6044328B2 (en) | 2012-12-26 | 2016-12-14 | 株式会社リコー | Image processing system, image processing method, and program |
-
2012
- 2012-09-11 JP JP2012199622A patent/JP6065474B2/en active Active
-
2013
- 2013-09-03 EP EP13837222.2A patent/EP2896201A4/en not_active Withdrawn
- 2013-09-03 US US14/419,556 patent/US9756243B2/en not_active Expired - Fee Related
- 2013-09-03 CN CN201380047263.1A patent/CN104620569A/en active Pending
- 2013-09-03 WO PCT/JP2013/074170 patent/WO2014042104A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002281379A (en) * | 2001-03-21 | 2002-09-27 | Ricoh Co Ltd | Image pickup system |
JP2003244511A (en) * | 2002-02-13 | 2003-08-29 | Fuji Photo Film Co Ltd | Omnidirectional photographing camera |
JP2007053617A (en) * | 2005-08-18 | 2007-03-01 | Fujifilm Holdings Corp | Exposure value operation method and imaging apparatus |
Non-Patent Citations (1)
Title |
---|
See also references of EP2896201A4 * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10554880B2 (en) | 2014-05-27 | 2020-02-04 | Ricoh Company, Ltd. | Image processing system, imaging apparatus, image processing method, and computer-readable storage medium |
JP2015226144A (en) * | 2014-05-27 | 2015-12-14 | 株式会社リコー | Image processing system, imaging apparatus, image processing method, and program |
WO2015182626A1 (en) * | 2014-05-27 | 2015-12-03 | Ricoh Company, Limited | Image processing system, imaging apparatus, image processing method, and computer-readable storage medium |
WO2016038886A1 (en) * | 2014-09-08 | 2016-03-17 | Ricoh Company, Limited | Imaging device, image processing device, and imaging method |
JP2016058840A (en) * | 2014-09-08 | 2016-04-21 | 株式会社リコー | Imaging apparatus, image processing system, imaging method, and program |
US9871976B2 (en) | 2015-03-10 | 2018-01-16 | Ricoh Company, Ltd. | Imaging apparatus, control system and control method |
EP3068126A1 (en) * | 2015-03-10 | 2016-09-14 | Ricoh Company, Ltd. | Imaging apparatus, control system and control method |
CN106793986A (en) * | 2015-05-06 | 2017-05-31 | 皇家飞利浦有限公司 | The optimization energy weighting of dark field signal in differential phase contrast x-ray imaging |
EP3349433A4 (en) * | 2015-09-09 | 2018-09-12 | Ricoh Company, Ltd. | Control system, imaging device, and program |
US10477106B2 (en) | 2015-09-09 | 2019-11-12 | Ricoh Company, Ltd. | Control system, imaging device, and computer-readable medium |
FR3050596A1 (en) * | 2016-04-26 | 2017-10-27 | New Imaging Tech | TWO-SENSOR IMAGER SYSTEM |
WO2017186647A1 (en) * | 2016-04-26 | 2017-11-02 | New Imaging Technologies | Imager system with two sensors |
US10848658B2 (en) | 2016-04-26 | 2020-11-24 | New Imaging Technologies | Imager system with two sensors |
US10607313B2 (en) | 2016-06-30 | 2020-03-31 | Gopro, Inc. | Systems and methods for generating stabilized visual content using spherical visual content |
WO2019079403A1 (en) * | 2017-10-18 | 2019-04-25 | Gopro, Inc. | Local exposure compensation |
US11363214B2 (en) | 2017-10-18 | 2022-06-14 | Gopro, Inc. | Local exposure compensation |
US10587808B2 (en) | 2018-05-18 | 2020-03-10 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11025824B2 (en) | 2018-05-18 | 2021-06-01 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11696027B2 (en) | 2018-05-18 | 2023-07-04 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10587807B2 (en) | 2018-05-18 | 2020-03-10 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11363197B2 (en) | 2018-05-18 | 2022-06-14 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10574894B2 (en) | 2018-05-18 | 2020-02-25 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10630921B2 (en) | 2018-05-21 | 2020-04-21 | Gopro, Inc. | Image signal processing for reducing lens flare |
US11330208B2 (en) | 2018-05-21 | 2022-05-10 | Gopro, Inc. | Image signal processing for reducing lens flare |
WO2019226211A1 (en) * | 2018-05-21 | 2019-11-28 | Gopro, Inc. | Image signal processing for reducing lens flare |
US11647289B2 (en) | 2018-09-19 | 2023-05-09 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11172130B2 (en) | 2018-09-19 | 2021-11-09 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11228712B2 (en) | 2018-09-19 | 2022-01-18 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10750092B2 (en) | 2018-09-19 | 2020-08-18 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10958840B2 (en) | 2018-09-19 | 2021-03-23 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10432864B1 (en) | 2018-09-19 | 2019-10-01 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11678053B2 (en) | 2018-09-19 | 2023-06-13 | Gopro, Inc. | Systems and methods for stabilizing videos |
US10536643B1 (en) | 2018-09-19 | 2020-01-14 | Gopro, Inc. | Systems and methods for stabilizing videos |
US11979662B2 (en) | 2018-09-19 | 2024-05-07 | Gopro, Inc. | Systems and methods for stabilizing videos |
JP2019088015A (en) * | 2019-01-16 | 2019-06-06 | 株式会社リコー | Image processing system, imaging apparatus, image processing method, and program |
US11503232B2 (en) | 2019-09-17 | 2022-11-15 | Gopro, Inc. | Image signal processing for reducing lens flare |
Also Published As
Publication number | Publication date |
---|---|
JP6065474B2 (en) | 2017-01-25 |
US20150222816A1 (en) | 2015-08-06 |
JP2014057156A (en) | 2014-03-27 |
US9756243B2 (en) | 2017-09-05 |
EP2896201A4 (en) | 2015-09-02 |
CN104620569A (en) | 2015-05-13 |
EP2896201A1 (en) | 2015-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9756243B2 (en) | Imaging controller and imaging control method and program | |
US20140078247A1 (en) | Image adjuster and image adjusting method and program | |
US10477106B2 (en) | Control system, imaging device, and computer-readable medium | |
US10699393B2 (en) | Image processing apparatus and image processing method | |
JP6119235B2 (en) | Imaging control apparatus, imaging system, imaging control method, and program | |
US11258960B2 (en) | Imaging device, information processing system, program, image processing method | |
US9871976B2 (en) | Imaging apparatus, control system and control method | |
CN107682611B (en) | Focusing method and device, computer readable storage medium and electronic equipment | |
JP6299116B2 (en) | Imaging apparatus, imaging method, and recording medium | |
US7864860B2 (en) | Image pickup apparatus and motion vector deciding method | |
KR101337667B1 (en) | Lens roll-off correction operation using values corrected based on brightness information | |
JP7247609B2 (en) | Imaging device, imaging method and program | |
US8781226B2 (en) | Digital image processing apparatus for recognizing fireworks, method of operating the same, and computer-readable storage medium | |
JP7455656B2 (en) | Image processing device, image processing method, and program | |
JP2015119436A (en) | Imaging apparatus | |
JP5943682B2 (en) | Imaging apparatus, control method thereof, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13837222 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14419556 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |