WO2015087772A1 - 画像処理装置及び方法、並びにプログラム及び記録媒体 - Google Patents
画像処理装置及び方法、並びにプログラム及び記録媒体 Download PDFInfo
- Publication number
- WO2015087772A1 WO2015087772A1 PCT/JP2014/082100 JP2014082100W WO2015087772A1 WO 2015087772 A1 WO2015087772 A1 WO 2015087772A1 JP 2014082100 W JP2014082100 W JP 2014082100W WO 2015087772 A1 WO2015087772 A1 WO 2015087772A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame period
- unit
- digital gain
- image
- area
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 46
- 238000000034 method Methods 0.000 title description 26
- 238000003384 imaging method Methods 0.000 claims description 69
- 238000001514 detection method Methods 0.000 claims description 68
- 238000012937 correction Methods 0.000 claims description 12
- 239000000203 mixture Substances 0.000 claims description 7
- 238000003672 processing method Methods 0.000 claims description 5
- 230000001934 delay Effects 0.000 claims 1
- 230000015572 biosynthetic process Effects 0.000 description 7
- 238000003786 synthesis reaction Methods 0.000 description 7
- 239000002131 composite material Substances 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 238000009825 accumulation Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
- G06T3/073—Transforming surfaces of revolution to planar images, e.g. cylindrical surfaces to planar images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
Definitions
- the present invention relates to an image processing apparatus and method, and in particular, extracts a partial area from an image obtained by imaging at a wide angle of view using a wide-angle lens, for example, a fish-eye lens, displays the enlarged area, and positions of the extracted area
- the present invention relates to an image processing apparatus and method suitable for application to a surveillance camera that pans, tilts, or zooms an image by switching the size, and particularly relates to an exposure control technique thereof.
- the present invention also relates to a program for causing a computer to execute part or all of the components of the image processing apparatus or part or all of the processing in the image processing method, and a recording medium on which the program is recorded.
- Japanese Patent Application Laid-Open No. 2004-228688 describes a camera that cuts out and displays a part of an image obtained by imaging and makes the position and size of a cutout region variable.
- Patent Document 1 when a part of an image is cut out and displayed, exposure control is performed using the luminance information of the entire screen instead of the luminance information of the cut out region, thereby affecting the influence of local luminance fluctuation of the subject. It is hard to receive.
- Patent Document 1 when exposure control is always performed using an image before cropping, an image obtained as a result of exposure control is stable without large fluctuations in luminance, but the luminance is large for each region, for example. Under different subject conditions, when only the low-brightness area is cut out and displayed, the entire display screen is darkened, and the visibility of the image is reduced. Conversely, when only the high-brightness area is cut out and displayed, the image is white. Skipping or saturation of pixel values occurs, and the visibility of the subject decreases.
- an object of the present invention is to perform cropping, enlargement, and distortion correction while switching the position and size of a cropping area of an image obtained by imaging using a lens having a wide angle of view.
- an image processing apparatus that electronically pans, tilts, or zooms, stable exposure control is performed with little fluctuation in the brightness of an image even when an image cutout region is switched.
- the image processing apparatus of the present invention A digital gain application unit that generates an imaging signal in which brightness is adjusted by multiplying an imaging signal output from an imaging unit that captures images in frame period units with a digital gain; A luminance detection unit that detects the luminance of each of a plurality of regions each forming a part of the captured image represented by the imaging signal generated by the digital gain application unit; A cutout unit that selects one of the plurality of areas of the captured image according to the designation information that designates a cutout area, cuts out an image of the selected area, and performs distortion correction; A control unit that sets an exposure condition for the imaging unit based on the luminance detected by the luminance detection unit and sets the digital gain used by the digital gain application unit; The controller is When the designation information is changed from information designating the first area to information designating the second area in the first frame period, In a frame period subsequent to the first frame period, the brightness used for setting the exposure condition and setting the digital gain is changed from the brightness of the first area to the brightness of the second area, The cutting unit is instructed to
- FIG. 1 is a block diagram illustrating an image generation apparatus including an image processing apparatus according to Embodiment 1 of the present invention. It is a figure which shows the example of the projection and distortion correction of the image in the imaging using a wide angle lens.
- (A)-(d) is a figure which shows the projection to the imaging surface of the image on a virtual spherical surface, the division
- (A)-(k) is a figure which shows an example of the sequence of the process in each part of Embodiment 1.
- FIG. (A)-(k) is a figure which shows the sequence of the process in each part of the conventional image processing apparatus.
- FIG. (A)-(k) is a figure which shows the other example of the sequence of the process in each part of Embodiment 1.
- FIG. (A)-(d) is a figure which shows the relationship between the some area used for calculation of the brightness
- FIG. 1 shows an image generation apparatus 10 including an image processing apparatus according to Embodiment 1 of the present invention.
- the illustrated image generation device 10 includes a wide-angle image acquisition unit 12 and an image processing device 14.
- the wide angle image acquisition unit 12 includes a wide angle lens 20 and an imaging unit 30.
- the image processing apparatus 14 includes a digital gain application unit 40, a luminance detection unit 50, an image generation unit 60, a cutout unit 70, and an image composition unit 80.
- the wide-angle lens 20 is composed of, for example, a fisheye lens, and forms an image with a wide angle of view on the imaging surface of the imaging unit 30.
- the image picked up by the fisheye lens can be grasped as a projection of the subject OJ in the real space onto the virtual spherical surface PL and a projection from the virtual spherical surface PL onto the planar imaging surface SL.
- correction to the image according to the application of the image for example, correction for bringing the image close to the image obtained by normal imaging, that is, the image NL obtained by the perspective projection method (distortion correction). Is done.
- the imaging unit 30 is composed of an image sensor that converts an image formed into electronic data and outputs the electronic data.
- the imaging unit 30 generates a captured image in a predetermined frame period unit, that is, for each frame period.
- An imaging signal representing a captured image generated by charge accumulation by exposure in each frame period is output to the next frame. Output during the period.
- the imaging unit 30 is configured such that R pixels, G pixels, and B pixels are arranged in a Bayer array, and the above-described imaging signal is generated by each pixel value (color signal) obtained from these R pixels, G pixels, and B pixels. Composed.
- the imaging unit 30 can change the exposure time and gain (analog gain), and can adjust the brightness of the captured image by changing the exposure time and gain.
- the digital gain application unit (digital amplifier) 40 multiplies the imaging signal from the imaging unit 30 by a gain (digital gain) (applies a gain magnification).
- the gain used for multiplication can be changed, and the brightness of the image output from the digital gain application unit 40 can be adjusted by controlling the gain in the same manner as the imaging unit 30.
- the imaging unit 30 outputs an imaging signal generated by performing exposure under an exposure condition in a certain frame period (first frame period) in the next frame period (second frame period).
- the digital gain multiplication in the digital gain application unit 40 in each frame period is generated as a result of exposure in the imaging unit 30 in the previous frame period, and targets an imaging signal input from the imaging unit 30.
- the control unit 100 instructs the luminance detection unit 50 to divide the image on the imaging surface SL into a plurality of regions and perform luminance detection for each region generated by the division.
- the luminance detection unit 50 detects the luminance for each region generated by the division by the control unit 100.
- Luminance is detected by calculating an average pixel value for each region. In the present embodiment, the luminance is detected for all the divided regions in each frame period. The calculation of the average pixel value for each region in each frame period is completed by the end of the frame period, that is, before the start of the next frame period.
- the image generation unit 60 performs color interpolation processing on signals of color components that are missing at each pixel position because the pixels of the imaging unit 30 are Bayer arrayed in the image output from the digital gain application unit 40. To generate a video signal having signals of all color components at each pixel position, and perform image processing such as color correction and edge enhancement according to the illuminance condition of the subject.
- the cutout unit 70 selects one of a plurality of regions formed by the division by the control unit 100 from the image output from the image generation unit 60, cuts out the image of the selected region, and corrects and enlarges the distortion. I do.
- the image synthesizing unit 80 combines the images of a plurality of areas sequentially output from the clipping unit 70 and synthesizes them into one image corresponding to a wider angle of view than the corresponding angle of view of each area.
- the cutout by the cutout unit 70 is a process of cutting out any one of the regions shown in FIG. 3B, for example, the regions R1, R2,. Regions R1, R2,... On the imaging surface SL in FIG. 3B correspond to regions R1, R2,.
- the distortion correction by the cutout unit 70 is a process of converting an image on the imaging surface SL into an image according to the application, for example, an image close to an image NL (FIG. 2) obtained by perspective projection.
- the composition by the image composition unit 80 is performed by sequentially extracting images of a plurality of regions that have been sequentially extracted by the clipping unit 70 and corrected for distortion, for example, images of regions R1 to R6 shown in FIG. Is a process that is combined into a single image.
- a region to be connected corresponds to a region within a range indicated by a frame CR in FIG.
- the control unit 100 receives designation information DR that designates a cutout region from the outside, and in response to this designation, the imaging unit 30, the digital gain application unit 40, the luminance detection unit 50, the image generation unit 60, the cutout unit 70, and the image composition
- the section 80 is controlled to switch the cut-out area and combine the images.
- the designation of the cutout area by the designation information DR includes information indicating the position of the cutout area, for example, the center position, and information indicating the size of the cutout area.
- an identification number is assigned to a plurality of areas in advance, information indicating the identification number is included, and the control unit 100 specifies an area to be cut out based on this information.
- the control unit 100 also sets image processing conditions such as color correction and edge enhancement according to the illuminance condition of the subject for the image generation unit 60, and controls these processes.
- the illuminance condition of the subject may be estimated from the luminance value detected by the luminance detection unit 50, for example.
- the control unit 100 further controls the image composition unit 80 regarding image composition.
- control unit 100 acquires a luminance detection result (pixel average value) output from the luminance detection unit 50, and sets an exposure condition for the imaging unit 30 and a digital gain application unit based on the acquired luminance detection result.
- the digital gain for 40 is set.
- the setting of exposure conditions for the imaging unit 30 includes setting of exposure time and setting of gain (analog gain).
- FIGS. 4A to 4K show operation sequences of the control unit 100, the imaging unit 30, the digital gain application unit 40, and the luminance detection unit 50.
- FIG. FIGS. 4A to 4K show an eight-frame period from frame n ⁇ 4 to n + 3 when switching from one region (first region) A to another region (second region) B. Operation is shown.
- the regions A and B are, for example, any one of the regions R1 to R6.
- FIG. 4A to 4K assume a case where there is an instruction to switch areas in frame n-1.
- FIG. 4A shows a region cut out by the cutout unit 70. In the illustrated example, the area A is cut out until the frame n + 1, and the area B is cut out after the frame n + 2.
- 4B, 4C, 4D, and 4E show the processing of the control unit 100.
- FIG. 4B shows a process for acquiring the result of luminance detection by the luminance detecting unit 50.
- the control unit 100 acquires only the detection result for one of the areas and uses it for the calculation of the set value.
- the luminance detection result for the region A is acquired up to the frame n ⁇ 1
- the luminance detection result for the region B is acquired after the frame n.
- FIG. 4C shows the calculation of the set value based on the luminance detection result acquired by the process in FIG. 4B
- FIG. 4D shows the calculation result of the calculation in FIG.
- the setting values of the exposure conditions (exposure time, analog gain) for the imaging unit 30 are shown.
- FIG. 4E shows the digital gain of the digital gain setting value calculated as a result of the calculation in FIG. Settings for the application unit 40 are shown.
- FIG. 4F shows the image valid period VA and the vertical blanking period BL in each frame.
- the positions in the time axis direction of the blocks indicating the respective processes schematically indicate the timing at which the respective processes are performed. Also, in FIGS. 4B to 4J, a line with an arrow from one process to another indicates transmission of information or signal representing the result obtained in each process to another process. .
- the luminance detection result acquired in each frame is obtained by calculating the average value of the pixel values in the immediately preceding frame. Since which region luminance detection result is to be acquired is determined at the start of each frame, the luminance of all regions is detected in the previous frame.
- the setting value calculation process of FIG. 4C is performed within the same frame period after the process of FIG.
- the processing for setting the exposure conditions for the imaging unit 30 and the digital gain setting for the digital gain applying unit 40 in FIGS. 4D and 4E are performed within the same frame period after the processing in FIG. Specifically, it is performed within the last vertical blanking period BL of the frame period. By performing the setting process within the blanking period BL, it is possible to avoid a change in the brightness of the image within the same frame due to the setting being changed within the image valid period VA.
- FIG. 4G and FIG. 4H show processing in the imaging unit 30.
- FIG. 4G shows the exposure process
- FIG. 4H shows the data output.
- FIG. 4 (i) shows data output in the digital gain application unit 40
- FIG. 4 (j) shows luminance detection (measurement of pixel values) by the luminance detection unit 50.
- FIG. As described above, in this embodiment, since the luminance of all regions is detected in each frame, “all regions” is described in FIG. The area from which the luminance value detection result is obtained at the beginning of the next frame is shown in parentheses next to the characters “all areas” in FIG. In the illustrated example, the brightness detection result of the area A in each frame is acquired in the next frame until the frame n-2, and the brightness detection result of the area B in each frame is obtained after the frame n-1. Obtained at the beginning of the next frame.
- the exposure condition setting for the imaging unit 30 and the digital gain setting for the digital gain application unit 40 are performed, and data obtained according to the result of these settings is output.
- the process up to (until the luminance detection result is reflected in the output data) will be described.
- the luminance of all the regions including the region A is detected in the frame n-4 (FIG. 4 (j)), and the luminance detection result of the region A is acquired by the control unit 100 at the start of the frame n-3 (FIG. 4). (B)) within the same frame n-3, calculation of the set value (FIG. 4C), setting for the image pickup unit 30 (FIG. 4D), and setting for the digital gain application unit 40 (FIG. 4B). e)) is performed, and in accordance with these settings, the imaging unit 30 is exposed in the next frame n-2 (FIG. 4 (g)), and the imaging signal generated as a result of the exposure is further transmitted to the next frame.
- n-1 is output from the imaging unit 30, and digital gain multiplication (FIG. 4 (i)) is performed by the digital gain application unit 40 in the same frame n-1.
- the control unit 100 detects the luminance in the next frame n.
- the detection result acquisition target is switched to the region B (the luminance detection result of the region B in the frame n ⁇ 1 is acquired), and the cut-out region is determined from the frame n + 2 after 2 frames (3 frames after the switching instruction) Switch to B (FIG. 4A).
- the reason why the luminance detection result acquisition target can be switched at the head of the frame n next to the frame instructed to switch the cut-out area is that the luminance information of all areas is detected in each frame.
- the area A is cut out.
- this frame n + 1 exposure based on the exposure condition set based on the luminance detection result of the area A acquired in the frame n-1 (frame n) And the data obtained as a result of the digital gain multiplication (frame n + 1) is output.
- the frame n + 2 after switching the cut-out area the area B is cut out.
- this frame n + 2 exposure based on the exposure condition set based on the luminance detection result of the area B acquired in the frame n (frame n + 1).
- the data obtained as a result of the digital gain multiplication (frame n + 2) is output.
- the data of each cut-out area is data obtained as a result of performing exposure and digital gain multiplication under the conditions set based on the luminance detection result for the same area. It is.
- the luminance detection in each frame is performed only for the selected area.
- FIG. 5 (k) it is assumed that there is a switching request for frame n-1.
- FIG. 5 (j) the brightness of the area A is detected until the frame n-1, and the brightness of the area B is detected after the frame n.
- the luminance detection result of the area A is acquired up to the frame n, and the luminance detection result of the area B is acquired after the frame n + 1 (FIG. 5B).
- setting value calculation, exposure condition setting, and digital gain setting are performed based on the luminance detection result of area A until frame n, and setting values based on the luminance detection result of area B are performed after frame n + 1. Calculation, exposure condition setting, and digital gain setting are performed.
- the data output in frame n + 2 is set based on the exposure condition set based on the brightness detection result of area A and the brightness detection result of area B. It is adjusted with the adjusted digital gain, and there is no guarantee that the brightness is appropriate, and there is a possibility that a difference in brightness between the previous and next frames may occur.
- FIGS. 4A to 4K the exposure under the exposure condition set based on the brightness detection result of the area A and the brightness detection result of the area A are the same before switching the cutout area.
- An image obtained by multiplying the set digital gain is output, and after switching the cut-out area, exposure under the exposure condition set based on the brightness detection result of area B and the brightness detection result of area B are the same.
- An image obtained by multiplying the digital gain set based on the output is output. That is, the brightness of the image is appropriately adjusted both before and after the area switching.
- the digital gain is set in the digital gain applying unit 40 within the same frame where the set value is calculated, and the digital gain applying unit 40 Although the use of the set digital gain is delayed by one frame, as shown in FIGS. 6A to 6K, the digital gain is applied to the digital gain in the next frame after the set value is calculated. 40, the digital gain application unit 40 may immediately use the newly set digital gain for multiplication. In other words, the delay process may be performed by the control unit 100 instead of the digital gain application unit 40.
- the luminance detection unit detects the luminance of all the regions of the image in each frame.
- the region that is actually the target of extraction and the target of the extraction or the target of the extraction It is good also as performing the brightness
- the regions it is possible to detect the luminance in each frame only for these regions.
- the regions to be cut out are shown as not overlapping with each other.
- an image of the overlapping region at the edge is generated and overlapped at the time of synthesis.
- a weighted addition or selection process may be performed.
- images of different regions sequentially cut out by the cutout unit are combined by the image synthesis unit.
- the present invention is not limited to such a case.
- the image of the cutout region is used as it is.
- the present invention can also be applied to a configuration in which images are sequentially displayed (without being connected to images in other areas).
- luminance detection may be performed only for the known region.
- Luminance detection may be performed only for a plurality of known areas.
- the area occupied by the composite image may be gradually moved by sequentially switching areas that are part of the composite image. For example, from the state in which the composite image is formed in the regions R1 to R6 in FIG. 2A, the leftmost regions R1 and R2 are removed, and instead the adjacent regions R7 and R8 are added to the regions R3 to R3. A composite image composed of R8 may be formed, and the same processing may be repeated thereafter. Also in this case, if the direction of movement is known, in addition to the regions R1 to R6 that are actually the targets of synthesis, the regions R7 and R8 that are to be combined next are also included in each frame. Luminance may be detected.
- the luminance in each frame may be detected for adjacent regions (regions that may be the target of synthesis).
- switching is performed between regions occupying different positions in the image.
- one region and another region that partially include the one region.
- the present invention can also be applied in the case of switching between and. By selecting and displaying a narrower area image and a wider area image one after the other, zoom processing can be performed.
- the present invention can also be applied to switching between a plurality of regions having different tilt (tilt) angles at the same position (center position). By selecting and displaying different tilt angle regions in this way, it is possible to perform display while changing the tilt angle.
- the present invention it is possible to perform stable exposure control with little fluctuation in the brightness of an image even if the cutout area of the image is switched, and the switching is executed based on the instruction to switch the cutout area. There is little delay until it is done.
- Embodiment 2 the image cut-out area and the area serving as the unit for luminance detection match, but the present invention can also be applied when the image cut-out area and the area used as the unit for luminance calculation are different. It is.
- a region serving as a unit of luminance calculation is referred to as a “zone” for distinction from the cutout region. If the area that is the unit of brightness calculation, that is, the area is smaller than the cutout area of the image and each cutout area extends over a plurality of areas, that is, if at least a part of the plurality of areas are included in the cutout area, the plurality The brightness of the cutout area may be calculated based on the brightness of the area.
- the captured image is divided into a plurality of rectangular areas (blocks), the luminance of each block is detected, and the luminance of the cutout region including at least a part of the block is determined based on the detected luminance of each block. It may be calculated.
- FIG. 7A shows an example in which the image is divided into a plurality of rectangular sections (blocks) BK by grid-like lines, and the cutout region ER is set so as to extend over the plurality of rectangular blocks.
- the block BKe used for calculating the luminance of the cutout region ER is indicated by hatching. That is, in the example of FIG. 7B, only the block BKe that is entirely included in the cutout area ER is used for calculating the luminance of the cutout area ER. In the example of FIG. Only the block BKe in which more than half is included in the cutout area ER is used for calculating the luminance of the cutout area ER. In the example of FIG.
- all the blocks BKe each including at least a part in the cutout area ER are all. This is used for calculating the luminance of the cutout region ER.
- the calculation of the brightness of the cutout area based on the brightness of the block may be a simple average or a weighted average. In the case of the weighted average, a weight corresponding to the ratio included in the cut-out area in each block may be given.
- the luminance of the cutout area is calculated using the luminance of each rectangular area (block) formed by dividing the captured image.
- the luminance of the cutout region can be calculated flexibly.
- the image processing method implemented by the image processing apparatus also forms part of the present invention.
- part or all of the components of the image processing apparatus described above, or part or all of the processing in the image processing method described above may be realized by software, that is, by a programmed computer. Therefore, a program for causing a computer to execute part or all of the components of the image processing apparatus or part of or all of the processing in the image processing method, and a computer-readable recording on which the program is recorded
- the medium also forms part of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
Abstract
Description
フレーム期間単位で撮像する撮像部から出力された撮像信号に対し、デジタルゲインを乗算して輝度が調整された撮像信号を生成するデジタルゲイン適用部と、
前記デジタルゲイン適用部により生成された撮像信号で表される撮像画像の、それぞれ一部をなす複数の領域の、領域毎の輝度を検出する輝度検出部と、
切出し領域を指定する指定情報に応じて、前記撮像画像の複数の領域の一つを選択して、選択した領域の画像を切り出して、歪補正を行う切り出し部と、
前記輝度検出部で検出された輝度に基づいて、前記撮像部に対して露光の条件を設定し、前記デジタルゲイン適用部で用いる前記デジタルゲインを設定する制御部とを備え、
前記制御部は、
第1のフレーム期間に、前記指定情報が第1の領域を指定する情報から第2の領域を指定する情報へ変更された場合に、
前記第1のフレーム期間の次のフレーム期間に、前記露光の条件の設定及び前記デジタルゲインの設定に用いる輝度を、前記第1の領域の輝度から前記第2の領域の輝度に変更し、
前記切り出し部に対して前記第1のフレーム期間から3フレーム期間経過後に、前記撮像信号から切り出す領域を前記第1の領域から前記第2の領域に変更することを指示する
ことを特徴とする。
図1は本発明の実施の形態1の画像処理装置を備えた画像生成装置10を示す。図示の画像生成装置10は、広角画像取得部12と、画像処理装置14とを備える。
広角画像取得部12は、広角レンズ20と、撮像部30とを有する。
画像処理装置14は、デジタルゲイン適用部40と、輝度検出部50と、画像生成部60と、切り出し部70と、画像合成部80とを有する。
魚眼レンズによる撮像は、図2に示すように、実空間内の被写体OJの仮想球面PLへの投影、仮想球面PLから平面状の撮像面SLへの投影として把握することができる。撮像面SL上の画像に対しては、画像の用途に応じた画像への補正、例えば、通常の撮像で得られる画像、即ち透視投影法で得られる画像NLに近付けるための補正(歪補正)が行われる。
輝度検出部50は、制御部100による分割で生成された領域毎に輝度の検出を行う。輝度の検出は、領域毎の平均画素値の算出により行われる。本実施の形態では、各フレーム期間において、すべての分割領域について輝度を検出する。
各フレーム期間における、各領域についての平均画素値の算出は、当該フレーム期間の最後までに、即ち次のフレーム期間の開始前には完了する。
図3(b)の撮像面SL上の領域R1、R2、…は仮想球面PL上の領域R1、R2、…に対応する。
切り出し部70による歪補正は撮像面SL上の画像を、用途に応じた画像、例えば透視投影によって得られる画像NL(図2)に近い画像に変換する処理である。
指定情報DRによる切り出し領域の指定は、切り出し領域の位置、例えば中心の位置を示す情報、及び切り出し領域の大きさを示す情報が含まれる。また、複数の領域に予め識別番号が付されている場合には、該識別番号を示す情報が含まれ、この情報に基づいて制御部100で、切り出すべき領域を特定する。
被写体の照度条件は、例えば輝度検出部50で検出された輝度値から推定したものであっても良い。
制御部100はさらに、画像合成部80に対して、画像の合成に関する制御を行う。
撮像部30に対する露光条件の設定は、露光時間の設定とゲイン(アナログゲイン)の設定を含む。
図4(a)~(k)には、ある領域(第1の領域)Aから別の領域(第2の領域)Bへの切り替えに際しての、フレームn-4からn+3までの8フレーム期間における動作が示されている。領域A及びBは例えばそれぞれ領域R1~R6のいずれかである。
図4(a)には、切り出し部70により切り出される領域が示されている。図示の例では、フレームn+1までは領域Aが切り出され、フレームn+2以降は領域Bが切り出される。
図4(b)、(c)、(d)及び(e)には、制御部100の処理が示されている。
図4(b)には、輝度検出部50による輝度検出の結果を取得する処理が示されている。
図示の例では、フレームn-1までは領域Aについての輝度検出結果、フレームn以降は領域Bについての輝度検出結果が取得される。
図4(f)には、各フレームにおける画像有効期間VA及び垂直ブランキング期間BLが示されている。
また、図4(b)~(j)において、一つの処理から別の処理への矢印を付けた線は、各処理で得られた結果を表す情報乃至信号の別の処理への伝達を示す。
図4(d)及び図4(e)の撮像部30に対する露光条件の設定及びデジタルゲイン適用部40に対するデジタルゲインの設定の処理は図4(c)の処理の後に同じフレーム期間内に、具体的には、当該フレーム期間の最後の垂直ブランキング期間BL内に行われる。
ブランキング期間BL内に設定の処理を行うことで、画像有効期間VA内に設定が変更されて同じフレーム内で画像の明るさに変化が生じるのを避けることができる。
図4(g)には、露光処理が示され、図4(h)には、データ出力が示されている。
先にも述べたように、本実施の形態では、各フレームにおいて、すべての領域の輝度の検出が行われるので、そのことを示すため図4(j)には「全領域」と記載され、次のフレームの先頭で、輝度値検出結果の取得の対象となる領域が、図4(j)に「全領域」の文字の次の括弧内に示されている。
図示の例では、フレームn-2まではそれぞれのフレームでの領域Aの輝度検出結果が次のフレームで取得され、フレームn-1以降は、それぞれのフレームでの領域Bの輝度検出結果が、次のフレームの先頭で取得される。
輝度検出結果の取得(フレームn-3)から、取得結果に基づいて設定された露光条件での露光、及びデジタルゲインの乗算の結果生成されたデータの出力(フレームn-1)までに2フレーム期間の遅れがある。
切り出し領域切り替えの指示のあったフレームの次のフレームnの先頭で輝度検出結果取得の対象を切り替えることができるのは、各フレームにおいて、すべての領域の輝度情報を検出しているためである。
一方、切り出し領域の切り替え後のフレームn+2においては、領域Bが切り出されるが、このフレームn+2では、フレームnにおいて取得した領域Bの輝度検出結果に基づいて設定された露光条件による露光(フレームn+1)及びデジタルゲインの乗算(フレームn+2)の結果得られたデータが出力される。
このように、切り出し領域の切り替えの前後において、切り出される各領域のデータは、同じ領域についての輝度検出結果に基づいて設定された条件で、露光及びデジタルゲインの乗算を行った結果得られたデータである。
従来の方法では、各フレームにおける輝度検出は、選択されている領域についてのみ行われる。例えば、図5(k)に示すように、フレームn-1に切り替え要求があったとする。
この場合、図5(j)に示すように、フレームn-1までは領域Aの輝度が検出され、フレームn以降は、領域Bの輝度が検出される。
その結果、フレームnまでは領域Aの輝度検出結果が取得され、フレームn+1以降は領域Bの輝度検出結果が取得される(図5(b))。
そこで、切り出し領域の切り替えをフレームn+2とフレームn+3の間で行い、フレームn+3以降は領域Bを切り出す。
従って、切り出し領域の切り替えの指示(図5(k))があったフレームの先頭から、切り出し領域の切り替えまでに、4フレーム期間の遅れがある。
これに対して、本発明では、切り出し領域の切り替えの指示(図4(k))があったフレームの先頭から切り出し領域の切り替え(図4(a))までの遅れが3フレーム期間となる。
これに対して図4(a)~(k)では、切り出し領域の切り替え前は領域Aの輝度検出結果に基づいて設定された露光条件での露光と、同じく領域Aの輝度検出結果に基づいて設定されたデジタルゲインの乗算とにより得られた画像が出力され、切り出し領域の切り替え後は領域Bの輝度検出結果に基づいて設定された露光条件での露光と、同じく領域Bの輝度検出結果に基づいて設定されたデジタルゲインの乗算とにより得られた画像が出力される。即ち、領域の切り替えの前でも後でも画像の明るさが適切に調整されたものとなる。
この場合にも、移動の方向が既知である場合には、現に合成の対象となっている領域R1~R6に加えて、次に合成の対象となる予定の領域R7、R8についても各フレームにおいて輝度の検出を行うこととしても良い。また、移動の方向が既知でないが、いずれかの方向に隣接領域を順次追加し、逆の側にある領域を除去していくことが既知である場合には、現に合成の対象となっている領域に加えて、隣接する領域(合成の対象となる可能性のある領域)について各フレームにおける輝度の検出を行うこととしても良い。
上記実施の形態1では画像の切り出し領域と、輝度検出の単位となる領域とが一致しているが、画像の切り出し領域と輝度算出の単位となる領域が異なる場合にも、本発明は適用可能である。以下では、切出し領域との区別のため、輝度算出の単位となる領域を「区域」と呼ぶ。輝度算出の単位となる領域、即ち区域が画像の切り出し領域よりも小さく、各切出し領域が複数の区域にまたがる場合、即ち複数の区域の少なくとも一部が、当該切出し領域に含まれる場合、該複数の区域の輝度に基づいて、切出し領域の輝度を算出することとして良い。例えば、例えば撮像画像を複数の矩形の区域(ブロック)に分割し、ブロック毎の輝度を検出し、検出されたブロックごとの輝度に基づいて、該ブロックを少なくとも部分的に含む切出し領域の輝度を算出することとしても良い。
ブロックの輝度に基づく切出し領域の輝度の算出は、単純平均であっても良く、重み付け平均であっても良い。重み付け平均の場合には、各ブロックのうちの、切出し領域に含まれる割合に応じた重みを付けても良い。
Claims (13)
- フレーム期間単位で撮像する撮像部から出力された撮像信号に対し、デジタルゲインを乗算して輝度が調整された撮像信号を生成するデジタルゲイン適用部と、
前記デジタルゲイン適用部により生成された撮像信号で表される撮像画像の、それぞれ一部をなす複数の領域の、領域毎の輝度を検出する輝度検出部と、
切出し領域を指定する指定情報に応じて、前記撮像画像の複数の領域の一つを選択して、選択した領域の画像を切り出して、歪補正を行う切り出し部と、
前記輝度検出部で検出された輝度に基づいて、前記撮像部に対して露光の条件を設定し、前記デジタルゲイン適用部で用いる前記デジタルゲインを設定する制御部とを備え、
前記制御部は、
第1のフレーム期間に、前記指定情報が第1の領域を指定する情報から第2の領域を指定する情報へ変更された場合に、
前記第1のフレーム期間の次のフレーム期間に、前記露光の条件の設定及び前記デジタルゲインの設定に用いる輝度を、前記第1の領域の輝度から前記第2の領域の輝度に変更し、
前記切り出し部に対して前記第1のフレーム期間から3フレーム期間経過後に、前記撮像信号から切り出す領域を前記第1の領域から前記第2の領域に変更することを指示する
ことを特徴とする画像処理装置。 - 前記デジタルゲイン適用部は、前記撮像部において、各フレーム期間での露光の結果生成された撮像信号に対し、次のフレーム期間にゲインを乗算して輝度が調整された撮像信号を出力し、
前記制御部は、各フレーム期間に前記輝度検出部で検出された輝度に基づいて、前記各フレーム期間から2フレーム期間後のフレーム期間における前記撮像部による露光の条件、及び前記各フレーム期間から3フレーム期間後のフレーム期間における前記デジタルゲイン適用部での乗算に使用されるデジタルゲインを設定し、
前記制御部は、前記第1のフレーム期間において前記第2の領域について検出された輝度に基づく前記露光の条件及び前記デジタルゲインの設定を開始し、
前記切り出し部に、前記あるフレーム期間の3フレーム期間後のフレーム期間以降前記第2の領域の画像の切り出しを行わせる
ことを特徴とする請求項1に記載の画像処理装置。 - 前記輝度検出部が、各フレーム期間において、前記複数の領域のうちの現に切り出されている領域及び切り出しの対象となる可能性がある領域のすべてについて領域毎の輝度を検出することを特徴とする請求項1又は2に記載の画像処理装置。
- 前記輝度検出部は、前記撮像画像を分割することで形成される複数の区域毎に輝度を検出し、前記複数の領域の各々に、少なくとも部分的に含まれる1又は2以上の前記区域の輝度から、当該領域の輝度を算出する
ことを特徴とする請求項1又は2に記載の画像処理装置。 - 前記制御部は、前記各フレーム期間の次のフレーム期間に前記露光の条件の決定及び前記デジタルゲインの決定のための演算を行い、
前記第1のフレーム期間の次のフレーム期間以降、前記第2の領域についての前記輝度の検出の結果に基づいて前記露光の条件、及び前記デジタルゲインの設定のための演算を行う
ことを特徴とする請求項2に記載の画像処理装置。 - 前記制御部は、決定された前記露光の条件及び前記デジタルゲインを、前記各フレーム期間の次のフレーム期間に前記撮像部及び前記デジタルゲイン適用部に設定し、
前記デジタルゲイン適用部は、設定されたデジタルゲインの使用を1フレーム期間遅らせる
ことを特徴とする請求項5に記載の画像処理装置。 - 前記制御部は、
決定された前記露光の条件を、前記各フレーム期間の次のフレーム期間に前記撮像部に設定し、
決定された前記デジタルゲインを、前記各フレーム期間の2フレーム期間後のフレーム期間に前記デジタルゲイン適用部に設定する
ことを特徴とする請求項5に記載の画像処理装置。 - 前記露光の条件の設定及び前記デジタルゲインの設定が各フレーム期間の垂直ブランキング期間内に行われることを特徴とする請求項1から7のいずれか一項に記載の画像処理装置。
- 前記デジタルゲイン適用部から出力される前記撮像信号を受けて、色補間、色補正、エッジ強調を含む画像処理を行う画像生成部をさらに備え、
前記切り出し部は、前記画像生成部での画像処理を受けた画像から前記切り出しを行うことを特徴とする請求項1から8のいずれか一項に記載の画像処理装置。 - 前記切り出し部から出力された複数の領域の画像を合成して一枚の連続した画像を生成する画像合成部をさらに備える
ことを特徴とする請求項1から9のいずれか一項に記載の画像処理装置。 - フレーム期間単位で撮像する撮像部から出力された撮像信号に対し、デジタルゲインを乗算して輝度が調整された撮像信号を生成するデジタルゲイン適用ステップと、
前記デジタルゲイン適用ステップにより生成された撮像信号で表される撮像画像の、それぞれ一部をなす複数の領域の、領域毎の輝度を検出する輝度検出ステップと、
切出し領域を指定する指定情報に応じて、前記撮像画像の複数の領域の一つを選択して、選択した領域の画像を切り出して、歪補正を行う切り出しステップと、
前記輝度検出ステップで検出された輝度に基づいて、前記撮像部に対して露光の条件を設定し、前記デジタルゲイン適用ステップで用いる前記デジタルゲインを設定する制御ステップとを備え、
前記制御ステップは、
第1のフレーム期間に、前記指定情報が第1の領域を指定する情報から第2の領域を指定する情報へ変更された場合に、
前記第1のフレーム期間の次のフレーム期間に、前記露光の条件の設定及び前記デジタルゲインの設定に用いる輝度を、前記第1の領域の輝度から前記第2の領域の輝度に変更し、
前記切り出しステップに対して前記第1のフレーム期間から3フレーム期間経過後に、前記撮像信号から切り出す領域を前記第1の領域から前記第2の領域に変更することを指示する
ことを特徴とする画像処理方法。 - 請求項11の各ステップの処理をコンピュータに実行させるためのプログラム。
- 請求項12のプログラムを記録したコンピュータで読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/028,128 US10171733B2 (en) | 2013-12-11 | 2014-12-04 | Image processing apparatus and method, and program and recording medium |
JP2015552408A JP6038352B2 (ja) | 2013-12-11 | 2014-12-04 | 画像処理装置及び方法、並びにプログラム及び記録媒体 |
GB1606720.9A GB2533750B (en) | 2013-12-11 | 2014-12-04 | Image processing apparatus and method, and program and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013256168 | 2013-12-11 | ||
JP2013-256168 | 2013-12-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015087772A1 true WO2015087772A1 (ja) | 2015-06-18 |
Family
ID=53371078
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/082100 WO2015087772A1 (ja) | 2013-12-11 | 2014-12-04 | 画像処理装置及び方法、並びにプログラム及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10171733B2 (ja) |
JP (1) | JP6038352B2 (ja) |
GB (1) | GB2533750B (ja) |
WO (1) | WO2015087772A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018142939A (ja) * | 2017-02-28 | 2018-09-13 | キヤノン株式会社 | 撮像装置、撮像システム、その制御方法およびプログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030778B2 (en) * | 2014-03-31 | 2021-06-08 | Healthy.Io Ltd. | Methods and apparatus for enhancing color vision and quantifying color interpretation |
US9262801B2 (en) | 2014-04-01 | 2016-02-16 | Gopro, Inc. | Image taping in a multi-camera array |
JP2022154658A (ja) * | 2021-03-30 | 2022-10-13 | キヤノン株式会社 | 撮像装置、撮像素子、撮像装置の制御方法およびプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06350888A (ja) * | 1993-06-10 | 1994-12-22 | Casio Comput Co Ltd | 電子スチルカメラ |
JPH0787384A (ja) * | 1993-09-17 | 1995-03-31 | Canon Inc | 撮像装置 |
JPH0918773A (ja) * | 1995-06-27 | 1997-01-17 | Canon Inc | 撮像装置 |
JPH1079882A (ja) * | 1996-09-02 | 1998-03-24 | Canon Inc | 画像入力装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3028830B2 (ja) | 1990-05-18 | 2000-04-04 | 株式会社日立製作所 | ビデオカメラ装置 |
EP0644687B1 (en) | 1993-09-17 | 2003-12-03 | Canon Kabushiki Kaisha | Image pickup apparatus |
JP2000261717A (ja) | 1999-03-05 | 2000-09-22 | Olympus Optical Co Ltd | 電子的撮像装置 |
JP2007074070A (ja) | 2005-09-05 | 2007-03-22 | Auto Network Gijutsu Kenkyusho:Kk | 車両周辺視認装置 |
JP2008187393A (ja) | 2007-01-29 | 2008-08-14 | Sony Corp | 露出制御システム、露出制御方法、そのプログラムと記録媒体およびカメラ制御システムとカメラ |
JP2010252002A (ja) | 2009-04-15 | 2010-11-04 | Panasonic Corp | 階調補正装置および撮像装置 |
JP5471550B2 (ja) * | 2010-02-10 | 2014-04-16 | ソニー株式会社 | 撮像装置、撮像装置の制御方法及びプログラム |
US8497897B2 (en) * | 2010-08-17 | 2013-07-30 | Apple Inc. | Image capture using luminance and chrominance sensors |
US9402034B2 (en) * | 2011-07-29 | 2016-07-26 | Apple Inc. | Adaptive auto exposure adjustment |
JP5879509B2 (ja) | 2011-10-19 | 2016-03-08 | パナソニックIpマネジメント株式会社 | 撮像装置 |
-
2014
- 2014-12-04 GB GB1606720.9A patent/GB2533750B/en not_active Expired - Fee Related
- 2014-12-04 JP JP2015552408A patent/JP6038352B2/ja not_active Expired - Fee Related
- 2014-12-04 US US15/028,128 patent/US10171733B2/en not_active Expired - Fee Related
- 2014-12-04 WO PCT/JP2014/082100 patent/WO2015087772A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06350888A (ja) * | 1993-06-10 | 1994-12-22 | Casio Comput Co Ltd | 電子スチルカメラ |
JPH0787384A (ja) * | 1993-09-17 | 1995-03-31 | Canon Inc | 撮像装置 |
JPH0918773A (ja) * | 1995-06-27 | 1997-01-17 | Canon Inc | 撮像装置 |
JPH1079882A (ja) * | 1996-09-02 | 1998-03-24 | Canon Inc | 画像入力装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018142939A (ja) * | 2017-02-28 | 2018-09-13 | キヤノン株式会社 | 撮像装置、撮像システム、その制御方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
GB2533750B (en) | 2020-03-18 |
US10171733B2 (en) | 2019-01-01 |
JPWO2015087772A1 (ja) | 2017-03-16 |
GB2533750A (en) | 2016-06-29 |
JP6038352B2 (ja) | 2016-12-07 |
GB2533750A8 (en) | 2016-08-03 |
US20160241782A1 (en) | 2016-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4245185B2 (ja) | 撮像装置 | |
JP5237721B2 (ja) | 撮像装置 | |
JP5609742B2 (ja) | 撮像装置、画像合成方法、及びプログラム | |
JP6038352B2 (ja) | 画像処理装置及び方法、並びにプログラム及び記録媒体 | |
JP6351377B2 (ja) | 画像処理システム、撮像装置及び記録装置 | |
US8836821B2 (en) | Electronic camera | |
JP6513309B2 (ja) | 画像合成装置、画像合成方法、及び画像合成プログラム | |
JP2007274504A (ja) | デジタルカメラ | |
JP5790858B2 (ja) | 画像合成装置、画像合成方法、及びプログラム | |
JP2012244480A (ja) | 全方位監視画像表示処理システム | |
US11722788B2 (en) | Image processing apparatus and method, and image capturing apparatus | |
JP2009164674A (ja) | ディストーション補正装置 | |
JP5197064B2 (ja) | ビデオカメラ | |
WO2017122394A1 (ja) | 撮像制御装置、および撮像装置 | |
JP2020188452A (ja) | 撮像装置およびその制御方法 | |
JP6762843B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP7008431B2 (ja) | 撮像装置、制御方法、プログラム及び撮像システム | |
CN110692235A (zh) | 图像处理装置、图像处理程序及图像处理方法 | |
US8270673B2 (en) | Motion detecting apparatus | |
JP5780454B2 (ja) | 撮像装置、プログラム及び表示方法 | |
WO2020235400A1 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
JP5338248B2 (ja) | 画像処理装置、電子カメラおよび画像処理プログラム | |
JP2001245277A (ja) | 画像変換装置 | |
JP2008147956A (ja) | 映像信号処理装置及び方法、並びにプログラム及び記録媒体 | |
JP6786346B2 (ja) | 画像処理装置、画像処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14869169 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015552408 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15028128 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 201606720 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20141214 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14869169 Country of ref document: EP Kind code of ref document: A1 |