WO2012086120A1 - Image processing apparatus, image pickup apparatus, image processing method, and program - Google Patents

Image processing apparatus, image pickup apparatus, image processing method, and program Download PDF

Info

Publication number
WO2012086120A1
WO2012086120A1 PCT/JP2011/006380 JP2011006380W WO2012086120A1 WO 2012086120 A1 WO2012086120 A1 WO 2012086120A1 JP 2011006380 W JP2011006380 W JP 2011006380W WO 2012086120 A1 WO2012086120 A1 WO 2012086120A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
parallax
image
resolution
area
Prior art date
Application number
PCT/JP2011/006380
Other languages
French (fr)
Japanese (ja)
Inventor
智生 木村
中村 剛
郁雄 渕上
大橋 政宏
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012086120A1 publication Critical patent/WO2012086120A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention belongs to the technical field of image blurring technology, and relates to an improvement of an image processing apparatus mounted on a 3D camera, an image processing program used as an accessory of the 3D camera, and an image processing method.
  • Image blurring technology refers to image processing that focuses only on some subjects in the image and intentionally blurs the other close-up / background. In such image processing, it is possible to enhance the main subject and produce a stereoscopic effect.
  • the 3D camera includes a set of a left-eye lens and a left-eye optical mechanism, a set of a right-eye lens and a right-eye optical mechanism, and can simultaneously photograph left-eye image data and right-eye image data. It is a camera. These left-eye image data and right-eye image data are reproduced on a display dedicated for stereoscopic viewing, and the left-eye image data and right-eye image data are viewed through liquid crystal shutter glasses. You can enjoy the video.
  • Patent Document 1 Document known inventions related to image blurring techniques are described in Patent Document 1, and document known inventions related to 3D cameras are described in Patent Document 2.
  • Patent Document 1 discloses an image processing apparatus that selects a matrix according to the focal length f, F value, and distance difference D of a lens and adds blur by using the selected matrix.
  • Patent Document 2 discloses a 3D camera that calculates an optimal convergence angle for a main subject and performs actual photographing to make the focused subject clear and blur the others.
  • the right-eye and left-eye image sensors are moved by a predetermined amount, and the movement amount in the image of the main subject captured by the right-eye and left-eye image sensors before and after the movement is calculated. By calculating, distance information is acquired.
  • the movement of the image sensor and the acquisition of distance information are repeated until data capable of calculating the distance L from the digital camera to the main subject can be acquired.
  • a distance between a base line connecting the left-eye lens and the lens driving unit and a line parallel to the base line passing through the main subject is a distance L.
  • the left-eye imaging optical system and the right-eye imaging optical system are operated by the left-eye optical system drive mechanism and the right-eye optical system drive mechanism so that the convergence angle is optimal for the main subject existing at the calculated distance L.
  • the convergence angle is an angle formed by the line of sight of the right eye pupil and the line of sight of the left eye pupil during stereoscopic reproduction.
  • the convergence angle optimal for the distance L is the optical axis of the imaging optical system for the left eye.
  • the optical axis of the right-eye imaging optical system is a convergence angle such that the distance L is from the midpoint of the baseline length of the left-eye lens and the right-eye lens.
  • the stereoscopic effect is in accordance with the difference between the convergence angle during stereoscopic reproduction and the convergence angle during planar reproduction (Non-patent Document 3).
  • the 3D camera described in Patent Document 2 calculates a movement amount in the main subject image from the left-eye image and the right-eye image, and according to the distance information based on the movement amount, the left-eye image sensor and the right-eye
  • a main subject there is a guarantee that the convergence angle is appropriate for the main subject, but subjects other than the main subject (e.g., the main subject is a person). In some cases, there is no guarantee as to what convergence angle is generated from the sub-subject captured in the foreground / background of the main subject.
  • the left-eye imaging optical system and the right-eye imaging optical system are completely independent optical systems that are located at some distance from each other in the 3D camera.
  • the position of the sub-subject reflected in the foreground / background may be greatly different between the left-eye image and the right-eye image. Even if the main subject has a convergence angle, stereoscopic playback is performed for images in which the position of the sub-subject reflected in the foreground / background is significantly different between the left-eye image and the right-eye image. When trying to do so, the convergence angle of the sub-subject reflected in the foreground / background may become extremely large. This is because the convergence angle of the sub-subject reflected in the foreground / background is not adjusted at all such as left-eye image sensor and right-eye image sensor feedback control.
  • a pan-focus image is an image taken with a single-focus lens or a lens with a small aperture, that is, a lens with a deep depth of field, in which all persons or objects in the image are in focus.
  • Digital camera power focus function The power focus function is not to manually extend the lens, but to automatically detect the movement of the subject through the viewfinder, and based on the movement information, the lens is extended by the motor.
  • a pan-focus image is also obtained when shooting is performed using a function for interpolating characteristics.
  • the left-eye image and the right-eye image which are pan-focus images
  • the main subject and the sub-subject that appear in the image are in focus, so the convergence angles for all of these subjects must be appropriate.
  • the blurring technique described in Literature 1 by applying the blurring technique described in Literature 1 to the left-eye image and the right-eye image, it is possible to blur the left-eye image and the right-eye image based on desired blurring characteristics.
  • the set of the left-eye image and the right-eye image obtained by the 3D camera is obtained by a completely independent optical system, even if the left-eye image and the right-eye image are individually blurred based on the blur characteristics Only the sub-subject that has a large movement amount between the left-eye image and the right-eye image cannot be greatly blurred. Therefore, the blurring technique described in Patent Document 1 has a problem that 3D sickness cannot be eliminated.
  • Non-Patent Document 2 instead of the image obtained with the 3D camera.
  • the prior art is not affected by the type of image to be processed and cannot be used for the purpose of obtaining a uniform stereoscopic image uniformly.
  • the object of the present invention is to determine the type of the left-eye image and the right-eye image even when the position of the sub-object captured in the foreground / background is greatly different between the left-eye image and the right-eye image. It is an object of the present invention to provide a processing apparatus that can achieve a uniform stereoscopic effect.
  • an image processing apparatus includes a receiving unit that receives input of a plurality of image data, a parallax detection unit that detects parallax with other image data for each image data, A detection unit that detects an illegal parallax region among a plurality of pixel regions in a pair of images subjected to parallax detection as an illegal parallax region, and performs blurring processing on each pixel region constituting the illegal parallax region And a processing means.
  • 3D sickness occurs uniformly regardless of the type of image, such as whether it is an image taken with a 3D camera, a pan-focus image, or a plane view image. Only such a pixel region can be blurred.
  • 3 is a flowchart showing the operation of the image processing apparatus 100. It is a block diagram which shows an example of a structure of the image processing apparatus 700 which concerns on Embodiment 2 of this invention.
  • FIG. 5 is a flowchart showing the operation of the image processing apparatus 700. It is a block diagram which shows an example of a structure of the image processing apparatus 1000 which concerns on Embodiment 3 of this invention. It is a figure which shows the relationship between the blur intensity
  • region. FIG. 10 is a diagram illustrating a relationship between blur intensity and distance of blurring processing performed by the image processing apparatus 1200.
  • FIG. 25 is a diagram illustrating an example of a user operation screen of the imaging apparatus 2400.
  • Figure showing an example of the safety level change screen It is a figure which shows an example of the change screen of the blurring process based on distance information. It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information.
  • Embodiment 1 >> ⁇ 1.1 Overview>
  • the image processing apparatus compares the parallax between the left-eye image and the right-eye image used for stereoscopic display with a predetermined threshold, and determines whether the parallax is illegal based on the comparison result. . Then, the blurring process is performed on the pixel area having illegal parallax. In this way, by identifying a pixel region having parallax that can cause visual fatigue, discomfort, stereoscopic sickness, etc. to the observer, and adding blur to this pixel region, visual fatigue / discomfort due to incorrect parallax can be prevented. Can be reduced.
  • FIG. 1 is a block diagram illustrating an example of the configuration of the image processing apparatus 100.
  • the image processing apparatus 100 includes image data input terminals 101 and 102, a parallax detection unit 103, an irregular parallax detection unit 104, blurred image generation units 105 and 106, and image data output terminals 107 and 108.
  • the image for the left eye and the image for the right eye are images obtained by imaging the scene from different viewpoints, and may be, for example, image data captured by a stereo camera. Further, the image is not limited to a real image, and may be CG (Computer Graphics) created assuming different virtual viewpoints. Further, it may be a still image or a moving image including a plurality of still images that are temporally continuous.
  • CG Computer Graphics
  • the parallax detection unit 103 detects parallax between the left-eye image and the right-eye image input to the image data input terminals 101 and 102.
  • the detection of the parallax is performed in units of 16 ⁇ 16 pixel blocks, and the number of pixels in the horizontal direction constituting the corresponding pixel blocks of the left-eye image and the right-eye image is detected as the parallax.
  • the detection of the parallax between the left-eye image and the right-eye image will be specifically described with reference to FIG.
  • FIG. 2 is a diagram illustrating detection of parallax between the left-eye image and the right-eye image.
  • detection of a corresponding pixel block will be described with reference to FIG.
  • FIG. 2A shows a left-eye image and a right-eye image divided into pixel blocks.
  • the detection of parallax is performed in units of 16 ⁇ 16 pixel blocks, for the sake of explanation, the parallax is illustrated as being divided into larger pixel blocks.
  • Pixel blocks A and A ′, B and B ′, and C and C ′ respectively indicate corresponding pixel blocks.
  • the corresponding pixel block is obtained by setting a pixel block to which a pixel corresponding to one pixel belonging to the pixel block of the left-eye image belongs to a pixel block corresponding to the pixel block of the left-eye image.
  • the search for the corresponding pixel is performed using, for example, a block matching method.
  • the block matching method is a search using a pixel density pattern around a pixel for which a corresponding point search is performed.
  • FIG. 2B is a diagram illustrating the distance from the pixel block to the center in the horizontal direction of the image.
  • dl (A), dl (B), and dl (C) are distances from the pixel blocks A, B, and C of the left-eye image to the center of the image, respectively, dr (A ′), dr (B ′), dr (C ′ ) Respectively indicate the distances from the pixel blocks A ′, B ′, C ′ of the right-eye image to the image center.
  • the distance is detected as the number of horizontal pixels constituting the pixel block to the image center.
  • the right direction from the screen is positive.
  • the parallax is calculated using the distance from the pixel block thus detected to the center of the image.
  • the illegal parallax detection unit 104 determines whether the parallax detected by the parallax detection unit 103 is illegal parallax.
  • illegal parallax will be described. In general, it is known that stereoscopic images having excessively large positive parallax and negative parallax may cause visual fatigue, discomfort, stereoscopic sickness, and the like in the present invention. In addition, parallax of stereoscopic images that can cause visual fatigue, discomfort, stereoscopic sickness, etc. is called illegal parallax.
  • Non-Patent Document 3 a parallax angle of 1 ° or less is recommended for comfortable stereoscopic observation. Therefore, the relationship between D> H and D ⁇ L between the upper limit value H and lower limit value L of the predetermined parallax (comfort parallax) that cannot cause visual fatigue, discomfort, stereoscopic sickness, etc. Holds.
  • the determination as to whether the parallax is illegal or not is performed based on the comparison result by comparing the parallax with a predetermined threshold value. That is, the upper limit value H and the lower limit value L of the comfortable parallax are compared.
  • the threshold value the number of horizontal pixels based on the guideline value recommended by the 3D consortium may be determined as a predetermined threshold value to determine whether the parallax is illegal.
  • the upper limit value of the comfortable parallax range is the number of horizontal pixels corresponding to a parallax angle of 1 degree
  • the lower limit value is the number of horizontal pixels corresponding to a parallax angle of 0 degree.
  • the number of horizontal pixels corresponding to the parallax angle varies depending on the size of the display screen, the number of horizontal pixels of the image data, the distance between the display screen and the observer, and the distance between the pupils of the observer.
  • Non-Patent Document 3 In the case of a general 5 million pixel photograph (2592 ⁇ 1944) (distance between pupils E: 65 mm), the number of horizontal pixels corresponding to a parallax angle of 1 degree is 102 pixels (see Non-Patent Document 3). That is, under the above observation conditions, a case where the parallax> 102 pixels and the parallax ⁇ 0 pixels is determined as an illegal parallax. In the case of an image / video of 1920 ⁇ 1080 pixels (distance between pupils E: 65 mm), the number of horizontal pixels corresponding to a parallax angle of 1 degree is 57 pixels (see Non-Patent Document 3).
  • the blurred image generation unit 105 performs a blurring process on a pixel block having an illegal parallax of an image input from the image data input terminal 101.
  • the blurred image generation unit 106 performs a blurring process on the pixel block having an illegal parallax of the image input from the image data input terminal 103.
  • the blurring process uses, for example, a 7 ⁇ 7 smoothing filter.
  • FIG. 3 is a diagram illustrating the addition of blur to an illegal parallax region having a negative parallax.
  • FIG. 3A shows an illegal parallax region where the parallax is negative.
  • the hatched portion is a pixel block determined as an illegal parallax region.
  • a tree part is determined as an illegal parallax region, and blur is added to this part. That is, 7 ⁇ 7 smoothing filter processing is performed on the pixels constituting the tree.
  • FIG. 3B is a diagram illustrating viewing of a stereoscopic image subjected to the blurring process.
  • the blurred image generation units 105 and 106 can reduce the visual fatigue and discomfort caused by the incorrect parallax by performing the blurring process on the individual pixel areas constituting the incorrect parallax area. it can.
  • the blurring process according to the present embodiment does not simply hide the illegal parallax area. Simply hiding or removing the incorrect parallax results in an unnatural image for the observer, but the blurring process according to the present embodiment prevents the observer from recognizing the incorrect parallax as a so-called blurred image. To do. This point will be described below.
  • FIG. 4 is a diagram illustrating the relationship between distance and parallax.
  • a is the inter-camera distance (baseline length)
  • F is the focal length of the camera
  • dl is the distance from the image center to the corresponding pixel C in the left-eye image
  • dr is the distance from the image center to the corresponding pixel C ′ in the right-eye image
  • x represents the distance from the imaging position to the subject shown in the corresponding pixel.
  • the parallax d is the amount of deviation between the left-eye image and the right-eye image
  • FIG. 5 is a diagram showing the relationship between the strength to add blur by the blurring process and the distance.
  • a virtual space consisting of distances 0 to 6, when the upper limit value H of comfortable parallax corresponds to the distance 1 and the lower limit value L corresponds to the distance 5, a proportional relationship is established between the parallax and the distance.
  • the range is a range of distance 0 to 1 and distance 5 to 6. Therefore, as shown in FIG. 5, blurring processing is performed on the near view (distance 0 to 1) and the distant view (5 to 6).
  • the blurring process is performed on the near view and the distant view, so that the viewer can not recognize the illegal parallax as a so-called blurred image. Since the area other than the comfortable parallax area is blurred, the subject in the comfortable parallax area is recognized by the observer as having a more stereoscopic effect.
  • FIG. 6 is a flowchart showing the operation of the image processing apparatus 100.
  • the parallax detection unit 103 first searches for a pixel block of the right-eye image corresponding to the pixel block of the left-eye image (step S601). Specifically, first, a pixel corresponding to one pixel belonging to the pixel block of the left-eye image is obtained by performing a corresponding point search on the right-eye image, and then the pixel block to which the corresponding-point pixel belongs is determined for the left eye. A corresponding pixel block is searched by setting it as a pixel block corresponding to the pixel block of the image. Next, the parallax detection unit 103 calculates the number of pixels constituting the center of the image from the pixel block for each of the left-eye image and the right-eye image (step S602).
  • the number of pixels constituting the parallax is calculated by subtracting the number of pixels dr constituting the center of the image from the pixel block of the right-eye image from the number of pixels dl constituting the center of the image from the pixel block of the left-eye image (step S603). ).
  • the illegal parallax detection unit 104 determines whether the number of pixels constituting the parallax calculated by the parallax detection unit 103 is an illegal parallax (step S604). The determination is performed based on the comparison result by comparing the parallax with a predetermined threshold (upper limit value / lower limit value of a comfortable parallax range). The operations from step S601 to step S604 are repeated for all the pixel blocks.
  • the blurred image generation units 105 and 106 After performing the operations from step S601 to step S604 for all the pixel blocks, the blurred image generation units 105 and 106 perform a blurring process on the pixel blocks (illegal parallax region) having illegal parallax (step S605).
  • the operation of the image processing apparatus 100 has been described above.
  • the image processing apparatus is an image processing apparatus that processes a plurality of images constituting stereoscopic viewing by binocular parallax, and includes an illegal parallax region.
  • FIG. 7 is a block diagram illustrating an example of the configuration of the image processing apparatus 700.
  • the same parts as those in the configuration of the image processing apparatus 100 according to Embodiment 1 shown in FIG. 1 are denoted by the same reference numerals, description thereof will be omitted, and different parts will be described. As illustrated in FIG.
  • the image processing apparatus 700 includes image data input terminals 101 and 102, a parallax detection unit 103, an illegal parallax detection unit 104, an interpolation image generation unit 701, a blurred image generation unit 702 and 703, and an image output terminal. 107 and 108 are comprised.
  • ⁇ 2.2.1 Interpolated Image Generation Unit 701> As a result of the viewer recognizing the difference in color and brightness during stereoscopic viewing, flickering may occur, which may cause visual fatigue, discomfort, stereoscopic sickness, etc. to the viewer.
  • the interpolated image generation unit 701 identifies the one-eye visible pixel area, and transfers (overwrites) the other image area in the image coordinate range to the one-eye visible pixel area.
  • FIG. 8 is a diagram showing a one-eye visible pixel region.
  • the one-side visible pixel region is located at the end of the image data.
  • FIG. 8A for example, in the pixel region indicated by the diagonal lines of the left-eye image, the corresponding pixel point in the right-eye image is not detected, and the parallax is not calculated.
  • a pixel block region where such parallax is not calculated is specified as a one-eye visible pixel region. Then, the pixel area indicated by the oblique line of the right-eye image in the same image coordinate range as the one-side pixel area is transferred (overwritten) onto the pixel area indicated by the oblique line of the left-eye image.
  • FIG. 8B is a diagram illustrating the left-eye image after the interpolation image is generated. As shown in FIG. 8B, the hatched pixels of the right-eye image are displayed on the one-eye visible pixel region of the left-eye image.
  • the blurred image generation units 702 and 703 perform a blurring process on the transfer source pixel area and the transferred pixel area in addition to the pixel block having the incorrect parallax (illegal parallax area). Thus, the color / brightness difference is not recognized by the observer.
  • FIG. 9 is a flowchart showing the operation of the image processing apparatus 700.
  • the parallax detection unit 103 first searches for a pixel block of the right-eye image corresponding to the pixel block of the left-eye image (step S901).
  • a pixel corresponding to one pixel belonging to the pixel block of the left-eye image is obtained by performing a corresponding point search on the right-eye image, and then the pixel block to which the corresponding-point pixel belongs is determined for the left eye.
  • a corresponding pixel block is searched by setting it as a pixel block corresponding to the pixel block of the image.
  • it is determined whether the corresponding pixel point has been detected in step S901 (step S902).
  • the parallax detection unit 103 calculates the number of pixels constituting between the image centers from the pixel block for each of the left-eye image and the right-eye image (step S903). .
  • the number of pixels constituting the parallax is calculated by subtracting the number of pixels constituting the center of the image from the pixel block of the right-eye image from the number of pixels dl constituting the center of the image from the pixel block of the left-eye image (step S904). ).
  • the illegal parallax detection unit 104 determines whether the number of pixels constituting the parallax calculated by the parallax detection unit 103 is an illegal parallax (step S905). The determination is performed based on the comparison result by comparing the parallax with a predetermined threshold (upper limit value / lower limit value of a comfortable parallax range). The operations from step S901 to step S905 are repeated for all pixel blocks.
  • the interpolated image generation unit 701 After performing the operations from step S901 to step S905 for all the pixel blocks, the interpolated image generation unit 701 identifies a pixel block for which parallax information is not calculated as a one-eye visible pixel region (step S906). Then, the interpolated image generation unit 701 transfers (overwrites) the other image region in the image coordinate range to the one-eye visible pixel region. (Step S907). After the above-described interpolation image generation, the blurred image generation units 702 and 703 perform blurring processing on the incorrect parallax region, the transfer source pixel region, and the transferred pixel region of the left-eye image and the right-eye image, respectively. (Step S908). The operation of the image processing apparatus 700 has been described above.
  • an image region (one-eye visible pixel region) that exists only in one of the left-eye image and the right-eye image is specified, and the same image is obtained for the one-eye visible pixel region.
  • the other image area in the coordinate range can be transferred (overwritten).
  • blurring processing on the transfer source pixel area and the transferred pixel area, As an image having a so-called blur, it is possible to remove a sense of discomfort to the observer and to reduce visual fatigue and discomfort caused by an image region reflected in only one of the left-eye image and the right-eye image.
  • the image processing apparatus is an image processing apparatus that processes a plurality of images constituting stereoscopic vision by binocular parallax, and includes an illegal parallax region. By blurring the image, it is possible to prevent the viewer from recognizing unauthorized parallax as a so-called blurred image and reduce visual fatigue and discomfort caused by unauthorized parallax.
  • a generation unit is provided, and based on the parallax between the left-eye image and the right-eye image, distance information from the imaging position of the image data to the subject reflected in the pixel block is generated for each pixel block.
  • FIG. 10 is a block diagram illustrating an example of the configuration of the image processing apparatus 1000. Note that portions that are the same as those of the configuration of the image processing apparatus 700 according to Embodiment 2 illustrated in FIG. 7 are denoted by the same reference numerals, description thereof is omitted, and different portions are described. As illustrated in FIG.
  • the image processing apparatus 1000 includes image data input terminals 101 and 102, a parallax detection unit 103, a distance information generation unit 1001, an irregular parallax detection unit 104, an interpolated image generation unit 701, and a blurred image generation unit 1002. 1003 and image data output terminals 107 and 108.
  • the distance information generation unit 1001 generates distance information from the imaging position of the image data to the subject reflected in the pixel block for each pixel block based on the parallax between the left-eye image and the right-eye image detected by the parallax detection unit 103. .
  • the distance information can be calculated from the parallax value using the principle of triangulation, that is, the relationship between the distance information and the parallax already described with reference to FIG. ⁇ 3.2.2 Blur Image Generation 1002, 1003>
  • the blurred image generation units 1002 and 1003 are images for the left eye based on the illegal parallax region, the transfer source pixel region / transferred pixel region in the transfer to the one-eye visible pixel region, and the distance information generated by the distance information generation unit 1001. -Perform blur processing on the right eye image.
  • a 7 ⁇ 7 smoothing filter process is performed on the transfer source pixel region / transferred pixel region in the transfer to the illegal parallax region and the one-eye visible pixel region, and a predetermined focus distance is applied to other regions. Intensity blurring based on the relative distance from.
  • the focus distance may be a distance at the center position of the image. Further, it may be obtained from the autofocus operation of the camera. Further, it may be the distance of the pixel region selected by the user.
  • the blurring process based on the distance information uses, for example, the technique described in Patent Document 1. That is, a matrix corresponding to the lens point spread function (PSF; point spread fanction) is selected according to the relative distance from the focus distance, and blur is applied by using the selected matrix.
  • PSF lens point spread function
  • FIG. 11 is a diagram showing an example of the blur intensity when an illegal parallax is detected at a distance of 0 to 1 and a distance of 5 to 6.
  • the blur intensity becomes maximum at distances 0 to 1 and 5 to 6 which are illegal parallax areas, and is an inverse hyperbola around the focus distance 3 in the areas of distances 1 to 5 other than the illegal parallax areas. It is defocused. Thereby, it is possible to prevent an observer from recognizing illegal parallax or the like as a more natural blurred image.
  • the image processing apparatus is an image processing apparatus that processes a plurality of images constituting stereoscopic viewing by binocular parallax, The blur processing is performed based on the distance information.
  • the subject detection unit is provided to detect a pixel region (subject region) where a predetermined subject is captured. Then, the blurring process is performed based on the illegal parallax area, the distance information, and the subject area.
  • the subject you want to focus on is not a blurred image that is uniformly blurred due to the distance positional relationship, but as an image with a high degree of visibility against the subject It is possible to prevent the viewer from recognizing illegal parallax or the like.
  • FIG. 12 is a block diagram illustrating an example of the configuration of the image processing apparatus 1200.
  • the image processing apparatus 1200 includes image data input terminals 101 and 102, a parallax detection unit 103, a distance information generation unit 1001, a subject detection unit 1201, an incorrect parallax detection unit 104, an interpolated image generation unit 701, It is configured to include blurred image generation units 1202 and 1203 and image data output terminals 107 and 108.
  • the subject detection unit 1201 detects a pixel region (subject region) where a predetermined subject is captured. In the present embodiment, the subject area is detected by template matching.
  • FIG. 13 is a diagram illustrating a template matching mechanism. As shown in FIG.
  • a specific object such as a face or a person is prepared as a template image 1301 in advance. Then, the subject area is detected by detecting a subject 1203 having a certain degree of similarity with the template image 1301 in comparison with the input image data 1202. ⁇ 4.2.2 Blur Image Generation Units 1202 and 1203>
  • the blur image generation units 1202 and 1203 are based on the incorrect parallax region, the transfer source pixel region / transferred pixel region in the transfer to the one-eye visible pixel region, the distance information, and the subject region detected by the subject detection unit 1201. Blur processing is performed on the image for the right eye and the image for the right eye.
  • FIG. 14 is a diagram showing an example of the blur intensity when the illegal parallax region is detected at the distances 0 to 1 and the distances 5 to 6 and the subject of interest is detected at the positions of the distances 2, 3, and 4.
  • the characteristic of reducing the blur intensity in the distance 2, 3, and 4 sections where the subject of interest is located eliminates the blur of the subject of interest and provides an image that does not impair the sharpness. It is done.
  • a pixel region (subject region) in which a predetermined subject is captured can be specified. Then, by performing blurring processing based on the subject area in addition to the illegal parallax region / distance information, the subject to be focused on does not become a blurred image that is uniformly blurred in the distance positional relationship, and the visibility to the subject is good It is possible to prevent the viewer from recognizing illegal parallax or the like as a blurred image.
  • the image processing apparatus is an image processing apparatus that processes a plurality of images constituting stereoscopic vision by binocular parallax, and includes an illegal parallax region and the like. Is used to reduce visual fatigue and discomfort caused by unauthorized parallax, but it is equipped with a high-resolution converter, and the input low-resolution image data is the same as other high-resolution image data. Convert to resolution image data.
  • the image processing according to the first to fourth embodiments such as the unauthorized parallax region, the one-eye visible pixel region, the distance information, the subject region detection, and the blurring processing are performed on the image data after the high resolution conversion.
  • the image processing according to Embodiments 1 to 4 can be performed on image data with different resolutions, and visual fatigue and discomfort caused by incorrect parallax can be reduced.
  • the image processing apparatus 1500 includes image data input terminals 1501 and 1502, a high resolution conversion unit 1503, a parallax detection unit 103, a distance information generation unit 1001, a subject detection unit 1201, an illegal parallax detection unit 104, An interpolation image generation unit 701, blurred image generation units 1504 and 1505, and image data output terminals 107 and 108 are configured.
  • image data input terminal 1501 receives high-resolution image data of either the left-eye image or the right-eye image used for stereoscopic display.
  • the image data input terminal 1502 is an image of a viewpoint different from the image input to the image data input terminal 1501 and receives low-resolution image data.
  • the high resolution conversion unit 1503 converts the low resolution image data input to the image data input terminal 1502 into image data having the same resolution as the high resolution image data input to the image data input terminal 1501. Nearest neighbor interpolation is used to generate a high resolution image from a low resolution image. That is, high resolution conversion is performed by arranging the pixel values closest to the interpolation point. Note that a high-resolution image may be generated by other resolution conversion techniques such as bilinear interpolation.
  • the left-eye image and the right-eye image By converting the low-resolution image data input to the image data input terminal 1502 into image data having the same resolution as the high-resolution image data input to the image data input terminal 1501, the left-eye image and the right-eye image It is possible to search for corresponding points and to detect parallax. In addition, an illegal parallax region can be detected from the parallax. In addition, distance information can be calculated from the parallax. In addition, it is possible to specify a pixel block for which parallax information is not calculated, and it is possible to detect a one-eye visible pixel region. Furthermore, the subject area can be detected by template matching.
  • the blurred image generation units 1504 and 1505 are used for the high-resolution image data input to the image data input terminal 1501 and the image data that has been subjected to high-resolution conversion by the high-resolution conversion unit 1503.
  • the blurring process is performed based on the transfer source pixel area, the transferred pixel area, the distance information, and the subject area in the transfer to the area.
  • the present embodiment even when the input image data is image data with different resolutions, it is possible to detect an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region, By performing blurring processing based on the unauthorized parallax area, the transfer source pixel area / transferred pixel area, distance information, and subject area in the transfer to the one-eye visible pixel area, a so-called blurred image results from the incorrect parallax. Visual fatigue and discomfort can be reduced.
  • a system for example, a stereo camera
  • a part of a plurality of cameras included in the system can be a low-performance camera, and cost can be reduced.
  • Embodiment 6 >> ⁇ 6.1 Outline> Similar to the image processing apparatus according to the fifth embodiment, the image processing apparatus according to the sixth embodiment can perform image processing according to the first to fourth embodiments on input image data having different resolutions. Although it is a processing device, its implementation method is different.
  • the input high-resolution image data is converted into image data with the same resolution as other low-resolution image data, and the image data after the low-resolution conversion is converted into an illegal parallax area, one-eye visible pixel area, distance information, subject Detection of area detection is performed. And after converting each information of illegal parallax area, one-eye visible pixel area, distance information, subject area detected for low resolution image data into each information for high resolution image data to be blurred, each information Perform blur processing based on. Since the processing of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region that are heavy in processing can be performed with a low-resolution image size, the processing can be reduced.
  • FIG. 16 is a block diagram illustrating an example of the configuration of the image processing apparatus 1600. As shown in FIG.
  • an image processing apparatus 1600 includes image data input terminals 1501 and 1502, a low resolution conversion unit 1601, a parallax detection unit 103, a high resolution conversion unit 1602, a distance information generation unit 1001, a subject detection unit 1201, The illegal parallax detection unit 104, the interpolation image generation unit 701, the conversion unit 1603, the blurred image generation units 1604 and 1605, and the image data output terminals 107 and 108 are configured.
  • the low resolution converter 1601 converts the high resolution image data input to the image data input terminal 1501 into image data having the same resolution as the low resolution image data input to the image data input terminal 1502. Nearest neighbor interpolation is used to generate a low resolution image from a high resolution image.
  • high resolution conversion is performed by arranging the pixel values closest to the interpolation point.
  • a high-resolution image may be generated by other resolution conversion techniques such as bilinear interpolation.
  • the high resolution conversion unit 1602 converts the low resolution image data input to the image input terminal 1502 into high resolution image data on which blurring processing is performed. Specifically, the image data is converted into image data having the same resolution as the high resolution image data.
  • the conversion unit 1603 converts each information of the high-resolution image data subjected to the low-resolution conversion and the incorrect parallax region, the one-eye visible pixel region, the distance information, and the subject region of the low-resolution image data input to the image input terminal 1502.
  • the high-resolution image data input to the image input terminal 1501 and the low-resolution image data subjected to the high-resolution conversion are converted into each information of illegal parallax area, one-eye visible pixel area, distance information, and subject area.
  • the interpolated pixel at the time of resolution conversion is used as the corresponding pixel, and the information on the incorrect parallax area, one-eye visible pixel area, distance information, and subject area included in the corresponding pixel of the low-resolution image data is high resolution. It is assumed that the pixel of the image data has an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region.
  • the blurred image generation units 1604 and 1605 convert the high-resolution image data input to the image input terminal 1501 and the low-resolution image data subjected to the high-resolution conversion into the incorrect parallax area / one eye converted by the conversion unit 1603. Blur processing is performed based on each information of the visible pixel area, distance information, and subject area.
  • FIG. 17 is a flowchart showing the operation of the image processing apparatus 1600.
  • the low resolution conversion unit 1601 first converts the high resolution input image data input to the image data input terminal 1501 to the same resolution as the low resolution input image data input to the image data input terminal 1502. Conversion into image data is performed (step S1701). Next, each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region is detected from the high-resolution input image data and the low-resolution input image data subjected to the low-resolution conversion (step S1702).
  • the high resolution conversion unit 1602 converts the low resolution input image data input to the image data input terminal 1502 into image data having the same resolution as the low resolution input image data input to the image data input terminal 1501 (Step S1). S1703).
  • the conversion unit 1603 converts each information of the illegal parallax region, one-eye visible pixel region, distance information, and subject region of the high-resolution input image data subjected to the low-resolution conversion detected in step S1702 into each piece of information of the high-resolution input image data.
  • each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region of the low resolution input image data is converted into each information of the low resolution input image data subjected to the high resolution conversion (step S1704).
  • the interpolated pixel at the time of resolution conversion is used as the corresponding pixel, and the information on the incorrect parallax area, one-eye visible pixel area, distance information, and subject area included in the corresponding pixel of the low-resolution image data is high resolution. It is assumed that the pixel of the image data has an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region.
  • step S1704 the conversion unit 1603 transfers the other image region in the same image coordinate range as the one-eye visible pixel region to the one-eye visible pixel region that has been converted.
  • Step S1705 the blurred image generation units 1604 and 1605 perform the blurring process based on the information on the incorrect parallax area, the one-eye visible pixel area, the distance information, and the subject area converted by the conversion unit 1603 that performs the blurring process.
  • the blurred image generation unit 1604 performs a blurring process on the high-resolution input image data based on each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region converted by the conversion unit 1603.
  • the blurred image generation unit 1605 blurs the low-resolution input image data subjected to the high-resolution conversion based on each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region converted by the conversion unit 1603. Process.
  • the operation of the image processing apparatus 700 has been described above.
  • each process of detecting an illegal parallax region, one-eye visible pixel region, distance information, and subject region, which is heavy in processing can be performed with a low-resolution image size.
  • the image processing according to Embodiments 1 to 4 can be executed at high speed. In addition, processing can be performed by a low-performance processor.
  • FIG. 18 is a block diagram illustrating an example of the configuration of the image processing apparatus 1800.
  • the image processing apparatus 1800 includes an image data input terminal 1801, a distance information input terminal 1802, a parallax calculation unit 1803, a parallax image generation unit 1804, an illegal parallax detection unit 104, a subject detection unit 1201, a blurred image. It includes generation units 1805 and 1806 and image data output terminals 107 and 108.
  • Image Data Input Terminal 1801, Distance Information Input Terminal 1802> One two-dimensional image data is input to the image data input terminal 1801.
  • the distance information input terminal 1802 receives distance information for each pixel block of the image data input to the image data input terminal 1801.
  • the distance information refers to the distance from the imaging position of the image data to the subject reflected in the pixel block.
  • the distance information may be acquired by a distance sensor such as a TOF (Time Of Flight) type distance sensor. Further, the distance information may be acquired from the focus position of the lens during the autofocus operation of the camera.
  • the distance information may be acquired through an external recording device, broadcast radio waves, a network, or the like.
  • the parallax calculation unit 1803 calculates the length (parallax) to be horizontally shifted for creating a parallax image for each pixel block based on the distance information input from the distance information input terminal 1802.
  • the parallax can be calculated using the relationship between the distance information and the parallax described with reference to FIG. ⁇ 7.2.4 Parallax image generation unit 1804>
  • the parallax image generation unit 1804 generates a parallax image by shifting each pixel block in the horizontal direction by the length of the parallax calculated by the parallax calculation unit 1803.
  • FIG. 19 is a flowchart showing the operation of the image processing apparatus 1800.
  • the parallax calculation unit 1803 calculates the parallax based on the distance information input to the distance information input terminal 1802 (step S1901).
  • the parallax image generation unit 1804 generates a parallax image based on the parallax calculated by the parallax calculation unit 1803 (step S1902).
  • the unauthorized parallax detection unit 104 detects the unauthorized parallax region
  • the subject detection unit 1201 detects the subject region for the input image data and the parallax image (step S1903).
  • the blurred image generation unit 1805 performs blurring processing based on the incorrect parallax region, the distance information, and the subject region (step S1904).
  • a stereoscopic image is created from one two-dimensional image data and distance information corresponding to the image data, and a blurring process based on an illegal parallax area / distance information / subject area is performed.
  • Visual fatigue and discomfort caused by unauthorized parallax can be reduced.
  • Embodiment 8 >> ⁇ 8.1 Overview>
  • the image processing apparatus according to the eighth embodiment is different from the image processing apparatus according to the seventh embodiment in that distance information is not received.
  • the distance information is extracted from one input image data.
  • the blurring process based on the incorrect parallax area, the distance information, and the subject area can be performed from one input image.
  • the image processing apparatus 2000 includes an image data input terminal 1801, a distance information extraction unit 2001, a parallax calculation unit 1803, an unauthorized parallax detection unit 104, a parallax image generation unit 1804, a subject detection unit 1201, a blurred image. It includes generation units 1805 and 1806 and image data output terminals 107 and 108.
  • the distance information extraction unit 2001 extracts distance information from the two-dimensional image data input from the image data input terminal 1801. Specifically, the technique shown in Non-Patent Document 2 is used. In other words, by first dividing the image into a set of pixels called “superpixels” that have very homogeneous attributes such as color and brightness, comparing these superpixels with neighboring superpixels, and analyzing changes in texture gradation, etc. Estimate the distance from the observer.
  • the ninth embodiment is an example related to an imaging apparatus including any one of the image processing apparatuses according to the first to eighth embodiments.
  • the imaging apparatus includes a camera and a user interface capable of selecting / changing a range of parallax recognized as illegal parallax, selecting / changing a subject of interest, and selecting / changing blur intensity based on distance information.
  • 21, 22, and 23 are diagrams illustrating an example of an imaging apparatus according to the present embodiment.
  • 21 shows a digital still camera
  • FIG. 22 shows a mobile terminal device
  • FIG. 23 shows a stereo camera connected to a television.
  • the imaging device shown in each figure performs blurring processing on an image photographed by a camera or an image input from the outside by any one of the image processing devices according to the first to eighth embodiments, and after image processing. Show the image on the display. Thereafter, a user operation by a touch panel, a remote controller, or the like is received, and selection / change of a parallax range recognized as illegal parallax, selection / change of a subject of interest, and selection / change of blur intensity based on distance information are performed. Based on the user operation, the blurred image displayed on the display is updated. Next, the configuration of this imaging apparatus will be specifically described.
  • FIG. 24 is a block diagram illustrating an example of a configuration of the imaging apparatus 2400.
  • the imaging apparatus 2400 includes an image signal processing unit 2401, a control unit 2402, camera units 2403 and 2404, image decoding units 2405 and 2406, image encoding units 2407 and 2408, an image recording unit 2409, A data transmission unit 2410, a 3D / 2D display unit 2411, a touch panel 2412, and a vibration unit 2413 are included.
  • the image signal processing unit 2001 includes any of the image processing apparatuses according to the first to eighth embodiments, and performs a blurring process on image data captured by the camera.
  • control unit 2102 controls the operation of each component of the imaging device 2100.
  • image decoding units 2005 and 2006 are JPEG, MPEG, H.264, and so on. H.264, decoding of compressed image data such as FLASH video.
  • the image encoding units 2007 and 2008 are JPEG, MPEG, H.264, and so on. H.264, compression to FLASH video, etc.
  • the image recording unit 2409 stores the image data captured by the camera units 2403 and 2404 and the image data processed by the image signal processing unit 2401 in a recording medium such as a memory card, HDD, or optical disk.
  • the data transmission unit 2410 is a tuner that receives network transmission / reception and broadcast waves corresponding to a computer data communication network such as the Internet and an intranet, a mobile phone communication network, and the like.
  • image data / image signals captured by the camera units 2403 and 2404 The image data processed by the processing unit 2401 is transmitted.
  • 3D / 2D Display Unit 2411 The 3D / 2D display unit 2411 displays an image processed by the image data / image signal processing unit 2401 captured by the camera units 2403 and 2404 using a liquid crystal display, a plasma display, an organic EL, or the like.
  • Touch panel 2412 The touch panel 2412 receives a selection / change of a parallax range recognized as an illegal parallax by a user, a selection / change of a target subject, and a selection / change operation of a blur intensity based on distance information.
  • an image is intuitively captured via the touch panel 2412 while viewing an image displayed on the 3D / 2D display unit 2411 and a menu screen drawn on the control unit 2401.
  • An operation instruction can be issued to the device 2400.
  • the vibration unit 2413 performs an appropriate vibration operation in response to a user operation using the touch panel 2412 or the like. In addition to intuitive operation by the screen, the user can perceive a clear response from the image signal processing system by sensory perception by vibration.
  • FIG. 25 is a diagram illustrating an example of a display screen in selection / change of a parallax range to be recognized as unauthorized parallax by a user, selection / change of a subject of interest, and selection / change operation of blur intensity based on distance information. As shown in FIG. 25, the blurring process is performed by touching the “Boke” portion.
  • the simple blurring process is a process in which the center of the image is set as a focus position and a blur having a strength corresponding to the length from the center of the image is added, and can be processed at high speed.
  • Each process of (a) simple blurring processing / display, (b) illegal parallax area detection, (c) distance information calculation, and (d) subject area detection is performed in parallel, and each time information is detected, a blurred image Update.
  • the user is requested to change / determine the blurring process based on the distance information, and the user changes / determines the blurring process based on the distance information by a user operation described later in, for example, ⁇ 9.3.3>. It can be performed.
  • the blurring process is changed or determined based on the distance information, the blurred image displayed on the screen is updated.
  • the subject area detection processing the user is requested to change / determine the subject of interest, and the user can change / determine the subject area by a user operation described later in, for example, section ⁇ 9.3.1>.
  • the subject area is changed or determined, the blurred image displayed on the screen is updated.
  • the vibration unit 2413 may be activated when the user touches the vicinity of the detected subject, and the user may intuitively sense the subject by vibration.
  • the blurring process is performed again based on the changed subject area.
  • the user selects / cancels the subject of interest and performs the blurring process again based on the changed subject region, an image with high visibility with respect to the subject of interest can be obtained.
  • ⁇ 9.3.2 Unjust parallax range change / decision operation> Next, an operation for changing / deciding an unauthorized parallax range will be described. In FIG. 25, in the “Option” portion, it is possible to determine a parallax range to be recognized as an illegal parallax.
  • FIG. 26 is a diagram showing a safety level change screen.
  • three levels of safety are provided.
  • Each safety level has a parallax range to be recognized as unauthorized parallax, and the user can change the range of unauthorized parallax by selecting the safe level.
  • safety level 1 is determined to recognize parallax corresponding to a parallax angle of 1.0 or more and ⁇ 1.0 or less as an illegal parallax
  • safety level 2 is set to be a parallax angle of 1.2 or more and ⁇ 1.2 or less.
  • the corresponding parallax is determined to be recognized as an illegal parallax.
  • the safety level is changed, the unauthorized parallax area is detected again, and the changed unauthorized parallax area is blurred.
  • FIG. 27 is a diagram for explaining a blurring process change / decision operation based on distance information.
  • the blurring process based on the distance information can be changed. Specifically, the position of the vertex of the blur intensity curve shown in FIG.
  • FIG. 28 to FIG. 35 are diagrams showing a blurring process that changes by sliding the scroll bar.
  • a portion displayed in white indicates a region where blurring processing based on distance information is not performed. That is, the blurring process is performed around the subject portion displayed in white.
  • FIG. 36 is a diagram showing a blur intensity curve that is changed by the user operation. If the scroll is stopped at the distance 3 portion, the blur indicated by the blur strength curve 3501 is added. If the scroll is stopped at the distance 2 portion, the blur indicated by the blur strength curve 3502 is added.
  • the present invention may be an application execution method disclosed by the processing procedure described in each embodiment.
  • the present invention may be a computer program including program code that causes a computer to operate according to the processing procedure.
  • the present invention can also be implemented as an LSI that controls the application execution apparatus.
  • FIG. 36 shows an example in which the image processing apparatus according to the present invention is embodied by LSI.
  • CPU central processing unit
  • DSP digital signal processor
  • ENC encoder / decoder
  • AIF audio interface
  • VIF video interface
  • PERI control module for controlling peripheral devices
  • NIF network interface
  • MIF memory interface
  • RAM / ROM random access memory
  • LSI is used, but depending on the degree of integration, it may be called IC, system LSI, super LSI, or ultra LSI.
  • the method of circuit integration is not limited to LSI, and implementation with a dedicated circuit or a general-purpose processor is also possible.
  • An FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
  • Embodiments 1 to 9 an example in which image processing is performed on two input image data has been described, but the input image data may be three or more multi-viewpoint images. Similar image processing can be performed on three or more input images.
  • parallax detection is performed in units of 16 ⁇ 16 pixel blocks, but parallax detection is performed in other pixel block units (for example, 32 ⁇ 32 pixels, 8 ⁇ 8 pixels). May be performed. Further, parallax may be detected on a pixel-by-pixel basis.
  • the image processing device of the present invention it is possible to identify a pixel region having parallax (incorrect parallax) that may cause visual fatigue, discomfort, stereoscopic sickness, etc. to the observer, and perform blurring processing on the pixel region. This makes it possible to prevent the viewer from recognizing unauthorized parallax as a so-called blurred image, and to reduce visual fatigue and discomfort caused by unauthorized parallax.

Abstract

A parallax detection unit (103) detects a parallax between a left-eye image and a right-eye image inputted from image data input terminals (101, 102). An improper-parallax detection unit (104) compares the parallax detected by the parallax detection unit (103) and a prescribed threshold value, and evaluates whether the detected parallax is an improper parallax or not on the basis of the result of the comparison. Blurred image generating units (105, 106) execute blurring to pixel areas having improper parallaxes.

Description

画像処理装置、撮像装置、画像処理方法、プログラムImage processing apparatus, imaging apparatus, image processing method, and program
 本発明は、画像ぼかし技術の技術分野に属し、3Dカメラに搭載される画像処理装置や、3Dカメラの付属品として利用される画像処理プログラム、画像処理方法の改良に関する。 The present invention belongs to the technical field of image blurring technology, and relates to an improvement of an image processing apparatus mounted on a 3D camera, an image processing program used as an accessory of the 3D camera, and an image processing method.
 画像ぼかし技術とは、画像の一部の被写体のみにピントをあわせ、それ以外の近景・背景を意図的にぼかす画像処理のことである。このような画像処理では、主被写体を引き立たせ、立体感を演出することができる。 Image blurring technology refers to image processing that focuses only on some subjects in the image and intentionally blurs the other close-up / background. In such image processing, it is possible to enhance the main subject and produce a stereoscopic effect.
 3Dカメラとは、左目用レンズ及び左目用光学機構の組み、右目用レンズ及び右目用光学機構の組みを具備していて、左目用画像データ、及び、右目用画像データを同時に撮影することができるカメラである。これらの左目用画像データ、及び、右目用画像データを立体視専用のディスプレイで再生させ、液晶シャッタ眼鏡を通じてこれらの左目用画像データ、及び、右目用画像データを視聴することで、ユーザは立体視映像を楽しむことができる。 The 3D camera includes a set of a left-eye lens and a left-eye optical mechanism, a set of a right-eye lens and a right-eye optical mechanism, and can simultaneously photograph left-eye image data and right-eye image data. It is a camera. These left-eye image data and right-eye image data are reproduced on a display dedicated for stereoscopic viewing, and the left-eye image data and right-eye image data are viewed through liquid crystal shutter glasses. You can enjoy the video.
 画像ぼかし技術についての文献公知発明については、特許文献1に記載されたものがあり、3Dカメラについての文献公知発明については、特許文献2に記載されたものがある。 Document known inventions related to image blurring techniques are described in Patent Document 1, and document known inventions related to 3D cameras are described in Patent Document 2.
 特許文献1は、レンズの焦点距離f、F値、距離差Dに応じてマトリックスを選択して、選択されたマトリックスを用いることでボケを付加する画像処理装置を開示している。 Patent Document 1 discloses an image processing apparatus that selects a matrix according to the focal length f, F value, and distance difference D of a lens and adds blur by using the selected matrix.
 特許文献2は、主被写体に最適な輻輳角を算出して、本撮影を行うことで、焦点があった被写体を鮮明にし、それ以外をぼかす3Dカメラを開示している。本文献2に記載された3Dカメラでは、右目用及び左目用の撮像素子を所定量だけ移動させ、移動前後における右目用及び左目用の撮像素子において撮影される主被写体の画像内の移動量を算出することにより、距離情報の取得を行う。 Patent Document 2 discloses a 3D camera that calculates an optimal convergence angle for a main subject and performs actual photographing to make the focused subject clear and blur the others. In the 3D camera described in Reference 2, the right-eye and left-eye image sensors are moved by a predetermined amount, and the movement amount in the image of the main subject captured by the right-eye and left-eye image sensors before and after the movement is calculated. By calculating, distance information is acquired.
 そして、デジタルカメラから主被写体までの距離Lを算出可能なデータが取得できるまで、この撮像素子の移動と、距離情報の取得とを繰り返す。ここで、左目用レンズとレンズ駆動部を結ぶ基線と、主被写体を通る当該基線と平行な線との距離を、距離Lとする。距離情報の取得が終了したら、右目用及び左目用の撮像素子の移動量と主被写体の画像内の移動量の相関から、三角測距の原理を用いて、3Dカメラから主被写体までの距離Lを算出する。 Then, the movement of the image sensor and the acquisition of distance information are repeated until data capable of calculating the distance L from the digital camera to the main subject can be acquired. Here, a distance between a base line connecting the left-eye lens and the lens driving unit and a line parallel to the base line passing through the main subject is a distance L. When the acquisition of the distance information is completed, the distance L from the 3D camera to the main subject is calculated using the principle of triangulation from the correlation between the movement amounts of the right-eye and left-eye image sensors and the movement amount of the main subject in the image. Is calculated.
 最後に、算出した距離Lの位置に存在する主被写体に最適な輻輳角になるように、左目用光学系駆動機構及び右目用光学系駆動機構により、左目用撮像光学系及び右目用撮像光学系を駆動して輻輳角を修正する。ここで輻輳角とは、立体視再生時において、右目瞳孔の視線と、左目瞳孔の視線とがなす角度のことであり、距離Lに最適な輻輳角とは、左目用撮像光学系の光軸と右目用撮像光学系の光軸との交点が、左目用レンズと、右目用レンズの基線長の中点から距離Lになるような輻輳角である。立体視効果は、立体視再生時における輻輳角と、平面視再生時における輻輳角との差分に応じたものになる(非特許文献3)。 Finally, the left-eye imaging optical system and the right-eye imaging optical system are operated by the left-eye optical system drive mechanism and the right-eye optical system drive mechanism so that the convergence angle is optimal for the main subject existing at the calculated distance L. To correct the convergence angle. Here, the convergence angle is an angle formed by the line of sight of the right eye pupil and the line of sight of the left eye pupil during stereoscopic reproduction. The convergence angle optimal for the distance L is the optical axis of the imaging optical system for the left eye. And the optical axis of the right-eye imaging optical system is a convergence angle such that the distance L is from the midpoint of the baseline length of the left-eye lens and the right-eye lens. The stereoscopic effect is in accordance with the difference between the convergence angle during stereoscopic reproduction and the convergence angle during planar reproduction (Non-patent Document 3).
特開平9-181966公報JP-A-9-181966 特開2010-114577号公報JP 2010-114577 A
 特許文献2に記載された3Dカメラは、左目用画像及び右目用画像から主被写体の画像内の移動量を算出して、その移動量に基づく距離情報に応じて、左目用の撮像素子及び右目用の撮像素子を駆動するというフィードバック制御を前提にしているから、主被写体については、その輻輳角が適切であることの保証が存在するものの、主被写体以外の被写体(例えば、主被写体が人物である場合には、その主被写体の前景側・背景側に写りこんだ副被写体)からどのような輻輳角が生じるかは、保証の限りではない。ここで従来の3Dカメラにおいて、左目用撮像光学系及び右目用撮像光学系は、3Dカメラにおいてある程度、隔てられた位置に存在する全く独立した光学系統であるから、これらの左目用撮像光学系及び右目用撮像光学系によって撮影された左目用画像、右目用画像を対比すると、前景・背景に写りこんだ副被写体の位置が、左目用画像と、右目用画像とで大きく違う場合がある。たとえ主被写体では輻輳角があっているとはいえ、このように、前景・背景に写りこんだ副被写体の位置が左目用画像と、右目用画像とで大きく異なる画像について、立体視再生を実行しようとすると、かかる前景・背景に写りこんだ副被写体についての輻輳角が極端に大きくなることがある。前景・背景に写りこんだ副被写体の輻輳角については、左目用の撮像素子及び右目用の撮像素子フィードバック制御等、調整が何等なされていないからである。よって、前景・背景に写りこんだ副被写体についての左目用画像と、右目用画像との移動量が大きければ、前景・背景に写りこんだ副被写体についての立体視効果が極端に大きくなるから、ユーザの3D酔いを招いてしまう。 The 3D camera described in Patent Document 2 calculates a movement amount in the main subject image from the left-eye image and the right-eye image, and according to the distance information based on the movement amount, the left-eye image sensor and the right-eye As a main subject, there is a guarantee that the convergence angle is appropriate for the main subject, but subjects other than the main subject (e.g., the main subject is a person). In some cases, there is no guarantee as to what convergence angle is generated from the sub-subject captured in the foreground / background of the main subject. Here, in the conventional 3D camera, the left-eye imaging optical system and the right-eye imaging optical system are completely independent optical systems that are located at some distance from each other in the 3D camera. When the left-eye image and the right-eye image captured by the right-eye imaging optical system are compared, the position of the sub-subject reflected in the foreground / background may be greatly different between the left-eye image and the right-eye image. Even if the main subject has a convergence angle, stereoscopic playback is performed for images in which the position of the sub-subject reflected in the foreground / background is significantly different between the left-eye image and the right-eye image. When trying to do so, the convergence angle of the sub-subject reflected in the foreground / background may become extremely large. This is because the convergence angle of the sub-subject reflected in the foreground / background is not adjusted at all such as left-eye image sensor and right-eye image sensor feedback control. Therefore, if the amount of movement between the left eye image and the right eye image for the sub-subject reflected in the foreground / background is large, the stereoscopic effect for the sub-subject reflected in the foreground / background becomes extremely large. The user's 3D sickness is invited.
 以上、特許文献2に記載された3Dカメラで撮影された左目用画像、右目用画像について説明したが、左目用画像、右目用画像がパンフォーカス画像である場合、この3D酔いの問題はより顕著に発生する。パンフォーカス画像とは、単焦点レンズや、口径の小さいレンズ、すなわち被写界深度が深いレンズで撮影された画像であり、画像中の全ての人物や物体等にピントがあっているようなものをいう。デジタルカメラのパワーフォーカス機能(パワーフォーカス機能とは、手動でレンズを繰り出すのではなく、ファインダー越しの被写体の動きを自動的に感知して、その動きの情報に基づき、モーターでレンズを繰り出し、レンズ特性を補間する機能のことである。)を利用して撮影した場合も、パンフォーカス画像が得られることが多い。パンフォーカス画像である左目用画像、右目用画像では、画像中に現れる主被写体、副被写体の全てに、ピントがあっているので、これらの被写体の全てについての輻輳角が適切でなければならない。しかし、上記特許文献2に記載された技術では、全ての被写体について輻輳角を合わせるよう光学機構を調整することは事実上不可能に近く、これら複数の被写体のどれかの輻輳角が不正に大きくなり、3D酔いが過大になるという状況に陥る。 As described above, the left-eye image and the right-eye image captured by the 3D camera described in Patent Document 2 have been described. However, when the left-eye image and the right-eye image are pan focus images, the problem of 3D sickness is more remarkable. Occurs. A pan-focus image is an image taken with a single-focus lens or a lens with a small aperture, that is, a lens with a deep depth of field, in which all persons or objects in the image are in focus. Say. Digital camera power focus function (The power focus function is not to manually extend the lens, but to automatically detect the movement of the subject through the viewfinder, and based on the movement information, the lens is extended by the motor. In many cases, a pan-focus image is also obtained when shooting is performed using a function for interpolating characteristics. In the left-eye image and the right-eye image, which are pan-focus images, the main subject and the sub-subject that appear in the image are in focus, so the convergence angles for all of these subjects must be appropriate. However, with the technique described in Patent Document 2, it is virtually impossible to adjust the optical mechanism so that the convergence angle is adjusted for all subjects, and the convergence angle of any one of these subjects is unduly large. It becomes a situation where 3D sickness becomes excessive.
 ここで、文献1に記載されたぼかし技術を左目用画像と、右目用画像とに適用することで、左目用画像、右目用画像を所望のぼかし特性に基づきぼかすことも考えられるが、上述したように、3Dカメラで得られた左目用画像、右目用画像の組みは、全く独立した光学系統によって得られたものなので、左目用画像、右目用画像を個別にぼかし特性に基づきぼかしたとしても、左目用画像と、右目用画像とで移動量が大きい副被写体のみを大きくぼかすことはできない。よって、特許文献1に記載されたぼかし技術では、3D酔いを解消しきれないという問題があった。 Here, by applying the blurring technique described in Literature 1 to the left-eye image and the right-eye image, it is possible to blur the left-eye image and the right-eye image based on desired blurring characteristics. As described above, since the set of the left-eye image and the right-eye image obtained by the 3D camera is obtained by a completely independent optical system, even if the left-eye image and the right-eye image are individually blurred based on the blur characteristics Only the sub-subject that has a large movement amount between the left-eye image and the right-eye image cannot be greatly blurred. Therefore, the blurring technique described in Patent Document 1 has a problem that 3D sickness cannot be eliminated.
 以上、3Dカメラで得られた左目用画像、右目用画像について説明したが、左目用画像、右目用画像の組みには、3Dカメラで得られた画像ではなく、非特許文献2に記載されているように、平面視画像から生成されたものも存在する。平面視画像から生成された左目用画像、右目用画像の組みは、現実的な事物に対する撮影から得られたものではないので、被写体の移動量が適性かどうかの保証は存在しない。よって、上記特許文献1、2に記載された技術では、平面視画像から生成された立体視画像において、3D酔いの解消を行うことは不可能である。そして従来技術は、処理対象となる画像の種類に影響されず、画一的に、良好な立体視画像を得るという目的には使用することはできない。 As described above, the left-eye image and the right-eye image obtained with the 3D camera have been described. However, the set of the left-eye image and the right-eye image is described in Non-Patent Document 2 instead of the image obtained with the 3D camera. As shown, there are also those generated from a planar view image. Since the combination of the left-eye image and the right-eye image generated from the planar view image is not obtained from photographing a real thing, there is no guarantee that the amount of movement of the subject is appropriate. Therefore, with the techniques described in Patent Documents 1 and 2, it is impossible to eliminate 3D sickness in a stereoscopic image generated from a planar image. The prior art is not affected by the type of image to be processed and cannot be used for the purpose of obtaining a uniform stereoscopic image uniformly.
 本発明の目的は、前景・背景に写りこんだ副被写体の位置が、左目用画像と、右目用画像とで大きく異なっている場合であっても、左目用画像、右目用画像の種別を問わず一律に適性な立体視効果を実現することができる、処理装置を提供することである。 The object of the present invention is to determine the type of the left-eye image and the right-eye image even when the position of the sub-object captured in the foreground / background is greatly different between the left-eye image and the right-eye image. It is an object of the present invention to provide a processing apparatus that can achieve a uniform stereoscopic effect.
 上記目的を達成するため、本発明にかかる画像処理装置は、複数の画像データの入力を受け付ける受付手段と、各画像データに対して、他の画像データとの視差を検出する視差検出部と、視差検出を行った一対の画像における複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出手段と、不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理手段とを備えることを特徴とする。 In order to achieve the above object, an image processing apparatus according to the present invention includes a receiving unit that receives input of a plurality of image data, a parallax detection unit that detects parallax with other image data for each image data, A detection unit that detects an illegal parallax region among a plurality of pixel regions in a pair of images subjected to parallax detection as an illegal parallax region, and performs blurring processing on each pixel region constituting the illegal parallax region And a processing means.
 輻輳角がどれだけになるかは、何れかの視点に対応する画像中の画素領域と、別の視点に対応する画像中の画素領域との差分がどれだけであるか、つまり、複数の画像データにおける領域間の視差に関連性をもつ。この視差が不正になる画素領域を検出して、ぼかしを施すので、不正な視差が存在する領域を強くぼかすことができる。これにより、不正視差に起因する視覚疲労・不快感を軽減することができる。 How much the convergence angle is, how much is the difference between the pixel area in the image corresponding to one viewpoint and the pixel area in the image corresponding to another viewpoint, that is, a plurality of images Relevant to disparity between regions in data. Since the pixel area where the parallax is illegal is detected and blurred, the area where the illegal parallax exists can be strongly blurred. Thereby, visual fatigue and discomfort caused by unauthorized parallax can be reduced.
 本発明では、視差が不正になる画素領域を検出して、ぼかしを施すので左目用画像及び右目用画像がパンフォーカス画像であっても、良好な立体視再生が可能になる。よって、立体視撮影にあたって、特許文献2に記載されているような左目用の撮像素子及び右目用の撮像素子を駆動して輻輳角を調整するというフィードバック制御が必要でなくなる。よって立体視画像を撮影することができる3Dカメラを低価格で開発することができる。左目用画像及び右目用画像のそれぞれがパンフォーカス画像であれば、立体視再生時において、画像中に現れる全ての人物、建造物、樹木、道路に奥行き感が与えられることになり、素晴らしい立体感を得ることができる。 In the present invention, pixel regions where parallax becomes incorrect are detected and blurred, so that even when the left-eye image and the right-eye image are pan-focus images, good stereoscopic reproduction is possible. Therefore, feedback control for adjusting the convergence angle by driving the left-eye image sensor and the right-eye image sensor as described in Patent Document 2 is not necessary for stereoscopic imaging. Therefore, a 3D camera that can capture a stereoscopic image can be developed at a low price. If each of the left-eye image and the right-eye image is a pan-focus image, all the people, buildings, trees, and roads that appear in the image will be given a sense of depth during stereoscopic playback. Can be obtained.
 また本発明では、3Dカメラの撮影画像・映像に限らず、非特許文献1、2に記載されている先行技術により、2D画像・映像を変換することで得られた画像・映像データを処理対象にすることができるので、2D画像・映像を立体視画像・映像に変換するという拡張機能が2Dカメラに存在する場合、本発明に係る処理装置が2Dカメラに搭載されることで、2Dカメラの付加価値を高めることができる。 In the present invention, not only captured images / videos of 3D cameras, but also images / video data obtained by converting 2D images / videos according to the prior art described in Non-Patent Documents 1 and 2 are to be processed. Therefore, when the extended function of converting a 2D image / video into a stereoscopic image / video exists in the 2D camera, the processing device according to the present invention is installed in the 2D camera, so that the 2D camera Added value can be increased.
 更に本発明では、3Dカメラで撮影された画像であるか、パンフォーカス画像であるか、平面視画像から変換されたものであるかといった画像の種類に影響されず、一律に、3D酔いが生じるような画素領域のみに、ぼかしを施すことができる。 Furthermore, in the present invention, 3D sickness occurs uniformly regardless of the type of image, such as whether it is an image taken with a 3D camera, a pan-focus image, or a plane view image. Only such a pixel region can be blurred.
本発明の実施の形態1に係る画像処理装置100の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 100 which concerns on Embodiment 1 of this invention. 左目用画像・右目用画像の視差の検出を説明するための図である。It is a figure for demonstrating the detection of the parallax of the image for left eyes, and the image for right eyes. 不正視差領域に対するボケの付加を説明するための図である。It is a figure for demonstrating addition of the blurring with respect to an unauthorized parallax area | region. 距離と視差との関係を説明するための図である。It is a figure for demonstrating the relationship between distance and parallax. 画像処理装置100によるぼかし処理のボケ強度と距離との関係を示す図である。It is a figure which shows the relationship between the blur intensity | strength of the blurring process by the image processing apparatus, and distance. 画像処理装置100の動作を示すフローチャートである。3 is a flowchart showing the operation of the image processing apparatus 100. 本発明の実施の形態2に係る画像処理装置700の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 700 which concerns on Embodiment 2 of this invention. 片眼可視画素領域を説明するための図である。It is a figure for demonstrating a one-eye visible pixel area | region. 画像処理装置700の動作を示すフローチャートである。5 is a flowchart showing the operation of the image processing apparatus 700. 本発明の実施の形態3に係る画像処理装置1000の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 1000 which concerns on Embodiment 3 of this invention. 画像処理装置1000によるぼかし処理のボケ強度と距離との関係を示す図である。It is a figure which shows the relationship between the blur intensity | strength of the blurring process by the image processing apparatus 1000, and distance. 本発明の実施の形態4に係る画像処理装置1200の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 1200 which concerns on Embodiment 4 of this invention. 被写体領域の検出を説明するための図である。It is a figure for demonstrating the detection of a to-be-photographed area | region. 画像処理装置1200によるぼかし処理のボケ強度と距離との関係を示す図である。FIG. 10 is a diagram illustrating a relationship between blur intensity and distance of blurring processing performed by the image processing apparatus 1200. 本発明の実施の形態5に係る画像処理装置1500の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 1500 which concerns on Embodiment 5 of this invention. 本発明の実施の形態6に係る画像処理装置1600の構成の一例を示すブロック図であるIt is a block diagram which shows an example of a structure of the image processing apparatus 1600 which concerns on Embodiment 6 of this invention. 画像処理装置1600の動作を示すフローチャートである。12 is a flowchart showing the operation of the image processing apparatus 1600. 本発明の実施の形態7に係る画像処理装置1800の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 1800 which concerns on Embodiment 7 of this invention. 画像処理装置1800の動作を示すフローチャートである。12 is a flowchart showing the operation of the image processing apparatus 1800. 本発明の実施の形態8に係る画像処理装置2000の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the image processing apparatus 2000 which concerns on Embodiment 8 of this invention. 本発明の実施の形態9に係る撮像装置の使用形態の一例を示す図である。It is a figure which shows an example of the usage condition of the imaging device which concerns on Embodiment 9 of this invention. 本発明の実施の形態9に係る撮像装置の使用形態の一例を示す図である。It is a figure which shows an example of the usage condition of the imaging device which concerns on Embodiment 9 of this invention. 本発明の実施の形態9に係る撮像装置の使用形態の一例を示す図である。It is a figure which shows an example of the usage condition of the imaging device which concerns on Embodiment 9 of this invention. 本発明の実施の形態9に係る撮像装置2400の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the imaging device 2400 which concerns on Embodiment 9 of this invention. 撮像装置2400のユーザ操作画面の一例を示す図である。FIG. 25 is a diagram illustrating an example of a user operation screen of the imaging apparatus 2400. 安全レベルの変更画面の一例を示す図Figure showing an example of the safety level change screen 距離情報に基づくぼかし処理の変更画面の一例を示す図である。It is a figure which shows an example of the change screen of the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. 距離情報に基づくぼかし処理を変更した場合の、表示画面の一例を示す図である。It is a figure which shows an example of a display screen at the time of changing the blurring process based on distance information. ユーザ操作により変更されるぼけ強度曲線を示す図である。It is a figure which shows the blur intensity | strength curve changed by user operation. 本発明に係る画像処理装置をLSIで具現化した例を示す図である。It is a figure which shows the example which actualized the image processing apparatus which concerns on this invention with LSI.
 以下、本発明の実施の形態について、図面を参照しながら説明する。
≪実施の形態1≫
<1.1 概要>
 実施の形態1に係る画像処理装置は、立体視の表示に用いられる左目用画像・右目用画像間の視差を、所定の閾値と比較し、その比較結果に基づき、不正視差であるか判定する。そして不正視差を有する画素領域に対してぼかし処理を行う。このように観察者に視覚疲労・不快感・立体視酔い等を引き起こし得る視差を有する画素領域を特定し、この画素領域にボケを付加することにより、不正視差に起因する視覚疲労・不快感を軽減することができる。
<1.2 構成>
 まず実施の形態1に係る画像処理装置100の構成について説明する。図1は画像処理装置100の構成の一例を示すブロック図である。図1に示されるように、画像処理装置100は、画像データ入力端子101、102、視差検出部103、不正視差検出部104、ボケ画像生成部105・106、画像データ出力端子107、108を含んで構成される。
<1.2.1 画像データ入力端子101、102>
 画像データ入力端子101、102は、それぞれ立体視の表示に用いられる左目用画像・右目用画像が入力される。ここで左目用画像・右目用画像とは、異なる視点から被写界を撮像して得られる画像であり、例えばステレオカメラで撮像された画像データであってもよい。また実写画像に限らず、異なる仮想視点を想定して作成したCG(Computer Graphics)等であってもよい。また、静止画像であっても、時間的に連続する複数の静止画像を含む動画像であってもよい。
<1.2.2 視差検出部103>
 視差検出部103は、画像データ入力端子101、102に入力された左目用画像・右目用画像間の視差を検出する。視差の検出は16×16画素の画素ブロック単位で行い、左目用画像・右目用画像の対応する画素ブロック間を構成する水平方向の画素数を視差として検出する。以下、図2を用いて左目用画像・右目用画像間の視差の検出について具体的に説明する。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
<< Embodiment 1 >>
<1.1 Overview>
The image processing apparatus according to Embodiment 1 compares the parallax between the left-eye image and the right-eye image used for stereoscopic display with a predetermined threshold, and determines whether the parallax is illegal based on the comparison result. . Then, the blurring process is performed on the pixel area having illegal parallax. In this way, by identifying a pixel region having parallax that can cause visual fatigue, discomfort, stereoscopic sickness, etc. to the observer, and adding blur to this pixel region, visual fatigue / discomfort due to incorrect parallax can be prevented. Can be reduced.
<1.2 Configuration>
First, the configuration of the image processing apparatus 100 according to the first embodiment will be described. FIG. 1 is a block diagram illustrating an example of the configuration of the image processing apparatus 100. As shown in FIG. 1, the image processing apparatus 100 includes image data input terminals 101 and 102, a parallax detection unit 103, an irregular parallax detection unit 104, blurred image generation units 105 and 106, and image data output terminals 107 and 108. Consists of.
<1.2.1 Image Data Input Terminals 101 and 102>
The image data input terminals 101 and 102 are inputted with a left-eye image and a right-eye image respectively used for stereoscopic display. Here, the image for the left eye and the image for the right eye are images obtained by imaging the scene from different viewpoints, and may be, for example, image data captured by a stereo camera. Further, the image is not limited to a real image, and may be CG (Computer Graphics) created assuming different virtual viewpoints. Further, it may be a still image or a moving image including a plurality of still images that are temporally continuous.
<1.2.2 Parallax detection unit 103>
The parallax detection unit 103 detects parallax between the left-eye image and the right-eye image input to the image data input terminals 101 and 102. The detection of the parallax is performed in units of 16 × 16 pixel blocks, and the number of pixels in the horizontal direction constituting the corresponding pixel blocks of the left-eye image and the right-eye image is detected as the parallax. Hereinafter, the detection of the parallax between the left-eye image and the right-eye image will be specifically described with reference to FIG.
 図2は、左目用画像・右目用画像の視差の検出を示す図である。まず、図2(a)を用いて、対応する画素ブロックの検出について説明する。図2(a)は、画素ブロック分けされた左目用画像・右目用画像を示す。なお、視差の検出は16×16画素の画素ブロック単位で行うが、説明のため実際より大きな画素ブロックでブロック分けされているように描いている。画素ブロックAとA´、BとB´、CとC´は、それぞれ対応する画素ブロックを示している。対応する画素ブロックは、左目用画像の画素ブロックに属する1つの画素に対応する画素が属する画素ブロックを、左目用画像の画素ブロックに対応する画素ブロックとすることで求める。対応する画素の探索は、例えばブロックマッチング法を用いて行う。ここでブロックマッチング法とは、対応点探索を行う画素の周辺の画素濃度パターンを用いて探索を行うものである。次に、図2(b)・(c)を用いて、視差の検出について説明する。 FIG. 2 is a diagram illustrating detection of parallax between the left-eye image and the right-eye image. First, detection of a corresponding pixel block will be described with reference to FIG. FIG. 2A shows a left-eye image and a right-eye image divided into pixel blocks. Although the detection of parallax is performed in units of 16 × 16 pixel blocks, for the sake of explanation, the parallax is illustrated as being divided into larger pixel blocks. Pixel blocks A and A ′, B and B ′, and C and C ′ respectively indicate corresponding pixel blocks. The corresponding pixel block is obtained by setting a pixel block to which a pixel corresponding to one pixel belonging to the pixel block of the left-eye image belongs to a pixel block corresponding to the pixel block of the left-eye image. The search for the corresponding pixel is performed using, for example, a block matching method. Here, the block matching method is a search using a pixel density pattern around a pixel for which a corresponding point search is performed. Next, detection of parallax will be described with reference to FIGS.
 図2(b)は、画素ブロックから画像の水平方向の中心までの距離を示す図である。dl(A)、dl(B)、dl(C)はそれぞれ左目用画像の画素ブロックA、B、Cから画像中心までの距離、dr(A´)、dr(B´)、dr(C´)はそれぞれ右目用画像の画素ブロックA´、B´、C´から画像中心までの距離を示す。距離は、画素ブロックから画像中心までを構成する水平画素の数として検出する。なお、画面から向かって右方向を正とする。このように検出された画素ブロックから画像中心までの距離を用いて視差を算出する。図2(c)は各画素ブロックの視差を示す図である。図2(c)に示されるように、視差は左目用画像の画素ブロックから画像中心までの距離dlから右目用画像の画素ブロックから画像中心までの距離drを引くことにより求められる。すなわちdl-drの値が視差となる。例えば、画素ブロックAにおける視差は、dl(A)-dr(A´)=10となる。以上のようにして、左目用画像・右目用画像間の視差を検出することができる。
<1.2.3 不正視差検出部104>
 不正視差検出部104は、視差検出部103で検出された視差が不正視差であるか判定を行う。まず不正視差について述べる。一般に、過度に大きな正の視差、負の視差を有する立体視画像は、視覚疲労・不快感・立体視酔い等を引き起こす可能性があることが知られており、本発明ではこのような観察者に視覚疲労・不快感・立体視酔い等を引き起こし得る立体視画像の視差を不正視差という。例えば3Dコンソーシアムが発行する安全ガイドライン(非特許文献3)では、視差角が1度以内であることが快適な立体視観察のために推奨されている。従って、視覚疲労・不快感・立体視酔い等を引き起こし得ない所定の視差(快適視差)の上限値H、下限値Lと不正視差Dの間には、D>H、D<Lとの関係が成り立つ。
FIG. 2B is a diagram illustrating the distance from the pixel block to the center in the horizontal direction of the image. dl (A), dl (B), and dl (C) are distances from the pixel blocks A, B, and C of the left-eye image to the center of the image, respectively, dr (A ′), dr (B ′), dr (C ′ ) Respectively indicate the distances from the pixel blocks A ′, B ′, C ′ of the right-eye image to the image center. The distance is detected as the number of horizontal pixels constituting the pixel block to the image center. The right direction from the screen is positive. The parallax is calculated using the distance from the pixel block thus detected to the center of the image. FIG. 2C shows the parallax of each pixel block. As shown in FIG. 2C, the parallax is obtained by subtracting the distance dr from the pixel block of the right-eye image to the image center from the distance dl from the pixel block of the left-eye image to the image center. That is, the value of dl-dr is the parallax. For example, the parallax in the pixel block A is dl (A) −dr (A ′) = 10. As described above, the parallax between the left-eye image and the right-eye image can be detected.
<1.2.3 Unjust Parallax Detection Unit 104>
The illegal parallax detection unit 104 determines whether the parallax detected by the parallax detection unit 103 is illegal parallax. First, illegal parallax will be described. In general, it is known that stereoscopic images having excessively large positive parallax and negative parallax may cause visual fatigue, discomfort, stereoscopic sickness, and the like in the present invention. In addition, parallax of stereoscopic images that can cause visual fatigue, discomfort, stereoscopic sickness, etc. is called illegal parallax. For example, in a safety guideline (Non-Patent Document 3) issued by a 3D consortium, a parallax angle of 1 ° or less is recommended for comfortable stereoscopic observation. Therefore, the relationship between D> H and D <L between the upper limit value H and lower limit value L of the predetermined parallax (comfort parallax) that cannot cause visual fatigue, discomfort, stereoscopic sickness, etc. Holds.
 次に、不正視差の判定について説明する。不正視差であるかの判定は、視差を所定の閾値の値と比較し、その比較結果に基づき行う。すなわち快適視差の上限値Hと下限値Lとの比較を行う。ここで閾値の値は、3Dコンソーシアムが推奨するガイドライン値に基づく水平画素数を、所定の閾値として定め、不正視差であるかの判定を行ってもよい。本実施形態では、より快適な立体視画像の生成のため、快適な視差の範囲の上限値を視差角1度に対応する水平画素数、下限値を視差角0度に対応する水平画素数とする。視差角に対応する水平画素数は、表示画面のサイズ、画像データの水平画素数、表示画面と観察者までの距離、観察者の瞳孔間距離により変わる。 Next, determination of illegal parallax will be described. The determination as to whether the parallax is illegal or not is performed based on the comparison result by comparing the parallax with a predetermined threshold value. That is, the upper limit value H and the lower limit value L of the comfortable parallax are compared. Here, as the threshold value, the number of horizontal pixels based on the guideline value recommended by the 3D consortium may be determined as a predetermined threshold value to determine whether the parallax is illegal. In this embodiment, in order to generate a more comfortable stereoscopic image, the upper limit value of the comfortable parallax range is the number of horizontal pixels corresponding to a parallax angle of 1 degree, and the lower limit value is the number of horizontal pixels corresponding to a parallax angle of 0 degree. To do. The number of horizontal pixels corresponding to the parallax angle varies depending on the size of the display screen, the number of horizontal pixels of the image data, the distance between the display screen and the observer, and the distance between the pupils of the observer.
 一般的な500万画素の写真(2592×1944)の場合(瞳孔間距離E:65mm)、視差角1度に対応する水平画素数は102画素となる(非特許文献3参照)。すなわち上記観察条件下では、視差>102画素、視差<0画素の場合を、不正視差と判定する。また、1920×1080画素の画像・映像の場合(瞳孔間距離E:65mm)、視差角1度に対応する水平画素数は57画素となる(非特許文献3参照)。すなわち上記の観察条件下では、視差>57画素、視差<0画素の場合を、不正視差と判定する。以上のように、不正視差を判定することができる。
<1.2.4 ボケ画像生成部105、106>
 ボケ画像生成部105は、画像データ入力端子101から入力された画像の不正視差を有する画素ブロックに対してぼかし処理を行う。ボケ画像生成部106は、画像データ入力端子103から入力された画像の不正視差を有する画素ブロックに対してぼかし処理を行う。ぼかし処理は、例えば7×7の平滑化フィルタを用いる。以下図3を用いて、ボケ画像の生成について具体的に説明する。
In the case of a general 5 million pixel photograph (2592 × 1944) (distance between pupils E: 65 mm), the number of horizontal pixels corresponding to a parallax angle of 1 degree is 102 pixels (see Non-Patent Document 3). That is, under the above observation conditions, a case where the parallax> 102 pixels and the parallax <0 pixels is determined as an illegal parallax. In the case of an image / video of 1920 × 1080 pixels (distance between pupils E: 65 mm), the number of horizontal pixels corresponding to a parallax angle of 1 degree is 57 pixels (see Non-Patent Document 3). That is, under the above observation conditions, a case where the parallax> 57 pixels and the parallax <0 pixels is determined as an illegal parallax. As described above, illegal parallax can be determined.
<1.2.4 Blur Image Generation Units 105 and 106>
The blurred image generation unit 105 performs a blurring process on a pixel block having an illegal parallax of an image input from the image data input terminal 101. The blurred image generation unit 106 performs a blurring process on the pixel block having an illegal parallax of the image input from the image data input terminal 103. The blurring process uses, for example, a 7 × 7 smoothing filter. Hereinafter, generation of a blurred image will be specifically described with reference to FIG.
 図3は、視差が負の不正視差領域に対するボケの付加を示す図である。図3(a)は、視差が負の不正視差領域を示す。斜線部分が不正視差領域として判定された画素ブロックである。この例では、木の部分が不正視差領域として判定されており、この部分にボケを付加する。すなわち、木を構成する画素に対して7×7の平滑化フィルタ処理を行う。図3(b)は、ぼかし処理を行った立体視画像の視聴を示す図である。3Dメガネ等により左目用画像を左目で、右目用画像を右目で見た場合、本図に示されるような立体視画像が認識される。このように不正視差領域である木の部分は完全にぼかされる。以上説明したように、ボケ画像生成部105、106は、不正視差領域を構成する個々の画素領域に対してぼかし処理を施すことにより、不正視差に起因する視覚疲労・不快感を軽減することができる。 FIG. 3 is a diagram illustrating the addition of blur to an illegal parallax region having a negative parallax. FIG. 3A shows an illegal parallax region where the parallax is negative. The hatched portion is a pixel block determined as an illegal parallax region. In this example, a tree part is determined as an illegal parallax region, and blur is added to this part. That is, 7 × 7 smoothing filter processing is performed on the pixels constituting the tree. FIG. 3B is a diagram illustrating viewing of a stereoscopic image subjected to the blurring process. When the left-eye image is viewed with the left eye and the right-eye image is viewed with the right eye using 3D glasses or the like, a stereoscopic image as shown in this figure is recognized. In this way, the portion of the tree that is the illegal parallax region is completely blurred. As described above, the blurred image generation units 105 and 106 can reduce the visual fatigue and discomfort caused by the incorrect parallax by performing the blurring process on the individual pixel areas constituting the incorrect parallax area. it can.
 また、本実施の形態に係るぼかし処理は、単に不正視差領域を隠すものではない。単に不正視差を隠すまたは除去するだけでは、観察者にとって不自然な画像となるが、本実施の形態に係るぼかし処理は、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにするものである。この点について、以下で説明する。 In addition, the blurring process according to the present embodiment does not simply hide the illegal parallax area. Simply hiding or removing the incorrect parallax results in an unnatural image for the observer, but the blurring process according to the present embodiment prevents the observer from recognizing the incorrect parallax as a so-called blurred image. To do. This point will be described below.
 まず図4を用いて画像データの撮像位置から被写体までの距離と視差との関係について説明する。図4は、距離と視差との関係を示す図である。aはカメラ間距離(基線長)、Fはカメラの焦点距離、dlは左目用画像における画像中心から対応画素Cまでの距離、drは右目用画像における画像中心から対応画素C´までの距離、xは撮像位置から対応画素に映る被写体までの距離を示す。視差dは左目用画像・右目用画像間のズレ量であるので、左目用画像における画像中心から対応画素Cまでの距離から右目用画像における画像中心から対応画素C´までの距離を引いた値、すなわちdl-drで表される。また点Cと点C´間の長さは、a-dl+dr=a-dとなる。ここで、△DCC´と△DABの相似関係に注目すると、a:x=a-d:x-Fの関係が得られる。この関係式を整理すると、距離xと視差dとの関係はx=aF/dと表される。すなわち、距離と視差との間には比例関係が成り立つ。以上が距離と視差との関係の説明である。次にこの視差と距離との間に成立する比例関係に基づいて、ぼかし処理と距離との関係について説明する。 First, the relationship between the distance from the imaging position of the image data to the subject and the parallax will be described with reference to FIG. FIG. 4 is a diagram illustrating the relationship between distance and parallax. a is the inter-camera distance (baseline length), F is the focal length of the camera, dl is the distance from the image center to the corresponding pixel C in the left-eye image, dr is the distance from the image center to the corresponding pixel C ′ in the right-eye image, x represents the distance from the imaging position to the subject shown in the corresponding pixel. Since the parallax d is the amount of deviation between the left-eye image and the right-eye image, a value obtained by subtracting the distance from the image center to the corresponding pixel C ′ in the right-eye image from the distance from the image center to the corresponding pixel C in the left-eye image. That is, it is expressed by dl-dr. The length between the point C and the point C ′ is a−dl + dr = ad. Here, when attention is paid to the similarity relationship between ΔDCC ′ and ΔDAB, a relationship of a: x = ad: xF is obtained. By arranging this relational expression, the relationship between the distance x and the parallax d is expressed as x = aF / d. That is, a proportional relationship is established between the distance and the parallax. The above is the description of the relationship between distance and parallax. Next, the relationship between the blurring process and the distance will be described based on the proportional relationship established between the parallax and the distance.
 図5は、ぼかし処理によりぼけを付加する強度と距離との関係を示す図である。距離0~6からなる仮想空間において、快適視差の上限値Hが距離1に対応し、下限値Lが距離5に対応する場合、視差と距離の間には比例関係が成り立つので、不正視差の範囲は距離0~1および距離5~6の範囲となる。従って図5に示されるように、近景(距離0~1)および遠景(5~6)にぼかし処理を行う。このように、不正視差領域にぼかし処理を行う結果、近景および遠景にぼかし処理を行うこととなり、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにすることができる。快適視差の領域以外をぼかすので、快適視差の領域にある被写体はより立体感あるものとして観察者に認識される。 FIG. 5 is a diagram showing the relationship between the strength to add blur by the blurring process and the distance. In a virtual space consisting of distances 0 to 6, when the upper limit value H of comfortable parallax corresponds to the distance 1 and the lower limit value L corresponds to the distance 5, a proportional relationship is established between the parallax and the distance. The range is a range of distance 0 to 1 and distance 5 to 6. Therefore, as shown in FIG. 5, blurring processing is performed on the near view (distance 0 to 1) and the distant view (5 to 6). As described above, as a result of performing the blurring process on the illegal parallax region, the blurring process is performed on the near view and the distant view, so that the viewer can not recognize the illegal parallax as a so-called blurred image. Since the area other than the comfortable parallax area is blurred, the subject in the comfortable parallax area is recognized by the observer as having a more stereoscopic effect.
 このように不正視差領域を特定し、ボケを付加することで、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにすることができ、不正視差に起因する視覚疲労・不快感を軽減することができる。
<1.2.5 画像データ出力端子107、108>
 画像データ出力端子107は、ボケ画像生成部105で生成されたボケ画像を、画像データ出力端子108は、ボケ画像生成部106で生成されたボケ画像をそれぞれ出力する。以上が画像処理装置100の構成についての説明である。続いて、画像処理装置100の動作について説明する。
<1.3 動作>
 図6は画像処理装置100の動作を示すフロー図である。本図に示されるように、視差検出部103はまず、左目用画像の画素ブロックに対応する右目用画像の画素ブロックの探索を行う(ステップS601)。具体的には、まず左目用画像の画素ブロックに属する1つの画素に対応する画素を、右目用画像について対応点探索をおこなうことにより求め、次にこの対応点画素の属する画素ブロックを、左目用画像の画素ブロックに対応する画素ブロックとすることで対応画素ブロックを探索する。次に視差検出部103は、左目用画像・右目用画像それぞれに対して、画素ブロックから画像中心間を構成する画素数を算出する(ステップS602)。そして左目用画像の画素ブロックから画像中心間を構成する画素数dlから右目用画像の画素ブロックから画像中心間を構成する画素数drを減算し、視差を構成する画素数を算出する(ステップS603)。不正視差検出部104は、視差検出部103で算出した視差を構成する画素数が、不正視差であるか判定する(ステップS604)。その判定は、視差を所定の閾値(快適な視差の範囲の上限値・下限値)と比較し、その比較結果に基づき行う。以上のステップS601からステップS604までの動作を全ての画素ブロックについて繰り返し行う。全ての画素ブロックについてステップS601からステップS604の動作を行った後、ボケ画像生成部105・106は、不正視差を有する画素ブロック(不正視差領域)に対してぼかし処理を行う(ステップS605)。以上が画像処理装置100の動作についての説明である。
By identifying the incorrect parallax area and adding blur, it is possible to prevent the viewer from recognizing the incorrect parallax as a so-called blurred image, and visual fatigue and discomfort caused by the incorrect parallax. Can be reduced.
<1.2.5 Image Data Output Terminals 107 and 108>
The image data output terminal 107 outputs the blurred image generated by the blurred image generation unit 105, and the image data output terminal 108 outputs the blurred image generated by the blurred image generation unit 106. The above is the description of the configuration of the image processing apparatus 100. Next, the operation of the image processing apparatus 100 will be described.
<1.3 Operation>
FIG. 6 is a flowchart showing the operation of the image processing apparatus 100. As shown in the figure, the parallax detection unit 103 first searches for a pixel block of the right-eye image corresponding to the pixel block of the left-eye image (step S601). Specifically, first, a pixel corresponding to one pixel belonging to the pixel block of the left-eye image is obtained by performing a corresponding point search on the right-eye image, and then the pixel block to which the corresponding-point pixel belongs is determined for the left eye. A corresponding pixel block is searched by setting it as a pixel block corresponding to the pixel block of the image. Next, the parallax detection unit 103 calculates the number of pixels constituting the center of the image from the pixel block for each of the left-eye image and the right-eye image (step S602). The number of pixels constituting the parallax is calculated by subtracting the number of pixels dr constituting the center of the image from the pixel block of the right-eye image from the number of pixels dl constituting the center of the image from the pixel block of the left-eye image (step S603). ). The illegal parallax detection unit 104 determines whether the number of pixels constituting the parallax calculated by the parallax detection unit 103 is an illegal parallax (step S604). The determination is performed based on the comparison result by comparing the parallax with a predetermined threshold (upper limit value / lower limit value of a comfortable parallax range). The operations from step S601 to step S604 are repeated for all the pixel blocks. After performing the operations from step S601 to step S604 for all the pixel blocks, the blurred image generation units 105 and 106 perform a blurring process on the pixel blocks (illegal parallax region) having illegal parallax (step S605). The operation of the image processing apparatus 100 has been described above.
 以上のように本実施形態によれば、観察者に視覚疲労・不快感・立体視酔い等を引き起こし得る立体視画像の視差を有する領域を特定し、この領域にボケを付加することにより、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにし、不正視差に起因する視覚疲労・不快感を軽減することができる。
≪実施の形態2≫
<2.1 概要>
 実施の形態2に係る画像処理装置は、実施の形態1に係る画像処理装置100と同様に、両眼視差による立体視を構成する複数の画像を処理する画像処理装置であって、不正視差領域に対してぼかし処理を行うことにより、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにし、不正視差に起因する視覚疲労・不快感を軽減するものであるが、不正視差領域に加え、左目用画像・右目用画像の一方のみに映り込んでいるため、立体視再生時での片目可視によりちらつきが発生し、観察者に視覚疲労・不快感・立体視酔い等を引き起こし得る画像領域(片眼可視画素領域)に対して、片眼可視画素領域と同画像座標範囲にある他方の画像領域を転写(上書き)する。そして転写元画素領域および転写された画素領域に対してぼかし処理を行う。これにより、いわゆるボケ味のある画像として、色・輝度の差分が観察者に認識されなくなり、左目用画像・右目用画像の一方のみに映り込んだ画像領域に起因する視覚疲労・不快感を軽減することができる。
<2.2 構成>
 まず実施の形態2に係る画像処理装置700の構成について説明する。図7は画像処理装置700の構成の一例を示すブロック図である。なお、図1に示す実施の形態1に係る画像処理装置100の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図7に示されるように、画像処理装置700は、画像データ入力端子101、102、視差検出部103、不正視差検出部104、補間画像生成部701、ボケ画像生成部702、703、画像出力端子107、108を含んで構成される。
<2.2.1 補間画像生成部701>
 立体視観察の際に、色・輝度の差分が観察者に認識される結果、ちらつきが発生し、観察者に視覚疲労・不快感・立体視酔い等を引き起こす恐れがある。補間画像生成部701は、かかる片眼可視画素領域を特定し、その片眼可視画素領域に対して、同画像座標範囲にある他方の画像領域を転写(上書き)する。より具体的に図8を用いて説明する。図8は、片眼可視画素領域を示す図である。片側可視画素領域は、画像データの端部に位置する。図8(a)に示されるように、例えば左目用画像の斜線で示された画素領域では、右目用画像における対応画素点が検出されず視差は算出されない。このような視差が算出されない画素ブロック領域を片眼可視画素領域として特定する。そして、片側画素領域と同画像座標範囲にある右目用画像の斜線で示された画素領域を、左目用画像の斜線で示された画素領域上に転写(上書き)する。ここで、片眼可視画素領域は画像データの端部に位置するので、上書きする画素は画像データの端部に位置する画素である。図8(b)は、補間画像生成後の左目用画像を示す図である。図8(b)に示されるように、左目用画像の片眼可視画素領域上には右目用画像の斜線部の画素が表示される。

<2.2.2 ボケ画像生成部702、703>
 ボケ画像生成部702、703は、不正視差を有する画素ブロック(不正視差領域)に加え、転写元画素領域および転写された画素領域に対してぼかし処理を行う。これにより色・輝度の差分が観察者に認識されなくなる。また片眼可視画素領域は、画像データの端部に位置するので、いわゆるボケ味がある画像として、観察者に対する違和感を取り除くことができる。以上が画像処理装置700の構成についての説明である。続いて、画像処理装置700の動作について説明する。
<2.3 動作>
 図9は画像処理装置700の動作を示すフロー図である。本図に示されるように、視差検出部103はまず、左目用画像の画素ブロックに対応する右目用画像の画素ブロックの探索を行う(ステップS901)。具体的には、まず左目用画像の画素ブロックに属する1つの画素に対応する画素を、右目用画像について対応点探索をおこなうことにより求め、次にこの対応点画素の属する画素ブロックを、左目用画像の画素ブロックに対応する画素ブロックとすることで対応画素ブロックを探索する。次に、ステップS901で対応画素点が検出できたか判定を行う(ステップS902)。対応画素点が検出できた場合(ステップS902、YES)視差検出部103は、左目用画像・右目用画像それぞれに対して、画素ブロックから画像中心間を構成する画素数を算出する(ステップS903)。そして左目用画像の画素ブロックから画像中心間を構成する画素数dlから右目用画像の画素ブロックから画像中心間を構成する画素数drを減算し、視差を構成する画素数を算出する(ステップS904)。不正視差検出部104は、視差検出部103で算出した視差を構成する画素数が、不正視差であるか判定する(ステップS905)。その判定は、視差を所定の閾値(快適な視差の範囲の上限値・下限値)と比較し、その比較結果に基づき行う。以上のステップS901からステップS905までの動作を全ての画素ブロックについて繰り返し行う。全ての画素ブロックについてステップS901からステップS905の動作を行った後、補間画像生成部701は、視差情報が算出されていない画素ブロックを片眼可視画素領域として特定する(ステップS906)。そして、補間画像生成部701はその片眼可視画素領域に対して、同画像座標範囲にある他方の画像領域を転写(上書き)する。(ステップS907)。上記の補間画像生成の後、ボケ画像生成部702、703は、それぞれ左目用画像・右目用画像の不正視差領域及び転写元の画素領域および転写された画素領域に対してぼかし処理行う。(ステップS908)。以上が、画像処理装置700の動作についての説明である。
As described above, according to the present embodiment, a region having a parallax of a stereoscopic image that can cause visual fatigue, discomfort, stereoscopic sickness, etc. is identified for an observer, and blur is added to this region, so-called As a blurred image, it is possible to prevent the viewer from recognizing unauthorized parallax, and to reduce visual fatigue and discomfort caused by unauthorized parallax.
<< Embodiment 2 >>
<2.1 Overview>
Similar to the image processing apparatus 100 according to the first embodiment, the image processing apparatus according to the second embodiment is an image processing apparatus that processes a plurality of images constituting stereoscopic viewing by binocular parallax, and includes an illegal parallax region. By performing blurring processing on the image, it is possible to prevent the viewer from recognizing unauthorized parallax as a so-called blurred image, and to reduce visual fatigue and discomfort caused by unauthorized parallax. In addition, because it appears in only one of the left-eye image and right-eye image, flickering may occur due to one-eye visibility during stereoscopic playback, which may cause visual fatigue, discomfort, and stereoscopic sickness to the viewer. The other image area in the same image coordinate range as the one-eye visible pixel area is transferred (overwritten) to the image area (one-eye visible pixel area). Then, the blurring process is performed on the transfer source pixel area and the transferred pixel area. As a result, the color / brightness difference is not recognized by the observer as a so-called blurred image, and visual fatigue and discomfort caused by the image area reflected in only one of the left-eye image and the right-eye image are reduced. can do.
<2.2 Configuration>
First, the configuration of the image processing apparatus 700 according to the second embodiment will be described. FIG. 7 is a block diagram illustrating an example of the configuration of the image processing apparatus 700. The same parts as those in the configuration of the image processing apparatus 100 according to Embodiment 1 shown in FIG. 1 are denoted by the same reference numerals, description thereof will be omitted, and different parts will be described. As illustrated in FIG. 7, the image processing apparatus 700 includes image data input terminals 101 and 102, a parallax detection unit 103, an illegal parallax detection unit 104, an interpolation image generation unit 701, a blurred image generation unit 702 and 703, and an image output terminal. 107 and 108 are comprised.
<2.2.1 Interpolated Image Generation Unit 701>
As a result of the viewer recognizing the difference in color and brightness during stereoscopic viewing, flickering may occur, which may cause visual fatigue, discomfort, stereoscopic sickness, etc. to the viewer. The interpolated image generation unit 701 identifies the one-eye visible pixel area, and transfers (overwrites) the other image area in the image coordinate range to the one-eye visible pixel area. This will be described more specifically with reference to FIG. FIG. 8 is a diagram showing a one-eye visible pixel region. The one-side visible pixel region is located at the end of the image data. As shown in FIG. 8A, for example, in the pixel region indicated by the diagonal lines of the left-eye image, the corresponding pixel point in the right-eye image is not detected, and the parallax is not calculated. A pixel block region where such parallax is not calculated is specified as a one-eye visible pixel region. Then, the pixel area indicated by the oblique line of the right-eye image in the same image coordinate range as the one-side pixel area is transferred (overwritten) onto the pixel area indicated by the oblique line of the left-eye image. Here, since the one-eye visible pixel region is located at the end of the image data, the pixel to be overwritten is a pixel located at the end of the image data. FIG. 8B is a diagram illustrating the left-eye image after the interpolation image is generated. As shown in FIG. 8B, the hatched pixels of the right-eye image are displayed on the one-eye visible pixel region of the left-eye image.

<2.2.2 Blur Image Generation Units 702 and 703>
The blurred image generation units 702 and 703 perform a blurring process on the transfer source pixel area and the transferred pixel area in addition to the pixel block having the incorrect parallax (illegal parallax area). Thus, the color / brightness difference is not recognized by the observer. In addition, since the one-eye visible pixel region is located at the end of the image data, it is possible to remove a sense of discomfort for the observer as a so-called blurred image. The above is the description of the configuration of the image processing apparatus 700. Next, the operation of the image processing apparatus 700 will be described.
<2.3 Operation>
FIG. 9 is a flowchart showing the operation of the image processing apparatus 700. As shown in the figure, the parallax detection unit 103 first searches for a pixel block of the right-eye image corresponding to the pixel block of the left-eye image (step S901). Specifically, first, a pixel corresponding to one pixel belonging to the pixel block of the left-eye image is obtained by performing a corresponding point search on the right-eye image, and then the pixel block to which the corresponding-point pixel belongs is determined for the left eye. A corresponding pixel block is searched by setting it as a pixel block corresponding to the pixel block of the image. Next, it is determined whether the corresponding pixel point has been detected in step S901 (step S902). When the corresponding pixel point can be detected (YES in step S902), the parallax detection unit 103 calculates the number of pixels constituting between the image centers from the pixel block for each of the left-eye image and the right-eye image (step S903). . The number of pixels constituting the parallax is calculated by subtracting the number of pixels constituting the center of the image from the pixel block of the right-eye image from the number of pixels dl constituting the center of the image from the pixel block of the left-eye image (step S904). ). The illegal parallax detection unit 104 determines whether the number of pixels constituting the parallax calculated by the parallax detection unit 103 is an illegal parallax (step S905). The determination is performed based on the comparison result by comparing the parallax with a predetermined threshold (upper limit value / lower limit value of a comfortable parallax range). The operations from step S901 to step S905 are repeated for all pixel blocks. After performing the operations from step S901 to step S905 for all the pixel blocks, the interpolated image generation unit 701 identifies a pixel block for which parallax information is not calculated as a one-eye visible pixel region (step S906). Then, the interpolated image generation unit 701 transfers (overwrites) the other image region in the image coordinate range to the one-eye visible pixel region. (Step S907). After the above-described interpolation image generation, the blurred image generation units 702 and 703 perform blurring processing on the incorrect parallax region, the transfer source pixel region, and the transferred pixel region of the left-eye image and the right-eye image, respectively. (Step S908). The operation of the image processing apparatus 700 has been described above.
 以上のように本実施形態によれば、左目用画像・右目用画像の一方の画像にのみ存在する画像領域(片眼可視画素領域)を特定し、片眼可視画素領域に対して、同画像座標範囲にある他方の画像領域を転写(上書き)することができる。観察者に視覚疲労・不快感・立体視酔い等を引き起こし得る立体視画像の視差を有する不正視差領域に加えて、転写元の画素領域および転写された画素領域に対してぼかし処理行うことで、いわゆるボケ味がある画像として、観察者への違和感を取り除き、左目用画像・右目用画像の一方のみに映り込んだ画像領域に起因する視覚疲労・不快感を軽減することができる。
≪実施の形態3≫
<3.1 概要>
 実施の形態3に係る画像処理装置は、実施の形態2に係る画像処理装置700と同様に、両眼視差による立体視を構成する複数の画像を処理する画像処理装置であって、不正視差領域に対してぼかし処理を行うことにより、いわゆるボケ味がある画像として、不正視差等を観察者に認識できないようにし、不正視差に起因する視覚疲労・不快感を軽減するものであるが、距離情報生成部を備え、左目用画像・右目用画像間の視差に基づき、画素ブロック毎に画像データの撮像位置から画素ブロックに映る被写体までの距離情報を生成する。そして不正視差領域・距離情報に基づきぼかし処理を行う。これにより、より自然なボケ味のある画像として、不正視差等を観察者に認識できないようにできる。
<3.2 構成>
 図10は画像処理装置1000の構成の一例を示すブロック図である。なお、図7に示す実施の形態2に係る画像処理装置700の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図10に示されるように、画像処理装置1000は、画像データ入力端子101、102、視差検出部103、距離情報生成部1001、不正視差検出部104、補間画像生成部701、ボケ画像生成部1002、1003、画像データ出力端子107、108を含んで構成される。
<3.2.1 距離情報生成部1001>
 距離情報生成部1001は、視差検出部103で検出された左目用画像・右目用画像間の視差に基づき、画素ブロック毎に画像データの撮像位置から画素ブロックに映る被写体までの距離情報を生成する。距離情報は三角測量の原理、すなわち既に図4を用いて説明した距離情報と視差の関係を用いて、視差の値から算出することができる。
<3.2.2 ボケ画像生成1002、1003>
 ボケ画像生成部1002、1003は、不正視差領域、片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域、および距離情報生成部1001で生成した距離情報に基づき、左目用画像・右目用画像にぼかし処理を行う。具体的には、不正視差領域および片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域に、7×7の平滑化フィルタ処理を行い、それ以外の領域に所定のピント距離からの相対距離に基づく強度のぼかし処理を行う。ピント距離は、画像の中央位置の距離としてもよい。また、カメラのオートフォーカス動作からもとめられるものであってもよい。また、使用者に選択された画素領域の距離としてもよい。
As described above, according to the present embodiment, an image region (one-eye visible pixel region) that exists only in one of the left-eye image and the right-eye image is specified, and the same image is obtained for the one-eye visible pixel region. The other image area in the coordinate range can be transferred (overwritten). In addition to the incorrect parallax area having parallax of stereoscopic images that can cause visual fatigue, discomfort, stereoscopic sickness, etc. to the observer, by performing blurring processing on the transfer source pixel area and the transferred pixel area, As an image having a so-called blur, it is possible to remove a sense of discomfort to the observer and to reduce visual fatigue and discomfort caused by an image region reflected in only one of the left-eye image and the right-eye image.
<< Embodiment 3 >>
<3.1 Overview>
Similar to the image processing apparatus 700 according to the second embodiment, the image processing apparatus according to the third embodiment is an image processing apparatus that processes a plurality of images constituting stereoscopic vision by binocular parallax, and includes an illegal parallax region. By blurring the image, it is possible to prevent the viewer from recognizing unauthorized parallax as a so-called blurred image and reduce visual fatigue and discomfort caused by unauthorized parallax. A generation unit is provided, and based on the parallax between the left-eye image and the right-eye image, distance information from the imaging position of the image data to the subject reflected in the pixel block is generated for each pixel block. Then, the blurring process is performed based on the illegal parallax region / distance information. Thereby, it is possible to prevent the viewer from recognizing illegal parallax or the like as a more natural blurred image.
<3.2 Configuration>
FIG. 10 is a block diagram illustrating an example of the configuration of the image processing apparatus 1000. Note that portions that are the same as those of the configuration of the image processing apparatus 700 according to Embodiment 2 illustrated in FIG. 7 are denoted by the same reference numerals, description thereof is omitted, and different portions are described. As illustrated in FIG. 10, the image processing apparatus 1000 includes image data input terminals 101 and 102, a parallax detection unit 103, a distance information generation unit 1001, an irregular parallax detection unit 104, an interpolated image generation unit 701, and a blurred image generation unit 1002. 1003 and image data output terminals 107 and 108.
<3.2.1 Distance Information Generation Unit 1001>
The distance information generation unit 1001 generates distance information from the imaging position of the image data to the subject reflected in the pixel block for each pixel block based on the parallax between the left-eye image and the right-eye image detected by the parallax detection unit 103. . The distance information can be calculated from the parallax value using the principle of triangulation, that is, the relationship between the distance information and the parallax already described with reference to FIG.
<3.2.2 Blur Image Generation 1002, 1003>
The blurred image generation units 1002 and 1003 are images for the left eye based on the illegal parallax region, the transfer source pixel region / transferred pixel region in the transfer to the one-eye visible pixel region, and the distance information generated by the distance information generation unit 1001. -Perform blur processing on the right eye image. Specifically, a 7 × 7 smoothing filter process is performed on the transfer source pixel region / transferred pixel region in the transfer to the illegal parallax region and the one-eye visible pixel region, and a predetermined focus distance is applied to other regions. Intensity blurring based on the relative distance from. The focus distance may be a distance at the center position of the image. Further, it may be obtained from the autofocus operation of the camera. Further, it may be the distance of the pixel region selected by the user.
 距離情報に基づくぼかし処理は、例えば特許文献1に記載されている技術を用いる。すなわちレンズのポイントスプレッド関数(PSF;point spread fanction)に対応するマトリックスを、ピント距離からの相対距離に応じて選択し、選択されたマトリックスを用いることでボケを付与する。 The blurring process based on the distance information uses, for example, the technique described in Patent Document 1. That is, a matrix corresponding to the lens point spread function (PSF; point spread fanction) is selected according to the relative distance from the focus distance, and blur is applied by using the selected matrix.
 図11は、距離0~1、および距離5~6で不正視差が検出された場合におけるぼけ強度の一例を示す図である。図11に示されるように、不正視差領域である距離0~1、5~6においてボケ強度が最大となり、不正視差領域以外の距離1~5の領域ではピント距離3を中心に逆双曲線的なボケ強度となっている。これにより、より自然なボケ味のある画像として、不正視差等を観察者に認識できないようにすることができる。 FIG. 11 is a diagram showing an example of the blur intensity when an illegal parallax is detected at a distance of 0 to 1 and a distance of 5 to 6. As shown in FIG. 11, the blur intensity becomes maximum at distances 0 to 1 and 5 to 6 which are illegal parallax areas, and is an inverse hyperbola around the focus distance 3 in the areas of distances 1 to 5 other than the illegal parallax areas. It is defocused. Thereby, it is possible to prevent an observer from recognizing illegal parallax or the like as a more natural blurred image.
 以上のように本実施形態によれば、画素ブロック毎に画像データの撮像位置から画素ブロックに映る被写体までの距離情報を生成する。そして、不正視差領域に対するぼかし処理に加え、不正視差領域以外の領域を距離情報に応じたボケ強度のボケを付加することで、より自然なボケ味のある画像として、不正視差等を観察者に認識できないようにできる
≪実施の形態4≫
<4.1 概要>
 実施の形態4に係る画像処理装置は、実施の形態3に係る画像処理装置と同様に、両眼視差による立体視を構成する複数の画像を処理する画像処理装置であって、不正視差領域・距離情報に基づきぼかし処理を行うものであるが、実施の形態3に係る画像処理装置の構成に加えて被写体検出部を備え、所定の被写体が写る画素領域(被写体領域)を検出する。そして不正視差領域・距離情報・被写体領域に基づき、ぼかし処理を行う。距離情報に加え被写体領域に基づきぼかし処理を行うことで、着目したい被写体が距離的な位置関係で一律にぼかされた不鮮明な画像とならず、被写体に対する視認性の良いボケ味がある画像として、不正視差等を観察者に認識できないようにすることができる。
<4.2 構成>
 図12は画像処理装置1200の構成の一例を示すブロック図である。なお図10に示す画像処理装置1000の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図12に示されるように、画像処理装置1200は、画像データ入力端子101、102、視差検出部103、距離情報生成部1001、被写体検出部1201、不正視差検出部104、補間画像生成部701、ボケ画像生成部1202、1203、画像データ出力端子107、108を含んで構成される。
<4.2.1 被写体検出部1201>
 被写体検出部1201は、所定の被写体が写る画素領域(被写体領域)を検出する。本実施の形態では、テンプレートマッチングにより被写体領域の検出を行う。図13は、テンプレートマッチングの仕組みを示す図である。図13に示されるように、あらかじめ顔、人物などの特定物体をテンプレート画像1301として用意する。そして、入力画像データ1202と照らしあわせて、テンプレート画像1301との類似度が一定以上の被写体1203を検出することにより被写体領域の検出を行う。
<4.2.2 ボケ画像生成部1202・1203>
 ボケ画像生成部1202・1203は、不正視差領域、片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域、距離情報、および被写体検出部1201で検出した被写体領域に基づき、左目用画像・右目用画像にぼかし処理を行う。具体的には、不正視差領域および片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域に7×7の平滑化フィルタ処理を行い、それ以外の領域に距離情報および被写体領域に基づくぼかし処理を行う。図14は、距離0~1、および距離5~6で不正視差領域が検出され、距離2、3、4の位置で着目被写体が検出された場合におけるぼけ強度の一例を示す図である。本図に示されるように、着目被写体が位置する距離2、3、4の区間でボケ強度を落としたような特性することで、着目する被写体のボケがなくなり、精鋭感を損なわない画像が得られる。
As described above, according to the present embodiment, distance information from the imaging position of the image data to the subject reflected in the pixel block is generated for each pixel block. Then, in addition to the blurring process for the illegal parallax area, by adding the blur of the blur intensity according to the distance information to the area other than the illegal parallax area, the illegal parallax or the like is given to the observer as a more natural blurred image. << Embodiment 4 >> that can be made unrecognizable
<4.1 Overview>
Similar to the image processing apparatus according to the third embodiment, the image processing apparatus according to the fourth embodiment is an image processing apparatus that processes a plurality of images constituting stereoscopic viewing by binocular parallax, The blur processing is performed based on the distance information. In addition to the configuration of the image processing apparatus according to the third embodiment, the subject detection unit is provided to detect a pixel region (subject region) where a predetermined subject is captured. Then, the blurring process is performed based on the illegal parallax area, the distance information, and the subject area. By performing blurring processing based on the subject area in addition to distance information, the subject you want to focus on is not a blurred image that is uniformly blurred due to the distance positional relationship, but as an image with a high degree of visibility against the subject It is possible to prevent the viewer from recognizing illegal parallax or the like.
<4.2 Configuration>
FIG. 12 is a block diagram illustrating an example of the configuration of the image processing apparatus 1200. Note that portions that are the same as those of the configuration of the image processing apparatus 1000 illustrated in FIG. As shown in FIG. 12, the image processing apparatus 1200 includes image data input terminals 101 and 102, a parallax detection unit 103, a distance information generation unit 1001, a subject detection unit 1201, an incorrect parallax detection unit 104, an interpolated image generation unit 701, It is configured to include blurred image generation units 1202 and 1203 and image data output terminals 107 and 108.
<4.2.1 Subject Detection Unit 1201>
The subject detection unit 1201 detects a pixel region (subject region) where a predetermined subject is captured. In the present embodiment, the subject area is detected by template matching. FIG. 13 is a diagram illustrating a template matching mechanism. As shown in FIG. 13, a specific object such as a face or a person is prepared as a template image 1301 in advance. Then, the subject area is detected by detecting a subject 1203 having a certain degree of similarity with the template image 1301 in comparison with the input image data 1202.
<4.2.2 Blur Image Generation Units 1202 and 1203>
The blur image generation units 1202 and 1203 are based on the incorrect parallax region, the transfer source pixel region / transferred pixel region in the transfer to the one-eye visible pixel region, the distance information, and the subject region detected by the subject detection unit 1201. Blur processing is performed on the image for the right eye and the image for the right eye. Specifically, 7 × 7 smoothing filter processing is performed on the transfer source pixel region / transferred pixel region in the transfer to the illegal parallax region and the one-eye visible pixel region, and distance information and subject region are applied to the other regions. Perform blur processing based on. FIG. 14 is a diagram showing an example of the blur intensity when the illegal parallax region is detected at the distances 0 to 1 and the distances 5 to 6 and the subject of interest is detected at the positions of the distances 2, 3, and 4. As shown in this figure, the characteristic of reducing the blur intensity in the distance 2, 3, and 4 sections where the subject of interest is located eliminates the blur of the subject of interest and provides an image that does not impair the sharpness. It is done.
 以上のように本実施形態によれば、所定の被写体が写る画素領域(被写体領域)を特定することができる。そして、不正視差領域・距離情報に加え被写体領域に基づきぼかし処理を行うことで、着目したい被写体が距離的な位置関係で一律にぼかされた不鮮明な画像とならず、被写体に対する視認性の良いボケ味がある画像として、不正視差等を観察者に認識できないようにすることができる。
≪実施の形態5≫
<5.1 概要>
 実施の形態5に係る画像処理装置は、実施の形態4に係る画像処理装置と同様に、両眼視差による立体視を構成する複数の画像を処理する画像処理装置であって、不正視差領域等に基づきぼかし処理を行うことにより、不正視差に起因する視覚疲労・不快感を軽減するものであるが、高解像度変換部を備え、入力された低解像度画像データを他の高解像度画像データと同解像度の画像データに変換する。そして高解像度変換後の画像データに対し、不正視差領域・片眼可視画素領域・距離情報・被写体領域の検出・ぼかし処理等の実施の形態1~4に係る画像処理を行う。これにより、異なる解像度の画像データに対して、実施の形態1~4に係る画像処理を行うことができ、不正視差に起因する視覚疲労・不快感を軽減することができる。
<5.2 構成>
 まず実施の形態5に係る画像処理装置の構成について説明する。なお、図12に示す実施の形態4に係る画像処理装置1200の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図15は画像処理装置1500の構成の一例を示すブロック図である。図15に示されるように、画像処理装置1500は、画像データ入力端子1501・1502、高解像度変換部1503、視差検出部103、距離情報生成部1001、被写体検出部1201、不正視差検出部104、補間画像生成部701、ボケ画像生成部1504・1505、画像データ出力端子107・108を含んで構成される。
<5.2.1 画像データ入力端子1501・1502>
 画像データ入力端子1501は、立体視の表示に用いられる左目用画像・右目用画像のうちどちらか一方の高解像度の画像データが入力される。画像データ入力端子1502は、画像データ入力端子1501に入力された画像とは異なる視点の画像であって、低解像度の画像データが入力される。
<5.2.2 高解像度変換部1503>
 高解像度変換部1503は、画像データ入力端子1502に入力された低解像度画像データを、画像データ入力端子1501に入力された高解像度画像データと同解像度の画像データに変換する。低解像度の画像から高解像度の画像の生成には、最近隣内挿法を用いる。すなわち、内挿点に最も近い画素値を配列することにより高解像度変換を行う。なお、共一次内挿法等の他の解像度変換技術により高解像度画像を生成してもよい。画像データ入力端子1502に入力された低解像度の画像データを、画像データ入力端子1501に入力された高解像度の画像データと同解像度の画像データに変換することで、左目用画像・右目用画像間で対応点探索を行うことが可能となり、視差を検出することができる。また視差から不正視差領域を検出することができる。また、視差から距離情報を算出することができる。また、視差情報が算出されていない画素ブロックを特定することが可能となり、片眼可視画素領域を検出することができる。さらに、テンプレートマッチングによる被写体領域の検出が可能となる。
<5.2.3 ボケ画像生成部 1504・1505>
 ボケ画像生成部1504・1505は、画像データ入力端子1501に入力される高解像度の画像データおよび高解像度変換部1503で高解像度変換を行った画像データに対して、不正視差領域、片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域、距離情報、被写体領域に基づきぼかし処理を行う。
As described above, according to the present embodiment, a pixel region (subject region) in which a predetermined subject is captured can be specified. Then, by performing blurring processing based on the subject area in addition to the illegal parallax region / distance information, the subject to be focused on does not become a blurred image that is uniformly blurred in the distance positional relationship, and the visibility to the subject is good It is possible to prevent the viewer from recognizing illegal parallax or the like as a blurred image.
<< Embodiment 5 >>
<5.1 Overview>
Similar to the image processing apparatus according to the fourth embodiment, the image processing apparatus according to the fifth embodiment is an image processing apparatus that processes a plurality of images constituting stereoscopic vision by binocular parallax, and includes an illegal parallax region and the like. Is used to reduce visual fatigue and discomfort caused by unauthorized parallax, but it is equipped with a high-resolution converter, and the input low-resolution image data is the same as other high-resolution image data. Convert to resolution image data. Then, the image processing according to the first to fourth embodiments such as the unauthorized parallax region, the one-eye visible pixel region, the distance information, the subject region detection, and the blurring processing are performed on the image data after the high resolution conversion. As a result, the image processing according to Embodiments 1 to 4 can be performed on image data with different resolutions, and visual fatigue and discomfort caused by incorrect parallax can be reduced.
<5.2 Configuration>
First, the configuration of the image processing apparatus according to the fifth embodiment will be described. Note that portions that are the same as those of the configuration of the image processing apparatus 1200 according to Embodiment 4 illustrated in FIG. 12 are denoted by the same reference numerals, description thereof is omitted, and different portions are described. FIG. 15 is a block diagram illustrating an example of the configuration of the image processing apparatus 1500. As shown in FIG. 15, the image processing apparatus 1500 includes image data input terminals 1501 and 1502, a high resolution conversion unit 1503, a parallax detection unit 103, a distance information generation unit 1001, a subject detection unit 1201, an illegal parallax detection unit 104, An interpolation image generation unit 701, blurred image generation units 1504 and 1505, and image data output terminals 107 and 108 are configured.
<5.2.1 Image Data Input Terminals 1501 and 1502>
The image data input terminal 1501 receives high-resolution image data of either the left-eye image or the right-eye image used for stereoscopic display. The image data input terminal 1502 is an image of a viewpoint different from the image input to the image data input terminal 1501 and receives low-resolution image data.
<5.2.2 High Resolution Converter 1503>
The high resolution conversion unit 1503 converts the low resolution image data input to the image data input terminal 1502 into image data having the same resolution as the high resolution image data input to the image data input terminal 1501. Nearest neighbor interpolation is used to generate a high resolution image from a low resolution image. That is, high resolution conversion is performed by arranging the pixel values closest to the interpolation point. Note that a high-resolution image may be generated by other resolution conversion techniques such as bilinear interpolation. By converting the low-resolution image data input to the image data input terminal 1502 into image data having the same resolution as the high-resolution image data input to the image data input terminal 1501, the left-eye image and the right-eye image It is possible to search for corresponding points and to detect parallax. In addition, an illegal parallax region can be detected from the parallax. In addition, distance information can be calculated from the parallax. In addition, it is possible to specify a pixel block for which parallax information is not calculated, and it is possible to detect a one-eye visible pixel region. Furthermore, the subject area can be detected by template matching.
<5.2.3 Blur Image Generation Units 1504 and 1505>
The blurred image generation units 1504 and 1505 are used for the high-resolution image data input to the image data input terminal 1501 and the image data that has been subjected to high-resolution conversion by the high-resolution conversion unit 1503. The blurring process is performed based on the transfer source pixel area, the transferred pixel area, the distance information, and the subject area in the transfer to the area.
 以上のように本実施形態によれば、入力される画像データが異なる解像度の画像データであったとしても、不正視差領域・片眼可視画素領域・距離情報・被写体領域を検出することができ、不正視差領域、片眼可視画素領域に対する転写における転写元の画素領域・転写された画素領域、距離情報、被写体領域に基づきぼかし処理を行うことで、いわゆるボケ味のある画像として不正視差に起因する視覚疲労・不快感を軽減することができる。特に、本実施形態に係る画像処理装置を備えるシステム(例えばステレオカメラ)において、システムが有する複数のカメラの一部を低性能のカメラとすることができ、低コスト化を図ることができる。
≪実施の形態6≫
<6.1 概要>
 実施の形態6に係る画像処理装置は、実施の形態5に係る画像処理装置と同様に、解像度が異なる入力画像データに対して、実施の形態1~4に係る画像処理を行うことができる画像処理装置であるが、その実現方法が異なる。
As described above, according to the present embodiment, even when the input image data is image data with different resolutions, it is possible to detect an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region, By performing blurring processing based on the unauthorized parallax area, the transfer source pixel area / transferred pixel area, distance information, and subject area in the transfer to the one-eye visible pixel area, a so-called blurred image results from the incorrect parallax. Visual fatigue and discomfort can be reduced. In particular, in a system (for example, a stereo camera) including the image processing apparatus according to the present embodiment, a part of a plurality of cameras included in the system can be a low-performance camera, and cost can be reduced.
<< Embodiment 6 >>
<6.1 Outline>
Similar to the image processing apparatus according to the fifth embodiment, the image processing apparatus according to the sixth embodiment can perform image processing according to the first to fourth embodiments on input image data having different resolutions. Although it is a processing device, its implementation method is different.
 入力された高解像度画像データを他の低解像度画像データと同解像度の画像データに変換し、その低解像度変換後の画像データに対して、不正視差領域・片眼可視画素領域・距離情報・被写体領域検出の検出を行う。そして低解像度の画像データに対して検出した不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を、ぼかし処理を行う高解像度の画像データに対する各情報に変換した後、各情報に基づきぼかし処理を行う。処理が重い不正視差領域・片眼可視画素領域・距離情報・被写体領域の検出の各処理を、低解像度の画像サイズで行うことができるため、処理の軽減が可能となる。
<6.2 構成>
 まず実施の形態6に係る画像処理装置1600の構成について説明する。なお、図12に示す実施の形態4に係る画像処理装置1200の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図16は画像処理装置1600の構成の一例を示すブロック図である。図16に示されるように、画像処理装置1600は、画像データ入力端子1501・1502、低解像度変換部1601、視差検出部103、高解像度変換部1602、距離情報生成部1001、被写体検出部1201、不正視差検出部104、補間画像生成部701、変換部1603、ボケ画像生成部1604・1605、画像データ出力端子107・108を含んで構成される。
<6.2.1 低解像度変換部1601>
 低解像度変換部1601は、画像データ入力端子1501に入力された高解像度画像データを、画像データ入力端子1502に入力された低解像度画像データと同解像度の画像データに変換する。高解像度の画像から低解像度の画像の生成には、最近隣内挿法を用いる。すなわち、内挿点に最も近い画素値を配列することにより高解像度変換を行う。なお、共一次内挿法等の他の解像度変換技術により高解像度画像を生成してもよい。画像データ入力端子1501に入力された高解像度の画像データを、画像データ入力端子1502に入力された低解像度の画像データと同解像度の画像データに変換することで、左目用画像・右目用画像間で対応点探索を行うことが可能となり、視差を検出することができる。また視差から不正視差領域を検出することができる。また、視差から距離情報を算出することができる。また、視差情報が算出されていない画素ブロックを特定することが可能となり、片眼可視画素領域を検出することができる。さらに、テンプレートマッチングによる被写体領域の検出が可能となる。処理が重い不正視差領域・片眼可視画素領域・距離情報・被写体領域の検出の各処理を、低解像度の画像サイズで行うことができるため、処理の軽減が可能となる。
<6.2.2 高解像度変換部1602>
 高解像度変換部1602は、画像入力端子1502に入力された低解像度画像データを、ぼかし処理を施す高解像度の画像データに変換する。具体的には、高解像度画像データと同解像度の画像データに変換する。低解像度の画像から高解像度の画像の生成には、最近隣内挿法を用いる。すなわち、内挿点に最も近い画素値を配列することにより高解像度変換を行う。なお、共一次内挿法等の他の解像度変換技術により高解像度画像を生成してもよい。
<6.2.3 変換部1603>
 変換部1603は、低解像度変換を行った高解像度の画像データと画像入力端子1502に入力された低解像度の画像データの不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を、画像入力端子1501に入力された高解像度の画像データと高解像度変換を行った低解像度の画像データの不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報に変換する。具体的には、解像度変換の際に内挿した画素を対応画素とし、その低解像度画像データの対応画素が有する不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を高解像度画像データの画素が有する不正視差領域・片眼可視画素領域・距離情報・被写体領域とする。そして、変換を行った片眼可視画素領域に対して、片眼可視画素領域と同画像座標範囲にある他方の画像領域を転写する。
<6.2.4 ボケ画像生成部1604・1605>
 ボケ画像生成部1604・1605は、画像入力端子1501に入力された高解像度の画像データと高解像度変換を行った低解像度の画像データに対して、変換部1603で変換した不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報に基づきぼかし処理を行う。
The input high-resolution image data is converted into image data with the same resolution as other low-resolution image data, and the image data after the low-resolution conversion is converted into an illegal parallax area, one-eye visible pixel area, distance information, subject Detection of area detection is performed. And after converting each information of illegal parallax area, one-eye visible pixel area, distance information, subject area detected for low resolution image data into each information for high resolution image data to be blurred, each information Perform blur processing based on. Since the processing of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region that are heavy in processing can be performed with a low-resolution image size, the processing can be reduced.
<6.2 Configuration>
First, the configuration of the image processing apparatus 1600 according to Embodiment 6 will be described. Note that portions that are the same as those of the configuration of the image processing apparatus 1200 according to Embodiment 4 illustrated in FIG. 12 are denoted by the same reference numerals, description thereof is omitted, and different portions are described. FIG. 16 is a block diagram illustrating an example of the configuration of the image processing apparatus 1600. As shown in FIG. 16, an image processing apparatus 1600 includes image data input terminals 1501 and 1502, a low resolution conversion unit 1601, a parallax detection unit 103, a high resolution conversion unit 1602, a distance information generation unit 1001, a subject detection unit 1201, The illegal parallax detection unit 104, the interpolation image generation unit 701, the conversion unit 1603, the blurred image generation units 1604 and 1605, and the image data output terminals 107 and 108 are configured.
<6.2.1 Low Resolution Conversion Unit 1601>
The low resolution converter 1601 converts the high resolution image data input to the image data input terminal 1501 into image data having the same resolution as the low resolution image data input to the image data input terminal 1502. Nearest neighbor interpolation is used to generate a low resolution image from a high resolution image. That is, high resolution conversion is performed by arranging the pixel values closest to the interpolation point. Note that a high-resolution image may be generated by other resolution conversion techniques such as bilinear interpolation. By converting the high-resolution image data input to the image data input terminal 1501 into image data having the same resolution as the low-resolution image data input to the image data input terminal 1502, the image between the left-eye image and the right-eye image is converted. It is possible to search for corresponding points and to detect parallax. In addition, an illegal parallax region can be detected from the parallax. In addition, distance information can be calculated from the parallax. In addition, it is possible to specify a pixel block for which parallax information is not calculated, and it is possible to detect a one-eye visible pixel region. Furthermore, the subject area can be detected by template matching. Since the processing of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region that are heavy in processing can be performed with a low-resolution image size, the processing can be reduced.
<6.2.2 High Resolution Conversion Unit 1602>
The high resolution conversion unit 1602 converts the low resolution image data input to the image input terminal 1502 into high resolution image data on which blurring processing is performed. Specifically, the image data is converted into image data having the same resolution as the high resolution image data. Nearest neighbor interpolation is used to generate a high resolution image from a low resolution image. That is, high resolution conversion is performed by arranging the pixel values closest to the interpolation point. Note that a high-resolution image may be generated by other resolution conversion techniques such as bilinear interpolation.
<6.2.3 Conversion unit 1603>
The conversion unit 1603 converts each information of the high-resolution image data subjected to the low-resolution conversion and the incorrect parallax region, the one-eye visible pixel region, the distance information, and the subject region of the low-resolution image data input to the image input terminal 1502. Then, the high-resolution image data input to the image input terminal 1501 and the low-resolution image data subjected to the high-resolution conversion are converted into each information of illegal parallax area, one-eye visible pixel area, distance information, and subject area. Specifically, the interpolated pixel at the time of resolution conversion is used as the corresponding pixel, and the information on the incorrect parallax area, one-eye visible pixel area, distance information, and subject area included in the corresponding pixel of the low-resolution image data is high resolution. It is assumed that the pixel of the image data has an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region. Then, the other image region in the same image coordinate range as the one-eye visible pixel region is transferred to the one-eye visible pixel region that has been converted.
<6.2.4 Blur Image Generation Units 1604 and 1605>
The blurred image generation units 1604 and 1605 convert the high-resolution image data input to the image input terminal 1501 and the low-resolution image data subjected to the high-resolution conversion into the incorrect parallax area / one eye converted by the conversion unit 1603. Blur processing is performed based on each information of the visible pixel area, distance information, and subject area.
 以上が画像処理装置1600の構成についての説明である。続いて、画像処理装置1600の動作について説明する。
<6.3 動作>
 図17は画像処理装置1600の動作を示すフロー図である。本図に示されるように、低解像度変換部1601はまず、画像データ入力端子1501に入力された高解像度入力画像データを、画像データ入力端子1502に入力された低解像度入力画像データと同解像度の画像データに変換する(ステップS1701)。次に、低解像度変換を行った高解像度入力画像データおよび低解像度入力画像データに対して、不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を検出する(ステップS1702)。不正視差領域・片眼可視画素領域・距離情報・被写体領域検出の各処理の詳細については、実施の形態1~4で説明したので、ここでは説明を省略する。そして、高解像度変換部1602は、画像データ入力端子1502に入力された低解像度入力画像データを、画像データ入力端子1501に入力された低解像度入力画像データと同解像度の画像データに変換する(ステップS1703)。変換部1603は、ステップS1702で検出された低解像度変換を行った高解像度入力画像データの不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を高解像度入力画像データの各情報に、低解像度入力画像データの不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を高解像度変換を行った低解像度入力画像データの各情報に変換する(ステップS1704)。具体的には、解像度変換の際に内挿した画素を対応画素とし、その低解像度画像データの対応画素が有する不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報を高解像度画像データの画素が有する不正視差領域・片眼可視画素領域・距離情報・被写体領域とする。そして変換部1603は、ステップS1704で、変換を行った片眼可視画素領域に対して、片眼可視画素領域と同画像座標範囲にある他方の画像領域を転写する。(ステップS1705)。以上の処理の後、ボケ画像生成部1604・1605は、ぼかし処理を行う変換部1603で変換した不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報に基づきぼかし処理を行う(ステップS1706)。具体的には、ボケ画像生成部1604は、高解像度入力画像データに対して、変換部1603で変換した不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報に基づきぼかし処理を行い、ボケ画像生成部1605は、高解像度変換された低解像度入力画像データに対して、変換部1603で変換した不正視差領域・片眼可視画素領域・距離情報・被写体領域の各情報に基づきぼかし処理を行う。以上が、画像処理装置700の動作についての説明である。
The above is the description of the configuration of the image processing apparatus 1600. Next, the operation of the image processing apparatus 1600 will be described.
<6.3 Operation>
FIG. 17 is a flowchart showing the operation of the image processing apparatus 1600. As shown in this figure, the low resolution conversion unit 1601 first converts the high resolution input image data input to the image data input terminal 1501 to the same resolution as the low resolution input image data input to the image data input terminal 1502. Conversion into image data is performed (step S1701). Next, each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region is detected from the high-resolution input image data and the low-resolution input image data subjected to the low-resolution conversion (step S1702). Since the details of each processing of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region detection have been described in the first to fourth embodiments, the description thereof is omitted here. Then, the high resolution conversion unit 1602 converts the low resolution input image data input to the image data input terminal 1502 into image data having the same resolution as the low resolution input image data input to the image data input terminal 1501 (Step S1). S1703). The conversion unit 1603 converts each information of the illegal parallax region, one-eye visible pixel region, distance information, and subject region of the high-resolution input image data subjected to the low-resolution conversion detected in step S1702 into each piece of information of the high-resolution input image data. Then, each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region of the low resolution input image data is converted into each information of the low resolution input image data subjected to the high resolution conversion (step S1704). Specifically, the interpolated pixel at the time of resolution conversion is used as the corresponding pixel, and the information on the incorrect parallax area, one-eye visible pixel area, distance information, and subject area included in the corresponding pixel of the low-resolution image data is high resolution. It is assumed that the pixel of the image data has an illegal parallax region, a one-eye visible pixel region, distance information, and a subject region. In step S1704, the conversion unit 1603 transfers the other image region in the same image coordinate range as the one-eye visible pixel region to the one-eye visible pixel region that has been converted. (Step S1705). After the above processing, the blurred image generation units 1604 and 1605 perform the blurring process based on the information on the incorrect parallax area, the one-eye visible pixel area, the distance information, and the subject area converted by the conversion unit 1603 that performs the blurring process ( Step S1706). Specifically, the blurred image generation unit 1604 performs a blurring process on the high-resolution input image data based on each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region converted by the conversion unit 1603. Then, the blurred image generation unit 1605 blurs the low-resolution input image data subjected to the high-resolution conversion based on each information of the illegal parallax region, the one-eye visible pixel region, the distance information, and the subject region converted by the conversion unit 1603. Process. The operation of the image processing apparatus 700 has been described above.
 以上のように本実施形態によれば、処理が重い不正視差領域・片眼可視画素領域・距離情報・被写体領域の検出の各処理を、低解像度の画像サイズで行うことができるため、実施の形態1~4に係る画像処理を高速に実行することができる。また、低性能のプロセッサで処理を行うことができる。
≪実施の形態7≫
<7.1 概要>
 実施の形態7に係る画像処理装置は、実施の形態1~4に係る画像処理装置と同様に、不正視差領域に対して、ぼかし処理を行うことにより、不正視差に起因する視覚疲労・不快感を軽減するものであるが、入力画像データが1つである点において異なる。1つの入力画像とその入力画像に対応する距離情報を受け付け、その距離情報から算出される視差に基づき不正視差を検出し、入力画像と距離情報から生成した視差画像に対してぼかし処理を行う。また、被写体領域の検出は、入力画像と距離情報から生成した視差画像について行う。これにより、不正視差領域・距離情報・被写体領域に基づくぼかし処理を、1つの入力画像とその入力画像に対応する距離情報から行うことができる。
<7.2 構成>
 まず実施の形態7に係る画像処理装置1800の構成について説明する。図18は画像処理装置1800の構成の一例を示すブロック図である。なお、図12に示す実施の形態4に係る画像処理装置1200の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図18に示されるように、画像処理装置1800は、画像データ入力端子1801、距離情報入力端子1802、視差算出部1803、視差画像生成部1804、不正視差検出部104、被写体検出部1201、ボケ画像生成部1805・1806、画像データ出力端子107・108を含んで構成される。
<7.2.1 画像データ入力端子1801、距離情報入力端子1802>
 画像データ入力端子1801は、1つの2次元画像データが入力される。距離情報入力端子1802は、画像データ入力端子1801に入力される画像データの画素ブロック毎の距離情報が入力される。ここで距離情報とは、画像データの撮像位置から画素ブロックに映る被写体までの距離をいう。なお距離情報は、TOF(Time Of Flight)型距離センサ等の距離センサにより取得してもよい。また、カメラのオートフォーカス動作時にレンズのピント位置から距離情報を取得してもよい。また、距離情報は、外部の記録装置、放送電波、ネットワーク等を通じて取得してもよい。
<7.2.2 視差算出部1803>
 視差算出部1803は、距離情報入力端子1802から入力された距離情報に基づき、画素ブロック毎に視差画像作成にあたり水平シフトすべき長さ(視差)を算出する。視差は、図4を用いて説明した距離情報と視差の関係を用いて算出することができる。
<7.2.4 視差画像生成部1804>
 視差画像生成部1804は、視差算出部1803で算出された視差の長さだけ横方向に各画素ブロックをシフトさせることにより、視差画像を生成する。
<7.2.5 ボケ画像生成部1805・1806>
 ボケ画像生成部1805は、画像データ入力端子1801から入力された画像データに対し、不正視差領域・距離情報・被写体領域に基づき、ぼかし処理を行う。ボケ画像生成部1806は、視差画像生成部1804で生成した視差画像に対して、不正視差領域・距離情報・被写体領域に基づき、ぼかし処理を行う。以上が画像処理装置1800の構成についての説明である。続いて、画像処理装置1800の動作について説明する。
<7.3 動作>
 図19は画像処理装置1800の動作を示すフロー図である。本図に示されるように、視差算出部1803は、距離情報入力端子1802に入力された距離情報に基づき視差を算出する(ステップS1901)。次に視差画像生成部1804は、視差算出部1803で算出された視差に基づき視差画像を生成する(ステップS1902)。そして不正視差検出部104で不正視差領域を、被写体検出部1201で被写体領域を入力画像データおよび視差画像に対してそれぞれ検出する(ステップS1903)。以上の処理の後、ボケ画像生成部1805は、不正視差領域・距離情報・被写体領域に基づきぼかし処理を行う(ステップS1904)。
As described above, according to the present embodiment, each process of detecting an illegal parallax region, one-eye visible pixel region, distance information, and subject region, which is heavy in processing, can be performed with a low-resolution image size. The image processing according to Embodiments 1 to 4 can be executed at high speed. In addition, processing can be performed by a low-performance processor.
<< Embodiment 7 >>
<7.1 Outline>
Similar to the image processing apparatuses according to the first to fourth embodiments, the image processing apparatus according to the seventh embodiment performs blurring processing on the illegal parallax region, thereby causing visual fatigue / discomfort due to the illegal parallax. Is different in that the number of input image data is one. One input image and distance information corresponding to the input image are received, illegal parallax is detected based on the parallax calculated from the distance information, and blurring processing is performed on the parallax image generated from the input image and the distance information. The subject area is detected for a parallax image generated from the input image and distance information. Thereby, the blurring process based on the incorrect parallax area, the distance information, and the subject area can be performed from one input image and the distance information corresponding to the input image.
<7.2 Configuration>
First, the configuration of an image processing apparatus 1800 according to Embodiment 7 will be described. FIG. 18 is a block diagram illustrating an example of the configuration of the image processing apparatus 1800. Note that portions that are the same as those of the configuration of the image processing apparatus 1200 according to Embodiment 4 illustrated in FIG. 12 are denoted by the same reference numerals, description thereof is omitted, and different portions are described. As shown in FIG. 18, the image processing apparatus 1800 includes an image data input terminal 1801, a distance information input terminal 1802, a parallax calculation unit 1803, a parallax image generation unit 1804, an illegal parallax detection unit 104, a subject detection unit 1201, a blurred image. It includes generation units 1805 and 1806 and image data output terminals 107 and 108.
<7.2.1 Image Data Input Terminal 1801, Distance Information Input Terminal 1802>
One two-dimensional image data is input to the image data input terminal 1801. The distance information input terminal 1802 receives distance information for each pixel block of the image data input to the image data input terminal 1801. Here, the distance information refers to the distance from the imaging position of the image data to the subject reflected in the pixel block. The distance information may be acquired by a distance sensor such as a TOF (Time Of Flight) type distance sensor. Further, the distance information may be acquired from the focus position of the lens during the autofocus operation of the camera. The distance information may be acquired through an external recording device, broadcast radio waves, a network, or the like.
<7.2.2 Parallax Calculation Unit 1803>
The parallax calculation unit 1803 calculates the length (parallax) to be horizontally shifted for creating a parallax image for each pixel block based on the distance information input from the distance information input terminal 1802. The parallax can be calculated using the relationship between the distance information and the parallax described with reference to FIG.
<7.2.4 Parallax image generation unit 1804>
The parallax image generation unit 1804 generates a parallax image by shifting each pixel block in the horizontal direction by the length of the parallax calculated by the parallax calculation unit 1803.
<7.2.5 Blur Image Generation Units 1805 and 1806>
The blurred image generation unit 1805 performs a blurring process on the image data input from the image data input terminal 1801 based on the incorrect parallax area, the distance information, and the subject area. The blurred image generation unit 1806 performs a blurring process on the parallax image generated by the parallax image generation unit 1804 based on the incorrect parallax region, the distance information, and the subject region. The above is the description of the configuration of the image processing apparatus 1800. Next, the operation of the image processing apparatus 1800 will be described.
<7.3 Operation>
FIG. 19 is a flowchart showing the operation of the image processing apparatus 1800. As shown in the figure, the parallax calculation unit 1803 calculates the parallax based on the distance information input to the distance information input terminal 1802 (step S1901). Next, the parallax image generation unit 1804 generates a parallax image based on the parallax calculated by the parallax calculation unit 1803 (step S1902). Then, the unauthorized parallax detection unit 104 detects the unauthorized parallax region, and the subject detection unit 1201 detects the subject region for the input image data and the parallax image (step S1903). After the above processing, the blurred image generation unit 1805 performs blurring processing based on the incorrect parallax region, the distance information, and the subject region (step S1904).
 以上のように本実施形態によれば、1つの2次元画像データとその画像データに対応する距離情報から、立体視画像を作成し、不正視差領域・距離情報・被写体領域に基づくぼかし処理を行うことができ、不正視差に起因する視覚疲労・不快感を軽減することができる。
≪実施の形態8≫
<8.1 概要>
 実施の形態8に係る画像処理装置は、距離情報を受け付けない点において実施の形態7にかかる画像処理装置と異なる。距離情報は、入力された1つの画像データから抽出する。これにより、不正視差領域・距離情報・被写体領域に基づくぼかし処理を、1つの入力画像から行うことができる。
<8.2 構成>
 図20は画像処理装置2000の構成の一例を示すブロック図である。なお、図18に示す実施の形態7に係る画像処理装置1800の構成と同じ部分については、同符号を付して説明を省略し、異なる部分について説明する。図20に示されるように、画像処理装置2000は、画像データ入力端子1801、距離情報抽出部2001、視差算出部1803、不正視差検出部104、視差画像生成部1804、被写体検出部1201、ボケ画像生成部1805・1806、画像データ出力端子107・108を含んで構成される。
As described above, according to the present embodiment, a stereoscopic image is created from one two-dimensional image data and distance information corresponding to the image data, and a blurring process based on an illegal parallax area / distance information / subject area is performed. Visual fatigue and discomfort caused by unauthorized parallax can be reduced.
<< Embodiment 8 >>
<8.1 Overview>
The image processing apparatus according to the eighth embodiment is different from the image processing apparatus according to the seventh embodiment in that distance information is not received. The distance information is extracted from one input image data. Thereby, the blurring process based on the incorrect parallax area, the distance information, and the subject area can be performed from one input image.
<8.2 Configuration>
FIG. 20 is a block diagram showing an example of the configuration of the image processing apparatus 2000. Note that the same parts as those of the configuration of the image processing apparatus 1800 according to Embodiment 7 shown in FIG. 18 are denoted by the same reference numerals, description thereof is omitted, and different parts are described. As illustrated in FIG. 20, the image processing apparatus 2000 includes an image data input terminal 1801, a distance information extraction unit 2001, a parallax calculation unit 1803, an unauthorized parallax detection unit 104, a parallax image generation unit 1804, a subject detection unit 1201, a blurred image. It includes generation units 1805 and 1806 and image data output terminals 107 and 108.
 以下、距離情報抽出部2001について説明する。距離情報抽出部2001は、画像データ入力端子1801から入力された2次元画像データから、距離情報を抽出する。具体的には、非特許文献2に示される技術を用いる。すなわち、まず画像を「スーパーピクセル」と呼ばれる色、明るさなどの属性がきわめて均質な画素集合に分け、このスーパーピクセルを隣接するスーパーピクセルと比較し、テクスチャーのグラデーションなどの変化を分析することによって、観察者からの距離を推定する。 Hereinafter, the distance information extraction unit 2001 will be described. The distance information extraction unit 2001 extracts distance information from the two-dimensional image data input from the image data input terminal 1801. Specifically, the technique shown in Non-Patent Document 2 is used. In other words, by first dividing the image into a set of pixels called “superpixels” that have very homogeneous attributes such as color and brightness, comparing these superpixels with neighboring superpixels, and analyzing changes in texture gradation, etc. Estimate the distance from the observer.
 以上のように本実施形態によれば、1つの2次元画像データから、立体視画像を作成し、不正視差領域・距離情報・被写体領域に基づくぼかし処理を行うことができ、不正視差に起因する視覚疲労・不快感を軽減することができる。
≪実施の形態9≫
<9.1 概要>
 実施の形態9は、実施の形態1から8までのいずれかの画像処理装置を備える撮像装置に関する実施例である。撮像装置は、カメラおよびユーザ操作により不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更することができるユーザインターフェイスを備える。実施の形態1から8までのいずれかの画像処理装置によりぼかし処理を行った画像に対し、ユーザが不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更を行うことにより、所望のボケが付加された画像を生成することができる。
<9.2 構成>
 まず、本実施の形態に係る撮像装置の使用形態について説明する。図21、22、23は、本実施形態に係る撮像装置の一例を示す図である。図21はデジタルスチールカメラ、図22はモバイル端末機器、図23はテレビに接続されたステレオカメラを示す。各図に示される撮像装置は、カメラにより撮影された画像または外部から入力された画像に対して、実施の形態1から8までのいずれかの画像処理装置によりぼかし処理を行い、画像処理後の画像をディスプレイに表示する。その後、タッチパネル、リモコン等によるユーザ操作を受け付け、不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更を行う。ユーザ操作に基づき、ディスプレイに表示されるぼけ画像は更新される。次にこの撮像装置の構成について具体的に説明する。
As described above, according to the present embodiment, it is possible to create a stereoscopic image from one two-dimensional image data and perform blurring processing based on the incorrect parallax area / distance information / subject area, resulting from incorrect parallax. Visual fatigue and discomfort can be reduced.
<< Embodiment 9 >>
<9.1 Outline>
The ninth embodiment is an example related to an imaging apparatus including any one of the image processing apparatuses according to the first to eighth embodiments. The imaging apparatus includes a camera and a user interface capable of selecting / changing a range of parallax recognized as illegal parallax, selecting / changing a subject of interest, and selecting / changing blur intensity based on distance information. Based on the selection / change of the parallax range that the user recognizes as the incorrect parallax, the selection / change of the subject of interest, and the distance information with respect to the image subjected to the blurring process by any of the image processing apparatuses of the first to eighth embodiments By selecting / changing the blur intensity, an image with a desired blur can be generated.
<9.2 Configuration>
First, a usage pattern of the imaging device according to the present embodiment will be described. 21, 22, and 23 are diagrams illustrating an example of an imaging apparatus according to the present embodiment. 21 shows a digital still camera, FIG. 22 shows a mobile terminal device, and FIG. 23 shows a stereo camera connected to a television. The imaging device shown in each figure performs blurring processing on an image photographed by a camera or an image input from the outside by any one of the image processing devices according to the first to eighth embodiments, and after image processing. Show the image on the display. Thereafter, a user operation by a touch panel, a remote controller, or the like is received, and selection / change of a parallax range recognized as illegal parallax, selection / change of a subject of interest, and selection / change of blur intensity based on distance information are performed. Based on the user operation, the blurred image displayed on the display is updated. Next, the configuration of this imaging apparatus will be specifically described.
 図24は撮像装置2400の構成の一例を示すブロック図である。図24に示されるように、撮像装置2400は、画像信号処理部2401、制御部2402、カメラ部2403・2404、画像復号化部2405・2406、画像符号化部2407・2408、画像記録部2409、データ伝送部2410、3D/2D表示部2411、タッチパネル2412、振動部2413を含んで構成される。
<9.2.1 画像信号処理部2101>
 画像信号処理部2001は、実施の形態1~8にかかる画像処理装置のいずれかを備え、カメラで撮影された画像データ等に対してぼかし処理を行う。なお、不正視差領域、距離情報、片眼可視画素領域の検出、被写体領域の検出は、並列的に処理してもよい。並列的に処理を行うことにより、ぼかし処理の高速に行うことができる。
<9.2.2 制御部2102>
 制御部2102は、撮像装置2100の各構成の動作を制御する。
<9.2.3 カメラ部2003・2004>
 カメラ部2003・2004は、イメージセンサを備え、左目用画像・右目用画像を撮像する。
<9.2.4 画像復号化部2005・2006>
 画像復号化部2005・2006は、JPEG、MPEG、H.264、FLASHビデオなどの圧縮した画像データの復号を行う。
<9.2.5 画像符号化部2007・2008>
 画像符号化部2007・2008は、JPEG、MPEG、H.264、FLASHビデオなどへの圧縮を行う。
<9.2.6 画像記録部2409>
 画像記録部2409は、カメラ部2403・2404で撮影した画像データ・画像信号処理部2401で画像処理した画像データを、メモリカード、HDD、光ディスク等の記録媒体に保存する。
<9.2.7 データ伝送部2410>
 データ伝送部2410は、インターネット、イントラネットといったコンピュータデータ通信網、携帯電話通信網等に対応したネットワーク送受信や、放送波を受信するチューナであり、例えばカメラ部2403・2404で撮影した画像データ・画像信号処理部2401で画像処理した画像データを伝送する。
<9.2.8 3D/2D表示部2411>
 3D/2D表示部2411は、カメラ部2403・2404で撮影した画像データ・画像信号処理部2401で画像処理した画像を液晶ディスプレイ、プラズマディスプレイ、有機EL等を用いて表示する。
<9.2.9 タッチパネル2412>
 タッチパネル2412は、ユーザによる不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更操作を受けつける。具体的には<9.3>節にて説明するが、3D/2D表示部2411で表示された画像、制御部2401で描画されたメニュー画面を見ながら、直感的に、タッチパネル2412経由で撮像装置2400に対して操作指示を行うことが可能である。
<9.2.10 振動部2413>
 振動部2413は、タッチパネル2412等によるユーザ操作に応答して、適宜振動動作を行う。画面による直感的な操作に加え、振動による感覚的な認知による明瞭な画像信号処理システムからの応答を使用者は感知することができる。
FIG. 24 is a block diagram illustrating an example of a configuration of the imaging apparatus 2400. As shown in FIG. 24, the imaging apparatus 2400 includes an image signal processing unit 2401, a control unit 2402, camera units 2403 and 2404, image decoding units 2405 and 2406, image encoding units 2407 and 2408, an image recording unit 2409, A data transmission unit 2410, a 3D / 2D display unit 2411, a touch panel 2412, and a vibration unit 2413 are included.
<9.2.1 Image Signal Processing Unit 2101>
The image signal processing unit 2001 includes any of the image processing apparatuses according to the first to eighth embodiments, and performs a blurring process on image data captured by the camera. Note that the illegal parallax region, the distance information, the one-eye visible pixel region detection, and the subject region detection may be processed in parallel. By performing processing in parallel, blurring processing can be performed at high speed.
<9.2.2 Control unit 2102>
A control unit 2102 controls the operation of each component of the imaging device 2100.
<9.2.3 Camera unit 2003/2004>
The camera units 2003 and 2004 include an image sensor, and take an image for the left eye and an image for the right eye.
<9.2.4 Image Decoding Units 2005 and 2006>
The image decoding units 2005 and 2006 are JPEG, MPEG, H.264, and so on. H.264, decoding of compressed image data such as FLASH video.
<9.2.5 Image Encoding Unit 2007/2008>
The image encoding units 2007 and 2008 are JPEG, MPEG, H.264, and so on. H.264, compression to FLASH video, etc.
<9.2.6 Image Recording Unit 2409>
The image recording unit 2409 stores the image data captured by the camera units 2403 and 2404 and the image data processed by the image signal processing unit 2401 in a recording medium such as a memory card, HDD, or optical disk.
<9.2.7 Data Transmission Unit 2410>
The data transmission unit 2410 is a tuner that receives network transmission / reception and broadcast waves corresponding to a computer data communication network such as the Internet and an intranet, a mobile phone communication network, and the like. For example, image data / image signals captured by the camera units 2403 and 2404 The image data processed by the processing unit 2401 is transmitted.
<9.2.8 3D / 2D Display Unit 2411>
The 3D / 2D display unit 2411 displays an image processed by the image data / image signal processing unit 2401 captured by the camera units 2403 and 2404 using a liquid crystal display, a plasma display, an organic EL, or the like.
<9.2.9 Touch panel 2412>
The touch panel 2412 receives a selection / change of a parallax range recognized as an illegal parallax by a user, a selection / change of a target subject, and a selection / change operation of a blur intensity based on distance information. Specifically, as described in section <9.3>, an image is intuitively captured via the touch panel 2412 while viewing an image displayed on the 3D / 2D display unit 2411 and a menu screen drawn on the control unit 2401. An operation instruction can be issued to the device 2400.
<9.2.10 Vibrating unit 2413>
The vibration unit 2413 performs an appropriate vibration operation in response to a user operation using the touch panel 2412 or the like. In addition to intuitive operation by the screen, the user can perceive a clear response from the image signal processing system by sensory perception by vibration.
 以上が撮像装置2400の構成についての説明である。続いて、ユーザによる不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更操作について説明する。
<9.3 ユーザ操作・処理>
 図25は、ユーザによる不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更操作における表示画面の一例を示す図である。図25に示されるように、「Boke」の部分をタッチすることにより、ぼかし処理が行われる。まず、表示画面には、簡易のぼかし処理が施されたボケ画像が表示される。ここで簡易のぼかし処理とは、画像中央をピント位置とし、画像中央からの長さに応じた強さのボケを付加するものであり、高速に処理することができる。(a)簡易のぼかし処理・表示、(b)不正視差領域検出、(c)距離情報算出、(d)被写体領域検出の各処理は並列的に行い、各情報が検出される毎にボケ画像を更新する。距離情報算出処理後は、距離情報に基づくぼかし処理の変更・決定操作をユーザに求め、ユーザは例えば<9.3.3>節で後述するユーザ操作により距離情報に基づくぼかし処理の変更・決定を行うことができる。距離情報に基づくぼかし処理の変更・決定された場合、画面に表示されるボケ画像を更新する。また、被写体領域検出処理後は、着目被写体の変更・決定操作をユーザに求め、ユーザは例えば<9.3.1>節で後述するユーザ操作により被写体領域の変更・決定を行うことができる。被写体領域の変更・決定が行われた場合、画面に表示されるボケ画像を更新する。
The above is the description of the configuration of the imaging device 2400. Next, a description will be given of selection / change of a parallax range to be recognized as unauthorized parallax by a user, selection / change of a subject of interest, and selection / change operation of blur intensity based on distance information.
<9.3 User operation and processing>
FIG. 25 is a diagram illustrating an example of a display screen in selection / change of a parallax range to be recognized as unauthorized parallax by a user, selection / change of a subject of interest, and selection / change operation of blur intensity based on distance information. As shown in FIG. 25, the blurring process is performed by touching the “Boke” portion. First, a blurred image that has been subjected to a simple blurring process is displayed on the display screen. Here, the simple blurring process is a process in which the center of the image is set as a focus position and a blur having a strength corresponding to the length from the center of the image is added, and can be processed at high speed. Each process of (a) simple blurring processing / display, (b) illegal parallax area detection, (c) distance information calculation, and (d) subject area detection is performed in parallel, and each time information is detected, a blurred image Update. After the distance information calculation process, the user is requested to change / determine the blurring process based on the distance information, and the user changes / determines the blurring process based on the distance information by a user operation described later in, for example, <9.3.3>. It can be performed. When the blurring process is changed or determined based on the distance information, the blurred image displayed on the screen is updated. Further, after the subject area detection processing, the user is requested to change / determine the subject of interest, and the user can change / determine the subject area by a user operation described later in, for example, section <9.3.1>. When the subject area is changed or determined, the blurred image displayed on the screen is updated.
 このように(a)簡易のぼかし処理・表示、(b)不正視差領域検出、(c)距離情報算出、(d)被写体領域検出の各処理を並列的に行い、処理の終了した情報に基づきボケ画像の逐次更新・ユーザ操作を行うことにより、リアルタイムにユーザにボケ画像を表示することができる。
<9.3.1 被写体領域変更・決定操作>
 次に被写体領域変更・決定操作について説明する。図25において、被写体検出部により検出された着目被写体は、四角で囲って表示されている。ここで被写体をタッチすることにより、着目被写体を設定したり、着目被写体を解除することができる。また、振動部2413と連動をさせることにより、検出した被写体付近を使用者が触れたときに、振動部2413が作動し、使用者が振動によって被写体を直感的に感知するようにしてもよい。着目被写体が変更された場合、変更された被写体領域に基づき再度ぼかし処理が行われる。このように、ユーザが着目被写体を選択・解除し、変更された被写体領域に基づき再度ぼかし処理を行うことにより、着目被写体に対する視認性の良い画像を得ることができる。
<9.3.2 不正視差範囲変更・決定操作>
 続いて、不正視差範囲の変更・決定操作について説明する。図25において、「Option」部分では、不正視差と認定する視差の範囲を決定することができる。図26は、安全レベルの変更画面を示す図である。例えば、本図に示されるように3段階の安全レベルを設ける。各安全レベルには、不正視差と認定する視差の範囲が定められており、ユーザが安全レベルを選択することで、不正視差の範囲を変更することができる。例えば、安全レベル1は、視差角1.0以上・-1.0以下に対応する視差を不正視差と認定するよう定め、安全レベル2は、視差角1.2以上・-1.2以下に対応する視差を不正視差と認定するように定める。安全レベルが変更された場合、不正視差領域の検出が再度行われ、変更された不正視差領域にぼかし処理が行われる。
In this way, (a) simple blur processing / display, (b) illegal parallax area detection, (c) distance information calculation, and (d) subject area detection are performed in parallel, and the processing is completed. By sequentially updating the blur image and performing a user operation, the blur image can be displayed to the user in real time.
<9.3.1 Subject area change / decision operation>
Next, the subject area change / determination operation will be described. In FIG. 25, the subject of interest detected by the subject detection unit is displayed surrounded by a square. By touching the subject here, the subject of interest can be set or the subject of interest can be released. Further, by interlocking with the vibration unit 2413, the vibration unit 2413 may be activated when the user touches the vicinity of the detected subject, and the user may intuitively sense the subject by vibration. When the subject of interest is changed, the blurring process is performed again based on the changed subject area. As described above, when the user selects / cancels the subject of interest and performs the blurring process again based on the changed subject region, an image with high visibility with respect to the subject of interest can be obtained.
<9.3.2 Unjust parallax range change / decision operation>
Next, an operation for changing / deciding an unauthorized parallax range will be described. In FIG. 25, in the “Option” portion, it is possible to determine a parallax range to be recognized as an illegal parallax. FIG. 26 is a diagram showing a safety level change screen. For example, as shown in the figure, three levels of safety are provided. Each safety level has a parallax range to be recognized as unauthorized parallax, and the user can change the range of unauthorized parallax by selecting the safe level. For example, safety level 1 is determined to recognize parallax corresponding to a parallax angle of 1.0 or more and −1.0 or less as an illegal parallax, and safety level 2 is set to be a parallax angle of 1.2 or more and −1.2 or less. The corresponding parallax is determined to be recognized as an illegal parallax. When the safety level is changed, the unauthorized parallax area is detected again, and the changed unauthorized parallax area is blurred.
 また、立体視画像の表示にあたり想定されるディスプレイサイズを、ユーザが入力することのできるインターフェイスを備えても良い。ここで入力されたディスプレイサイズに基づき、不正視差を構成する画素を更新し、不正視差領域の検出を再度行う。
<9.3.3 距離情報に基づくぼかし処理変更・決定操作>
 続いて、距離情報に基づくぼかし処理変更・決定操作について説明する。図27は、距離情報に基づくぼかし処理変更・決定操作を説明するための図である。「DEPTH」の部分をタッチすると、距離情報に基づくぼかし処理を変更することができる。具体的には、図11で示されるぼけ強度曲線の頂点の位置、すなわち距離情報によるぼかし処理を施さない距離の位置を変更することができる。「DEPTH」の右横に表示されるスクロールバーをスライドすることにより、このぼけ強度曲線の頂点の位置を変更することができる。図28~図35は、このスクロールバーをスライドすることにより、変化するぼかし処理を表示した図である。白抜きで表示した部分が距離情報によるぼかし処理を施さない領域を示す。つまり、白抜きで表示された被写体部分を中心にぼかし処理が行われる。このように、スクロールバーをスライドすることにより、ぼけ強度曲線の頂点の位置を変更することができ、逐次ぼかし処理された画像が表示される。図36は、このユーザ操作により変更されるぼけ強度曲線を示す図である。距離3の部分でスクロールを止めれば、ぼけ強度曲線3501に示されるボケが付加され、距離2の部分でスクロールを止めれば、ぼけ強度曲線3502に示されるボケが付加される。
Moreover, you may provide the interface which a user can input the display size assumed when displaying a stereoscopic vision image. Based on the input display size, the pixels constituting the illegal parallax are updated, and the illegal parallax area is detected again.
<9.3.3 Blur processing change / decision operation based on distance information>
Next, a blurring process change / decision operation based on distance information will be described. FIG. 27 is a diagram for explaining a blurring process change / decision operation based on distance information. When the “DEPTH” portion is touched, the blurring process based on the distance information can be changed. Specifically, the position of the vertex of the blur intensity curve shown in FIG. 11, that is, the position of the distance that is not subjected to the blurring process based on the distance information can be changed. By sliding a scroll bar displayed on the right side of “DEPTH”, the position of the vertex of the blur intensity curve can be changed. FIG. 28 to FIG. 35 are diagrams showing a blurring process that changes by sliding the scroll bar. A portion displayed in white indicates a region where blurring processing based on distance information is not performed. That is, the blurring process is performed around the subject portion displayed in white. In this way, by sliding the scroll bar, the position of the vertex of the blur intensity curve can be changed, and images that are sequentially blurred are displayed. FIG. 36 is a diagram showing a blur intensity curve that is changed by the user operation. If the scroll is stopped at the distance 3 portion, the blur indicated by the blur strength curve 3501 is added. If the scroll is stopped at the distance 2 portion, the blur indicated by the blur strength curve 3502 is added.
 以上がユーザ操作についての説明である。このように、ユーザは各操作により不正視差と認定する視差の範囲の選択・変更、着目被写体の選択・変更、距離情報に基づくぼけ強度の選択・変更することができる。そして、この各操作により選択・変更された各情報に基づき、再度ぼかし処理を行うことにより、ユーザ所望のぼかし処理が施された画像を生成することができる。
≪補足≫
 なお、上記の実施の形態に基づいて説明してきたが、本発明は上記の実施の形態に限定されないことはもちろんである。以下のような場合も本発明に含まれる。
(a)本発明は、各実施形態で説明した処理手順が開示するアプリケーション実行方法であるとしてもよい。また、前記処理手順でコンピュータを動作させるプログラムコードを含むコンピュータプログラムであるとしてもよい。
(b)本発明は、アプリケーション実行装置の制御を行うLSIとしても実施可能である。図36は、本発明に係る画像処理装置をLSIで具現化した例を示す。本図に示されるように例えば、CPU、DSP、ENC/DEC(エンコーダ/デコーダ)、AIF(オーディオインターフェイス)、VIF(ビデオインターフェイス)、PERI(周辺機器を制御する制御モジュール)、NIF(ネットワークインターフェイス)、MIF(メモリーインターフェイス)、RAM/ROMを含んで構成される。各実施形態で説明した処理手順はプログラムコードとしてRAM/ROMに格納され、CPUまたはDSPで実行される。
This completes the description of the user operation. In this way, the user can select / change the range of parallax that is recognized as incorrect parallax, select / change the subject of interest, and select / change the blur intensity based on the distance information. Then, by performing the blurring process again based on each information selected / changed by each operation, an image subjected to the blurring process desired by the user can be generated.
<Supplement>
In addition, although it demonstrated based on said embodiment, of course, this invention is not limited to said embodiment. The following cases are also included in the present invention.
(A) The present invention may be an application execution method disclosed by the processing procedure described in each embodiment. Further, the present invention may be a computer program including program code that causes a computer to operate according to the processing procedure.
(B) The present invention can also be implemented as an LSI that controls the application execution apparatus. FIG. 36 shows an example in which the image processing apparatus according to the present invention is embodied by LSI. As shown in the figure, for example, CPU, DSP, ENC / DEC (encoder / decoder), AIF (audio interface), VIF (video interface), PERI (control module for controlling peripheral devices), NIF (network interface) , MIF (memory interface) and RAM / ROM. The processing procedure described in each embodiment is stored as program code in the RAM / ROM and executed by the CPU or DSP.
 ここでは、LSIとしたが、集積度の違いにより、IC、システムLSI、スーパーLSI、ウルトラLSIと呼称されることもある。 Here, LSI is used, but depending on the degree of integration, it may be called IC, system LSI, super LSI, or ultra LSI.
 また、集積回路化の手法はLSIに限るものではなく、専用回路または、汎用プロセッサで実現してもよい。LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)や、LSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。 Further, the method of circuit integration is not limited to LSI, and implementation with a dedicated circuit or a general-purpose processor is also possible. An FPGA (Field Programmable Gate Array) that can be programmed after manufacturing the LSI or a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
 さらには、半導体技術の進歩または派生する別技術によりLSIに置き換わる集積回路化の技術が登場すれば、当然、その技術を用いて機能ブロック及び部材の集積化を行ってもよい。このような技術には、バイオ技術の適用等が可能性としてありえる。
(c)実施の形態1~9において、2つの入力画像データに対して画像処理を行う例を説明したが、入力画像データは3以上の多視点画像であってもよい。3以上の入力画像に対しても、同様の画像処理を行うことができる。
(d)実施の形態1~9において、視差の検出は16×16画素の画素ブロック単位で行うとしたが、他の画素ブロック単位(例えば32×32画素、8×8画素)で視差の検出を行ってもよい。また、1画素単位で視差の検出を行ってもよい。
Further, if integrated circuit technology comes out to replace LSI's as a result of the advancement of semiconductor technology or a derivative other technology, it is naturally also possible to carry out function block and member integration using this technology. Biotechnology can be applied to such technology.
(C) In Embodiments 1 to 9, an example in which image processing is performed on two input image data has been described, but the input image data may be three or more multi-viewpoint images. Similar image processing can be performed on three or more input images.
(D) In Embodiments 1 to 9, parallax detection is performed in units of 16 × 16 pixel blocks, but parallax detection is performed in other pixel block units (for example, 32 × 32 pixels, 8 × 8 pixels). May be performed. Further, parallax may be detected on a pixel-by-pixel basis.
 本発明に係る画像処理装置によれば、観察者に視覚疲労・不快感・立体視酔い等を引き起こす恐れある視差(不正視差)を有する画素領域を特定し、この画素領域にぼかし処理を行うことにより、いわゆるボケ味がある画像として、不正視差を観察者に認識できないようにし、不正視差に起因する視覚疲労・不快感を軽減することができ有益である。 According to the image processing device of the present invention, it is possible to identify a pixel region having parallax (incorrect parallax) that may cause visual fatigue, discomfort, stereoscopic sickness, etc. to the observer, and perform blurring processing on the pixel region. This makes it possible to prevent the viewer from recognizing unauthorized parallax as a so-called blurred image, and to reduce visual fatigue and discomfort caused by unauthorized parallax.
 100、600、900、1200、1500、1600、1800、2000、 画像処理装置
 101、102、1501、1502、1801 画像データ入力端子
 103 視差検出部
 104 不正視差検出部
 105、106、602、603、902、903、1202、1203、1504、1505、1604、1605、1805、1806 ボケ画像生成部
 107、108 画像データ出力端子
 601 補間画像生成部
 901 距離情報生成部
 1201 被写体検出部、
 1301 テンプレート画像
 1302 入力画像データ
 1303 被写体
 1503、1602 高解像度変換部
 1601 低解像度変換部
 1603 変換部
 1802 距離情報入力端子
 1803 視差算出部
 1804 視差画像生成部
 2001 距離情報抽出部
 2400 撮像装置
 2401 画像信号処理部
 2402 制御部
 2403、2404 カメラ部
 2405、2406 画像復号化部
 2407、2408 画像符号化部
 2409 画像記録部
 2410 データ伝送部
 2411 3D/2D表示部
 2412 タッチパネル
 2413 振動部
 3501、3502、3503 ボケ強度曲線
100, 600, 900, 1200, 1500, 1600, 1800, 2000, image processing apparatus 101, 102, 1501, 1502, 1801 image data input terminal 103 parallax detection unit 104 illegal parallax detection unit 105, 106, 602, 603, 902 , 903, 1202, 1203, 1504, 1505, 1604, 1605, 1805, 1806 Blur image generation unit 107, 108 Image data output terminal 601 Interpolated image generation unit 901 Distance information generation unit 1201 Subject detection unit,
1301 Template image 1302 Input image data 1303 Subject 1503, 1602 High resolution converter 1601 Low resolution converter 1603 Converter 1802 Distance information input terminal 1803 Parallax calculator 1804 Parallax image generator 2001 Distance information extractor 2400 Imaging device 2401 Image signal processing Unit 2402 control unit 2403, 2404 camera unit 2405, 2406 image decoding unit 2407, 2408 image encoding unit 2409 image recording unit 2410 data transmission unit 2411 3D / 2D display unit 2412 touch panel 2413 vibration unit 3501, 3502, 3503 blur intensity curve

Claims (23)

  1.  立体的画像表示に用いる複数の画像データに画像処理を行う画像処理装置であって、
     前記複数の画像データの入力を受け付ける受付手段と、
     各画像データに対して、他の画像データとの視差を検出する視差検出部と、
     視差検出を行った一対の画像における複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出手段と、
     不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理手段と
     を備えることを特徴とする画像処理装置。
    An image processing apparatus that performs image processing on a plurality of image data used for stereoscopic image display,
    Receiving means for receiving input of the plurality of image data;
    For each image data, a parallax detection unit that detects parallax with other image data,
    Detecting means for detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in a pair of images for which parallax detection has been performed;
    An image processing apparatus comprising: processing means for performing a blurring process on each pixel area constituting the illegal parallax area.
  2.  前記検出手段は、視差と所定の閾値との比較を行い、その比較結果に応じて、不正視差領域を検出することを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the detection unit compares the parallax with a predetermined threshold value and detects an illegal parallax region according to the comparison result.
  3.  前記複数の画素データのうち一方の画像データの端部に存在し、他方の画像データに存在しないために、立体視において片目可視によるちらつきを生じさせる原因となる、前記一方の画像データにのみ存在する画素領域を、片眼可視画素領域と定義した場合、
     前記画像処理装置はさらに、
     前記片眼可視画素領域に隣接する画素領域と共用して立体視に供する他方の画像データ側の画像データを、前記片眼可視画像領域に上書きし、片目可視の補間を行う補間画像生成部を備え、
     前記処理手段はさらに、
     片眼可視の補間のために上書きされた画素領域および、上書きのために用いた画素データが存在する他方の画像データの画像領域の両方をぼかし処理することを特徴とする請求項2に記載の画像処理装置。
    Exists in only one image data that causes flicker due to one-eye view in stereoscopic view because it exists at the end of one image data among the plurality of pixel data and does not exist in the other image data If the pixel area to be defined is a one-eye visible pixel area,
    The image processing apparatus further includes:
    An interpolated image generating unit that overwrites the one-eye visible image area with image data on the other image data side used for stereoscopic viewing in common with a pixel area adjacent to the one-eye visible pixel area, and performs one-eye visible interpolation; Prepared,
    The processing means further includes
    The blurring process is performed on both the pixel area overwritten for one-eye visible interpolation and the image area of the other image data in which the pixel data used for the overwriting exist. Image processing device.
  4.  前記画像処理装置はさらに、
     前記視差に基づき、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部を備え、
     前記処理手段はさらに、
     前記不正視差領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項2に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, a distance information generation unit that generates distance information indicating the distance from the imaging position of the image data to the subject reflected in the pixel area for each pixel area,
    The processing means further includes
    The image processing apparatus according to claim 2, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area.
  5.  前記画像処理装置はさらに、
     前記視差に基づき、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部と、
     所定の被写体を構成する画素領域を示す被写体画素領域を検出する被写体検出部を備え、
     前記処理手段はさらに、
     前記不正視差領域および前記被写体画素領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項2に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, a distance information generation unit that generates distance information indicating the distance from the imaging position of the image data to the subject reflected in the pixel area for each pixel area;
    A subject detection unit for detecting a subject pixel region indicating a pixel region constituting a predetermined subject;
    The processing means further includes
    The image processing apparatus according to claim 2, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area and the subject pixel area.
  6.  1つの画像データの入力を受け付ける受付手段と、
     前記画像データの画素領域毎に、画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報に基づき視差を算出する視差算出部と、
     前記画像データと前記視差に基づき左目用画像と右目用画像からなるステレオ画像を生成するステレオ画像生成部と、
     前記左目用画像および右目用画像における複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出手段と、
     不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理手段と
     を備えることを特徴とする画像処理装置。
    Receiving means for receiving input of one image data;
    A parallax calculation unit that calculates a parallax for each pixel area of the image data based on distance information indicating a distance from an imaging position of the image data to a subject reflected in the pixel area;
    A stereo image generation unit that generates a stereo image including a left-eye image and a right-eye image based on the image data and the parallax;
    Detecting means for detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in the left-eye image and the right-eye image;
    An image processing apparatus comprising: processing means for performing a blurring process on each pixel area constituting the illegal parallax area.
  7.  前記検出手段は、視差と所定の閾値との比較を行い、その比較結果に応じて、不正視差領域を検出することを特徴とする請求項6に記載の画像処理装置。 The image processing apparatus according to claim 6, wherein the detection unit compares the parallax with a predetermined threshold and detects an illegal parallax region according to the comparison result.
  8.  前記処理手段はさらに、
     前記左目用画像および右目用画像における前記不正視差領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項7に記載の画像処理装置。
    The processing means further includes
    The image processing apparatus according to claim 7, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area in the left-eye image and the right-eye image.
  9.  前記画像処理装置はさらに、
     前記左目用画像および右目用画像における所定の被写体を構成する画素領域を示す被写体画素領域を検出する被写体検出部を備え、
     前記処理手段はさらに、
     前記左目用画像および右目用画像における前記不正視差領域および前記被写体画素領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項7に記載の画像処理装置。
    The image processing apparatus further includes:
    A subject detection unit for detecting a subject pixel region indicating a pixel region constituting a predetermined subject in the left-eye image and the right-eye image;
    The processing means further includes
    The image processing apparatus according to claim 7, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area and the subject pixel area in the left-eye image and the right-eye image.
  10.  前記画像処理装置はさらに、
     前記1つの画像データから前記距離情報を算出する距離情報算出部を備えることを特徴とする請求項6に記載の画像処理装置。
    The image processing apparatus further includes:
    The image processing apparatus according to claim 6, further comprising a distance information calculation unit that calculates the distance information from the one image data.
  11.  立体的画像表示に用いる複数の画像データに画像処理を行う画像処理装置であって、
     前記複数の画像データは、少なくとも一つの高解像度の画像データと低解像度の画像データを含み、
     前記複数の画像データの入力を受け付ける受付手段と、
     前記低解像度の画像データを、前記高解像度の画像データと同解像度の画像データに変換する解像度変換部と、
     前記高解像度の画像データと解像度変換を行った画像データ間の視差を検出する視差検出部と、
     前記高解像度の画像データおよび解像度変換を行った画像データにおける複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出手段と、
     前記高解像度の画像データおよび解像度変換を行った前記低解像度の画像データにおいて不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理手段と
     を備えることを特徴とする画像処理装置。
    An image processing apparatus that performs image processing on a plurality of image data used for stereoscopic image display,
    The plurality of image data includes at least one high-resolution image data and low-resolution image data,
    Receiving means for receiving input of the plurality of image data;
    A resolution converter that converts the low-resolution image data into image data having the same resolution as the high-resolution image data;
    A parallax detector that detects parallax between the high-resolution image data and the image data that has undergone resolution conversion;
    Detecting means for detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in the high-resolution image data and the resolution-converted image data;
    An image processing apparatus comprising: processing means for performing a blurring process on individual pixel areas constituting an irregular parallax area in the high-resolution image data and the low-resolution image data subjected to resolution conversion.
  12.  前記検出手段は、視差と所定の閾値との比較を行い、その比較結果に応じて、不正視差領域を検出することを特徴とする請求項11に記載の画像処理装置。 12. The image processing apparatus according to claim 11, wherein the detection unit compares the parallax with a predetermined threshold and detects an illegal parallax region according to the comparison result.
  13.  前記複数の画素データのうち一方の画像データの端部に存在し、他方の画像データに存在しないために、立体視において片目可視によるちらつきを生じさせる原因となる、前記一方の画像データにのみ存在する画素領域を、片眼可視画素領域と定義した場合、
     前記画像処理装置はさらに、
     前記片眼可視画素領域に隣接する画素領域と共用して立体視に供する他方の前記高解像度の画像データまたは解像度変換を行った画像データを、前記高解像度の画像データまたは解像度変換を行った画像データの前記片眼可視画像領域に上書きし、片目可視の補間を行う補間画像生成部を備え、
     片眼可視補間のために他方の画像データに上書きされた画素領域および上書き元の画素領域の双方に対してぼかし処理を施す
     ことを特徴とする請求項12記載の処理装置。
    Exists in only one image data that causes flicker due to one-eye view in stereoscopic view because it exists at the end of one image data among the plurality of pixel data and does not exist in the other image data If the pixel area to be defined is a one-eye visible pixel area,
    The image processing apparatus further includes:
    The other high-resolution image data or resolution-converted image data used for stereoscopic viewing in common with the pixel area adjacent to the one-eye visible pixel area is the high-resolution image data or resolution-converted image. Overwriting the one-eye visible image area of the data, comprising an interpolation image generation unit for performing one-eye visible interpolation,
    13. The processing apparatus according to claim 12, wherein blur processing is performed on both the pixel area overwritten on the other image data and the overwriting source pixel area for one-eye visible interpolation.
  14.  前記画像処理装置はさらに、
     前記視差に基づき、前記高解像度の画像データと解像度変換を行った画像データに対して、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部を備え、
     前記処理手段はさらに、
     前記高解像度の画像データと解像度変換を行った画像データにおける前記不正視差領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項12に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, distance information that generates distance information indicating the distance from the imaging position of the image data to the subject reflected in the pixel area for each pixel area for the high-resolution image data and the image data that has undergone resolution conversion With a generator,
    The processing means further includes
    13. The image processing apparatus according to claim 12, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area in the high-resolution image data and the image data subjected to resolution conversion. .
  15.  前記画像処理装置はさらに、
     前記視差に基づき、前記高解像度の画像データと解像度変換を行った画像データに対して、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部と、
     前記高解像度の画像データと解像度変換を行った画像データにおける所定の被写体を構成する画素領域を示す被写体画素領域を検出する被写体検出部を備え、
     前記処理手段はさらに、
     前記高解像度の画像データと解像度変換を行った画像データにおける前記不正視差領域および前記被写体画素領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項12に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, distance information that generates distance information indicating the distance from the imaging position of the image data to the subject reflected in the pixel area for each pixel area for the high-resolution image data and the image data that has undergone resolution conversion A generator,
    A subject detection unit for detecting a subject pixel region indicating a pixel region constituting a predetermined subject in the high-resolution image data and the image data subjected to resolution conversion;
    The processing means further includes
    13. The blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area and the subject pixel area in the high-resolution image data and the image data subjected to resolution conversion. The image processing apparatus described.
  16.  立体的画像表示に用いる複数の画像データに画像処理を行う画像処理装置であって、
     前記複数の画像データは、少なくとも一つの高解像度の画像データと低解像度の画像データを含み、
     前記複数の画像データの入力を受け付ける受付手段と、
     前記高解像度の画像データを、前記低解像度の画像データと同解像度の画像データに変換する低解像度変換部と、
     低解像度変換を行った画像データと前記低解像度の画像データ間の視差を検出する視差検出部と、
     低解像度変換を行った画像データと前記低解像度の画像データにおける複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出手段と、
     前記低解像度の画像データを、前記高解像度の画像データと同解像度の画像データに変換する高解像度変換部と、
     低解像度変換を行った画像データの不正視差領域を前記高解像度の画像データの不正視差領域に、前記低解像度の画像データの不正視差領域を高解像度変換をおこなった画像データの不正視差領域に変換する不正視差領域変換部と、
     前記高解像度の画像データおよび高解像度変換を行った画像データにおいて不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理手段と
     を備えることを特徴とする画像処理装置。
    An image processing apparatus that performs image processing on a plurality of image data used for stereoscopic image display,
    The plurality of image data includes at least one high-resolution image data and low-resolution image data,
    Receiving means for receiving input of the plurality of image data;
    A low-resolution conversion unit that converts the high-resolution image data into image data having the same resolution as the low-resolution image data;
    A parallax detection unit that detects parallax between the image data subjected to low resolution conversion and the low resolution image data;
    Detecting means for detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in the image data subjected to low resolution conversion and the low resolution image data;
    A high-resolution converter that converts the low-resolution image data into image data having the same resolution as the high-resolution image data;
    Convert the illegal parallax area of the image data that has undergone low resolution conversion into the illegal parallax area of the high resolution image data, and convert the illegal parallax area of the low resolution image data into the illegal parallax area of the image data that has undergone high resolution conversion An unauthorized parallax region conversion unit to
    An image processing apparatus comprising: processing means for performing a blurring process on each pixel area constituting an illegal parallax area in the high-resolution image data and the image data subjected to high-resolution conversion.
  17.  前記検出手段は、視差と所定の閾値との比較を行い、その比較結果に応じて、不正視差領域を検出することを特徴とする請求項16に記載の画像処理装置。 The image processing apparatus according to claim 16, wherein the detection unit compares the parallax with a predetermined threshold and detects an illegal parallax region according to the comparison result.
  18.  前記複数の画素データのうち一方の画像データの端部に存在し、他方の画像データに存在しないために、立体視において片目可視によるちらつきを生じさせる原因となる、前記一方の画像データにのみ存在する画素領域を、片眼可視画素領域と定義した場合、
     前記画像処理装置はさらに、
     低解像度変換を行った画像データと前記低解像度の画像データ間の片眼可視画素領域を、前記高解像度の画像データと高解像度変換を行った画像データ間の片眼可視画素領域に変換する片眼可視画素領域変換部と、
     変換後の片眼可視画素領域に隣接する画素領域と共用して立体視に供する他方の画像データを、変換後の片眼可視画素領域に上書きし、片目可視の補間を行う補間画像生成部を備え、
     前記処理手段はさらに、
     片眼可視補間のために他方の画像データに上書きされた画素領域および上書き元の画素領域の双方に対してぼかし処理を施す
     ことを特徴とする請求項17記載の処理装置。
    Exists in only one image data that causes flicker due to one-eye view in stereoscopic view because it exists at the end of one image data among the plurality of pixel data and does not exist in the other image data If the pixel area to be defined is a one-eye visible pixel area,
    The image processing apparatus further includes:
    A piece that converts a single-eye visible pixel region between low-resolution converted image data and the low-resolution image data into a single-eye visible pixel region between the high-resolution image data and the high-resolution converted image data. An eye-visible pixel region conversion unit;
    An interpolated image generation unit that performs one-eye visible interpolation by overwriting the converted one-eye visible pixel area with the other image data used for stereoscopic viewing in common with a pixel area adjacent to the converted one-eye visible pixel area. Prepared,
    The processing means further includes
    18. The processing apparatus according to claim 17, wherein blur processing is performed on both the pixel area overwritten on the other image data and the overwriting source pixel area for one-eye visible interpolation.
  19.  前記画像処理装置はさらに、
     前記視差に基づき、低解像度変換を行った画像データと前記低解像度の画像データに対して、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部と、
     低解像度変換を行った画像データの距離情報を前記高解像度の画像データの距離情報に、前記低解像度の画像データの距離情報を高解像度変換を行った画像データの距離情報に変換する距離情報変換部とを備え、
     前記処理手段はさらに、
     前記高解像度の画像データと高解像度変換を行った画像データにおける前記不正視差領域を除く画素領域に対して、前記距離情報に基づくぼかし処理を施すことを特徴とする請求項17に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, a distance for generating distance information indicating a distance from an imaging position of the image data to a subject reflected in the pixel area for each pixel area for the low-resolution converted image data and the low-resolution image data An information generator,
    Distance information conversion for converting distance information of low-resolution image data into distance information of the high-resolution image data and distance information of the low-resolution image data into distance information of the image data subjected to high-resolution conversion With
    The processing means further includes
    18. The image processing according to claim 17, wherein a blurring process based on the distance information is performed on a pixel area excluding the illegal parallax area in the high-resolution image data and the image data subjected to high-resolution conversion. apparatus.
  20.  前記画像処理装置はさらに、
     前記視差に基づき、低解像度変換を行った画像データと前記低解像度の画像データに対して、画素領域毎に画像データの撮像位置から画素領域に映る被写体までの距離を示す距離情報を生成する距離情報生成部と、
     低解像度変換を行った画像データと前記低解像度の画像データにおける所定の被写体を構成する画素領域を示す被写体画素領域を検出する被写体検出部と、
     低解像度変換を行った画像データの距離情報を前記高解像度の画像データの距離情報に、前記低解像度の画像データの距離情報を高解像度変換を行った画像データの距離情報に変換する距離情報変換部と、
     低解像度変換を行った画像データの被写体領域を前記高解像度の画像データの被写体領域に、前記低解像度の画像データの被写体領域を高解像度変換を行った画像データの被写体領域に変換する被写体領域変換部とを備え、
     前記処理手段はさらに、
     前記高解像度の画像データと高解像度変換を行った画像データにおける前記不正視差領域および前記被写体画素領域を除く画素領域に対して、距離情報に基づくぼかし処理を施すことを特徴とする請求項17に記載の画像処理装置。
    The image processing apparatus further includes:
    Based on the parallax, a distance for generating distance information indicating a distance from an imaging position of the image data to a subject reflected in the pixel area for each pixel area for the low-resolution converted image data and the low-resolution image data An information generator,
    A subject detection unit for detecting a subject pixel region indicating a pixel region constituting a predetermined subject in the low-resolution image data and the low-resolution image data;
    Distance information conversion for converting distance information of low-resolution image data into distance information of the high-resolution image data and distance information of the low-resolution image data into distance information of the image data subjected to high-resolution conversion And
    Subject area conversion for converting the subject area of the image data subjected to the low resolution conversion into the subject area of the high resolution image data, and converting the subject area of the low resolution image data into the subject area of the image data subjected to the high resolution conversion. With
    The processing means further includes
    18. The blur processing based on distance information is performed on a pixel area excluding the illegal parallax area and the subject pixel area in the high-resolution image data and the image data subjected to high-resolution conversion. The image processing apparatus described.
  21.  請求項1に記載の画像処理装置と、イメージセンサを備えた撮像装置。 An image processing apparatus according to claim 1 and an image pickup apparatus including an image sensor.
  22.  立体的画像表示に用いる複数の画像データに画像処理を行う画像処理方法であって、
     前記複数の画像データの入力を受け付ける受付ステップと、
     各画像データに対して、他の画像データとの視差を検出する視差検出ステップと、
     視差検出を行った一対の画像における複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出ステップと、
     不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理ステップと
     を含むことを特徴とする画像処理方法。
    An image processing method for performing image processing on a plurality of image data used for stereoscopic image display,
    An accepting step for accepting input of the plurality of image data;
    For each image data, a parallax detection step for detecting parallax with other image data;
    A detection step of detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in a pair of images subjected to parallax detection;
    And a processing step of performing blurring processing on each pixel area constituting the illegal parallax area.
  23.  立体的画像表示に用いる複数の画像データに対する画像処理をコンピュータに実行させるプログラムであって、
     前記複数の画像データの入力を受け付ける受付ステップと、
     各画像データに対して、他の画像データとの視差を検出する視差検出ステップと、
     視差検出を行った一対の画像における複数の画素領域のうち、視差が不正となるものを不正視差領域として検出する検出ステップと、
     不正視差領域を構成する個々の画素領域に対してぼかし処理を施す処理ステップと
     をコンピュータに実行させるプログラム。
    A program for causing a computer to execute image processing on a plurality of image data used for stereoscopic image display,
    An accepting step for accepting input of the plurality of image data;
    For each image data, a parallax detection step for detecting parallax with other image data;
    A detection step of detecting, as an illegal parallax area, a parallax that is illegal among a plurality of pixel areas in a pair of images subjected to parallax detection;
    A program that causes a computer to execute a processing step of performing a blurring process on individual pixel areas that constitute an illegal parallax area.
PCT/JP2011/006380 2010-12-24 2011-11-16 Image processing apparatus, image pickup apparatus, image processing method, and program WO2012086120A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010287821 2010-12-24
JP2010-287821 2010-12-24

Publications (1)

Publication Number Publication Date
WO2012086120A1 true WO2012086120A1 (en) 2012-06-28

Family

ID=46313415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/006380 WO2012086120A1 (en) 2010-12-24 2011-11-16 Image processing apparatus, image pickup apparatus, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2012086120A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014014076A (en) * 2012-07-03 2014-01-23 Woodman Labs Inc Image blur based on 3d depth information
US9787862B1 (en) 2016-01-19 2017-10-10 Gopro, Inc. Apparatus and methods for generating content proxy
US9792502B2 (en) 2014-07-23 2017-10-17 Gopro, Inc. Generating video summaries for a video using video summary templates
US9838730B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing
US9871994B1 (en) 2016-01-19 2018-01-16 Gopro, Inc. Apparatus and methods for providing content context using session metadata
US9916863B1 (en) 2017-02-24 2018-03-13 Gopro, Inc. Systems and methods for editing videos based on shakiness measures
US9922682B1 (en) 2016-06-15 2018-03-20 Gopro, Inc. Systems and methods for organizing video files
US9953679B1 (en) 2016-05-24 2018-04-24 Gopro, Inc. Systems and methods for generating a time lapse video
US9953224B1 (en) 2016-08-23 2018-04-24 Gopro, Inc. Systems and methods for generating a video summary
US9967515B1 (en) 2016-06-15 2018-05-08 Gopro, Inc. Systems and methods for bidirectional speed ramping
US9972066B1 (en) 2016-03-16 2018-05-15 Gopro, Inc. Systems and methods for providing variable image projection for spherical visual content
US10002641B1 (en) 2016-10-17 2018-06-19 Gopro, Inc. Systems and methods for determining highlight segment sets
US10044972B1 (en) 2016-09-30 2018-08-07 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10045120B2 (en) 2016-06-20 2018-08-07 Gopro, Inc. Associating audio with three-dimensional objects in videos
US10078644B1 (en) 2016-01-19 2018-09-18 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US10129464B1 (en) 2016-02-18 2018-11-13 Gopro, Inc. User interface for creating composite images
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10229719B1 (en) 2016-05-09 2019-03-12 Gopro, Inc. Systems and methods for generating highlights for a video
US10268898B1 (en) 2016-09-21 2019-04-23 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video via segments
US10282632B1 (en) 2016-09-21 2019-05-07 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video
US10338955B1 (en) 2015-10-22 2019-07-02 Gopro, Inc. Systems and methods that effectuate transmission of workflow between computing platforms
US10339443B1 (en) 2017-02-24 2019-07-02 Gopro, Inc. Systems and methods for processing convolutional neural network operations using textures
US10360663B1 (en) 2017-04-07 2019-07-23 Gopro, Inc. Systems and methods to create a dynamic blur effect in visual content
US10395119B1 (en) 2016-08-10 2019-08-27 Gopro, Inc. Systems and methods for determining activities performed during video capture
US10397415B1 (en) 2016-09-30 2019-08-27 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10395122B1 (en) 2017-05-12 2019-08-27 Gopro, Inc. Systems and methods for identifying moments in videos
US10402938B1 (en) 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US10402698B1 (en) 2017-07-10 2019-09-03 Gopro, Inc. Systems and methods for identifying interesting moments within videos
US10614114B1 (en) 2017-07-10 2020-04-07 Gopro, Inc. Systems and methods for creating compilations based on hierarchical clustering
CN113938604A (en) * 2021-09-22 2022-01-14 深圳市汇顶科技股份有限公司 Focusing method, focusing device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06194602A (en) * 1992-12-24 1994-07-15 Nippon Telegr & Teleph Corp <Ntt> Binocular stereoscopic viewing device
JPH10134187A (en) * 1996-10-31 1998-05-22 Nec Corp Three-dimensional structure estimating device
JP2010171628A (en) * 2009-01-21 2010-08-05 Nikon Corp Image processing device, program, image processing method, recording method, and recording medium
WO2012001970A1 (en) * 2010-06-30 2012-01-05 富士フイルム株式会社 Image processing device, method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06194602A (en) * 1992-12-24 1994-07-15 Nippon Telegr & Teleph Corp <Ntt> Binocular stereoscopic viewing device
JPH10134187A (en) * 1996-10-31 1998-05-22 Nec Corp Three-dimensional structure estimating device
JP2010171628A (en) * 2009-01-21 2010-08-05 Nikon Corp Image processing device, program, image processing method, recording method, and recording medium
WO2012001970A1 (en) * 2010-06-30 2012-01-05 富士フイルム株式会社 Image processing device, method, and program

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10015469B2 (en) 2012-07-03 2018-07-03 Gopro, Inc. Image blur based on 3D depth information
US9185387B2 (en) 2012-07-03 2015-11-10 Gopro, Inc. Image blur based on 3D depth information
JP2014014076A (en) * 2012-07-03 2014-01-23 Woodman Labs Inc Image blur based on 3d depth information
US10776629B2 (en) 2014-07-23 2020-09-15 Gopro, Inc. Scene and activity identification in video summary generation
US10339975B2 (en) 2014-07-23 2019-07-02 Gopro, Inc. Voice-based video tagging
US10074013B2 (en) 2014-07-23 2018-09-11 Gopro, Inc. Scene and activity identification in video summary generation
US11776579B2 (en) 2014-07-23 2023-10-03 Gopro, Inc. Scene and activity identification in video summary generation
US11069380B2 (en) 2014-07-23 2021-07-20 Gopro, Inc. Scene and activity identification in video summary generation
US9792502B2 (en) 2014-07-23 2017-10-17 Gopro, Inc. Generating video summaries for a video using video summary templates
US10262695B2 (en) 2014-08-20 2019-04-16 Gopro, Inc. Scene and activity identification in video summary generation
US10643663B2 (en) 2014-08-20 2020-05-05 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10559324B2 (en) 2015-01-05 2020-02-11 Gopro, Inc. Media identifier generation for camera-captured media
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US10338955B1 (en) 2015-10-22 2019-07-02 Gopro, Inc. Systems and methods that effectuate transmission of workflow between computing platforms
US10078644B1 (en) 2016-01-19 2018-09-18 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US9787862B1 (en) 2016-01-19 2017-10-10 Gopro, Inc. Apparatus and methods for generating content proxy
US10402445B2 (en) 2016-01-19 2019-09-03 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US9871994B1 (en) 2016-01-19 2018-01-16 Gopro, Inc. Apparatus and methods for providing content context using session metadata
US10129464B1 (en) 2016-02-18 2018-11-13 Gopro, Inc. User interface for creating composite images
US9972066B1 (en) 2016-03-16 2018-05-15 Gopro, Inc. Systems and methods for providing variable image projection for spherical visual content
US10740869B2 (en) 2016-03-16 2020-08-11 Gopro, Inc. Systems and methods for providing variable image projection for spherical visual content
US10817976B2 (en) 2016-03-31 2020-10-27 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US10402938B1 (en) 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US11398008B2 (en) 2016-03-31 2022-07-26 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US10341712B2 (en) 2016-04-07 2019-07-02 Gopro, Inc. Systems and methods for audio track selection in video editing
US9838730B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing
US10229719B1 (en) 2016-05-09 2019-03-12 Gopro, Inc. Systems and methods for generating highlights for a video
US9953679B1 (en) 2016-05-24 2018-04-24 Gopro, Inc. Systems and methods for generating a time lapse video
US11223795B2 (en) 2016-06-15 2022-01-11 Gopro, Inc. Systems and methods for bidirectional speed ramping
US9967515B1 (en) 2016-06-15 2018-05-08 Gopro, Inc. Systems and methods for bidirectional speed ramping
US10742924B2 (en) 2016-06-15 2020-08-11 Gopro, Inc. Systems and methods for bidirectional speed ramping
US9922682B1 (en) 2016-06-15 2018-03-20 Gopro, Inc. Systems and methods for organizing video files
US10045120B2 (en) 2016-06-20 2018-08-07 Gopro, Inc. Associating audio with three-dimensional objects in videos
US10395119B1 (en) 2016-08-10 2019-08-27 Gopro, Inc. Systems and methods for determining activities performed during video capture
US11508154B2 (en) 2016-08-23 2022-11-22 Gopro, Inc. Systems and methods for generating a video summary
US11062143B2 (en) 2016-08-23 2021-07-13 Gopro, Inc. Systems and methods for generating a video summary
US9953224B1 (en) 2016-08-23 2018-04-24 Gopro, Inc. Systems and methods for generating a video summary
US10726272B2 (en) 2016-08-23 2020-07-28 Go Pro, Inc. Systems and methods for generating a video summary
US10282632B1 (en) 2016-09-21 2019-05-07 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video
US10268898B1 (en) 2016-09-21 2019-04-23 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video via segments
US10397415B1 (en) 2016-09-30 2019-08-27 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10044972B1 (en) 2016-09-30 2018-08-07 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10560591B2 (en) 2016-09-30 2020-02-11 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10560655B2 (en) 2016-09-30 2020-02-11 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10643661B2 (en) 2016-10-17 2020-05-05 Gopro, Inc. Systems and methods for determining highlight segment sets
US10002641B1 (en) 2016-10-17 2018-06-19 Gopro, Inc. Systems and methods for determining highlight segment sets
US10923154B2 (en) 2016-10-17 2021-02-16 Gopro, Inc. Systems and methods for determining highlight segment sets
US10776689B2 (en) 2017-02-24 2020-09-15 Gopro, Inc. Systems and methods for processing convolutional neural network operations using textures
US10339443B1 (en) 2017-02-24 2019-07-02 Gopro, Inc. Systems and methods for processing convolutional neural network operations using textures
US9916863B1 (en) 2017-02-24 2018-03-13 Gopro, Inc. Systems and methods for editing videos based on shakiness measures
US10817992B2 (en) 2017-04-07 2020-10-27 Gopro, Inc. Systems and methods to create a dynamic blur effect in visual content
US10360663B1 (en) 2017-04-07 2019-07-23 Gopro, Inc. Systems and methods to create a dynamic blur effect in visual content
US10817726B2 (en) 2017-05-12 2020-10-27 Gopro, Inc. Systems and methods for identifying moments in videos
US10395122B1 (en) 2017-05-12 2019-08-27 Gopro, Inc. Systems and methods for identifying moments in videos
US10614315B2 (en) 2017-05-12 2020-04-07 Gopro, Inc. Systems and methods for identifying moments in videos
US10614114B1 (en) 2017-07-10 2020-04-07 Gopro, Inc. Systems and methods for creating compilations based on hierarchical clustering
US10402698B1 (en) 2017-07-10 2019-09-03 Gopro, Inc. Systems and methods for identifying interesting moments within videos
CN113938604A (en) * 2021-09-22 2022-01-14 深圳市汇顶科技股份有限公司 Focusing method, focusing device, electronic equipment and storage medium
CN113938604B (en) * 2021-09-22 2023-05-09 深圳市汇顶科技股份有限公司 Focusing method, focusing device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2012086120A1 (en) Image processing apparatus, image pickup apparatus, image processing method, and program
US9007442B2 (en) Stereo image display system, stereo imaging apparatus and stereo display apparatus
EP2549762B1 (en) Stereovision-image position matching apparatus, stereovision-image position matching method, and program therefor
EP2706504A2 (en) An apparatus, a method and a computer program for image processing
CN102722080B (en) A kind of multi purpose spatial image capture method based on many lens shootings
WO2012153447A1 (en) Image processing device, image processing method, program, and integrated circuit
JP2013005259A (en) Image processing apparatus, image processing method, and program
JP2011188004A (en) Three-dimensional video imaging device, three-dimensional video image processing apparatus and three-dimensional video imaging method
US10222910B1 (en) Method and apparatus for creating an adaptive Bayer pattern
US8866881B2 (en) Stereoscopic image playback device, stereoscopic image playback system, and stereoscopic image playback method
JP5533529B2 (en) Image processing apparatus and image processing system
JP2009251141A (en) Stereoscopic image display
JP5755571B2 (en) Virtual viewpoint image generation device, virtual viewpoint image generation method, control program, recording medium, and stereoscopic display device
JP2014501086A (en) Stereo image acquisition system and method
US20160180514A1 (en) Image processing method and electronic device thereof
WO2014148031A1 (en) Image generation device, imaging device and image generation method
KR20110113923A (en) Image converting device and three dimensional image display device including the same
TW201445977A (en) Image processing method and image processing system
CN104853080A (en) Image processing device
US20170309055A1 (en) Adjusting parallax of three-dimensional display material
EP2701389A1 (en) Apparatus and method for depth-based image scaling of 3D visual content
US20120121163A1 (en) 3d display apparatus and method for extracting depth of 3d image thereof
US20130011047A1 (en) Method, System and Computer Program Product for Switching Between 2D and 3D Coding of a Video Sequence of Images
JP2013201688A (en) Image processing apparatus, image processing method, and image processing program
US11758101B2 (en) Restoration of the FOV of images for stereoscopic rendering

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11850386

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11850386

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP