GB2558286A - Image processing - Google Patents

Image processing Download PDF

Info

Publication number
GB2558286A
GB2558286A GB1622195.4A GB201622195A GB2558286A GB 2558286 A GB2558286 A GB 2558286A GB 201622195 A GB201622195 A GB 201622195A GB 2558286 A GB2558286 A GB 2558286A
Authority
GB
United Kingdom
Prior art keywords
image
dimensional images
images
distortion
depth parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1622195.4A
Other versions
GB201622195D0 (en
Inventor
Henry Bickerstaff Ian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1622195.4A priority Critical patent/GB2558286A/en
Publication of GB201622195D0 publication Critical patent/GB201622195D0/en
Publication of GB2558286A publication Critical patent/GB2558286A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0088Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Processing (AREA)

Abstract

Image processing apparatus comprises a detector to detect a depth parameter for one or more image features at the periphery 500, 510 of a set of two or more captured three-dimensional Images; an image processor (120, Fig. 1) to apply an image distortion to the image features in at least one of the three-dimensional images of the sets, in dependence upon the detected depth parameter; and a combiner to generate a composite image from the set of three-dimensional images. The 3D images could be stereoscopic images. The fields of view of each image may partially overlap. The distortion applied may be a horizontal expansion or contraction.

Description

(54) Title of the Invention: Image processing
Abstract Title: Stitching multiple 3D images together to create a single 3D image (57) Image processing apparatus comprises a detector to detect a depth parameter for one or more image features at the periphery 500, 510 of a set of two or more captured three-dimensional Images; an image processor (120, Fig.
1) to apply an image distortion to the image features in at least one of the three-dimensional images of the sets, in dependence upon the detected depth parameter; and a combiner to generate a composite image from the set of three-dimensional images. The 3D images could be stereoscopic images. The fields of view of each image may partially overlap. The distortion applied may be a horizontal expansion or contraction.
Figure GB2558286A_D0001
I
I-------I
Figure GB2558286A_D0002
ιΛ
Figure GB2558286A_D0003
Fig ί
Figure GB2558286A_D0004
Figure GB2558286A_D0005
Figure GB2558286A_D0006
Figure GB2558286A_D0007
Figure GB2558286A_D0008
|ΛΌ i ΛρΟ j.X— x tio s-kiP
βίθ ceSSo/L· Corvid
|ΛΐΑΑί-
Figure GB2558286A_D0009
Figure GB2558286A_D0010
Figure GB2558286A_D0011
Figure GB2558286A_D0012
Figure GB2558286A_D0013
Figure GB2558286A_D0014
\t>0° \ί)ΙΟ
i£t , 1 (2
Figure GB2558286A_D0015
IMAGE PROCESSING
BACKGROUND
Field
This disclosure relates to image processing.
Description of Related Art
Panoramic images can be generated by combining or “stitching together” multiple partially overlapping images.
In the case of three-dimensional panoramic images, errors can occur in the stitching process because of parallax differences between the peripheries of adjacent images. SUMMARY
The present disclosure is defined by the appended claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary, but are not restrictive, of the present technology.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 schematically illustrates an image capture and processing apparatus;
Figure 2 schematically illustrates a stereoscopic image;
Figure 3 schematically illustrates images captured by the arrangement of Figure 1;
Figure 4 schematically illustrates a combined image;
Figure 5 schematically illustrates overlapping image regions;
Figure 6 schematically illustrates an image processing apparatus;
Figure 7 schematically illustrates a comparison of disparities;
Figure 8 schematically illustrates a disparity comparison output; and
Figures 9 and 10 are schematic flowcharts illustrating methods.
DESCRIPTION OF THE EMBODIMENTS
Figure 1 schematically illustrates an image capture and processing apparatus. The apparatus comprises two or more three-dimensional cameras 100, 110 and an image processing apparatus 120. The three-dimensional cameras may be stereoscopic cameras generating stereoscopic images, for example. More than two such cameras may be used, but the drawing just shows two cameras for clarity of the explanation. Similarly, in fact only one camera may be used in another possible arrangement, with the camera being moved between image capture operations from one camera position to another so as to provide similar functionality to the apparatus of Figure 1. It will be appreciated that such variations in the number of cameras and the number of captured images are within the scope of the present embodiments.
Therefore, example arrangements provide a camera arrangement to capture two or more three-dimensional images. The camera arrangement may comprise two or more cameras disposed relative to one another so as to capture respective three-dimensional images. The respective fields of view of the two or more cameras may partially overlap.
Each of the cameras 100, 110 captures a respective three-dimensional image. If the cameras are video cameras, each camera captures successive three-dimensional images at a repetition rate defined by an image period of the video signal.
The techniques to be discussed below are applicable to individual three-dimensional images (or groups of them as captured by the plural cameras) or to video signals comprising successive three-dimensional images. Therefore, references to “images” in the discussion below should be taken to encompass the use of the same techniques in respect of video signals.
The cameras 100, 110 are angled towards one another so that their respective fields of view 105, 115 overlap partially. Each camera therefore captures some image content representing features which are common to image content captured by the other camera, and some image content which is not common to image content captured by the other camera. If there are more than two cameras involved, each could overlap with its neighbour in an angledin arrangement so that some image content is common to a neighbour on one side and some content is common to a neighbour on the other side.
Note that the cameras do not have to be angled in. An arrangement in which the cameras are angled outwards but still having fields of view overlapping one another could be used.
The techniques to be discussed below relate to operations to combine images captured by the two (or more) cameras into a single panoramic image.
Note that the cameras in Figure 1 may be relatively laterally displaced, vertically displaced, or both.
Figure 2 is a schematic example of a stereoscopic image comprising a pair of left and right images for display to the user’s left and right eyes respectively.
In the image pair of Figure 2, the depth of an image feature is represented by its socalled disparity as between the two images of the pair. The disparity represents a lateral displacement of that image feature between the two images. In some example, an image feature representing an object at infinity would generally have zero disparity, with the disparity increasing as the object is closer to the camera viewpoint. In other arrangements, however, a virtual display screen can be arranged at a position other than at infinity, in which case the image disparity is zero at the depth of the virtual display screen. In either arrangement, however, assuming the three-dimensional cameras 100. 110 are set up so as to provide comparable three-dimensional images (or an appropriate calibration factor is applied), the disparity of left and right images generated by one camera can be compared in the techniques to be discussed below with the disparity in images generated by the other of the cameras.
Figure 3 schematically illustrates images captured by the arrangement of Figure 1.
Here, only one of the image pair in each case is illustrated, such as the left image captured by each camera. With the angled-in example shown in Figure 1, an image 300 represents an image captured by the camera 110 and an image 310 represents an image captured by the camera 100. It can be seen that the captured images overlap partially so that there is image content in each which is not found in the other, and those image content which is common to both images.
Figure 4 represents a panoramic image generated by combining or “stitching together” the images of Figure 3. In the context of three-dimensional images, the panoramic image of Figure 4 is represented by an image pair of a left and right panoramic image.
In the context of three-dimensional image capture, errors can occur where multiple images are stitched together because of so-called parallax between overlapping portions of adjacent images. The present techniques aim to alleviate this issue.
Figure 5 schematically illustrates overlapping image regions in the example images of Figure 3. A region 500 represents substantially the same image material as a region 510 in the other of the images.
Figure 6 schematically illustrates an image processing apparatus which can be used to process and combine (stitch together) images of the type shown in Figure 3, mainly overlapping three-dimensional images. The apparatus of Figure 6 comprises a detector 600 configured to detect a depth parameter for one or more image features at the periphery of a set of two or more captured three-dimensional images, for example by detecting image disparity. The one or more image features may comprise image features which appear in overlapping portions of two or more of the three-dimensional images. An image processor 610 is arranged to apply an image distortion to the image features in at least one of the three-dimensional images, in dependence upon the detected depth parameter. A combiner 620 operates to generate a composite image (of the form shown schematically in Figure 4, as a left and right image pair) from the set of three-dimensional images (to which the distortion mentioned above has been applied).
The nature of the distortion and its application will now be described with reference to Figures 7 and 8.
Figure 7 is an expanded view of the overlapping sections shown in Figure 5. The detector 600 detects image disparity between the pair of images by detecting corresponding image features in each of the left and right images of an image pair. So, for example, for the image feature 700 representing the sun, the detector 600 detects the disparity in the left and right image pair captured by the camera 100 and, separately, the disparity in the left and right image pair captured by the camera 110 of the same image feature 700. The detector subtracts one disparity from the other to obtain a disparity difference between the two images. In the present example, the disparity difference is zero in respect of the image feature 700, because (in this example) the image disparity of that feature is zero in each of the two images. Figure 8 schematically illustrates a disparity comparison output in which a region 800 is associated with a disparity difference of zero.
Regarding a different image feature, for example the garage door 710, once again the detector 600 operates to detect the image disparity as between left and right images captured by the camera 100 and separately the image disparity as between left and right images captured by the camera 110, then determining the difference between the two detected disparities. In this particular schematic example, a non-zero disparity difference is detected in respect of the region 810 associated with the garage door feature. The detector 600 carries out this process in respect of the whole of the overlapping regions of the two adjacent images. The output of Figure 8 is therefore built up for each image region of the overlapping portion.
The processor 600 then applies an image distortion in dependence upon the detected image disparities, or in this example, the difference between the detected image disparities. In the present example, the distortion is a horizontal (in a horizontal image direction) stretch or compression of the relevant image feature. A stretch is applied in example arrangements to the image feature in that one of the overlapping images in which that image feature is represented as being further away, or having a greater depth parameter. By applying horizontal stretches in this manner, problems with uncovering otherwise occluded areas are avoided. In examples, the stretch can be dependent upon (for example proportional to) the disparity difference, so that in examples the image processor is configured to apply a degree of distortion to an image feature in dependence upon a difference in depth parameter for that image feature in two or more of the three-dimensional images. The distortion can be a horizontal expansion or contraction.
Figure 9 is a schematic flowchart illustrating an image processing method comprising: detecting (at a step 900) a depth parameter for one or more image features at the periphery of a set of two or more captured three-dimensional images;
applying (at a step 910) an image distortion to the image features in at least one of the three-dimensional images of the sets, in dependence upon the detected depth parameter; and generating (at a step 920) a composite image from the set of three-dimensional images. Figure 10 is a schematic flowchart providing slightly more detail to the step 910 of Figure
9, in which, at a step 1000, the detector 600 detects a depth or disparity difference between corresponding features of the overlapping portions of the adjacent images, and, at a step 1010, the processor 610 applies a stretch distortion in a horizontal direction. The combiner 620 then acts on the images including the one or ones to which the stretch has been applied.
It will be appreciated that example embodiments can be implemented by computer software operating on a general purpose computing system such as a games machine. In these examples, computer software, which when executed by a computer, causes the computer to carry out any of the methods discussed above is considered as an embodiment of the present disclosure. Similarly, embodiments of the disclosure are provided by a non-transitory, machine-readable storage medium which stores such computer software.
It will also be apparent that numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practised otherwise than as specifically described herein.

Claims (12)

1. Image processing apparatus comprising:
a detector to detect a depth parameter for one or more image features at the periphery of a set of two or more captured three-dimensional images;
an image processor to apply an image distortion to the image features in at least one of the three-dimensional images of the sets, in dependence upon the detected depth parameter; and a combiner to generate a composite image from the set of three-dimensional images.
2. Apparatus according to claim 1, in which the three-dimensional images are stereoscopic images.
3. Apparatus according to claim 2, in which the detector is configured to detect image disparity of the one or more image features.
4. Apparatus according to any one of claims 1 to 3, comprising a camera arrangement to capture two or more three-dimensional images.
5. Apparatus according to claim 4, in which the camera arrangement comprises two or more cameras disposed relative to one another so as to capture respective three-dimensional images.
6. Apparatus according to claim 5, in which the respective fields of view of the two or more cameras partially overlap.
7. Apparatus according to any one of the preceding claims, in which the one or more image features comprise image features which appear in overlapping portions of two or more of the three-dimensional images.
8. Apparatus according to any one of the preceding claims, in which the image processor is configured to apply a degree of distortion to an image feature in dependence upon a difference in depth parameter for that image feature in two or more of the three-dimensional images.
9. Apparatus according to any one of the preceding claims, in which the distortion is a horizontal expansion or contraction.
10. An image processing method comprising:
detecting a depth parameter for one or more image features at the periphery of a set of two or more captured three-dimensional images;
5 applying an image distortion to the image features in at least one of the threedimensional images of the sets, in dependence upon the detected depth parameter; and generating a composite image from the set of three-dimensional images.
11. Computer software which, when executed by a computer, causes the computer to 10 perform the method of claim 10.
12. A non-transitory, machine-readable storage medium which stores computer software according to claim 11.
Intellectual
Property
Office
Application No: GB1622195.4
GB1622195.4A 2016-12-23 2016-12-23 Image processing Withdrawn GB2558286A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1622195.4A GB2558286A (en) 2016-12-23 2016-12-23 Image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1622195.4A GB2558286A (en) 2016-12-23 2016-12-23 Image processing

Publications (2)

Publication Number Publication Date
GB201622195D0 GB201622195D0 (en) 2017-02-08
GB2558286A true GB2558286A (en) 2018-07-11

Family

ID=58360709

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1622195.4A Withdrawn GB2558286A (en) 2016-12-23 2016-12-23 Image processing

Country Status (1)

Country Link
GB (1) GB2558286A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160088280A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Camera system for three-dimensional video

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160088280A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Camera system for three-dimensional video

Also Published As

Publication number Publication date
GB201622195D0 (en) 2017-02-08

Similar Documents

Publication Publication Date Title
JP5287702B2 (en) Image processing apparatus and method, and program
Zhang et al. Casual stereoscopic panorama stitching
KR20100087685A (en) Method and apparatus for improving quality of depth image
US8982187B2 (en) System and method of rendering stereoscopic images
KR20130137000A (en) Image processing device, image processing method, and program
US8750600B2 (en) Apparatus and method for generating three-dimensional (3D) zoom image of stereo camera
KR20120078949A (en) Stereoscopic image generation method of background terrain scenes, system using the same and recording medium for the same
JP2013521686A (en) Disparity distribution estimation for 3DTV
TW201445977A (en) Image processing method and image processing system
JP5307953B1 (en) Stereoscopic image processing apparatus, stereoscopic image processing method, and stereoscopic image processing program
US9208549B2 (en) Method and apparatus for color transfer between images
US10554954B2 (en) Stereoscopic focus point adjustment
GB2558286A (en) Image processing
Orozco et al. HDR multiview image sequence generation: Toward 3D HDR video
Gurbuz et al. Color calibration for multi-camera imaging systems
US9693042B2 (en) Foreground and background detection in a video
US9591290B2 (en) Stereoscopic video generation
Yao et al. View synthesis based on background update with gaussian mixture model
US9674500B2 (en) Stereoscopic depth adjustment
JP2012049920A (en) Image processing device and method for controlling the same
Muddala et al. Edge-preserving depth-image-based rendering method
US9661309B2 (en) Stereoscopic video zooming
Kang et al. Generation of multi-view images using stereo and time-of-flight depth cameras
EP3155809A1 (en) Stereoscopic video zooming and foreground and background detection in a video
Bouchard et al. Half-occluded regions: The key to detecting a diverse array of defects in S3D imagery

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)