CN103053169B - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN103053169B
CN103053169B CN201280002319.7A CN201280002319A CN103053169B CN 103053169 B CN103053169 B CN 103053169B CN 201280002319 A CN201280002319 A CN 201280002319A CN 103053169 B CN103053169 B CN 103053169B
Authority
CN
China
Prior art keywords
mentioned
white space
image
changing image
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201280002319.7A
Other languages
Chinese (zh)
Other versions
CN103053169A (en
Inventor
矶贝邦昭
三崎正之
田川润一
河村岳
藤井隆志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Publication of CN103053169A publication Critical patent/CN103053169A/en
Application granted granted Critical
Publication of CN103053169B publication Critical patent/CN103053169B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Possess: data input part (101), accept the input of input picture (121), depth data (122) and acquisition parameters (123); Parameters input portion (102), accepts the input of the transformation parameter (124) of the parameter as the projective transformation about 3 dimension module; Changing image generating unit (103), by carrying out the projective transformation based on transformation parameter (124) in 3 dimension module obtained according to input picture (121), depth data (122) and acquisition parameters (123), generates changing image (131); Clear area detecting part (104), in changing image (131), detects as there is not the corresponding pixel of pixel and the white space of the set of blank pixel in input picture (121); Efferent (105), when the blank value of the size representing white space is below threshold value, exports changing image (131).

Description

Image processing apparatus and image processing method
Technical field
The present invention relates to the image processing apparatus and the image processing method that generate changing image by carrying out projective transformation in 3 dimension module obtained from input picture.
Background technology
In recent years, user becomes many to the situation that the image taken with Digital Still camera or digital camera etc. is edited.Such as, user is via GUI(GraphicalUserInterface: graphic user interface) specify a part of region in input picture, the picture of the subject of shooting in this part of region is amplified.Under these circumstances, just the picture of subject is merely zoomed in or out, can not obtain by camera to subject close to and take such picture.
In addition, propose by from multiple directions shooting subject input picture synthesis and generate 3 dimension module, in 3 generated dimension module by carry out projective transformation from arbitrary viewpoint position thus generation shot object image method (for example, referring to patent documentation 1).By such method, such as, can obtain camera close and take such shot object image or take such shot object image from the direction different from the camera of reality to subject.
Patent documentation 1: Japanese Unexamined Patent Publication 2002-290964 publication
Brief summary of the invention
The problem that invention will solve
But, in above-mentioned method in the past, have the situation of the image quality deterioration significantly of changing image.
Summary of the invention
So, the present invention makes in view of the above problems, object be to provide a kind of when by carrying out projective transformation generate changing image in 3 dimension module obtained from input picture, image processing apparatus and the image processing method of the deterioration of the image quality of changing image can be suppressed.
For the means of dealing with problems
The image processing apparatus of a form for the present invention, is characterized in that possessing: data input part, accepts input picture, represents the input of the acquisition parameters of the depth data of the degree of depth of above-mentioned input picture and above-mentioned input picture; Parameters input portion, accepts the parameter relevant to the projective transformation of 3 dimension module and the input of transformation parameter; Changing image generating unit, by carrying out the projective transformation based on above-mentioned transformation parameter in 3 dimension module obtained according to above-mentioned input picture, above-mentioned depth data and above-mentioned acquisition parameters, thus generates changing image; Clear area detecting part, detects the white space of the set as blank pixel in above-mentioned changing image, and this blank pixel is the pixel that there is not respective pixel in above-mentioned input picture; And efferent, when the blank value notationally stating the size of white space is below threshold value, above-mentioned changing image is exported.
In addition, these whole or concrete forms also can by the CD-ROM(CompactDiscReadOnlyMemory of system, method, integrated circuit, computer program or embodied on computer readable) etc. recording medium realize, also can be realized by the arbitrary combination of system, method, integrated circuit, computer program and recording medium.
Invention effect
According to a form of the present invention, when by carrying out projective transformation generate changing image in 3 dimension module obtained from input picture, the deterioration of the image quality of changing image can be suppressed.
Accompanying drawing explanation
Fig. 1 is the block diagram of the functional structure of the image processing apparatus representing one embodiment 1.
Fig. 2 is the flow chart of the process action of the image processing apparatus representing one embodiment 1.
Fig. 3 is used to the figure of an example of the process of the changing image generating unit that one embodiment 1 is described.
Fig. 4 is used to the figure of an example of the process of the changing image generating unit that one embodiment 1 is described.
Fig. 5 is used to the figure of an example of the process of the changing image generating unit that one embodiment 1 is described.
Fig. 6 is the block diagram of the functional structure of the image processing apparatus representing one embodiment 2.
Fig. 7 is the flow chart of the process action of the image processing apparatus representing one embodiment 2.
Fig. 8 is the block diagram of the functional structure of the image processing apparatus representing one embodiment 3.
Fig. 9 is the flow chart of the process action of the image processing apparatus representing one embodiment 3.
Figure 10 A is the figure of the example representing input picture.
Figure 10 B is the figure of the example representing depth data.
Figure 11 is the figure of an example of the GUI represented under initial condition.
Figure 12 is the figure of an example of the GUI representing the changing image that display generates based on Input transformation parameter.
Figure 13 is the figure of an example of the GUI representing the changing image (blank value > threshold value) that display generates based on interpolation transformation parameter.
Figure 14 is the figure of an example of the GUI representing the changing image (blank value=threshold value) that display generates based on interpolation transformation parameter.
Figure 15 represents that white space supplements and the figure of an example of the changing image obtained.
Figure 16 is the block diagram of the functional structure of the display unit of the variation representing one embodiment 1 ~ 3.
Embodiment
(summary of the present invention)
In the prior art method, the situation of the image quality deterioration significantly of changing image is had.Such as, have the situation that corresponding picture non-existent white space in the input image occurs in the changing image generated by projective transformation, under these circumstances, the image quality of changing image is deteriorated significantly.
So the image processing apparatus of a form for the present invention, is characterized in that possessing: data input part, accept input picture, represent the input of the acquisition parameters of the depth data of the degree of depth of above-mentioned input picture and above-mentioned input picture; Parameters input portion, accepts the parameter relevant to the projective transformation of 3 dimension module and the input of transformation parameter; Changing image generating unit, by carrying out the projective transformation based on above-mentioned transformation parameter in 3 dimension module obtained according to above-mentioned input picture, above-mentioned depth data and above-mentioned acquisition parameters, thus generates changing image; Clear area detecting part, detects the white space of the set as blank pixel in above-mentioned changing image, and this blank pixel is the pixel that there is not respective pixel in above-mentioned input picture; And efferent, when the blank value notationally stating the size of white space is below threshold value, above-mentioned changing image is exported.
According to this structure, can switch whether export this changing image according to the size of the white space detected in changing image.Thus, the situation exporting the changing image that image quality is deteriorated significantly because white space is comparatively large can be suppressed.And then, like this when white space is larger, if such as carry out projective transformation based on other transformation parameters, then white space also can be made to diminish, the deterioration of the image quality of changing image can be suppressed.
Such as, also can be, above-mentioned clear area detecting part be the mode of 1 white space with the set of the blank pixel adjoined each other, and detects multiple above-mentioned white space; The maximum blank value of above-mentioned efferent in the respective blank value of multiple above-mentioned white space is below above-mentioned threshold value, above-mentioned changing image is exported.
Generally speaking, compared with situation about existing discretely with many less white spaces, the impact that the situation that there is less large white space is brought to the deterioration of image quality is larger.So, according to this structure, can output transform image the maximum blank value in the respective blank value of multiple white space is below threshold value.That is, whether output transform image can be switched by the white space larger according to the impact of the deterioration for image quality in multiple white space, situation about being exported by the changing image of image quality deterioration significantly can be suppressed.
Such as, also can be that above-mentioned blank value has carried out weighting, to make the position of white space more close to the value that the blank value of center then this white space of changing image is larger.
Generally speaking, be present in image edge part white space compared with, the impact that the white space being present in the center of image brings to the deterioration of image quality is larger.So, according to this structure, can use and carry out weighting and more switch whether output transform image close to the blank value that the blank value of center then this white space of changing image is larger to make the position of white space.That is, whether output transform image can be switched by the white space larger according to the impact of the deterioration for image quality in multiple white space, situation about being exported by the changing image of image quality deterioration significantly can be suppressed.
Such as, also can be that above-mentioned blank value has carried out weighting, to make the value that the blank value of less then this white space of the ratio of width to height of white space is larger.
Generally speaking, if identical area, then the white space of the ratio of width to height less (i.e. longitudinal amplitude and horizontal amplitude close), the impact brought to the deterioration of image quality is larger.So according to this structure, can use and carry out weighting to make the ratio of width to height of white space less, the blank value that the blank value of this white space is larger switches whether output transform image.That is, whether output transform image can be switched by the white space larger according to the impact of the deterioration for image quality in multiple white space, situation about being exported by the changing image of image quality deterioration significantly can be suppressed.
Such as, also can be, the supplementary portion that the pixel value of pixel value to the pixel of above-mentioned white space also possessing the pixel near based on above-mentioned white space supplements; When above-mentioned blank value is below above-mentioned threshold value, the above-mentioned changing image after the pixel value that above-mentioned efferent exports the pixel of above-mentioned white space is added.
According to this structure, the pixel value of the pixel of white space can be supplemented based on the pixel value of the pixel near white space.Thus, the deterioration of the image quality caused because of white space can be suppressed.
Such as, also can be that the change that above-mentioned parameter input part accepts for the viewpoint position of above-mentioned input picture from user indicates, as the input of above-mentioned transformation parameter; Above-mentioned changing image generating unit, by carrying out projective transformation according to the viewpoint position determined by above-mentioned transformation parameter, generates above-mentioned changing image.
According to this structure, can based on accept from user, for the viewpoint position of input picture change instruction carry out projective transformation.
Such as, also can be that above-mentioned efferent, also when above-mentioned blank value is larger than above-mentioned threshold value, exports the information of the input of asking other transformation parameters of being used for.
According to this structure, the input of other transformation parameters can be asked when the deterioration of the image quality of changing image is larger.As a result, also can generate changing image based on other transformation parameters, the changing image of the deterioration that inhibit the image quality caused because of white space can be generated.
Such as, also can be, above-mentioned efferent, when above-mentioned blank value is larger than above-mentioned threshold value, exports for being highlighted for accepting the information of the object of initialized for above-mentioned transformation parameter instruction from user, as being used for the information of the input of asking other transformation parameters above-mentioned.
According to this structure, when the deterioration of the image quality of changing image is larger, by being used for accepting from user, the object of initialized for transformation parameter instruction can be highlighted.Thus, can impel to user the initialization carrying out transformation parameter, the deterioration of the image quality of changing image can be suppressed.In addition, be used for carrying out, in the user interface of image procossing, the operability of user can being improved.
Such as, also can be that the changing image exported before, also when above-mentioned blank value is larger than above-mentioned threshold value, exports by above-mentioned efferent.
According to this structure, the changing image exported before can be exported when the deterioration of the image quality of changing image is larger, the situation of the changing image exporting image quality deterioration can be prevented.
Such as, also can be that above-mentioned image processing apparatus also possesses: calculation of parameter portion, when above-mentioned blank value is larger than above-mentioned threshold value, by carrying out interpolation between the transformation parameter corresponding with above-mentioned input picture and the transformation parameter accepted by above-mentioned parameter input part, calculate interpolation transformation parameter; Above-mentioned changing image generating unit is also by carrying out the projective transformation based on the above-mentioned interpolation transformation parameter calculated, newly-generated changing image.
According to this structure, when the deterioration of the image quality of changing image is larger, can automatically calculate than the interpolation transformation parameter of accepted transformation parameter closer to the transformation parameter corresponding with input picture.Thus, the deterioration of the image quality of changing image can be suppressed.
Such as, also can be, above-mentioned parameter calculating part calculates above-mentioned interpolation transformation parameter successively, to make above-mentioned interpolation transformation parameter move closer to from the transformation parameter accepted by above-mentioned parameter input part to the transformation parameter corresponding with above-mentioned input picture, until above-mentioned blank value becomes below above-mentioned threshold value; Above-mentioned changing image generating unit generates above-mentioned changing image successively based on above-mentioned interpolation transformation parameter; Above-mentioned changing image exports by above-mentioned efferent successively.
According to this structure, interpolation transformation parameter can be calculated to make the transformation parameter of interpolation transformation parameter gradually to corresponding with input picture close.Thus, automatically the such transformation parameter of the image quality that presets can be met by computational transformation image.That is, the deterioration of the image quality of changing image can be suppressed.
And then, according to this structure, the changing image generated based on the interpolation transformation parameter calculated like this can be exported successively.Thus, the situation that the image quality of changing image can be improved is pointed out user as animation.
Such as, also can be that above-mentioned efferent also exports the information of the relation representing above-mentioned blank value and above-mentioned threshold value.
According to this structure, the degradation of image quality can be pointed out to user.Thus, be used for carrying out, in the user interface of image procossing, the operability of user can being improved.
In addition, these whole or concrete forms also can be realized by the recording medium of the CD-ROM of system, method, integrated circuit, computer program or embodied on computer readable etc., also can be realized by the arbitrary combination of system, method, integrated circuit, computer program and recording medium.
Below, with reference to accompanying drawing, embodiments of the present invention are described.
In addition, the execution mode below illustrated all represents a specific example of the present invention.The allocation position of the numerical value represented by following execution mode, shape, material, inscape, inscape and connect the just example such as order of form, step, step is not limit the meaning of the present invention.In addition, about in the inscape of following execution mode, the inscape be not documented in the independent claims representing upper concept, be set to arbitrary inscape and be described.
(execution mode 1)
Fig. 1 is the block diagram of the functional structure of the image processing apparatus representing one embodiment 1.This image processing apparatus 100 possesses data input part 101, parameters input portion 102, changing image generating unit 103, clear area detecting part 104 and efferent 105.
Data input part 101 accepts the input of input picture 121, depth data 122 and acquisition parameters 123.Below, the combination of input picture 121, depth data 122 and acquisition parameters 123 is called input data.
Specifically, data input part 101 is such as from HDD(HardDiskDrive: hard disk drive) or the storage device of flash memory etc. read input data.In addition, such as data input part 101 obtains input data from the external device (ED) connected via network.
In addition, so-called depth data 122 is the data of the degree of depth representing input picture.When input picture is the image taken by camera, the so-called degree of depth, represents the distance from this camera to subject.Specifically, depth data 122 such as comprises the depth value of each pixel forming input picture.
Acquisition parameters 123 is the parameters of the shooting condition representing input picture.In the present embodiment, acquisition parameters comprises the angle of visual field, the front end distance of the degree of depth and the rear end distance of the degree of depth.In addition, acquisition parameters 123 might not need to comprise the angle of visual field, the front end distance of the degree of depth and the rear end distance of the degree of depth.Such as, if the depth value be included in depth data 122 is the value after normalization (normalization), then acquisition parameters 123 also can not comprise the front end distance of the degree of depth and the rear end distance of the degree of depth.
Parameters input portion 102 accepts the input of the transformation parameter 124 as the parameter of the projective transformation about 3 dimension module.Such as, parameters input portion 102 indicates the change of the viewpoint position of input picture 121 as the input and accepting from user of transformation parameter.
In addition, transformation parameter 124 might not need be viewpoint position change instruction.Such as, transformation parameter also can be the information of the kind (such as, having an X-rayed projection or orthogonal projection etc.) representing projective transformation.In addition, such as transformation parameter 124 also can be the information representing the angle of visual field.
Changing image generating unit 103, in 3 dimension module obtained according to input picture 121, depth data 122 and acquisition parameters 123, by carrying out the projective transformation based on transformation parameter 124, generates changing image 131.About the details of this changing image generating unit 103, other accompanying drawings are used to describe later.
Clear area detecting part 104 detects the white space of the set as blank pixel in changing image 131.In the present embodiment, clear area detecting part 104 detects white space in the mode that the set of the blank pixel adjoined each other is 1 white space.
Here, so-called blank pixel is pixel in changing image and be the pixel that there is not corresponding pixel in input picture.That is, so-called blank pixel is the pixel of the position corresponding with the image do not photographed in input picture.In other words, so-called blank pixel is the pixel not having the pixel of pixel value or pixel value to maintain initial value former state.
In addition, clear area detecting part 104 might not need the set detecting the blank pixel adjoined each other as 1 white space.Such as, clear area detecting part 104 also can detect the set of (the mutually not adjoining) blank pixel be present in discretely in the scope of regulation as 1 white space.
Changing image 131, when the blank value of the size representing white space is below threshold value, exports by efferent 105.In the present embodiment, the maximum blank value of efferent 105 in the respective blank value of multiple white space is below threshold value, changing image 131 is exported.
Here, as blank value, use the ratio of pixel count relative to the pixel count of changing image 131 of white space.In addition, blank value might not need to be such value.Such as, blank value also can be pixel count or the area of white space.In addition, such as blank value also can be the horizontal amplitude of white space and at least one of longitudinal amplitude.
Then, the various actions of the image processing apparatus 100 formed as described above are described.
Fig. 2 is the flow chart of the process action of the image processing apparatus 100 representing one embodiment 1.Below, the situation of the change instruction inputting viewpoint position from user is described.
First, data input part 101 accepts the input (S101) of input data.Then, parameters input portion 102 accepts the input (S102) of change instruction (transformation parameter) of viewpoint position from user.Specifically, parameters input portion 102 such as obtains the change instruction of the viewpoint position to the input picture input be presented on picture.In the case, user such as by via the position on the input mechanism instruction picture of mouse etc., inputs the change instruction of viewpoint position.
Changing image generating unit 103 by changing viewpoint position and carry out projective transformation in 3 dimension module obtained according to input picture, depth data and acquisition parameters, according to change instruction, thus generation changing image (S103).Describe later about the process generating this changing image.
Then, clear area detecting part 104 detects white space (S104) in changing image.Here, clear area detecting part 104 detects the set of the blank pixel adjoined each other as 1 white space.
Clear area detecting part 104 judges whether blank value is below threshold value (S105).Specifically, clear area detecting part 104, when multiple white space being detected, judges whether the maximum of blank value is below threshold value.As long as this threshold value comes experimental according to the degree of the image quality of changing image deterioration because of white space or empirically setting is just passable.
Here, when blank value is below threshold value (S105 is), efferent 105 output transform image (S106), ends process.As a result, be such as presented at according to the changing image after the change instruction conversion of viewpoint position on picture.
On the other hand, when blank value is larger than threshold value (S105's is no), image processing apparatus 100 not output transform image and former state ends process.As a result, such as input picture former state is presented on picture.That is, the change of the viewpoint position undertaken by user is limited.
As above, image processing apparatus 100, by switching whether output transform image according to the size of white space, can limit the picture editting undertaken by user.
Then, explain with reference to the process of Fig. 3 ~ Fig. 5 to the generation changing image undertaken by changing image generating unit 103.Here, the situation of the change instruction that have input viewpoint position as transformation parameter is described.
Fig. 3 ~ Fig. 5 is used to the figure of the example that the process undertaken by the changing image generating unit 103 of one embodiment 1 is described.In addition, in Fig. 3 ~ Fig. 5, X-direction and Y direction represent horizontal direction and vertical direction, and Z-direction represents depth direction (depth direction).
First, as shown in Figure 3, changing image generating unit 103 uses input picture 121(Fig. 3 (a) represented by screen coordinate system) and depth data 122,3 dimension module (Fig. 3 (b)) that generation represents with projective coordinate system.That is, changing image generating unit 103 uses input picture 121 and depth data 122, calculates the vector V p(X of the position of the pixel represented in projective coordinate system, Y, Z according to each pixel).
So-called screen coordinate system is the coordinate system of 2 dimensions corresponding with display frame.So-called projective coordinate system, also referred to as cutting (clip) coordinate system or equipment (device) coordinate system, is the coordinate system of 3 dimensions obtained by carrying out projective transformation in camera coordinate system.So-called camera coordinate system, also referred to as looking (view) coordinate system, is the coordinate system of 3 dimensions defined by viewpoint (camera) position and direction of visual lines.In the present embodiment, in camera coordinate system, viewpoint position is consistent with initial point, and direction of visual lines is consistent with Z-direction (depth direction).
Then, changing image generating unit 103 uses acquisition parameters, and 3 dimension module are converted (Fig. 3 (c)) from projective coordinate system to camera coordinate system.Specifically, the vector V p of the position of each pixel represented in projective coordinate system, such as formula shown in (1) ~ (3), converts to the vector V c of the position of each pixel represented in camera coordinate system by changing image generating unit 103.In addition, in following formula (1) ~ (3), vector V p and Vc homogeneous coordinates represent.
[ numerical expression 1 ]
V c=[x′/w′y′/w′z′/w′1](1)
Here, x ', y ', z ' and w ' are calculated by following formula (2).
[ numerical expression 2 ]
[x′y′z′w′]=Vp×Mpc=[xyz1]×Mpc(2)
Here, matrix M pc is the inverse matrix of projection matrix Mcp.Projection matrix Mcp uses the front end distance zn of the degree of depth, rear end distance zf, the angle of visual field fovY of the degree of depth and the ratio of width to height Aspect, represents as following formula (3).
[ numerical expression 3 ]
Mcp = cot ( fovY 2 ) Aspect 0 0 0 0 cot ( fovY 2 ) 0 0 0 0 zf zf - zn 1 0 0 - zn * zf zf - zn 0 - - - ( 3 )
Then, as shown in Figure 4, changing image generating unit 103 indicates according to the change of viewpoint position, changes the viewpoint position (Fig. 4 (a) and Fig. 4 (b)) of 3 dimension module in camera coordinate system.Specifically, changing image generating unit 103 moves to the moving direction rightabout with viewpoint position by making the vector V c of the position of each pixel in expression camera coordinate system, calculates the vector V c ' of the position of each pixel after representing viewpoint position change.
Then, changing image generating unit 103 converts (Fig. 4 (c)) by changing 3 dimension module after viewpoint position from camera coordinate system to projective coordinate system.Specifically, the vector V c ' of the position of each pixel after the viewpoint position change represented in camera coordinate system converts to the vector V p ' of the position of each pixel represented in projective coordinate system by changing image generating unit 103.That is, changing image generating unit 103 by carrying out projective transformation, compute vectors Vp ' as shown in following formula (4).
[ numerical expression 4 ]
Vp′=Vc′×Mcp(4)
In the present embodiment, the projection matrix Mcp shown in the projection matrix Mcp shown in formula (4) with formula (3) is identical.In addition, projection matrix Mcp used herein might not need to be such projection matrix.Projection matrix both can be such as orthogonal projection matrix, also can be along with the projection matrix rotated or reverse.That is, when comprising the information representing projection matrix in transformation parameter, as long as the projection matrix using transformation parameter to represent is just passable.
Finally, as shown in Figure 5, changing image generating unit 103 according to 3 dimension module represented with projective coordinate system, changing image 131(Fig. 5 (a) that generation represents with screen coordinate system and Fig. 5 (b)).As shown in Fig. 5 (b), in changing image 131, there is white space 132.This white space 132 is the set of the blank pixel that there is not corresponding pixel in input picture.Use the blank value (such as pixel count) of the size representing such white space 132, determine whether output transform image 131.
As above, according to the image processing apparatus 100 about present embodiment, can switch according to the size of the white space detected in changing image and whether this changing image be exported.Thus, can suppress because white space is comparatively large, changing image that is image quality deterioration significantly exports this situation.And then, when such white space is larger, if such as carry out projective transformation based on other transformation parameters, then also can reduces white space, the deterioration of the image quality of changing image can be suppressed.
In addition, generally speaking, compared with situation about existing discretely with many less white spaces, the impact that the situation that there is the larger white space of minority is brought to the deterioration of image quality is larger.So, according to the image processing apparatus 100 about present embodiment, can output transform image the maximum blank value in the respective blank value of multiple white space is below threshold value.That is, whether output transform image can be switched by the white space larger according to the impact of the deterioration on image quality in multiple white space, the changing image of image quality deterioration significantly can be suppressed to export this situation.
(execution mode 2)
Then, execution mode 2 is described.About the image processing apparatus of present embodiment mainly supplements the pixel value of the pixel of the white space formed in changing image, by different from the image processing apparatus of one embodiment 1 for the changing image output this point after supplementing.
Fig. 6 is the block diagram of the functional structure of the image processing apparatus 110 representing one embodiment 2.In addition, in figure 6, give identical label for the component part same with Fig. 1, suitably omit the description.
As shown in Figure 6, about the image processing apparatus 110 of present embodiment also possesses supplementary portion 111.
Supplementary portion 111 is supplemented based on the pixel value of the pixel value near white space to the pixel of white space.Specifically, supplementary portion 111 such as the pixel of white space pixel value and set the pixel value of the pixel adjoined with white space.In addition, such as supplementary portion 111 also can use the pixel value of the pixel near white space, carries out interpolation to the pixel value of the pixel of white space.
The pixel value of the pixel by supplementary portion 111 pairs of white spaces, when blank value is below threshold value, is carried out the changing image after supplementing and exports by efferent 105.
Then, the various actions of the image processing apparatus 110 formed as described above are described.
Fig. 7 is the flow chart of the process action of the image processing apparatus 110 representing one embodiment 2.In addition, in the figure 7, give identical label to the step of carrying out the process same with Fig. 2, suitably omit the description.
Same with execution mode 1, image processing apparatus 110 performs the process of step S101 ~ S105.Here, when blank value is below threshold value (S105 is), supplementary portion 111, based on the pixel value of the pixel near white space, is supplemented (S111) the pixel value of the pixel of white space.Then, the changing image supplementing the pixel value of the pixel of white space is exported (S112) by efferent 105.
On the other hand, when blank value is larger than threshold value (No of S105), image processing apparatus 110 former state ends process.That is, not output transform image.
As above, according to the image processing apparatus 110 about present embodiment, can carry out the pixel value of the pixel of white space based on the pixel value of the pixel near white space.Thus, image processing apparatus 110 can suppress the deterioration of the image quality brought by white space.
(execution mode 3)
The image processing apparatus of one embodiment 3 automatically generates other transformation parameters when blank value is larger than threshold value.Further, the image processing apparatus of one embodiment 3 is based on the newly-generated newly-generated changing image of other transformation parameters.Below, with reference to accompanying drawing, the image processing apparatus about present embodiment is specifically described.
Fig. 8 is the block diagram of the functional structure of the image processing apparatus 115 representing one embodiment 3.In addition, in fig. 8, give identical label for the component part same with Fig. 1, suitably omit the description.
As shown in Figure 8, image processing apparatus 115 possesses data input part 101, parameters input portion 102, clear area detecting part 104, calculation of parameter portion 116, changing image generating unit 117 and efferent 118.
Calculation of parameter portion 116, when blank value is larger than threshold value, by carrying out interpolation between the transformation parameter corresponding with input picture and the transformation parameter accepted by parameters input portion 102, calculates interpolation transformation parameter.Such as, calculation of parameter portion 116 calculates interpolation transformation parameter by linear interpolation.
Here, the transformation parameter corresponding with input picture, is used to the transformation parameter obtaining input picture by carrying out projective transformation in 3 dimension module obtained according to input picture, depth data and acquisition parameters.Below, the transformation parameter corresponding with this input picture is called initial transformation parameter.In addition, the transformation parameter accepted by parameters input portion 102 is called Input transformation parameter.
Changing image generating unit 117 is same with the changing image generating unit 103 of execution mode 1, by carrying out the projective transformation based on Input transformation parameter, generates changing image.And then, changing image generating unit 117 when calculating interpolation transformation parameter by calculation of parameter portion 116, by carrying out the projective transformation based on the interpolation transformation parameter calculated, newly-generated changing image.
No matter whether efferent 118 blank value is below threshold value, all export the changing image generated by changing image generating unit 117.And then the information of the relation representing blank value and threshold value exports by efferent 118.Such as, expression blank value exports relative to the information of the ratio of threshold value or the difference of threshold value and blank value by efferent 118.
Then, the various actions of the image processing apparatus 115 formed as described above are described.
Fig. 9 is the flow chart of the process action of the image processing apparatus 115 representing one embodiment 3.In addition, in fig .9, give identical label for the step of carrying out the process same with Fig. 2, suitably omit the description.
Changing image generating unit 117, by carrying out the projective transformation based on the transformation parameter accepted by step S102 (Input transformation parameter), generates changing image (S116).Generated changing image is exported (S117) by efferent 118.
Then, clear area detecting part 104 detects white space (S104) in changing image.Further, clear area detecting part 104 judges whether the blank value of the size representing the white space detected is below threshold value (S105).Here, when blank value is below threshold value (S105 is), end process.
On the other hand, when blank value is larger than threshold value (S105's is no), calculation of parameter portion 116 calculates interpolation transformation parameter (S118).Here, calculation of parameter portion 116 calculates interpolation transformation parameter from Input transformation parameter to the mode that initial transformation parameter moves closer to make interpolation transformation parameter.That is, when calculating interpolation transformation parameter, calculation of parameter portion 116 calculates than the interpolation transformation parameter of the interpolation transformation parameter calculated closer to initial transformation parameter.
Then, changing image generating unit 117 by carrying out the projective transformation based on the interpolation transformation parameter calculated, newly-generated changing image (S103).Further, the process of step S117, S104 and S105 is again carried out.
As above, calculation of parameter portion 116 calculates interpolation transformation parameter successively, to make interpolation transformation parameter move closer to from Input transformation parameter to initial transformation parameter, until blank value becomes below threshold value.In addition, changing image generating unit 117 generates changing image successively based on interpolation transformation parameter.In addition, changing image exports by efferent 118 successively.
Then, the concrete example of the GUI used in image procossing such is above described.Below, the situation that data input part 101 has accepted input picture 121 such shown in Figure 10 A and Figure 10 B and depth data 122 is described.
Figure 11 is the figure of an example of the GUI300 represented under initial condition.In an initial condition, input picture 121 is shown.In addition, user can indicate transformation parameter (, being viewpoint position and the angle of visual field here) via this GUI300.As shown in figure 11, GUI300 has object (object) 301 ~ 305.
Object 301 have for display and input represent X-direction (horizontal direction), Y direction (vertical direction) and Z-direction (depth direction) respective on text box 301x, 301y, 301z of value of viewpoint position.Here, as representing the value of viewpoint position, use from the variable quantity compared with the viewpoint position corresponding with input picture.User, by text frame 301x ~ 301z input value, can make the viewpoint position of the image of current display (being input picture 121) change here.
Object 302 has the button 302x ~ 302z for making viewpoint position change.Button 302x is used to the button of the direction change of the plus or minus making viewpoint position to X-direction.In addition, button 302y is used to the button of the direction change of the plus or minus making viewpoint position to Y direction.In addition, button 302z is used to the button of the direction change of the plus or minus making viewpoint position to Z-direction.User, by being pressed by these buttons 302x ~ 302z, can make the viewpoint position of the image of current display change.
Object 303 has the text box representing the value of the angle of visual field for display and input.User, by text frame input value, can make the angle of visual field of the image of current display change.
Object 304 has the slider bar for making the angle of visual field change.User, by being moved left and right by this slider bar, can make the angle of visual field of the image of current display change.
Object 305 has the text and the image that represent the information of the relation of blank value and threshold value for display.Here, as the information of relation representing blank value and threshold value, use blank value is relative to the ratio of threshold value.The region that with the addition of shade increases and decreases relative to the size of the ratio of threshold value according to blank value.
Here, the change of GUI when using Figure 12 ~ Figure 14 variable quantity as the viewpoint position of X-direction is described and have input " 30 ".Figure 12 is the figure of the example representing the GUI showing the changing image 131a generated based on Input transformation parameter.Figure 13 represents to show the changing image 131b(blank value > threshold value that generates based on interpolation transformation parameter) the figure of an example of GUI.Figure 14 represents to show changing image 131c(blank value=threshold value of generating based on interpolation transformation parameter) the figure of an example of GUI.Here, white space is showed by blacking.
When the viewpoint position as X-direction variable quantity and when have input " 30 ", first carry out following process.
First, parameters input portion 102 as the viewpoint position of X-direction variable quantity and accept the input of the transformation parameter of expression " 30 ".Further, changing image generating unit 117, by the projective transformation based on accepted transformation parameter, generates changing image 131a.Generated changing image 131a exports by efferent 118.And then efferent 118 exports " 130% ", as the information of relation representing blank value and threshold value, this blank value represents the size of the white space detected from changing image 131a by clear area detecting part 104.
As a result, as shown in figure 12, display transformation image 131a, upgrades object 305.Here, because blank value is larger than threshold value, so carry out following process again.
First, calculation of parameter portion 116, by carrying out interpolation between initial transformation parameter and Input transformation parameter, calculates interpolation transformation parameter.Here, calculation of parameter portion 116, by carrying out linear interpolation between the viewpoint position by Input transformation Parametric Representation and the viewpoint position by initial transformation Parametric Representation, calculates the variable quantity as the viewpoint position of X-direction and represents the interpolation transformation parameter of " 25 ".Further, changing image generating unit 117, by carrying out the projective transformation based on the interpolation transformation parameter calculated, generates changing image 131b.Generated changing image 131b exports by efferent 118.And then efferent 118 exports " 110% ", as the information of relation representing blank value and threshold value, this blank value represents the size of the white space detected from changing image 131b by clear area detecting part 104.
As a result, as shown in figure 13, display transformation image 131b, upgrades object 301 and object 305.Here, because blank value is still large than threshold value, so carry out following process again.
First, calculation of parameter portion 116, by carrying out interpolation between initial transformation parameter and Input transformation parameter, calculates interpolation transformation parameter.Specifically, calculation of parameter portion 116 calculates than the interpolation transformation parameter of the interpolation transformation parameter calculated before closer to Input transformation parameter.Here, calculation of parameter portion 116 calculates the interpolation transformation parameter representing " 20 ", as the variable quantity of the viewpoint position of X-direction.Further, changing image generating unit 117, by carrying out the projective transformation based on the interpolation transformation parameter calculated, generates changing image 131c.Generated changing image 131c exports by efferent 118.And then efferent 118 exports " 100% " information as the relation of blank value and threshold value, and this blank value represents the size of the white space detected from changing image 131c by clear area detecting part 104.
As a result, as shown in figure 14, display transformation image 131c, upgrades object 301 and object 305.Here, because blank value is consistent with threshold value, so process terminates.
As above, according to the image processing apparatus 115 about present embodiment, when the deterioration of the image quality of changing image is larger, can automatically calculate than the interpolation transformation parameter of Input transformation parameter closer to initial transformation parameter.Thus, image processing apparatus 115 can suppress the deterioration of the image quality of changing image.
In addition, according to the image processing apparatus 115 about present embodiment, interpolation transformation parameter can be calculated to make interpolation transformation parameter close to initial transformation parameter gradually.Thus, image processing apparatus 115 automatically can meet the such transformation parameter of the image quality that presets by computational transformation image.That is, image processing apparatus 115 can suppress the deterioration of the image quality of changing image.
And then the changing image generated based on the interpolation transformation parameter calculated like this can export by image processing apparatus 115 successively.Thus, image processing apparatus 115 can to the situation that user points out the image quality of changing image to improve as animation.
In addition, according to the image processing apparatus 115 about present embodiment, the degradation of image quality can be pointed out to user.Thus, image processing apparatus 115 can improve the operability of user in the user interface being used for carrying out image procossing.
In addition, in the present embodiment, the pixel value of white space is the original state of initial value (black), but same with execution mode 2, also can supplement based on the pixel value near white space.In the case, such as shown in figure 15, image processing apparatus 115 is supplemented by the pixel value of the pixel to the white space in changing image 131c, generates changing image 131d.And then generated changing image 131d exports by image processing apparatus 115.Thus, image processing apparatus 115 can suppress the deterioration of the image quality caused because of white space.
In addition, calculation of parameter portion 116 might not need as described above by interpolation computational transformation parameter.Such as, parameters input portion 102 also can use other transformation parameters that general searching algorithm search white space diminishes.As a result, the changing image that blank value is below threshold value can automatically be searched for.
Above, be illustrated based on the image processing apparatus of execution mode to a form for the present invention, but the present invention is not limited to these execution modes.Only otherwise depart from purport of the present invention, the form after implementing to present embodiment the various distortion that those skilled in the art expects or the form combination of the inscape of different execution modes built are also contained in the scope of one or more forms of the present invention.
Such as, in the respective embodiments described above, blank value is the quantity of the pixel forming white space, but blank value might not need to be pixel count.Such as, blank value also can be carried out the value of weighting according to the position of white space.Specifically, blank value can also is weighting, to make the position of white space close to the value that the blank value of center then this white space of changing image is larger.
Generally speaking, be present in image edge part white space compared with, the impact that the white space being present in the center of image brings to the deterioration of image quality is larger.So, as described above, weighting has been carried out, to make the position of white space more close to the blank value that the blank value of center then this white space of changing image is larger by using, thus, whether output transform image can be switched by the white space larger according to the impact of the deterioration for image quality in multiple white space, the changing image exporting image quality deterioration significantly can be suppressed.
In addition, blank value also can be shape according to white space and the value of weighting.Specifically, blank value also can be the value being weighted to make the blank value of less then this white space of the ratio of width to height of white space (aspect) larger.That is, blank value also can be the value being weighted to make the blank value of less then this white space of the variation of the distance from the center of white space to boundary line larger.That is, blank value also can be weighted to make the shape of white space more close to the value that the blank value of round then this white space is larger.
Generally speaking, if identical area, then the white space of the ratio of width to height less (i.e. longitudinal amplitude and horizontal amplitude more close), the impact brought to the deterioration of image quality is larger.So, by using the blank value being weighted as described above to make the blank value of less then this white space of the ratio of width to height of white space larger, whether output transform image can be switched by the white space larger according to the impact of the deterioration for image quality in multiple white space, situation about being exported by the changing image of image quality deterioration significantly can be suppressed.
In addition, in above-mentioned execution mode 1 and 2, when blank value is larger than threshold value, former state ends process, but efferent 105 also can process as follows.
Such as, efferent 105 also can export the information of the input of asking other transformation parameters of being used for when blank value is larger than threshold value.Thus, image processing apparatus can ask the input of other transformation parameters when the deterioration of the image quality of changing image is larger.As a result, image processing apparatus also can generate changing image based on other transformation parameters, can generate the changing image of the deterioration that inhibit the image quality caused because of white space.
In addition, such as efferent 105 also can export when blank value is larger than threshold value that represent can not the information of output transform image.Thus, such as user can identify the changing image that can not export based on inputted transformation parameter.
In addition, such as efferent 105 also can, when blank value is larger than threshold value, will be used for being highlighted for accepting the information of the object of initialized for transformation parameter instruction from user, as being used for the information of the input of asking other transformation parameters, exports.
Here, the initialization of so-called transformation parameter, means and transformation parameter is reset to the transformation parameter corresponding with input picture.In addition, what is called is used for accepting from user the object of instruction, such as, be the GUI object of reset button etc.In addition, the so-called information for being highlighted is such as the information of the flicker of denoted object, indicate the information that the color of object changed to the higher color of conspicuity or indicate the information etc. making the size of object become large.
When outputing the information for being highlighted object like this, such as, in the GUI shown in Figure 12, be used for accepting from user the object (not shown) of initialized for transformation parameter instruction is highlighted.
Thus, image processing apparatus can impel user to carry out the initialization of transformation parameter, can suppress the deterioration of the image quality of changing image.In addition, image processing apparatus can also make the operability of user improve.
In addition, such as the changing image exported before also when blank value is larger than threshold value, can export by efferent 105.Specifically, efferent 105, such as in the situation that Figure 12 is such, also can replace changing image 131a and export the changing image exported when blank value is below threshold value.Thus, image processing apparatus can prevent the changing image exporting image quality deterioration.
In addition, about the image processing apparatus of the respective embodiments described above also can be equipped in a display device.Figure 16 is the block diagram of the functional structure of the display unit 200 of the variation representing one embodiment 1,2 or 3.
Display unit 200 is such as television set, Digital Still camera, digital camera, personal computer or portable phone etc.As shown in figure 16, display unit 200 possesses image processing apparatus 100,110 or 115 and display part 201.Here, image processing apparatus 100,110 or 115 is to display part 201 output transform image.This changing image is presented on picture when achieving changing image from image processing apparatus 100,110 or 115 by display part 201.Specifically, display part 201 indication example is as the GUI of Figure 11 ~ as shown in Figure 14.
In addition, in the respective embodiments described above, changing image is 2 dimension images, but also can be 3 d image.That is, changing image generating unit also can generate 3 d image.Such as, changing image generating unit also can generate the side of above-mentioned changing image as left eye image and right eye image.In the case, changing image generating unit also can generate the viewpoint position image different from above-mentioned changing image the opposing party as left eye image and right eye image.Further, the left eye image generated like this and right eye image 3 also can be tieed up display by display part.
In addition, when generating left eye image and right eye image, clear area detecting part also can left eye image and right eye image both in detect white spaces.In the case, blank value also can be represent the left eye value of the size of the white space in image and the statistical typical value (such as, maximum or mean value etc.) of value of size representing the white space in right eye image.
In addition, part or all of the inscape that possesses of the image processing apparatus of the respective embodiments described above also can be made up of 1 system LSI (LargeScaleIntegration: large scale integrated circuit).Such as, image processing apparatus 100 also can by have data input part 101, parameters input portion 102, changing image generating unit 103, clear area detecting part 104 and efferent 105 system LSI form.
System LSI be multiple constituting portion is integrated on 1 chip manufacture super multi-functional LSI, specifically, be comprise microprocessor, ROM(ReadOnlyMemory), RAM(RandomAccessMemory) etc. and form computer system.Computer program is stored in above-mentioned ROM.By above-mentioned microprocessor according to above computer program behavior, system LSI realizes its function.
In addition, be set to system LSI here, but according to the difference of integrated level, also have the situation being called IC, LSI, system LSI, super (super) LSI, superfine (ultra) LSI.In addition, the method for integrated circuit is not limited to LSI, also can be realized by special circuit or general processor.Also can utilize the FPGA(FieldProgrammableGateArray that can programme after LSI manufactures: field programmable gate array), maybe can reconstruct the connection of the circuit unit of LSI inside and the reconfigurable processor of setting.
And then, if there is the technology of the integrated circuit substituting LSI because of the other technologies of the progress of semiconductor technology or derivation, then this technology can certainly be used to carry out the integrated of functional block.It is likely the application etc. of biotechnology.
In addition, in the respective embodiments described above, each inscape also can be made up of special hardware or realize by performing the software program being suitable for each inscape.The software program be recorded in the recording medium of hard disk or semiconductor memory etc. also can be read and perform by the program execution department of CPU or processor etc. and realize by each inscape.Here, the software realizing the image processing apparatus of the respective embodiments described above etc. is following such program.
That is, this program makes computer perform: data input step, accepts input picture, represents the input of the acquisition parameters of the depth data of the degree of depth of above-mentioned input picture and above-mentioned input picture; Parameters input step, accepts the parameter relevant to the projective transformation of 3 dimension module and the input of transformation parameter; Changing image generation step, by carrying out the projective transformation based on above-mentioned transformation parameter in 3 dimension module obtained according to above-mentioned input picture, above-mentioned depth data and above-mentioned acquisition parameters, thus generates changing image; White space detecting step, detects the white space of the set as blank pixel in above-mentioned changing image, and this blank pixel is the pixel that there is not respective pixel in above-mentioned input picture; And output step, when the blank value notationally stating the size of white space is below threshold value, above-mentioned changing image is exported.
In addition, as the original image of the image shown in Figure 10 A ~ Figure 15, make use of the image in following paper.
(1)D.ScharsteinandC.Pal.Learningconditionalrandomfieldsforstereo.InIEEEComputerSocietyConferenceonComputerVisionandPatternRecognition(CVPR2007),Minneapolis,MN,June2007.
(2)H.HirschmullerandD.Scharstein.Evaluationofcostfunctionsforstereomatching.InIEEEComputerSocietyConferenceonComputerVisionandPatternRecognition(CVPR2007),Minneapolis,MN,June2007.
Industrial applicibility
The present invention can as using the image processing apparatus of depth data transform input picture or possess the uses such as the television set of this image processing apparatus, Digital Still camera, digital camera, personal computer or portable phone.
Label declaration
100,110,115 image processing apparatus
101 data input part
102 parameters input portions
103,117 changing image generating units
104 clear area detecting part
105,118 efferents
111 supplementary portion
116 calculation of parameter portions
121 input pictures
122 depth datas
123 acquisition parameters
124 transformation parameters
131,131a, 131b, 131c, 131d changing image
132 white spaces
200 display unit
201 display parts
300GUI
301,302,303,304,305 objects
301x, 301y, 301z text box
302x, 302y, 302z button

Claims (13)

1. an image processing apparatus, is characterized in that, possesses:
Data input part, accepts input picture, represents the input of the acquisition parameters of the depth data of the degree of depth of above-mentioned input picture and above-mentioned input picture;
Parameters input portion, accepts the parameter relevant to the projective transformation of 3 dimension module and the input of transformation parameter;
Changing image generating unit, by carrying out the projective transformation based on above-mentioned transformation parameter in 3 dimension module obtained according to above-mentioned input picture, above-mentioned depth data and above-mentioned acquisition parameters, thus generates changing image;
Clear area detecting part, detects the white space of the set as blank pixel in above-mentioned changing image, and this blank pixel is the pixel that there is not respective pixel in above-mentioned input picture; And
Efferent, when the blank value notationally stating the size of white space is below threshold value, exports above-mentioned changing image, when blank value is larger than threshold value, is not exported by above-mentioned changing image;
Above-mentioned blank value has carried out weighting, to make the position of white space more close to the value that the blank value of center then this white space of changing image is larger.
2. image processing apparatus as claimed in claim 1, is characterized in that,
Above-mentioned clear area detecting part is the mode of 1 white space with the set of the blank pixel adjoined each other, and detects multiple above-mentioned white space;
The maximum blank value of above-mentioned efferent in the respective blank value of multiple above-mentioned white space is below above-mentioned threshold value, above-mentioned changing image is exported.
3. image processing apparatus as claimed in claim 1, is characterized in that,
Above-mentioned blank value has carried out weighting, to make the value that the blank value of less then this white space of the ratio of width to height of white space is larger.
4. image processing apparatus as claimed in claim 1, is characterized in that,
The supplementary portion that the pixel value of pixel value to the pixel of above-mentioned white space also possessing the pixel near based on above-mentioned white space supplements;
When above-mentioned blank value is below above-mentioned threshold value, the above-mentioned changing image after the pixel value that above-mentioned efferent exports the pixel of above-mentioned white space is added.
5. image processing apparatus as claimed in claim 1, is characterized in that,
The change that above-mentioned parameter input part accepts for the viewpoint position of above-mentioned input picture from user indicates, as the input of above-mentioned transformation parameter;
Above-mentioned changing image generating unit, by carrying out projective transformation according to the viewpoint position determined by above-mentioned transformation parameter, generates above-mentioned changing image.
6. image processing apparatus as claimed in claim 1, is characterized in that,
Above-mentioned efferent, also when above-mentioned blank value is larger than above-mentioned threshold value, exports the information of the input of asking other transformation parameters of being used for.
7. image processing apparatus as claimed in claim 6, is characterized in that,
Above-mentioned efferent, when above-mentioned blank value is larger than above-mentioned threshold value, exports for being highlighted for accepting the information of the object of initialized for above-mentioned transformation parameter instruction from user, as the information of the above-mentioned input being used for asking other transformation parameters.
8. image processing apparatus as claimed in claim 1, is characterized in that,
The changing image exported before, also when above-mentioned blank value is larger than above-mentioned threshold value, exports by above-mentioned efferent.
9. image processing apparatus as claimed in claim 1, is characterized in that,
Above-mentioned image processing apparatus also possesses:
Calculation of parameter portion, when above-mentioned blank value is larger than above-mentioned threshold value, by carrying out interpolation between the transformation parameter corresponding with above-mentioned input picture and the transformation parameter accepted by above-mentioned parameter input part, calculates interpolation transformation parameter;
Above-mentioned changing image generating unit is also by carrying out the projective transformation based on the above-mentioned interpolation transformation parameter calculated, newly-generated changing image.
10. image processing apparatus as claimed in claim 9, is characterized in that,
Above-mentioned parameter calculating part calculates above-mentioned interpolation transformation parameter successively, to make above-mentioned interpolation transformation parameter move closer to from the transformation parameter accepted by above-mentioned parameter input part to the transformation parameter corresponding with above-mentioned input picture, until above-mentioned blank value becomes below above-mentioned threshold value;
Above-mentioned changing image generating unit generates above-mentioned changing image successively based on above-mentioned interpolation transformation parameter;
Above-mentioned changing image exports by above-mentioned efferent successively.
11. image processing apparatus as claimed in claim 1, is characterized in that,
Above-mentioned efferent also exports the information of the relation representing above-mentioned blank value and above-mentioned threshold value.
12. image processing apparatus according to any one of claim 1 ~ 11, is characterized in that,
Above-mentioned image processing apparatus is formed as integrated circuit.
13. 1 kinds of image processing methods, is characterized in that, comprising:
Data input step, accepts input picture, represents the input of the acquisition parameters of the depth data of the degree of depth of above-mentioned input picture and above-mentioned input picture;
Parameters input step, accepts the parameter relevant to the projective transformation of 3 dimension module and the input of transformation parameter;
Changing image generation step, by carrying out the projective transformation based on above-mentioned transformation parameter in 3 dimension module obtained according to above-mentioned input picture, above-mentioned depth data and above-mentioned acquisition parameters, thus generates changing image;
White space detecting step, detects the white space of the set as blank pixel in above-mentioned changing image, and this blank pixel is the pixel that there is not respective pixel in above-mentioned input picture; And
Export step, when the blank value notationally stating the size of white space is below threshold value, above-mentioned changing image is exported, when blank value is larger than threshold value, above-mentioned changing image is not exported;
Above-mentioned blank value has carried out weighting, to make the position of white space more close to the value that the blank value of center then this white space of changing image is larger.
CN201280002319.7A 2011-06-08 2012-06-05 Image processing apparatus and image processing method Expired - Fee Related CN103053169B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2011-128651 2011-06-08
JP2011128651 2011-06-08
PCT/JP2012/003682 WO2012169174A1 (en) 2011-06-08 2012-06-05 Image processing device and image processing method

Publications (2)

Publication Number Publication Date
CN103053169A CN103053169A (en) 2013-04-17
CN103053169B true CN103053169B (en) 2016-03-16

Family

ID=47295759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280002319.7A Expired - Fee Related CN103053169B (en) 2011-06-08 2012-06-05 Image processing apparatus and image processing method

Country Status (4)

Country Link
US (1) US9082183B2 (en)
JP (1) JP5927541B2 (en)
CN (1) CN103053169B (en)
WO (1) WO2012169174A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870103B (en) * 2012-12-14 2017-05-24 联想(北京)有限公司 Method for information processing and electronic device
CN105705903A (en) * 2013-11-06 2016-06-22 凸版印刷株式会社 3D-shape measurement device, 3D-shape measurement method, and 3D-shape measurement program
CN111183457A (en) * 2017-09-27 2020-05-19 夏普株式会社 Video image generation device, video image capturing system, video image generation method, control program, and recording medium
CN110675384B (en) * 2019-09-24 2022-06-07 广东博智林机器人有限公司 Image processing method and device
WO2021182130A1 (en) * 2020-03-12 2021-09-16 ソニーグループ株式会社 Image processing device, method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101388967A (en) * 2008-10-20 2009-03-18 四川虹微技术有限公司 Gap filling method for view synthesis
CN101610423A (en) * 2009-07-13 2009-12-23 清华大学 A kind of method and apparatus of rendering image
CN101695139A (en) * 2009-10-14 2010-04-14 宁波大学 Gradable block-based virtual viewpoint image drawing method

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5280280A (en) * 1991-05-24 1994-01-18 Robert Hotto DC integrating display driver employing pixel status memories
JP3059590B2 (en) * 1992-09-30 2000-07-04 富士通株式会社 Stereoscopic display method and apparatus
US20020114521A1 (en) * 1993-10-14 2002-08-22 Tooru Fujii Image processing device and method for identifying an input image and copier including same
KR100414629B1 (en) * 1995-03-29 2004-05-03 산요덴키가부시키가이샤 3D display image generation method, image processing method using depth information, depth information generation method
US5923820A (en) * 1997-01-23 1999-07-13 Lexmark International, Inc. Method and apparatus for compacting swath data for printers
JP4112077B2 (en) * 1998-06-11 2008-07-02 株式会社トプコン Image measurement processing method and apparatus, and recording medium recording image measurement processing program
US6442293B1 (en) 1998-06-11 2002-08-27 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US6633685B1 (en) * 1998-08-05 2003-10-14 Canon Kabushiki Kaisha Method, apparatus, and storage media for image processing
US6631206B1 (en) * 1999-08-30 2003-10-07 University Of Washington Image filtering in HSI color space
US20020126910A1 (en) * 2001-01-02 2002-09-12 Eastman Kodak Company Method of calculating noise from multiple digital images utilizing common noise characteristics
JP3505575B2 (en) 2001-03-23 2004-03-08 独立行政法人産業技術総合研究所 Digital mirror device
US7031548B2 (en) * 2001-10-04 2006-04-18 Hewlett-Packard Development Company, L.P. Method and apparatus for filtering noise from a digital image
JP3945279B2 (en) * 2002-03-15 2007-07-18 ソニー株式会社 Obstacle recognition apparatus, obstacle recognition method, obstacle recognition program, and mobile robot apparatus
US20050196070A1 (en) * 2003-02-28 2005-09-08 Fujitsu Limited Image combine apparatus and image combining method
US8593542B2 (en) * 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US7542034B2 (en) * 2004-09-23 2009-06-02 Conversion Works, Inc. System and method for processing video images
US8320641B2 (en) * 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US8164594B2 (en) * 2006-05-23 2012-04-24 Panasonic Corporation Image processing device, image processing method, program, storage medium and integrated circuit
CA2653815C (en) * 2006-06-23 2016-10-04 Imax Corporation Methods and systems for converting 2d motion pictures for stereoscopic 3d exhibition
TWI356364B (en) * 2006-10-17 2012-01-11 Chimei Innolux Corp Liquid crystal display device and image display me
US7968831B2 (en) * 2007-06-12 2011-06-28 The Boeing Company Systems and methods for optimizing the aimpoint for a missile
JP2009005008A (en) * 2007-06-20 2009-01-08 Olympus Corp Image data processing device and image data processing method
US8005319B2 (en) * 2007-09-24 2011-08-23 Arcsoft, Inc. Method for digitally magnifying images
US8073190B2 (en) * 2007-11-16 2011-12-06 Sportvision, Inc. 3D textured objects for virtual viewpoint animations
JP2010250452A (en) * 2009-04-14 2010-11-04 Tokyo Univ Of Science Arbitrary viewpoint image synthesizing device
JP2011044828A (en) * 2009-08-19 2011-03-03 Fujifilm Corp Stereoscopic image generator, stereoscopic image printing device, and stereoscopic image generation method
JP2011081605A (en) * 2009-10-07 2011-04-21 Fujifilm Corp Image processing apparatus, method and program
US8593483B2 (en) * 2009-10-20 2013-11-26 Apple Inc. Temporal filtering techniques for image signal processing
JP5560721B2 (en) * 2010-01-12 2014-07-30 セイコーエプソン株式会社 Image processing apparatus, image display system, and image processing method
JP2011211474A (en) * 2010-03-30 2011-10-20 Sony Corp Image processing apparatus and image signal processing method
WO2012015359A1 (en) * 2010-07-26 2012-02-02 Agency For Science, Technology And Research Method and device for image processing
US8989448B2 (en) * 2011-03-22 2015-03-24 Morpho, Inc. Moving object detecting device, moving object detecting method, moving object detection program, moving object tracking device, moving object tracking method, and moving object tracking program
JP5799634B2 (en) * 2011-07-22 2015-10-28 株式会社リコー Image processing apparatus and image processing system
JP5891722B2 (en) * 2011-11-10 2016-03-23 セイコーエプソン株式会社 Control device, electro-optical device, electronic apparatus, and control method
US8941750B2 (en) * 2011-12-27 2015-01-27 Casio Computer Co., Ltd. Image processing device for generating reconstruction image, image generating method, and storage medium
US9105078B2 (en) * 2012-05-31 2015-08-11 Apple Inc. Systems and methods for local tone mapping

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101388967A (en) * 2008-10-20 2009-03-18 四川虹微技术有限公司 Gap filling method for view synthesis
CN101610423A (en) * 2009-07-13 2009-12-23 清华大学 A kind of method and apparatus of rendering image
CN101695139A (en) * 2009-10-14 2010-04-14 宁波大学 Gradable block-based virtual viewpoint image drawing method

Also Published As

Publication number Publication date
US20130136342A1 (en) 2013-05-30
WO2012169174A1 (en) 2012-12-13
JPWO2012169174A1 (en) 2015-02-23
JP5927541B2 (en) 2016-06-01
CN103053169A (en) 2013-04-17
US9082183B2 (en) 2015-07-14

Similar Documents

Publication Publication Date Title
US10783683B2 (en) Image stitching
Liu et al. Video frame synthesis using deep voxel flow
USRE47534E1 (en) System, method and a computer readable medium for providing an output image
CN103053169B (en) Image processing apparatus and image processing method
JP5536115B2 (en) Rendering of 3D video images on stereoscopic display
US9165401B1 (en) Multi-perspective stereoscopy from light fields
US8368768B2 (en) Image processing apparatus, image processing method, and program
WO2014076868A1 (en) Image processing device and image processing method
KR102461232B1 (en) Image processing method and apparatus, electronic device, and storage medium
Chen et al. Content-aware image resizing by quadratic programming
CA2745380A1 (en) Devices and methods for processing images using scale space
WO2015156149A1 (en) Image processing apparatus and image processing method
Kanchana et al. Revealing disocclusions in temporal view synthesis through infilling vector prediction
Gallea et al. Physical metaphor for streaming media retargeting
JP7265316B2 (en) Image processing device and image processing method
KR20130098675A (en) Face detection processing circuit and image pick-up device including the same
Kim et al. FPGA implementation of stereoscopic image proceesing architecture base on the gray-scale projection
Somraj et al. Temporal view synthesis of dynamic scenes through 3D object motion estimation with multi-plane images
JP5478200B2 (en) Image processing apparatus, image processing method, and image processing program
US9736456B1 (en) Two dimensional to three dimensional video conversion
Boufama The use of homographies for view synthesis
US20140198176A1 (en) Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data
Liang et al. Fast single frame super-resolution using scale-invariant self-similarity
US20180096488A1 (en) Method of texture synthesis and image processing apparatus using the same
WO2023139757A1 (en) Pose estimation apparatus, pose estimation method, and non-transitory computer-readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160412

Address after: Osaka Japan

Patentee after: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT Co.,Ltd.

Address before: Osaka Japan

Patentee before: Matsushita Electric Industrial Co.,Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160316

Termination date: 20190605