WO2013133057A1 - 画像処理装置および方法、並びにプログラム - Google Patents
画像処理装置および方法、並びにプログラム Download PDFInfo
- Publication number
- WO2013133057A1 WO2013133057A1 PCT/JP2013/054670 JP2013054670W WO2013133057A1 WO 2013133057 A1 WO2013133057 A1 WO 2013133057A1 JP 2013054670 W JP2013054670 W JP 2013054670W WO 2013133057 A1 WO2013133057 A1 WO 2013133057A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- viewpoint
- eye
- eye image
- images
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
Definitions
- the present technology relates to an image processing apparatus, method, and program, and more particularly, to an image processing apparatus, method, and program capable of presenting a more natural stereoscopic image.
- the convergence point of the stereoscopic image obtained by the above-described technique that is, the point where the two optical axes of the photographing unit intersect is one for one stereoscopic image. Therefore, when the user views the obtained stereoscopic image, if the user gazes at a position on the stereoscopic image that is different from the position where the shooting unit is congested, the parallax distribution of the stereoscopic image indicates the actual object. It will be different from what you see, and you will feel uncomfortable.
- the straight line PL11 is the line-of-sight direction of the user's left eye EL
- the straight line PL12 is the line-of-sight direction of the user's right eye ER
- the point P1 is the convergence point.
- the left side surface SD11 in the drawing of the object OB11 is observed in the left eye EL of the user, but the right side surface SD12 in the drawing of the object OB11 is Not observed. Further, in the user's left eye EL, the left side surface SD13 in the drawing of the object OB12 is observed, but the right side surface SD14 in the drawing of the object OB12 is not observed.
- the left side SD11 and the right side SD12 of the object OB11 are observed in the right eye ER of the user, and the left side SD13 and the right side SD13 of the object OB12 are observed.
- Side surface SD14 is observed.
- the point P2 is the convergence point.
- the left side surface SD11 and the right side surface SD12 of the object OB11 are observed in the left eye EL of the user, and the left side surface SD13 of the object OB12 is observed.
- the right side surface SD14 is observed.
- the right side surface SD12 of the object OB11 is observed in the right eye ER of the user, but the left side surface SD11 of the object OB11 is not observed. Further, the right side surface SD14 of the object OB12 is observed in the right eye ER of the user, but the left side surface SD13 of the object OB12 is not observed.
- the parallax distribution is different. For example, when the line-of-sight direction changes by 15 degrees, the lens surface of the human eye moves approximately 3.6 mm, and thus such a change in parallax distribution occurs. However, when the user changes the orientation of the face, The amount of movement is further increased, and the change in the parallax distribution is increased accordingly.
- the parallax distribution varies depending on the position of the convergence point. Therefore, in a single-congestion stereoscopic image, when the user gazes at a position different from the convergence point on the stereoscopic image, the parallax distribution is different from that when an actual object is observed, and the user feels unnatural. .
- human eyes are highly sensitive to parallax, and such a difference in parallax distribution is perceived by the user.
- the sensitivity to a person's spatial resolution is on the order of a corner, whereas the sensitivity to a person's parallax is about one order higher than the sensitivity to the spatial resolution.
- the difference in the parallax distribution when looking at a position different from the convergence point is one factor that causes an unnatural impression due to the difference from the entity.
- the present technology has been made in view of such a situation, and is capable of presenting a more natural stereoscopic image.
- An image processing apparatus is an image group including viewpoint images of a plurality of different viewpoints, and the same on the viewpoint image for each viewpoint based on a plurality of image groups having different gazing points.
- a position determination unit that arranges the viewpoint images on a new coordinate system so that the subjects overlap, and a plurality of viewpoint images arranged on the coordinate system for each viewpoint to generate a combined viewpoint image
- the image processing apparatus includes a synthesis processing unit that generates a stereoscopic image having a plurality of gazing points, which is composed of the synthesized viewpoint images of the viewpoints.
- the image group for each gazing point can have one convergence point, each consisting of a pair of viewpoint images.
- the composition processing unit can generate the composite viewpoint image by applying a weight according to the position in the region where the plurality of viewpoint images overlap and performing an averaging filter process on the viewpoint image.
- the plurality of the image groups may be taken at the same time.
- the plurality of the image groups may be taken at different times for each of the image groups.
- An image processing method or program is an image group including a plurality of viewpoint images from different viewpoints, and is based on the viewpoint image for each viewpoint based on the plurality of image groups having different gazing points.
- the viewpoint images are arranged on a new coordinate system so that the same subject overlaps, and a plurality of viewpoint images arranged on the coordinate system are synthesized for each viewpoint to generate a synthesized viewpoint image. And generating a stereoscopic image having a plurality of gazing points composed of the combined viewpoint images of the respective viewpoints.
- the same subject on the viewpoint image overlaps for each viewpoint based on a plurality of image groups including a plurality of viewpoint images from different viewpoints and different gazing points.
- the viewpoint images are arranged on a new coordinate system, and for each viewpoint, a plurality of the viewpoint images arranged on the coordinate system are synthesized to generate a synthesized viewpoint image, whereby each viewpoint A stereoscopic image having a plurality of gazing points is generated.
- a more natural stereoscopic image can be presented.
- the present technology is for generating a natural stereoscopic image that is less uncomfortable when the user observes it. First, generation of a stereoscopic image according to the present technology will be described.
- the stereoscopic image generated by the present technology includes, for example, a left-eye image observed with the user's left eye and a right-eye image observed with the user's right eye during stereoscopic display.
- the stereoscopic image may be composed of viewpoint images of three or more different viewpoints.
- the stereoscopic image is a left-eye image and a right-eye image that are two different viewpoint images. The explanation will be continued on the assumption that
- a plurality of images with different convergence points are used when generating one stereoscopic image. That is, an image pair including a left eye image and a right eye image having a predetermined convergence point is prepared for each of a plurality of different convergence points. For example, when attention is paid to a right eye image constituting a stereoscopic image, as shown in FIG. 2, four right eye images PC11 to PC14 having different convergence points are combined to generate a final right eye image.
- the right eye image PC11 to the right eye image PC14 are images obtained by photographing two human bust images OB21 and OB22 as subjects.
- the right eye image PC11 is an image captured so that the position of the right eye of the bust OB21 is a convergence point
- the right eye image PC12 is captured so that the position of the left eye of the bust OB21 is a convergence point. It is an image that was made.
- the right eye image PC13 is an image taken so that the position of the right eye of the bust OB22 is a convergence point
- the right eye image PC14 is taken so that the position of the left eye of the bust OB22 is a convergence point. It is an image that was made.
- the four right eye images PC11 to PC14 having different convergence points are arranged on a new coordinate system (plane) so that the same subject on the images overlaps. Then, the superimposed right-eye image PC11 to right-eye image PC14 are combined so as to be smoothly connected to form a final right-eye image (hereinafter also referred to as a right-eye combined image).
- the right eye image is synthesized with a weight according to the position in the overlapping region.
- the right-eye image PC11 and the right-eye image PC13 are synthesized, that is, added with weights to generate a right-eye synthesized image.
- the weight for the right eye image PC11 is made larger than the weight for the right eye image PC13.
- the position closer to the right eye image PC11 is closer to the center of the right eye image PC11 than the center position of the right eye image PC13, for example. It is considered as a position.
- a plurality of left eye images having different convergence points are combined to form a final left eye image (hereinafter also referred to as a left eye combined image).
- FIG. 3 a stereoscopic image composed of the right-eye synthesized image PUR and the left-eye synthesized image PUL is obtained.
- the same reference numerals are given to the portions corresponding to those in FIG. 2, and description thereof will be omitted as appropriate.
- the outlines of the bust OB21 and the bust OB22 as subjects are slightly blurred.
- the outline blur is caused by the parallax between the right eye composite image PUR and the left eye composite image PUL, but the outline blur of the bust OB21 and the bust OB22 in the addition average image PA is small, and the right eye composite image PUR. It can be seen that the parallax of the left-eye synthesized image PUL is an appropriate parallax.
- the parallax between the right eye composite image PUR and the left eye composite image PUL is the right eye image and the left eye captured with the position of the right eye of the bust OB21 as the convergence point.
- the value is close to the parallax of the image. That is, in the region near the right eye of the bust OB21 in the stereoscopic image, the parallax distribution is close to that when the user actually watches the right eye of the bust OB21.
- the parallax between the right eye composite image PUR and the left eye composite image PUL is the same as the right eye image captured with the position of the left eye of the bust OB22 as the convergence point.
- the value is close to the parallax of the left eye image.
- the region near the left eye of the bust OB22 in the stereoscopic image has a parallax distribution close to that when the user actually gazes at the left eye of the bust OB22.
- the right eye composite image PUR and the left eye are synthesized by combining each image so that images with different weight points are weighted and smoothly joined together. This is because the composite image PUL is generated.
- the convergence point of the right eye image and the left eye image for obtaining the right eye composite image PUR and the left eye composite image PUL is a part of a subject that is likely to be watched by the user.
- a plurality of right-eye images and left-eye images having different convergence points are obtained, for example, by photographing a plurality of photographing devices arranged in a direction substantially perpendicular to the optical axis of each photographing device, as indicated by an arrow Q31 in FIG. be able to.
- the photographing device 11R-1, the photographing device 11L-1, the photographing device 11R-2, the photographing device 11L-2, the photographing device 11R-3, and the photographing device 11L-3 are arranged in order.
- the photographing device 11R-1, the photographing device 11R-2, and the photographing device 11R-3 are photographing devices for photographing right-eye images having different convergence points.
- the photographing device 11L-1, the photographing device 11L-2, and the photographing device 11L-3 are photographing devices for photographing left-eye images having different convergence points.
- the imaging device 11R-1 and the imaging device 11L-1, the imaging device 11R-2 and the imaging device 11L-2, and the imaging device 11R-3 and the imaging device 11L-3 have different convergence points. It becomes a device pair.
- photographing devices 11R-1 to 11R-3 when it is not necessary to particularly distinguish the photographing devices 11R-1 to 11R-3, they are also simply referred to as the photographing device 11R, and it is not necessary to particularly distinguish the photographing devices 11L-1 to 11L-3. Also simply referred to as a photographing device 11L.
- the photographing device 11R and the photographing device 11L may be arranged separately.
- a half mirror 12 that transmits half of the light from the direction of the subject and reflects the remaining half is disposed.
- the photographing device 11L-1, the photographing device 11L-2, and the photographing device 11L-3 are arranged in this order from the back side. Further, in the drawing of the half mirror 12, an imaging device 11R-1, an imaging device 11R-2, and an imaging device 11R-3 are arranged in order from the back side toward the front side.
- each imaging device 11L captures a left-eye image by receiving the light emitted from the subject and transmitted through the half mirror 12, and each imaging device 11R is emitted from the subject and is transmitted by the half mirror 12.
- a right eye image is taken by receiving the reflected light.
- the optical axes of the imaging devices 11R are located between the optical axes of the adjacent imaging devices 11L.
- the optical axis of the imaging device 11R-1 is located between the optical axis of the imaging device 11L-1 and the optical axis of the imaging device 11L-2.
- a plurality of right eye images having different convergence points are photographed almost simultaneously by one photographing apparatus 11R-1, and a plurality of left eyes having different convergence points are photographed by one photographing apparatus 11L-1. Images may be taken almost simultaneously.
- the imaging device 11R-1 and the imaging device 11L-1 have a straight line RT11 or a straight line RT12 that is substantially perpendicular to the optical axis of the imaging devices. As rotated. Thereby, it is possible to perform imaging while moving the convergence point of the imaging device 11R-1 and the imaging device 11L-1 to an arbitrary position at high speed. In this case, for example, an image pair including a right eye image and a left eye image having one convergence point is captured at different times for each convergence point.
- a camera capable of photographing 240 frames per second is used as the photographing apparatus 11R. -1 and imaging device 11L-1.
- an electronic shutter may be used together.
- the viewpoint image includes M viewpoint images (however, 3 ⁇ M) viewpoint images.
- a plurality of M viewpoint images may be taken for each convergence point (gaze point).
- each viewpoint image of the m-th viewpoint arranged on the new coordinate system is synthesized to be a synthesized viewpoint image, and a stereoscopic image composed of the synthesized viewpoint images for each of M viewpoints, that is, an M viewpoint image is generated. Is done.
- FIG. 5 is a diagram illustrating a configuration example of an embodiment of a display processing system to which the present technology is applied.
- 5 includes a photographing unit 41, an image processing device 42, a display control unit 43, and a display unit 44.
- the imaging unit 41 captures a right eye image and a left eye image based on the control of the image processing device 42 and supplies the captured images to the image processing device 42.
- the photographing unit 41 includes a right eye image photographing unit 61, a left eye image photographing unit 62, a wide angle right eye image photographing unit 63, and a wide angle left eye image photographing unit 64.
- the right-eye image capturing unit 61 and the left-eye image capturing unit 62 are a pair of capturing devices that capture a right-eye image and a left-eye image at a predetermined convergence point.
- the unit 62 corresponds to the imaging device 11R-1 and the imaging device 11L-1 indicated by the arrow Q33 in FIG.
- the right-eye image photographing unit 61 includes the photographing devices 11R-1 to 11R-3 indicated by the arrows Q31 and Q32 in FIG. 4, and the left-eye image photographing unit 62 is configured by the arrows Q31 and arrow in FIG.
- the imaging apparatus 11L-1 to 11L-3 shown in Q32 may be used.
- the right eye image capturing unit 61 and the left eye image capturing unit 62 capture a plurality of right eye images and left eye images having different convergence points, and supply the obtained right eye image and left eye image to the image processing device 42. .
- a pair of right eye image and left eye image is taken, and the n th (where 1 ⁇ n ⁇ N) right eye image and left eye image are These are also referred to as right eye image R n and left eye image L n , respectively.
- An image pair composed of the right eye image R n and the left eye image L n is an image pair having one convergence point.
- the wide-angle right-eye image capturing unit 63 and the wide-angle left-eye image capturing unit 64 convert the wide-angle images from the right-eye image R n and the left-eye image L n into the wide-angle right-eye image R g and the wide-angle left-eye image L, respectively.
- the image processing device 42 Based on the right eye image R n and left eye image L n , the wide-angle right eye image R g, and the wide-angle left eye image L g supplied from the imaging unit 41, the image processing device 42 performs the right-eye composite image and the left-eye image L n. An eye composite image is generated and supplied to the display control unit 43.
- the image processing device 42 includes a position determination unit 71, a composition processing unit 72, and a cutout unit 73.
- Position determination unit 71 a new coordinate system based on the wide-angle right eye image R g (hereinafter, also referred to as a projected coordinate system) as the right-eye image R n on overlaps the wide right-eye image R g, each determining the position on the projection coordinate system of the right eye image R n.
- the projection coordinate system based on the wide-angle right eye image R g is a two-dimensional coordinate system with its origin at the center position of the wide-angle right eye image R g.
- each left eye image L n is wide left on the projection coordinate system based on the wide-angle left eye image L g
- the position of each left eye image L n on the projection coordinate system is determined so as to overlap with the eye image L g .
- the synthesis processing unit 72 synthesizes the right eye image R n arranged on the projection coordinate system and synthesizes the left eye image L n arranged on the projection coordinate system.
- Cutout portion 73 generates the right-eye image by cutting out a predetermined area of the R n the synthetic-obtained image (trimmed with) the right eye synthesized image on the projection coordinate system, the left on the projection coordinate system cut out a predetermined area of an image obtained by synthesizing the eye image L n to generate the left-eye synthesized image.
- the display control unit 43 supplies the right-eye composite image and the left-eye composite image supplied from the image processing device 42 to the display unit 44, and causes a stereoscopic display.
- the display unit 44 includes, for example, an autostereoscopic display device, and displays a stereoscopic image by displaying the right eye composite image and the left eye composite image supplied from the display control unit 43.
- step S11 the photographing unit 41 takes an image and the right eye image R n and the left-eye image L n for each of a plurality of convergence points and a wide right-eye image R g and wide left-eye image L g.
- the right eye image capturing unit 61 and the left eye image capturing unit 62 capture the right eye image R n and the left eye image L n (where 1 ⁇ n ⁇ N) at N convergence points, respectively, and perform image processing. Supply to device 42. Further, the wide-angle right eye image capturing unit 63 and the wide-angle left eye image capturing unit 64 capture the wide-angle right eye image R g and the wide-angle left eye image L g and supply them to the image processing device 42.
- the image processing device 42 controls the shooting unit 41, and a portion of a desired subject that is likely to be watched by the user is set as a convergence point.
- the right eye image R n and the left eye image L n may be taken.
- the image processing device 42 determines, as the position of the convergence point, a region having a high contrast, that is, a region that is not flat and has some luminance change, from the wide-angle right eye image R g and the wide-angle left eye image L g .
- the imaging unit 41 is controlled so that the area becomes a convergence point.
- the image processing apparatus 42 selects the center of both eyes and face of a person as a convergence point You may do it. Further, for example, when a plurality of people are shown in the wide-angle right-eye image Rg and the wide-angle left-eye image Lg , the area of the face located at the center, left or right position of the screen from those people's faces May be selected as a point of convergence. The detection of a human face from the wide-angle right eye image R g and wide-angle left-eye image L g, it is sufficient to use the face recognition function.
- step S12 the position determining unit 71 arranges the right eye image R n and the left eye image L n on a new projection coordinate system.
- the position determination unit 71 obtains the correlation between the right eye image R n and the wide angle right eye image R g and the sum of absolute differences, thereby using the wide angle right eye image R g as a reference. On the projection coordinate system, a position where the region of the central portion of the right eye image R n most overlaps with the wide-angle right eye image R g is determined. Then, the position determination unit 71 is arranged at a position that defines the respective right-eye images R n.
- the right eye and the central portion of the area of the image R n for example, the right-eye image when the height of the R n is h, and the right eye image R n central region of diameter h / 2 of a circle centered on the the And so on.
- the position determination unit 71 performs, for each left eye image L n , the region of the central portion of the left eye image L n on the projection coordinate system based on the wide-angle left eye image L g. There determines the most overlapping position and angle left eye image L g, arranging the left-eye image L n at that position.
- the right eye images R n are arranged on the projection coordinate system so that the same subject of each right eye image R n overlaps, and the same subject of each left eye image L n overlaps them.
- the eye image L n is arranged on the projection coordinate system.
- step S13 the composition processing unit 72 performs superposition on the right eye image R n and the left eye image L n arranged on the projection coordinate system.
- the composition processing unit 72 uses a Gaussian filter or the like on the right eye image R n so that overlapping portions of the right eye images R n arranged on the projection coordinate system are smoothly continuous. subjected to averaging filter process had, for synthesizing n pieces of right-eye images R n.
- the right eye image R n may be transformed by geometric transformation such as affine transformation.
- the right eye image R n after deformation, superimposed in weighted according to the position within the region where the images overlap with each other. Accordingly, so as not to stand observed boundaries of the right-eye image R n mutually overlapping, the right eye image R n is smoothly synthesized.
- each part of each right eye image R n overlaps each part of the wide-angle right eye image R g with respect to the right eye image R n .
- Geometric transformation such as affine transformation may be performed.
- the composition processing unit 72 causes the left eye image L so that the overlapping portions of the left eye images L n arranged on the projection coordinate system are smoothly continuous.
- An averaging filter process is performed on n to synthesize N left-eye images L n .
- step S ⁇ b > 14 the cutout unit 73 generates a stereoscopic image based on the combined right eye image R n and left eye image L n and supplies the stereoscopic image to the display control unit 43.
- a predetermined region of the image obtained by combining the images is cut out to be a left eye composite image.
- a stereoscopic image composed of the right eye composite image and the left eye composite image is obtained.
- step S15 the display control unit 43 supplies the stereoscopic image supplied from the cutout unit 73 to the display unit 44 for display, and the stereoscopic image generation process ends.
- the stereoscopic image composed of the right eye composite image and the left eye composite image may be a still image or a moving image.
- the stereoscopic image may be a multi-viewpoint image including three or more viewpoint images.
- the display processing system synthesizes a plurality of right eye images and left eye images having different convergence points, and generates a three-dimensional image including the right eye synthesized image and the left eye synthesized image.
- the stereoscopic image obtained in this way has a plurality of convergence points, a more natural stereoscopic image can be presented. That is, when the user observes a stereoscopic image, the difference from the actual parallax distribution is suppressed at each gazing point, and a natural and easy-to-see high-quality stereoscopic image can be presented.
- the imaging unit 41 and the display control unit 43 are connected to the image processing apparatus 42 .
- the imaging unit 41 may be provided in the image processing apparatus 42, or display control may be performed.
- the unit 43 and the display unit 44 may be provided in the image processing device 42.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 7 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 205 is further connected to the bus 204.
- An input unit 206, an output unit 207, a recording unit 208, a communication unit 209, and a drive 210 are connected to the input / output interface 205.
- the input unit 206 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 207 includes a display, a speaker, and the like.
- the recording unit 208 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 209 includes a network interface and the like.
- the drive 210 drives a removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 201 loads, for example, the program recorded in the recording unit 208 to the RAM 203 via the input / output interface 205 and the bus 204, and executes the program. Is performed.
- the program executed by the computer (CPU 201) can be provided by being recorded in the removable medium 211 as a package medium or the like, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 208 via the input / output interface 205 by attaching the removable medium 211 to the drive 210.
- the program can be received by the communication unit 209 via a wired or wireless transmission medium and installed in the recording unit 208.
- the program can be installed in the ROM 202 or the recording unit 208 in advance.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- the present technology can be configured as follows.
- An image processing apparatus comprising: a synthesis processing unit that generates [2] The image processing device according to [1], wherein the image group for each gazing point includes one convergence point that includes a pair of viewpoint images.
- the synthesis processing unit generates the synthesized viewpoint image by performing weighted average filter processing on the viewpoint image with a weight according to a position in a region where the plurality of viewpoint images overlap [1] or [ 2].
- [4] The image processing device according to any one of [1] to [3], wherein the plurality of image groups are taken at the same time.
- [5] The image processing device according to any one of [1] to [3], wherein the plurality of image groups are taken at different times for each of the image groups.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
[立体画像の生成について]
本技術は、ユーザが観察した場合に、より違和感のない自然な立体画像を生成するためのものである。まず、本技術による立体画像の生成について説明する。
以上においては、本技術による立体画像を構成する右眼合成画像と左眼合成画像は、それぞれ輻輳点の異なる複数の右眼画像または左眼画像を合成することで生成されると説明した。次に、これらの右眼合成画像と左眼合成画像を生成するのに用いられる右眼画像と左眼画像の撮影について説明する。
次に、本技術を適用した具体的な実施の形態について説明する。図5は、本技術を適用した表示処理システムの一実施の形態の構成例を示す図である。
ところで、図5の表示処理システムに対して、立体画像の生成および表示が指示されると、表示処理システムは立体画像生成処理を行なって、立体画像を表示する。以下、図6のフローチャートを参照して、表示処理システムによる立体画像生成処理について説明する。
複数の異なる視点の視点画像からなる画像群であって、互いに注視点の異なる複数の画像群に基づいて、前記視点ごとに、前記視点画像上の同じ被写体が重なるように、前記視点画像を新たな座標系上に配置する位置決定部と、
前記視点ごとに、前記座標系上に配置された複数の前記視点画像を合成して合成視点画像を生成することで、各前記視点の前記合成視点画像からなる、複数の注視点を有する立体画像を生成する合成処理部と
を備える画像処理装置。
[2]
前記注視点ごとの画像群は、それぞれ一対の視点画像からなる、1つの輻輳点を有する
[1]に記載の画像処理装置。
[3]
前記合成処理部は、複数の前記視点画像が重なる領域内の位置に応じた重みを付けて、前記視点画像に対する加算平均フィルタ処理を行なうことで、前記合成視点画像を生成する
[1]または[2]に記載の画像処理装置。
[4]
前記複数の前記画像群は同一時刻に撮影されたものである
[1]乃至[3]の何れかに記載の画像処理装置。
[5]
前記複数の前記画像群は、前記画像群ごとに異なる時刻に撮影されたものである
[1]乃至[3]の何れかに記載の画像処理装置。
Claims (7)
- 複数の異なる視点の視点画像からなる画像群であって、互いに注視点の異なる複数の画像群に基づいて、前記視点ごとに、前記視点画像上の同じ被写体が重なるように、前記視点画像を新たな座標系上に配置する位置決定部と、
前記視点ごとに、前記座標系上に配置された複数の前記視点画像を合成して合成視点画像を生成することで、各前記視点の前記合成視点画像からなる、複数の注視点を有する立体画像を生成する合成処理部と
を備える画像処理装置。 - 前記注視点ごとの画像群は、それぞれ一対の視点画像からなる、1つの輻輳点を有する
請求項1に記載の画像処理装置。 - 前記合成処理部は、複数の前記視点画像が重なる領域内の位置に応じた重みを付けて、前記視点画像に対する加算平均フィルタ処理を行なうことで、前記合成視点画像を生成する
請求項2に記載の画像処理装置。 - 前記複数の前記画像群は同一時刻に撮影されたものである
請求項3に記載の画像処理装置。 - 前記複数の前記画像群は、前記画像群ごとに異なる時刻に撮影されたものである
請求項3に記載の画像処理装置。 - 複数の異なる視点の視点画像からなる画像群であって、互いに注視点の異なる複数の画像群に基づいて、前記視点ごとに、前記視点画像上の同じ被写体が重なるように、前記視点画像を新たな座標系上に配置し、
前記視点ごとに、前記座標系上に配置された複数の前記視点画像を合成して合成視点画像を生成することで、各前記視点の前記合成視点画像からなる、複数の注視点を有する立体画像を生成する
ステップを含む画像処理方法。 - 複数の異なる視点の視点画像からなる画像群であって、互いに注視点の異なる複数の画像群に基づいて、前記視点ごとに、前記視点画像上の同じ被写体が重なるように、前記視点画像を新たな座標系上に配置し、
前記視点ごとに、前記座標系上に配置された複数の前記視点画像を合成して合成視点画像を生成することで、各前記視点の前記合成視点画像からなる、複数の注視点を有する立体画像を生成する
ステップを含む処理をコンピュータに実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA 2861212 CA2861212A1 (en) | 2012-03-07 | 2013-02-25 | Image processing device and method, and program |
US14/381,248 US20150116202A1 (en) | 2012-03-07 | 2013-02-25 | Image processing device and method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012050211 | 2012-03-07 | ||
JP2012-050211 | 2012-03-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013133057A1 true WO2013133057A1 (ja) | 2013-09-12 |
Family
ID=49116539
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/054670 WO2013133057A1 (ja) | 2012-03-07 | 2013-02-25 | 画像処理装置および方法、並びにプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150116202A1 (ja) |
CA (1) | CA2861212A1 (ja) |
WO (1) | WO2013133057A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5982751B2 (ja) | 2011-08-04 | 2016-08-31 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
US10477192B2 (en) * | 2016-09-14 | 2019-11-12 | Semiconductor Energy Laboratory Co., Ltd. | Display system and electronic device |
US11627299B1 (en) * | 2019-08-20 | 2023-04-11 | Opic, Inc. | Method and apparatus for a stereoscopic smart phone |
WO2023235093A1 (en) * | 2022-05-31 | 2023-12-07 | Douglas Robert Edwin | A method and apparatus for a stereoscopic smart phone |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10262176A (ja) * | 1997-03-19 | 1998-09-29 | Teiichi Okochi | 映像形成方法 |
JPH11261797A (ja) * | 1998-03-12 | 1999-09-24 | Fuji Photo Film Co Ltd | 画像処理方法 |
JP2003087550A (ja) * | 2001-09-12 | 2003-03-20 | Sanyo Electric Co Ltd | 画像合成装置、画像合成方法、画像合成処理プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2011035643A (ja) * | 2009-07-31 | 2011-02-17 | Fujifilm Corp | 多眼撮影方法および装置、並びにプログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100499130B1 (ko) * | 2001-11-27 | 2005-07-04 | 삼성전자주식회사 | 조명 변화에 강건한 영상 검색 방법 및 장치 |
US7097619B2 (en) * | 2002-09-03 | 2006-08-29 | Siemens Medical Solutions Usa, Inc. | Elevation beam pattern variation for ultrasound imaging |
US7466856B2 (en) * | 2002-09-26 | 2008-12-16 | Samsung Electronics Co., Ltd. | Image retrieval method and apparatus independent of illumination change |
JP4318465B2 (ja) * | 2002-11-08 | 2009-08-26 | コニカミノルタホールディングス株式会社 | 人物検出装置および人物検出方法 |
US20070083114A1 (en) * | 2005-08-26 | 2007-04-12 | The University Of Connecticut | Systems and methods for image resolution enhancement |
JP2008067230A (ja) * | 2006-09-08 | 2008-03-21 | Sony Corp | 画像処理装置、画像処理方法、及び、プログラム |
JP4961965B2 (ja) * | 2006-11-15 | 2012-06-27 | 株式会社ニコン | 被写体追跡プログラム、被写体追跡装置、およびカメラ |
JPWO2009001467A1 (ja) * | 2007-06-28 | 2010-08-26 | 富士通株式会社 | 低照度環境における撮影画像の明るさを改善する電子機器 |
JP4811433B2 (ja) * | 2007-09-05 | 2011-11-09 | ソニー株式会社 | 画像選択装置、画像選択方法、およびプログラム |
WO2010002070A1 (en) * | 2008-06-30 | 2010-01-07 | Korea Institute Of Oriental Medicine | Method for grouping 3d models to classify constitution |
KR101445185B1 (ko) * | 2008-07-10 | 2014-09-30 | 삼성전자주식회사 | 복수 개의 영상촬영유닛을 구비한 플렉시블 영상촬영장치및 그 제조방법 |
US8472670B2 (en) * | 2008-08-08 | 2013-06-25 | Panasonic Corporation | Target detection device and target detection method |
JP5230376B2 (ja) * | 2008-11-28 | 2013-07-10 | 三星電子株式会社 | 撮像装置及び撮像方法 |
JP2011205374A (ja) * | 2010-03-25 | 2011-10-13 | Fujifilm Corp | 表示装置 |
EA201200255A1 (ru) * | 2012-02-22 | 2013-08-30 | Закрытое Акционерное Общество "Импульс" | Способ подавления шума цифровых рентгенограмм |
US8774508B2 (en) * | 2012-02-27 | 2014-07-08 | Denso It Laboratory, Inc. | Local feature amount calculating device, method of calculating local feature amount, corresponding point searching apparatus, and method of searching corresponding point |
-
2013
- 2013-02-25 CA CA 2861212 patent/CA2861212A1/en not_active Abandoned
- 2013-02-25 US US14/381,248 patent/US20150116202A1/en not_active Abandoned
- 2013-02-25 WO PCT/JP2013/054670 patent/WO2013133057A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10262176A (ja) * | 1997-03-19 | 1998-09-29 | Teiichi Okochi | 映像形成方法 |
JPH11261797A (ja) * | 1998-03-12 | 1999-09-24 | Fuji Photo Film Co Ltd | 画像処理方法 |
JP2003087550A (ja) * | 2001-09-12 | 2003-03-20 | Sanyo Electric Co Ltd | 画像合成装置、画像合成方法、画像合成処理プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2011035643A (ja) * | 2009-07-31 | 2011-02-17 | Fujifilm Corp | 多眼撮影方法および装置、並びにプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20150116202A1 (en) | 2015-04-30 |
CA2861212A1 (en) | 2013-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102054363B1 (ko) | 시선 보정을 위한 화상 회의에서 영상 처리를 위한 방법 및 시스템 | |
RU2638776C1 (ru) | Устройство генерирования изображения и способ | |
JP4228646B2 (ja) | 立体視画像生成方法および立体視画像生成装置 | |
JP6023801B2 (ja) | シミュレーション装置 | |
US9204126B2 (en) | Three-dimensional image display device and three-dimensional image display method for displaying control menu in three-dimensional image | |
JP2019079552A (ja) | イメージ形成における及びイメージ形成に関する改良 | |
JP2000354257A (ja) | 画像処理装置、画像処理方法、およびプログラム提供媒体 | |
US11962746B2 (en) | Wide-angle stereoscopic vision with cameras having different parameters | |
JP2017505565A (ja) | 多面映像の生成方法及びシステム | |
JP2010033367A (ja) | 情報処理装置及び情報処理方法 | |
JP2017204674A (ja) | 撮像装置、ヘッドマウントディスプレイ、情報処理システム、および情報処理方法 | |
Blum et al. | The effect of out-of-focus blur on visual discomfort when using stereo displays | |
US20120056885A1 (en) | Image generation system, image generation method, and information storage medium | |
US10885651B2 (en) | Information processing method, wearable electronic device, and processing apparatus and system | |
JP2003284093A (ja) | 立体画像処理方法および装置 | |
JP2011176800A (ja) | 画像処理装置、立体表示装置及び画像処理方法 | |
WO2019098198A1 (ja) | 画像生成装置、ヘッドマウントディスプレイ、画像生成システム、画像生成方法、およびプログラム | |
US20170171534A1 (en) | Method and apparatus to display stereoscopic image in 3d display system | |
EP3526639A1 (en) | Display of visual data with a virtual reality headset | |
JP2011010126A (ja) | 画像処理装置、画像処理方法 | |
WO2013133057A1 (ja) | 画像処理装置および方法、並びにプログラム | |
CN109799899B (zh) | 交互控制方法、装置、存储介质和计算机设备 | |
CN112995638A (zh) | 自动调节视差的裸眼3d采集和显示系统及方法 | |
JPWO2017141584A1 (ja) | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム | |
TWI589150B (zh) | 3d自動對焦顯示方法及其系統 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13757336 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2861212 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14381248 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13757336 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |