WO2005112475A1 - 画像処理装置 - Google Patents
画像処理装置 Download PDFInfo
- Publication number
- WO2005112475A1 WO2005112475A1 PCT/JP2005/007865 JP2005007865W WO2005112475A1 WO 2005112475 A1 WO2005112475 A1 WO 2005112475A1 JP 2005007865 W JP2005007865 W JP 2005007865W WO 2005112475 A1 WO2005112475 A1 WO 2005112475A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pattern
- images
- monocular
- monocular images
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
Definitions
- the present invention relates to an image processing apparatus suitable for an electronic camera or the like that captures a stereo image.
- the system controller includes a stereo adapter detector that detects whether the stereo adapter is attached, and an automatic exposure (AE) controller that analyzes the subject image signal related to the photometric area and calculates the photometric information required for exposure control.
- AE automatic exposure
- a photometry area setting unit for setting the photometry area described above is provided, and the photometry area setting unit has a function of setting different photometry areas in the normal shooting mode and the stereo shooting mode, respectively. It discloses a technique for setting an optimal photometric area for each of a normal shooting mode and a stereo shooting mode.
- a method of projecting two images of the same subject (hereinafter, referred to as a monocular image) mutually shifted according to parallax on the left and right sides of one image frame.
- a monocular image in order to obtain a three-dimensional effect from one image in which the monocular images are arranged on the left and right (hereinafter, referred to as an integrated image)
- the left and right monocular images in the integrated image are right-eye or left-eye, respectively.
- Observe with. Combine the two images observed with the left and right eyes into one
- an image having a three-dimensional effect can be recognized.
- This perceived image (hereinafter referred to as a fusion image) gives perspective to each part according to the amount of displacement of each part of the left and right monocular images.
- a black-level band-shaped image (hereinafter referred to as a supplementary image) is arranged in the periphery of the integrated image and in the center in the horizontal direction, and only the area surrounded by the supplementary image is made effective.
- a method of easily recognizing the range of a simple monochromatic image is adopted.
- the degree of publicity of the integrated image is low, and it may not be recognized that the integrated image having the accompanying image is a stereoscopic image.
- the integrated image may be subjected to the same image processing as that of a general monocular image, or may be erroneously recognized as an image due to a shooting error and may not be subjected to necessary image processing. There was a point.
- the fused image is obtained by observing the left and right monocular images with the left and right eyes, respectively.
- the present invention provides an image processing apparatus that can easily distinguish an integrated image from a monocular image and that can effectively support fusion for recognizing a fused image from the integrated image. Aim.
- the image processing apparatus for a plurality of monocular images of the same subject obtained with a predetermined parallax, includes ancillary images arranged on a part or all of the edges of the plurality of monocular images.
- An incidental image setting means for setting; an image pattern setting means for setting each image pattern arranged on the incidental image with a predetermined relative positional relationship with each of the monocular images of the plurality of monocular images;
- a plurality of monocular images are provided, and integrated image generating means for generating one integrated image based on the plurality of monochromatic images and the supplementary image on which the image pattern is arranged is provided. .
- the supplementary image setting means sets supplementary images to be arranged on a part or all of the edges of the plurality of monocular images.
- the image pattern setting means sets each image pattern arranged on the incidental image with a predetermined relative positional relationship to each of the plurality of monocular images.
- the integrated image generating means is provided with a plurality of monocular images of the same subject obtained with a predetermined parallax, and a single image based on the plurality of monocular images and ancillary images on which image patterns are arranged. Generate an integrated image.
- FIG. 1 is a block diagram showing an electronic camera incorporating an image processing device according to a first embodiment of the present invention.
- FIG. 2 is an explanatory diagram showing a configuration of a mirror type stereo adapter in FIG. 1.
- FIG. 3 is an explanatory diagram for explaining an integrated image.
- FIG. 4 is an explanatory view showing another example of an incidental image.
- FIG. 5 is an explanatory view showing another example of the accompanying image.
- FIG. 6 is an explanatory diagram showing an image pattern of a character “T” when a three-dimensional “STREO” character string is adopted as a fusion support pattern.
- FIG. 7 is an explanatory diagram showing an example of a data format of an image file.
- FIG. 1 is a block diagram showing an electronic camera in which the image processing device according to the first embodiment of the present invention is incorporated. (First Embodiment)
- the electronic camera includes a camera body 1, a lens unit 5 having a lens barrel, and a stereo adapter 10 for capturing a stereo image.
- a mirror type stereo adapter 10 is detachably attached to the lens unit 5.
- the stereo adapter 10 has a configuration in which mirrors 11 and 12 are arranged at positions separated by parallax, respectively, and mirrors 13 and 14 for guiding light reflected by these mirrors 11 and 12 to the camera side are arranged. Te ru.
- the light that has passed through the mirrors 11 and 13 and the mirrors 12 and 14 in the stereo adapter 10 is respectively transmitted through the photographing lens group 21 in the lens unit 5, the exposure control mechanism 22, and the half mirror in the camera body 1. Guided to 31.
- the lens unit 5 includes a photographic lens group 21, an exposure control mechanism 22, a lens driving mechanism 23, a lens driver 24, and an exposure control driver 25.
- the imaging lens group 21 is a main imaging optical system capable of performing normal monocular imaging (monocular imaging) when the stereo adapter 10 is not attached, and is driven by the lens driving mechanism 23 to perform focusing and zooming. Is being adjusted!
- the lens driving mechanism 23 is controlled by a lens driver 24.
- the exposure control mechanism 22 controls the aperture of the photographing lens group 21 and a shutter device (not shown).
- the exposure control mechanism 22 is controlled by an exposure control driver 25.
- the light guided from the lens unit 5 to the camera body 1 passes through the half mirror 31 and is guided to the CCD color image sensor 34 via the low-pass and infrared cut filter system 32 to form an image. It is supposed to be.
- the CCD color image sensor 34 is driven and controlled by a CCD driver 35 to convert an optical image of a subject into an electric signal.
- a CCD driver 35 As the CCD color imaging device 34, for example, an interline type with a vertical overflow drain structure and a progressive (sequential) scanning type is employed.
- FIG. 2 is an explanatory diagram showing a configuration of the mirror type stereo adapter in FIG.
- a stereo adapter 10 of a set of mirrors is detachable from a lens unit 5 having a lens barrel mounted on the camera body 1.
- mirrors 11 and 12 are respectively arranged at positions apart from each other by parallax.
- the mirrors 13 and 14 for guiding the light reflected by the camera to the camera are arranged.
- the light incident on the right-eye viewing mirror 11 of the stereo adapter 10 forms an image on the region R of the imaging surface 34 a of the CCD color imaging device 34 via the mirror 13 and the imaging lens group 21.
- the light incident on the left-eye viewing mirror 12 is imaged on the area L of the imaging surface 34a of the CCD color imaging device 34 via the mirror 14 and the imaging lens group 21. .
- the signal photoelectrically converted by the CCD color image sensor 34 is passed through a pre-processing circuit 36 including AZD conversion and the like, to a digital process circuit for performing color signal generation processing, matrix conversion processing, and other various digital processing. Given to 39.
- color image data is generated by processing the digitized image signal.
- An LCD display unit 40 is connected to the digital process circuit 39, and a memory card 42 such as a CF (Compact Flash Memory Card) or a smart media is connected via a card interface (IF) 41. .
- the LCD display section 40 performs display based on color image data, and the memory card 42 stores color image data.
- the memory card 42 can be loaded into an external personal computer 60. Then, the image recorded on the memory card 42 can be displayed on the personal computer 60 and image processing can be performed. Further, the image recorded on the memory card 42 can be printed out by a printer (not shown).
- the half mirror 31 is configured so that an incident subject image is partially reflected, and is configured to guide the reflected light to the AF sensor module 45.
- the AF sensor module 45 performs focus detection based on a light beam that has entered through the photographing lens group 21.
- the AF sensor module 45 includes a separator lens 46 for dividing a pupil and an AF sensor 47 including a line sensor.
- a system controller 50 constituted by a CPU and the like is used to control each unit in the camera body 1 and the lens cutout 5 as a whole.
- the system controller 50 includes a lens driver 24, an exposure control driver 25, a CCD driver 35, a pre-process circuit 36, a digital process circuit 39, an AF sensor module 45, an operation switch section 52, and the like.
- Operation display section 53, nonvolatile memory (EEPROM) 51, stereo switching Switch (SW) 54 is connected.
- the operation switch unit 52 includes various switch powers such as a release switch and a shooting mode setting.
- the operation display unit 53 is a display unit for displaying an operation state, a mode state, and the like of the camera.
- the EEPROM 51 is a memory for storing various setting information and the like.
- the stereo switch 54 is a switch for switching modes when the stereo adapter 10 is mounted on the lens unit 5. Note that, here, the shooting mode switching is performed by operating the stereo switching switch 54. The switching is not limited to this. For example, a detection function may be provided in the stereo adapter 10 so that the shooting mode is automatically switched.
- the system controller 50 controls the exposure control mechanism 22 and the CCD driver 35 to drive the CCD color image sensor 34 to perform exposure (charge accumulation) and signal reading.
- the system controller 50 supplies the output of the CCD 34 to the digital process circuit 39 via the pre-process circuit 36 to perform various kinds of signal processing, and records the signals on the memory card 42 via the card interface 41.
- the strobe 57 emits flash light, and is controlled by the system controller 50 via the exposure control driver 25 in the lens unit 5!
- the system controller 50 further includes an exposure control unit 50d and a photometric area setting unit 50e.
- the exposure control unit 50d analyzes the subject image signal relating to the photometric area and calculates exposure information necessary for exposure control.
- the photometric area setting unit 50e sets a photometric area for the exposure control unit 50d.
- the system controller 50 further includes an additional image setting unit 50a, an integrated image generation unit 50b, and an image file generation unit 50c.
- the integrated image generation unit 50b is for generating an integrated image including two monocular images.
- the image file generation unit 50c can convert the integrated image into an electronic image file in a predetermined format and output the electronic image file.
- the image file generation unit 50c performs a compression process on the integrated image as necessary, and generates a digital image file of a predetermined format shown in FIG. 7 to which attached data (metadata) is added.
- the incidental image setting section 50a is for setting an incidental image that defines an effective range of each monocular image included in the integrated image.
- the supplementary image is set as a band of a predetermined level on the edge of the integrated image.
- the incidental image setting unit 50a instructs the integrated image generation unit 50b on the incidental image to be set, and the integrated image generation unit 50b generates the incidental image in the integrated image according to the instruction of the incidental image setting unit 50a.
- the additional image setting unit 50a can set an arbitrary image pattern as the additional image.
- vignetting actually occurs at a boundary portion, or a shift occurs in an image forming position.
- a predetermined portion in the LR area is trimmed to set the range of an effective monochromatic image (hereinafter referred to as an image frame).
- the additional image setting unit 50a sets an additional image so as to surround this image frame.
- the supplementary image setting unit 50a as an image pattern setting means includes a predetermined image pattern (hereinafter, referred to as an image) for identifying that the image is an integrated image in the supplementary image. At least a predetermined image pattern (hereinafter referred to as a fusion support pattern) for supporting the fusion by discriminating pattern and! / ⁇ ⁇ ) and observing the integrated image to enable easy fusion image acquisition. You can now make settings to include one.
- a predetermined image pattern hereinafter, referred to as an image
- a fusion support pattern for supporting the fusion by discriminating pattern and! / ⁇ ⁇
- the supplementary image setting unit 50a adopts the character string "STEREO” as an image pattern that also serves as the identification pattern and the fusion support pattern.
- the auxiliary image setting unit 50a arranges the image pattern “STEREO” in the lower or upper auxiliary image portion of each monocular image.
- the supplementary image setting unit 50a arranges the same pattern of “STEREO”, which is the fusion support pattern, in such a manner that their relative positions are coincident with each monocular image, for example. That is, assuming that the image frame size and magnification of the left and right monocular images are the same, and the shape and size of the fusion support pattern are the same, the relative positions with respect to each monocular image match.
- the fusion support pattern is located at a reference position where depth is not felt in the fusion image, that is, on the image frame plane.
- each fusion support pattern arranged corresponding to each monocular image do not necessarily have to match, and by appropriately setting the shape and size of the pattern and the relative position to each monocular image, a predetermined It is also possible to support fusion by localizing with a sense of depth or presenting it as a three-dimensional shape.
- FIG. 3 is an explanatory diagram for explaining an integrated image.
- the subject optical image incident via the stereo adapter 10 forms an image on the imaging surface of the CCD color imaging device 34 via the imaging lens group 21, the exposure control mechanism 22, the half mirror 31, and the filter system 32.
- one image including the left and right monocular images L and R is obtained by the CCD color imaging device.
- the image signal from the CCD color image sensor 34 is input to the controller 50 via the pre-processing circuit 36.
- the incidental image setting unit 50a in the controller 50 sets a band-like incidental image area that divides the range of an effective monochromatic image in consideration of vignetting and deviation of an image forming position. Based on the setting of the incidental image setting section 50a, the integrated image generating section 50b generates an integrated image including the incidental image.
- FIG. 3 shows an integrated image generated by the integrated image generation unit 50b.
- the shaded portion in FIG. 3 indicates a band-shaped incidental image set by the incidental image setting section 50a. That is, as shown in FIG. 3, the integrated image includes left and right monocurricular images 71L and 71R and a band-shaped incidental image 7Is arranged on the edges thereof. Note that L and R in FIG. 3 indicate that the left and right monochromatic images are arranged! / !!
- the supplementary image setting unit 50a displays the character pattern 72L, 72R of “STEREO”, which is an image pattern that combines the identification pattern and the fusion support pattern, on the supplementary image 71s. .
- the integrated image generation unit 50b arranges the character patterns 72L and 72R in the accompanying image 7Is below the monocular images 71L and 71R, respectively.
- These character patterns 72L and 72R are configured, for example, in the same shape and the same size, and have the same relative positional relationship to the monocular images 71L and 71R, respectively. Set to one.
- the supplementary image setting unit 50a can set the relative positional relationship of the character patterns 72L, 72R to the monochromatic images 71L, 71R based on the image frame of each monochromatic image 71L, 71R.
- the integrated image shown in FIG. 3 includes character patterns 72L and 72R of “STERO” indicating that the image is an integrated image on the supplementary image 71a. It can be easily identified.
- the character patterns 72L and 72R have the same relative positional relationship to the monocular images 71L and 71R, respectively, and have the same shape and size. Is located on the image frame plane. Therefore, it is easy to form a fusion image by attempting fusion so that the character string “STERO” overlaps when recognizing the image.
- the image file generation unit 50c converts the integrated image generated by the integrated image generation unit 50b into an electronic image file in a predetermined format. That is, the image file generation unit 50c performs a compression process on the integrated image as necessary, and generates a digital image file of a predetermined format to which the attached data (metadata) is added. For example, the image file generating unit 50c codes the attached image in terms of the range and the luminance level on the integrated image and outputs the metadata as metadata.
- the electronic image file of the integrated image from the image file generation unit 50c is provided to the digital process circuit 39.
- the digital process circuit 39 can display the integrated image on the display screen of the LCD 40 based on the input electronic image file. Further, the digital process circuit 39 can also provide the input electronic image file to the memory card 42 via the card IF 41 and record it.
- the relative positional relationship with respect to the left and right monocular images is mutually different.
- Two image patterns of the same size and shape are included as an identification pattern and a fusion support pattern.
- the identification pattern is not limited to a fuzzy character as long as it is an image pattern that can be identified as an integrated image.
- the arrangement position is also free. It may be composed only of.
- the fusion support pattern may be a pair of image patterns arranged in a predetermined positional relationship in each monocular image, depending on the display position including the depth direction in the fusion image. Image patterns of shapes and sizes having a predetermined relationship with each other can be adopted.
- the example in which the present invention is applied to an electronic camera can be applied to a single image processing apparatus that processes an image captured by a force electronic camera. Similar functions can be achieved by a program such as a personal computer that processes a captured image.
- the force fusion support pattern described in the example in which the supplementary images are arranged in a band shape on the entire edge of the left and right monocular images is arranged on the upper or lower incident image of the left and right monocular images. It should just be arranged. Therefore, the auxiliary image 75D may be arranged only below the integrated image as shown in FIG. 4, and as shown in FIG. 5, the left and right monocular images 71L, The accompanying image 77s may be arranged on the edge of the 71R. Although not shown, ancillary images may be arranged only above the integrated image. Further, the supplementary image may be arranged at a part of one edge of each monocular image.
- the two fusion support patterns provided for the left and right monocular images, respectively, are formed in a shape or the like according to the depth direction in the fusion image.
- the fusion support pattern itself can be configured to be stereoscopically viewed.
- FIG. 6 is an explanatory diagram showing an image pattern of a character “T” in the case where a three-dimensional “STREO” character string is adopted as the fusion support pattern.
- the pattern of the surface of the character ⁇ (the solid portion in Fig. 6) has a relative positional relationship to the left and right monocular images. different.
- the pattern on the back side of the letter T (not shown) has the same relative position with respect to the left and right monochromatic images. That is, the pattern on the back side of the character ⁇ is localized in the image frame plane, which is the reference position in the depth direction in the fusion image, the front side of the character T protrudes forward, and the side of the character T is shaded in FIG.
- the three-dimensional character that is displayed according to the image shown in (1) is recognized. The same applies to other characters.
- a pattern having unevenness on the surface can be set, and the pattern can be set so that the color and brightness of the side surface have gradation. is there.
- the stereo system has been described as a binocular system corresponding to the left and right eyes.
- the present invention is similarly applied to a general multi-view stereo system having three or more eyes. Of course it is possible.
- each of the above-described embodiments an example is described in which one image in which a monocular image is arranged on the left and right is acquired by a camera, and an additional image is set for this image.
- each monocular image obtained by separately photographing with two cameras is separately input, and one integrated image is generated based on each monocular image and ancillary image. It is clear that this can be applied.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05737297A EP1744564A4 (en) | 2004-04-26 | 2005-04-26 | IMAGE PROCESSOR |
US11/586,044 US20070035618A1 (en) | 2004-04-26 | 2006-10-24 | Image processing apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004130126A JP2005311983A (ja) | 2004-04-26 | 2004-04-26 | 画像処理装置 |
JP2004-130126 | 2004-04-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/586,044 Continuation US20070035618A1 (en) | 2004-04-26 | 2006-10-24 | Image processing apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005112475A1 true WO2005112475A1 (ja) | 2005-11-24 |
WO2005112475A9 WO2005112475A9 (ja) | 2006-01-05 |
Family
ID=35394533
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/007865 WO2005112475A1 (ja) | 2004-04-26 | 2005-04-26 | 画像処理装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20070035618A1 (ja) |
EP (1) | EP1744564A4 (ja) |
JP (1) | JP2005311983A (ja) |
CN (1) | CN1947430A (ja) |
WO (1) | WO2005112475A1 (ja) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4993578B2 (ja) * | 2007-01-15 | 2012-08-08 | オリンパスイメージング株式会社 | 画像ファイル再生装置,画像ファイル加工編集装置 |
US8487982B2 (en) * | 2007-06-07 | 2013-07-16 | Reald Inc. | Stereoplexing for film and video applications |
US9524700B2 (en) * | 2009-05-14 | 2016-12-20 | Pure Depth Limited | Method and system for displaying images of various formats on a single display |
TW201119353A (en) | 2009-06-24 | 2011-06-01 | Dolby Lab Licensing Corp | Perceptual depth placement for 3D objects |
CN102498720B (zh) * | 2009-06-24 | 2015-09-02 | 杜比实验室特许公司 | 在3d或多视图视频数据中嵌入字幕和/或图形叠层的方法 |
KR101801017B1 (ko) * | 2010-02-09 | 2017-11-24 | 코닌클리케 필립스 엔.브이. | 3d 비디오 포맷 검출 |
JP5873813B2 (ja) * | 2010-02-19 | 2016-03-01 | トムソン ライセンシングThomson Licensing | ステレオロゴ挿入 |
US9426441B2 (en) | 2010-03-08 | 2016-08-23 | Dolby Laboratories Licensing Corporation | Methods for carrying and transmitting 3D z-norm attributes in digital TV closed captioning |
WO2012145191A1 (en) | 2011-04-15 | 2012-10-26 | Dolby Laboratories Licensing Corporation | Systems and methods for rendering 3d images independent of display size and viewing distance |
JP6017735B2 (ja) | 2014-10-14 | 2016-11-02 | オリンパス株式会社 | 撮像システム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09160143A (ja) * | 1995-12-12 | 1997-06-20 | Nikon Corp | ステレオカメラ |
JPH09327042A (ja) * | 1996-06-04 | 1997-12-16 | Sony Corp | 3次元立体映像信号変換装置の撮影用のビデオカメラ装置及び該装置に用いられる光学アダプター装置 |
JPH1188912A (ja) * | 1997-09-10 | 1999-03-30 | Canon Inc | 複眼カメラ及び複眼カメラにおける表示制御方法 |
JP2002077943A (ja) * | 2000-08-29 | 2002-03-15 | Olympus Optical Co Ltd | 画像取り扱い装置 |
JP2003284096A (ja) * | 2002-01-16 | 2003-10-03 | Olympus Optical Co Ltd | ステレオ撮影装置、ファインダ、目印提示部材、及び、ステレオ撮影装置の撮影方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09322199A (ja) * | 1996-05-29 | 1997-12-12 | Olympus Optical Co Ltd | 立体映像ディスプレイ装置 |
US6765568B2 (en) * | 2000-06-12 | 2004-07-20 | Vrex, Inc. | Electronic stereoscopic media delivery system |
EP1501317A4 (en) * | 2002-04-25 | 2006-06-21 | Sharp Kk | Image data generation device, image data reproduction device and image data recording medium |
-
2004
- 2004-04-26 JP JP2004130126A patent/JP2005311983A/ja active Pending
-
2005
- 2005-04-26 EP EP05737297A patent/EP1744564A4/en not_active Withdrawn
- 2005-04-26 WO PCT/JP2005/007865 patent/WO2005112475A1/ja not_active Application Discontinuation
- 2005-04-26 CN CNA2005800131454A patent/CN1947430A/zh active Pending
-
2006
- 2006-10-24 US US11/586,044 patent/US20070035618A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09160143A (ja) * | 1995-12-12 | 1997-06-20 | Nikon Corp | ステレオカメラ |
JPH09327042A (ja) * | 1996-06-04 | 1997-12-16 | Sony Corp | 3次元立体映像信号変換装置の撮影用のビデオカメラ装置及び該装置に用いられる光学アダプター装置 |
JPH1188912A (ja) * | 1997-09-10 | 1999-03-30 | Canon Inc | 複眼カメラ及び複眼カメラにおける表示制御方法 |
JP2002077943A (ja) * | 2000-08-29 | 2002-03-15 | Olympus Optical Co Ltd | 画像取り扱い装置 |
JP2003284096A (ja) * | 2002-01-16 | 2003-10-03 | Olympus Optical Co Ltd | ステレオ撮影装置、ファインダ、目印提示部材、及び、ステレオ撮影装置の撮影方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1744564A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN1947430A (zh) | 2007-04-11 |
JP2005311983A (ja) | 2005-11-04 |
EP1744564A4 (en) | 2010-08-04 |
WO2005112475A9 (ja) | 2006-01-05 |
EP1744564A1 (en) | 2007-01-17 |
US20070035618A1 (en) | 2007-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3841630B2 (ja) | 画像取り扱い装置 | |
WO2005112475A1 (ja) | 画像処理装置 | |
US8599245B2 (en) | Image processing apparatus, camera, and image processing method | |
KR100947394B1 (ko) | 3d 화상 파일, 촬상 장치, 화상 재생 장치, 및 화상 가공장치 | |
JP5827988B2 (ja) | 立体画像撮像装置 | |
WO2005115016A1 (ja) | 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法 | |
US20110090313A1 (en) | Multi-eye camera and method for distinguishing three-dimensional object | |
CN102959467A (zh) | 单眼立体成像装置 | |
US20130113793A1 (en) | Image processing device, image processing method, and image processing program | |
JP5530322B2 (ja) | 表示装置および表示方法 | |
JP2010068182A (ja) | 3次元撮影装置および方法並びにプログラム | |
US8648953B2 (en) | Image display apparatus and method, as well as program | |
CN102986232B (zh) | 图像处理装置及方法 | |
CN102959967B (zh) | 图像输出装置及方法 | |
JP2011035643A (ja) | 多眼撮影方法および装置、並びにプログラム | |
CN103782234B (zh) | 立体图像捕捉设备和方法 | |
US20120307016A1 (en) | 3d camera | |
JP2002218506A (ja) | 撮像装置 | |
JP4536231B2 (ja) | 撮像装置 | |
JP3939127B2 (ja) | 撮像装置 | |
JP4589651B2 (ja) | 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法 | |
JP5580486B2 (ja) | 画像出力装置、方法およびプログラム | |
JP4398197B2 (ja) | カメラ | |
CN104054333A (zh) | 图像处理装置、方法以及程序及其记录介质 | |
JP2006238086A (ja) | 立体写真用デジタルカメラ及び立体写真撮影方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
COP | Corrected version of pamphlet |
Free format text: PAGES 2/4-3/4, DRAWINGS, REPLACED BY NEW PAGES 2/4-3/4 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005737297 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11586044 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580013145.4 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005737297 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11586044 Country of ref document: US |