KR20130080508A - Stereoscopic camera system - Google Patents

Stereoscopic camera system Download PDF

Info

Publication number
KR20130080508A
KR20130080508A KR1020120001333A KR20120001333A KR20130080508A KR 20130080508 A KR20130080508 A KR 20130080508A KR 1020120001333 A KR1020120001333 A KR 1020120001333A KR 20120001333 A KR20120001333 A KR 20120001333A KR 20130080508 A KR20130080508 A KR 20130080508A
Authority
KR
South Korea
Prior art keywords
image
images
correction value
gaze
focus control
Prior art date
Application number
KR1020120001333A
Other languages
Korean (ko)
Other versions
KR101334570B1 (en
Inventor
이영섭
김상조
김창훈
Original Assignee
에쓰이에이치에프코리아 (주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에쓰이에이치에프코리아 (주) filed Critical 에쓰이에이치에프코리아 (주)
Priority to KR1020120001333A priority Critical patent/KR101334570B1/en
Publication of KR20130080508A publication Critical patent/KR20130080508A/en
Application granted granted Critical
Publication of KR101334570B1 publication Critical patent/KR101334570B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Landscapes

  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)

Abstract

According to an aspect of the present invention, a stereo camera system includes: first and second camera modules configured to output first and second images by capturing a subject spaced apart from each other according to a focus control value; A memory for storing a data table indicating a corresponding relationship between a focus control value, a focal length, and at least one of a distance between a subject and the first and second camera modules and a gaze correction value; A control unit providing the focus control value to the first and second camera modules and outputting a gaze correction value for the first and second images based on the data table; And an image synthesizer configured to correct the first and second images according to the gaze point correction value, and synthesize the corrected first and second images to output a stereoscopic image.

Description

Stereo camera system {STEREOSCOPIC CAMERA SYSTEM}

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a stereo camera system, and more particularly, to a gaze point control method in a parallel camera stereo camera system.

Unlike general monocular cameras, stereo cameras or stereoscopic cameras require precise control of focus and gaze to minimize eye strain and produce images with excellent stereoscopic effect.

The binocular disparity is the difference between the left and right images caused by the two eyes of the human being separated by about 6.5cm. The binocular disparity varies according to the distance between the camera and the object within the same screen, and is known as the most important factor for making a 3D feeling. .

The point where the parallax becomes zero is called a gaze point, and controlling a gaze point in a stereoscopic camera is called gaze point control or gaze angle control. By controlling the gaze point, it is possible to prevent excessive parallax in the screen, which is one of the very important elements required for producing a high quality stereoscopic image.

In order to control the gaze point in a conventional stereo camera, a stereoscopic image is acquired by controlling the moving position and angle of the camera.

In the case of a parallel-axis stereo camera, the position and angle of the camera's movement cannot be controlled. Therefore, the gaze point can be controlled only through image processing.

In addition, auto focusing has a variety of control methods, but it is a method of controlling only one camera, so it is difficult to apply to a binocular camera where the focus of the left and right cameras should be applied at the same time.

In the conventional fixed stereoscopic camera, since the moving position and angle of the camera cannot be adjusted to adjust the focus and the gaze point, one of the left and right cameras is controlled to adjust the focus and control the gaze point accordingly, thereby acquiring a stereoscopic image. For this reason, the user could not see the stereoscopic image during the focus control, and had a difficulty in capturing the stereoscopic image in a limited environment.

It is an object of certain embodiments of the present invention to at least partially solve, alleviate or eliminate at least one of the problems and / or disadvantages associated with the prior art.

One object of the present invention is to develop a technology in which the focus and gaze points are automatically controlled in a fixed 3D camera in which the angle of the camera is limited, such as a parallel axis method, so that a user can easily and conveniently take a 3D image having excellent stereoscopic effect. .

According to an aspect of the present invention, a stereo camera system includes: first and second camera modules configured to output first and second images by capturing a subject spaced apart from each other according to a focus control value; A memory for storing a data table indicating a corresponding relationship between a focus control value, a focal length, and at least one of a distance between a subject and the first and second camera modules and a gaze correction value; A control unit providing the focus control value to the first and second camera modules and outputting a gaze correction value for the first and second images based on the data table; And an image synthesizer configured to correct the first and second images according to the gaze point correction value, and synthesize the corrected first and second images to output a stereoscopic image.

The present invention has the effect of enabling a convenient and practical operation of a stereo camera by implementing a focus-time automatic linkage control function using an image processing technique in a parallel axis stereo camera.

In addition, the present invention is to remove the parallax of the left and right image generated by the parallel stereoscopic stereo camera according to the focal length to remove the parallax according to the focal length to give a long time observation without further fatigue to the eyes, further increases the stereoscopic feeling It has the effect of making

1 is a view showing the configuration of a parallel axis stereo camera system according to an embodiment of the present invention,
2 is a view for explaining gaze point correction of first and second images;
3 is a flowchart for explaining a focus-perspective automatic linkage method performed by a controller;
4 is a view for explaining a gaze point movement according to a movement of an object;
5 and 6 are views for explaining a method for calculating a gaze point correction value,
7 is a diagram for describing a method of calculating a distance between a subject and a first lens along an optical axis of a first lens;

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, specific matters such as specific elements are shown, which are provided to help a more general understanding of the present invention. It is self-evident to those of ordinary knowledge in Esau. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

Hereinafter, in the embodiments of the present invention, ordinal numbers such as first and second are used, but only for distinguishing objects of the same name from each other, the order of which may be arbitrarily determined, and the preceding description of the objects of subordinate order. Can be applied mutatis mutandis.

1 is a view showing the configuration of a parallel axis stereo camera system according to an embodiment of the present invention. The stereo camera system 100 includes a stereo camera 110 for capturing first and second images, a focus-period automatic linkage device 160 for automatically controlling a viewpoint according to focus, and first and second images. It includes a stereoscopic image synthesizer 150 for generating a stereoscopic image based on the second image.

The stereo camera 110 may include a first camera module 120 for capturing a first image as a left image and a second camera module for capturing a second image as a right image when viewed from the stereo camera 110. 130 and the first memory 140. The stereo camera 110 has a parallel axis structure in which the first and second camera modules 120 and 130 are spaced apart at predetermined intervals and arranged in parallel. In this example, the first to third memories 140, 152 and 164 are illustrated as being provided, respectively, but the first to third memories 140, 152 and 164 may be integrated into one memory. In addition, the following description is based on the first camera module 120, but this description may be equally applied to the second camera module 130.

The first camera module 120 includes at least one first lens 121, a first actuator 122 including a first motor 123 and a first guide 124, and a first image sensor 125. ). In addition, the second camera module 130 may include at least one second lens 131, a second actuator 132 including a second motor 133, and a first guide 134, and a second image sensor. (135). Since the first and second camera modules 120 and 130 have the same configuration, the first and second camera modules 120 and 130 will be described below.

The at least one first lens 121 forms an image of a subject by converging light incident from the outside. The first lens 121 may be a convex lens, an aspherical lens, or the like. The first lens 121 has a symmetry with respect to an optical axis passing through its center, and the optical axis is defined as this central axis. Although only one lens is shown in FIG. 1, a plurality of lenses may be used.

The first actuator 122 moves the first lens 121 according to a focus control value input from the controller 166, and provides a first motor 123 to provide a driving force and the first by the driving force. The first lens 124 may include a first guide 124 that moves the lens 121 along the optical axis. The focus control value is a value provided to the first motor 123 to move the first lens 121 to a position corresponding to a specific focus, and for example, may correspond to a maximum driving voltage.

The first image sensor 125 detects an optical image formed by external light incident through the first lens 121 as an electrical image signal. The first image sensor 125 may include a plurality of pixel units arranged in an M × N matrix structure, and the pixel unit may include a photodiode and a plurality of transistors. The pixel unit accumulates electric charges generated by incident light, and the voltage by the accumulated electric charges indicates illuminance of the incident light. In the case of processing an image constituting a still image or a moving image, the image signal output from the first image sensor 125 is composed of a set of voltages (ie, pixel values) output from the pixel units. The video signal represents one frame (ie, a still picture). In addition, the frame is composed of M x N pixels. As the first image sensor 140, a charge-coupled device (CCD) image sensor, a complementary metal-oxide semiconductor (CMOS) image sensor, or the like may be used.

The first memory 140 includes position deviation information of each lens 121 and 131, position deviation information of each image sensor 125 and 135, lens position control deviation information of each actuator 122 and 132, and each lens ( Information about the stereo camera 110 is stored, such as a focal length according to the moving positions (or focus control values) of the 121 and 131, and gaze point information according to the focal length. Each deviation information refers to a difference between a design value and an actual value according to an error of a manufacturing process.

The stereoscopic image synthesizer 150 includes a second memory 152, an image processor 154, and an image synthesizer 156.

The second memory 152 stores a first image input from the first camera module 120 and a second image input from the second camera module 130.

The image processor 154 performs image correction such as color image correction and distortion correction on each of the first and second images stored in the second memory 152. In addition, the image processor 154 corrects the gaze points of the first and second images according to the gaze point correction values input from the controller 166 of the focal point-period automatic linkage device 160.

2 is a diagram for explaining gaze point correction of first and second images.

Referring to FIG. 2A, the image processor 154 crops a horizontal outer edge of the first image (viewed from the stereo camera system) by a width corresponding to the gaze point correction value D. Cropping and cropping the vertical edge portion of the first image according to the aspect ratio of the first image. Thereafter, the image processor 154 may restore the cropped first image 210 to the size of the original first image. A cropped first image 210 is shown at the top of FIG. 2A, and a restored first image 212 is shown at the bottom of FIG. 2A.

As described above, referring to FIG. 2B, the image processor 154 corresponds to the gaze point correction value D of the horizontal edge portion (viewed from the stereo camera system side) of the second image. And cropping by the width, and cropping the edge portion in the vertical direction of the second image according to the aspect ratio of the second image. Optionally, the image processor may restore the cropped second image 220 to the size of the original second image. A cropped second image 220 is shown at the top of FIG. 2B, and a restored second image 222 is shown at the bottom of FIG. 2B.

The image synthesizing unit 156 generates a stereoscopic image by synthesizing the corrected first and second images input from the image processing unit 154 into a preset format. For example, the image synthesizing unit 156 may synthesize the first and second images in a normal side by side or a normal top and bottom.

The focus-perspective auto-interlocking device 160 includes a display unit 162, a third memory 164, and a controller 166.

The display unit 162 displays the stereoscopic image input from the control unit 166 to the user. The display unit 162 may include a plasma display panel (PDP), a liquid crystal display (LCD), organic light emitting diodes (OLEDs), LEDs, and the like.

The third memory 164 stores a stereoscopic image input from the controller 166.

The controller 166 controls the overall operation of the stereo camera system 100 and performs a focus-periphery automatic linkage method.

3 is a flowchart illustrating a focus-periphery automatic linkage method performed by a controller.

The focus-perspective automatic linkage method includes steps S110 to S260.

Step S110 is an initialization step, and the controller 166 may determine position deviation information of the first and second lenses 121 and 131 and the first and second image sensors 125 and 135 stored in the first memory 140. Initial gaze correction values and initial values applied during initial driving of the stereo camera system 100 by performing interpolation and correction based on position deviation information, position control deviation information of the first and second actuators 122 and 132, and the like. Set the focus control value. The gaze point correction value and the variable focus control value are then calculated based on the amount of change from the initial gaze point correction value and the initial focus control value. For example, the initial gaze point correction value and the initial focus control value may be set to 0, respectively.

In step S120, the controller 166 detects a focus shift (or change) according to a user's manipulation or automatic setting. The control unit 166 performs step S130 when a focus shift occurs and ends the method when there is no focus change.

In step S130, the controller 166 determines whether the auto focus function is set. If the auto focus function is set, the controller 166 performs step S150. If the auto focus function is not set, the controller 166 performs step S200.

In operation S150, the controller 166 controls the actuators 122 and 132 to move each lens 121 and 131 to a predetermined pattern (eg, a pattern for moving from the initial position to the end position, the initial position at the intermediate position). After moving to the position, and then move to the end position again), the focus control value is calculated based on the image information input from the image processor 154. That is, the controller 166 calculates a contrast of a subject (or edge of the subject) in the first image and / or the second image and calculates a focus control value using the contrast. For example, if the contrast of the current image is higher than the contrast of the previous and next images during the auto focus process, it is determined that the focus is in the current image, and provided to each of the actuators 122 and 132 corresponding to the current image. The control value can be determined as the focus control value.

In operation S160, the controller 166 calculates a focal length corresponding to the focus control value. The first memory 140 stores a first data table representing the focus control values and the focal lengths corresponding to each other, and the controller 166 controls the focus through interpolation or mathematical calculation based on the first data table. The focal length corresponding to the value is calculated.

In operation S170, the controller 166 calculates a gaze correction value corresponding to the focal length.

4 is a view for explaining the movement of the gaze point according to the movement of an object. The angle of view θ 'at which the human eyes 410, 412 see the object 420 at a distant position. As the object moves from a distant position to a near position, the gaze seen by the human eyes 410 and 412 also moves from a distant position to a close position. This is due to the separation angle between the eyes and the angle of view &thetas; " In the case of a fixed stereo camera system such as a parallel axis method, since the viewing angle and distance of binocular distance according to the distance of the object are limited, it is necessary to create a three-dimensional effect through a method other than changing the viewing angle. In the present invention, in order to make a user feel the three-dimensional feeling most comfortably in the fixed stereo camera system, the gaze point correction is performed on the image so that the difference in the gaze angle according to the distance of the object is compensated.

5 and 6 are diagrams for describing a method of calculating a gaze point correction value. FIG. 5 is a diagram illustrating a distance and angle relationship between a gaze point and the first lens 121, and FIG. 6 is a diagram illustrating a distance and angle relationship between the first lens 121 and an image center point. Since the first and second camera modules 120 and 130 have the same configuration, the first and second camera modules 120 and 130 will be described below with reference to the first camera module 120. 5 and 6, some components of the first and second camera modules 120 and 130 are omitted.

Referring to FIG. 5, X is the distance between the subject and the first lens along the optical axis 126 of the first lens 121, and L is the distance between the optical axis of the first lens and the gaze point. When the viewing angle θ, which is an angle between a straight line connecting the center point and the viewing point of the first lens, is θ, the viewing angle θ according to the distance L from the subject is expressed by the following equation (1). As described above, the following description is similarly applicable to the second lens 131 and its optical axis 136.

Figure pat00001

Referring to FIG. 6, when the first lens 21 moves away from the image sensor 125 according to the focus change, the initial focal length F is changed to (F + Z). In this case, the first lens of the moved position in FIG. 6 is denoted by reference numeral 121a. The first image of the subject is formed on the light receiving surface of the image sensor 125. 6 illustrates a center point A of the first image corresponding to the initial gaze point and a center point A ′ of the first image corresponding to the changed gaze point. The distance between the initial center point A and the moved center point A 'is defined as the gaze point correction value D. The angle between the straight line connecting the initial center point A and the vertex of the first lens 121a and the straight line connecting the moved center point A 'and the vertex of the first lens 121a is the viewing angle θ.

The gaze point correction value D may be calculated by Equation 2 below.

Figure pat00002

The focal length change Z according to the change of the distance X between the subject 510 and the first lens 121 may be calculated by Equation 3 below. In this case, the focal length change amount corresponds to the movement amount (or position change amount) of the first lens 121.

Figure pat00003

Table 1 below shows the distance X between the subject 510 and the first lens 121 along the optical axis 216 of the first lens 121, the focal length change Z, the viewing angle θ, and the viewing point correction value D. Is a second data table indicating the correspondence relationship between the two.

Object
Distance
Z
(Mm)
θ
(Degree)
D
(Mm)
D
(pixel)
30 mm 0.60549 45 4.57549 3268.207 60 mm 0.281294 26.56505 2.125647 1518.319 100 mm 0.164125 16.69924 1.240237 885.8839 200 mm 0.0804 8.530766 0.60756 433.9715 300 mm 0.053241 5.710593 0.402324 287.3743 400 mm 0.039797 4.289153 0.300735 214.8106 500 mm 0.031774 3.43363 0.240106 171.5046 600 mm 0.026443 2.862405 0.199822 142.7301 700 mm 0.022644 2.454032 0.171113 122.2238 800 mm 0.019799 2.147585 0.149617 106.8696 900 mm 0.01759 1.909152 0.13292 94.94261 1,000 mm 0.015824 1.718358 0.119575 85.41051 1,200 mm 0.013178 1.432096 0.099579 71.12817 1,500 mm 0.010535 1.145763 0.079611 56.86479 2,000 mm 0.007896 0.859372 0.059668 42.62032 3,000 mm 0.005261 0.572939 0.039753 28.39472 5,000 mm 0.003155 0.343771 0.023839 17.02781 10,000 mm 0.001577 0.171887 0.011915 8.510522

7 is a diagram for describing a method of calculating a distance between a subject and a first lens along an optical axis of a first lens. The first memory 140 includes a focus control value Fv1 for the far object 710, a distance d1 between the far object 710 and the first lens, a focus control value Fv2 for the near object 720, and a near object. The distance d2 between the 720 and the first lens 121a is further stored. In this case, the first lens of the moved position in FIG. 7 is denoted by reference numeral 121a.

5 to 7, the distance X between the subject 510 and the first lens 121 according to the focus control value Focus may be calculated by Equation 4 below.

Figure pat00004

Referring back to FIG. 3, in step S160, the controller 166 substitutes the focus control value Focus into Equation 4 instead of using the first data table, and thereby the subject 510 and the first lens 121. The distance X may be calculated, and the focal length F + Z may be calculated based on the calculated X and the initial focal length F corresponding to the initial focus control value.

In operation S170, the controller 166 substitutes d1, d2, Fv1, and Fv2, which are already known, and a focus control value Focus into Equation 4 to determine the distance X between the subject 510 and the first lens 121. The gaze point correction value D may be calculated by referring to a second data table indicating a correspondence relationship between the distance X between the subject 510 and the first lens 121 and the gaze point correction value D. The second data table may further include a focal length field or a focus control value field, and the controller 166 calculates a focal length in step S160 and corrects a gaze point by referring to the second data table in step S170. The value D can be calculated. Alternatively, the controller 166 may calculate a focal length in operation S160, and calculate a gaze point correction value D with reference to the second data table in operation S170. That is, the controller 166 is configured to display a second data table representing a correspondence relationship between a focus control value Focus, a focal length, and at least one of the distance X between the subject 510 and the first lens 121 and the gaze point correction value D. Based on this, the gaze point correction value D can be calculated.

In step S200, the controller 166 determines which of the focus control and the gaze point control the user selects, performs step S210 when selecting the focus control, and performs step S250 when selecting the gaze point control. .

In operation S210, the controller 166 calculates a focal length corresponding to a focus control value according to a user input.

In operation S220, the controller 166 calculates a gaze correction value corresponding to the focal length.

In operation S250, the controller 166 calculates a focal length corresponding to a gaze correction value according to a user input. For example, the controller 166 may calculate a focal length corresponding to the gaze point correction value based on the second data table.

In operation S260, the controller 166 calculates a focus control value corresponding to the user calculated focal length. For example, the controller 166 may calculate a focus control value corresponding to a focal length based on the first data table.

It will be appreciated that embodiments of the present invention may be implemented in hardware, software, or a combination of hardware and software. Any such software may be stored in a memory such as, for example, a volatile or nonvolatile storage device such as a storage device such as ROM, or a memory such as, for example, a RAM, memory chip, device, or integrated circuit, whether removable or rewritable. , Or stored in a machine-readable storage medium, such as optical, or magnetically recordable, such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the memory that can be included in the host device is an example of a machine-readable storage medium suitable for storing a program or programs containing instructions for implementing embodiments of the present invention. Accordingly, the present invention includes a program comprising code for implementing the apparatus or method described in any claim herein and a machine-readable storage medium storing such a program. In addition, such a program may be electronically transported through any medium such as a communication signal transmitted via a wired or wireless connection, and the present invention appropriately includes the same.

Reference Signs List 100: stereo camera system, 110: stereo camera, 120: first camera module, 130: second camera module, 150: stereoscopic image synthesizing unit, 160: focus-period automatic linkage device

Claims (5)

In a stereo camera system,
First and second camera modules outputting first and second images by capturing a subject spaced apart from each other according to a focus control value;
A memory for storing a data table indicating a corresponding relationship between a focus control value, a focal length, a focal length change amount, and at least one of a distance between a subject and the first and second camera modules and a gaze correction value;
A control unit providing the focus control value to the first and second camera modules and outputting a gaze correction value for the first and second images based on the data table;
And an image synthesizing unit configured to correct the first and second images according to the gaze point correction value and synthesize the corrected first and second images to output a stereoscopic image.
The method of claim 1,
And a display unit for displaying the stereoscopic image to a user.
The method of claim 1,
The stereo camera system is characterized in that the first and second camera module is configured in a parallel axis manner fixed in parallel to each other.
The method of claim 1,
The image synthesizing unit crops an edge portion of each of the first and second images by a width corresponding to the gaze correction value.
The method of claim 1,
And the image synthesizing unit synthesizes the first and second images in a side by side or a top and bottom.
KR1020120001333A 2012-01-05 2012-01-05 Stereoscopic camera system KR101334570B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120001333A KR101334570B1 (en) 2012-01-05 2012-01-05 Stereoscopic camera system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120001333A KR101334570B1 (en) 2012-01-05 2012-01-05 Stereoscopic camera system

Publications (2)

Publication Number Publication Date
KR20130080508A true KR20130080508A (en) 2013-07-15
KR101334570B1 KR101334570B1 (en) 2013-11-28

Family

ID=48992581

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120001333A KR101334570B1 (en) 2012-01-05 2012-01-05 Stereoscopic camera system

Country Status (1)

Country Link
KR (1) KR101334570B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482627B2 (en) 2016-09-22 2019-11-19 Samsung Electronics Co., Ltd Method and electronic device for calibration of stereo camera
KR20230076490A (en) * 2021-11-24 2023-05-31 (주)에스지유 Method for automatically controlling convergence point in stereoscopic camera and stereoscopic camera system using the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100778318B1 (en) 2006-02-23 2007-11-22 (주)브이쓰리아이 Stereo camera system for moving lens horizontally
KR101069209B1 (en) * 2008-12-22 2011-09-30 (주)브이쓰리아이 Method and apparatus for controlling convergence in parallel axis typed sterocamera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482627B2 (en) 2016-09-22 2019-11-19 Samsung Electronics Co., Ltd Method and electronic device for calibration of stereo camera
KR20230076490A (en) * 2021-11-24 2023-05-31 (주)에스지유 Method for automatically controlling convergence point in stereoscopic camera and stereoscopic camera system using the same

Also Published As

Publication number Publication date
KR101334570B1 (en) 2013-11-28

Similar Documents

Publication Publication Date Title
US10051258B2 (en) Image processing device, image processing method and image processing computer program product
US8300086B2 (en) Image processing for supporting a stereoscopic presentation
CN102318331B (en) Stereoscopic image pick-up apparatus
US8502863B2 (en) Stereoscopic imaging apparatus
US20140043323A1 (en) Three-dimensional image display apparatus and three-dimensional image processing method
US20130016188A1 (en) Camera module and image capturing method
JP5594067B2 (en) Image processing apparatus and image processing method
WO2012101917A1 (en) Image processing device, image-capturing device, reproduction device, and image processing method
US20130093855A1 (en) Parallel axis stereoscopic camera
US20120062707A1 (en) Method and apparatus for determining a convergence angle of a stereo camera
US20160044305A1 (en) Multiview image display apparatus and control method thereof
WO2019026929A1 (en) Medical observation device
US20130208097A1 (en) Three-dimensional imaging system and image reproducing method thereof
TWI589150B (en) Three-dimensional auto-focusing method and the system thereof
US20130128002A1 (en) Stereography device and stereography method
KR101334570B1 (en) Stereoscopic camera system
US9154771B2 (en) Apparatus for capturing stereoscopic image
WO2012002347A1 (en) Image processing device, imaging device and image processing method
US20140293006A1 (en) Image output device, method, and recording medium therefor
KR20160073613A (en) Image pick-up apparatus, portable terminal including the same and image pick-up method using the apparatus
KR101691292B1 (en) Table-top 3D display system and method
KR102242923B1 (en) Alignment device for stereoscopic camera and method thereof
CN103098479A (en) Image processing device, method and program
KR101746717B1 (en) Output method of view images in three-dimensional display with a different focus lens array
KR20140098040A (en) 3-dimensional moving picture photographing apparatus and camera

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
N231 Notification of change of applicant
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20161028

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20171030

Year of fee payment: 5

LAPS Lapse due to unpaid annual fee