US20040150728A1 - Image pick-up apparatus for stereoscope - Google Patents

Image pick-up apparatus for stereoscope Download PDF

Info

Publication number
US20040150728A1
US20040150728A1 US10/763,756 US76375604A US2004150728A1 US 20040150728 A1 US20040150728 A1 US 20040150728A1 US 76375604 A US76375604 A US 76375604A US 2004150728 A1 US2004150728 A1 US 2004150728A1
Authority
US
United States
Prior art keywords
image pick
image
distance
optical
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/763,756
Other languages
English (en)
Inventor
Shigeru Ogino
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP33299597A external-priority patent/JP3976860B2/ja
Priority claimed from JP10323691A external-priority patent/JP2000152282A/ja
Application filed by Individual filed Critical Individual
Priority to US10/763,756 priority Critical patent/US20040150728A1/en
Publication of US20040150728A1 publication Critical patent/US20040150728A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention relates to an image pick-up apparatus for stereoscope for picking up parallax images for stereoscopic viewing. More particularly, the present invention relates to an image pick-up apparatus for picking up, without user intervention, parallax images that satisfy binocular fusion conditions and an image pick-up apparatus that lets a user know whether the binocular fusion conditions are satisfied.
  • Some known stereoscopic image pick-up apparatuses capture parallax images from a plurality of view points using a plurality of cameras.
  • a plurality of cameras are mounted on tripod heads, and a user manually adjusts a spacing and a convergence angle between camera axes, based on his or her operational experience, depending on a scene to be captured, and image-captures the scene while actually observing the stereoscopic view on a monitor.
  • Head-mounted displays and eyeglass-type displays have been developed today, and these displays let an image for the right eye to be selectively presented on the right eye and an image for the left eye to be selectively presented on the left eye.
  • a stereoscopic image producing a depth perception is thus observed.
  • a liquid-crystal display is combined with a lenticular sheet having a predetermined pitch or a mask with apertures and non-apertures formed in a predetermined pattern to impart a directivity to a light ray from the liquid-crystal display, and by making the directivity match the pattern of the image presented on the liquid-crystal display, the observer watches the image for the right eye on the right eye and the image for the left eye on the left eye. The observer thus enjoys the image presenting depth.
  • the image presented is typically acquired through a binocular-type camera having two lens sets.
  • An apparatus disclosed in Japanese Examined Patent Publication No. 8-27499 (telescopic television image pick-up apparatus), does not require two lens sets.
  • This apparatus includes two liquid-crystal shutters, a total reflection mirror and a half mirror and picks up alternately left and right parallax images through a single lens set.
  • the above binocular-type camera employs two lens sets, one for forming the image for the right eye and the other for forming the image for the left eye, and the user feels fatigue when observing a stereoscopic image or cannot to attain image fusion at all, if performance differences due to manufacturing error (for example, in magnification, deviation in optical axis, tint, brightness, distortion, field tilt and the like) are present between the two lenses.
  • manufacturing error for example, in magnification, deviation in optical axis, tint, brightness, distortion, field tilt and the like
  • the accuracy of components needs to be heightened. Adjustment is further required if the improved accuracy of components is still not sufficient enough.
  • special means is used, for example, images are electronically corrected.
  • the zoom variator operations of the left and right lens sets must be interlocked in a state with these performances factored in. This arrangement is costly and time consuming to manufacture, and is inadequate for mass production.
  • the time-division camera proposed in the above-cited Japanese Examined Patent Publication No. 8-27499 combines the optical paths for left and right parallax images at a half mirror into one optical path to guide images to a single lens. When images are transmitted through or reflected from the half mirror before entering the lens, the quantity of light is reduced to half or smaller.
  • the arrangement disclosed in the above-cited Japanese Examined Patent Publication No. 8-27499 theoretically presents a difference between the lengths of the optical paths of the left and right parallax images, suffering a magnification difference between the left and right images. This becomes a cause of the fatigue of the user when he or she observes the images picked-up, and as a result, the user cannot observe the images fused and cannot observe them in a stereoscopic view.
  • the object of the present invention is achieved by the image pick-up apparatus for capturing images for stereoscopic viewing, which comprises image pick-up means for picking up left and right parallax images of a main subject respectively for the left and right eyes, display means for displaying the left and right parallax images, picked up by the image pick-up means, line-of-sight detection means for detecting the lines of sight of the left and right eyes, looking the respective images displayed by the display means, and determining means for determining, based on the output of the line-of-sight detection means, whether the main subject falls within an image-fusible range.
  • the apparatus detects the lines of sight of an observer who observes the display means on which the left parallax image and right parallax image are presented, and, instead of the observer, determines, based on the lines of sight, whether the subject falls within the image-fusible range.
  • the determination result provided by the determining means is reported to the user.
  • an image-fusible range is defined by the inter-pupillary distance and the distance of distinct vision of a user.
  • the image date of the parallax images is stored in a memory in response to the output of the determining means.
  • the line-of-sight detection means further comprises converting means for converting the left and right lines of sight of a user into left and right direction vectors that are respectively expressed in left and right coordinate systems of the image pick-up means, and coordinates calculator means for calculating the coordinate values of the crossing point of the left and right direction vectors in the world coordinate system.
  • the image-fusible range is expressed according to a farthest position and a nearest position from the view point of the user in the direction of depth.
  • the farthest position of the image-fusible range is set to be a point so that the horizontal distance between two second points on the left and right image planes of the image pick-up means corresponding to a first point (A) of the farthest position is substantially equal to the inter-pupillary distance of the user.
  • the nearest position of the image-fusible range is set to be a point so that a position (C′) where two points on the left and right image planes of the image pick-up means corresponding to a first point (C) of the nearest position, through perspective transformation based on the left and right view points of the user, look standing out to the user, is approximately equal to the point at the distance of distinct vision of the user.
  • the image pick-up apparatus which comprises a first optical system and a second optical system with a predetermined convergence angle made therebetween, a first electronic shutter and a second electronic shutter for electronically blocking the respective optical paths of the first and second optical systems, control means for driving the first and second electronic shutters in a time-division manner, optical path integrator means for integrating the optical paths of the first and second optical systems, a third optical system having the integrated optical path, a charge-coupled device for photoelectrically converting an optical image transmitted through the third optical system, reading means for reading the output of the charge-coupled device in a time-division manner in synchronization with time-division driving of the first and second electronic shutter by the control means, distance measuring means for measuring a distance to a subject, and adjusting means for adjusting the convergence angle between the first and second optical systems in accordance with the measured distance.
  • the image pick-up apparatus is further made compact by incorporating the optical path integrator means which comprises a prism arranged at the entrance of the optical path of the third optical system, a first mirror for deflecting the optical path of the first optical system toward the prism, a second mirror for deflecting the optical path of the second optical system toward the prism, and the adjusting means which comprises angle adjusting means for controlling the angles of pivot of the first and second mirrors.
  • the optical path integrator means which comprises a prism arranged at the entrance of the optical path of the third optical system, a first mirror for deflecting the optical path of the first optical system toward the prism, a second mirror for deflecting the optical path of the second optical system toward the prism, and the adjusting means which comprises angle adjusting means for controlling the angles of pivot of the first and second mirrors.
  • the angle adjusting means pivots the first and second mirrors in the same angles but in opposite directions.
  • the distance measuring means uses a triangulation method.
  • distance measurement is performed based on position information about the plurality of lenses used in the first and second optical systems.
  • the image pick-up apparatus further comprises detector means for detecting a change, in the distance to the subject, in excess of a predetermined value, and activating means for activating the adjusting means in response to the output of the detector means.
  • the image pick-up apparatus further comprises detector means for detecting a predetermined number or larger number of occurrences of changes in the distance to the subject, each change in excess of a predetermined value, and activating means for activating the adjusting means in response to the output of the detector means.
  • the image pick-up apparatus further comprises a camera main unit for processing an output signal from the charge-coupled device, a lens unit for driving the optical systems, and an interconnection unit for electrically connecting the camera main unit to the lens unit.
  • the image pick-up apparatus further comprises a camera mount for the camera main unit and a lens mount for the lens unit, wherein the lens unit is detachably mounted onto the camera main unit.
  • FIG. 1 is a block diagram showing the construction of a first embodiment of the present invention
  • FIG. 2 is an explanatory diagram showing the position of the eyes of an observer relative to an optical system of an image pick-up apparatus of the first embodiment of the present invention
  • FIG. 3 is a flow diagram showing a control process for measuring a distance to a subject based on the line-of-sight data of the observer in the first embodiment of the present invention
  • FIG. 4 is an explanatory diagram showing the operation executed in step S 8 through step S 10 in the control process shown in FIG. 3;
  • FIG. 5 is an explanatory diagram illustrating the principle of determining an image-fusible range
  • FIG. 6 is an explanatory diagram illustrating the principle of determining the farthest position of the image-fusible range
  • FIG. 7 is an explanatory diagram illustrating the principle of determining the nearest position of the image-fusible range
  • FIG. 8 is an explanatory diagram illustrating the principle of determining the nearest position of the image-fusible range
  • FIG. 9 is a flow diagram showing the control process for determining the image-fusible range
  • FIG. 10 is a block diagram of an image pick-up apparatus according to a second embodiment of the present invention.
  • FIG. 11 is an explanatory diagram showing a distance measurement method (triangulation) employed in the second embodiment of the present invention.
  • FIG. 12 is an explanatory diagram illustrating a drive signal of a liquid-crystal shutter of the second embodiment of the present invention.
  • FIG. 13 is an explanatory diagram illustrating the generation process of the drive signal shown in FIG. 12.
  • FIG. 14 is a flow diagram illustrating a convergence control according to the second embodiment of the present invention.
  • FIG. 1 is a block diagram showing the construction of a “stereoscopic image pick-up apparatus” according to a first embodiment of the present invention.
  • the stereo camera 1 for capturing a plurality of parallax images
  • the stereo camera 1 comprises a right optical system 101 for capturing a parallax image for the right eye and CCD 103 of a photoelectric converter and a left optical system 102 for capturing a parallax image for the left eye and CCD 104 of a photoelectric converter.
  • the pair of optical system 101 and CCD 103 is identical in optical specification to the pair of optical system 102 and CCD 104 .
  • the optical system 101 and CCD 103 and the optical system 102 and CCD 104 are arranged so that the spacing between the two pairs (hereinafter referred to as a base line distance) and the angle made between the two pairs (hereinafter referred to as a convergence angle) are variably set with an unshown mechanism.
  • the base line distance and the convergence angle are adjusted by a known convergence/base line distance adjuster 11 .
  • a convergence angle/base line distance detector 10 detects the convergence angle and base line distance with its unshown encoder.
  • FIG. 2 shows the positions of the optical systems 101 and 102 relative to a main subject.
  • ⁇ b represent the convergence angle between the optical systems 101 and 102
  • 2k represent the base line distance.
  • the adjuster 11 adjusts the convergence angle ⁇ b and the base line distance 2k, and an adjusted angle is detected by the detector 10 , and is then sent to a calculating unit 3 (FIG. 1) to be described later.
  • An image display 2 includes a right display unit 201 for the right eye, a left display unit 202 for the left eye, a right line-of-sight detector 203 , and a left line-of-sight detector 204 .
  • the right display unit 201 and left display unit 202 are identical in specification, and are the so-called retinal display that provides an image through the after image effect by irradiating and scanning the retina with a light beam from a liquid crystal, CRT, LED or laser, each equipped with an observation optical system.
  • the display units 201 and 202 respectively scan the left and right retinas by a light beam from an LED circuit 12 to give left and right stereoscopic images to the user.
  • a right line-of-sight detector 203 and left line-of-sight detector 204 in image display 2 employ the method of detecting the line of sight from the so-called corneal reflection light.
  • the image display 2 includes liquid-crystal displays LCD 205 and LCD 206 . This method is disclosed in detail in Japanese Unexamined Patent Publication No. 5-68188, and the discussion of the method is omitted here.
  • the line-of-sight information about the eyeball obtained by the line-of-sight detectors 203 and 204 is sent to the calculating unit 3 to be described later, and serves as a basis for the calculation of the direction vector of the line of sight.
  • the calculating unit 3 of the image pick-up apparatus calculates an image-pickable range of the stereo camera 1 that assures that the left and right parallax images picked up by the stereo camera 1 causes an image fusion on the user (such a range is hereinafter referred to as “image-fusible range”).
  • the calculating unit 3 substantially realized by computer software, is shown in a functional block diagram in FIG. 1, and comprises a right line-of-sight range calculator 301 , a left line-of-sight range calculator 302 , a line-of-sight vector calculator 303 , a crossing point calculator 307 , a distance calculator 305 , an image-pickable range calculator 304 , and a comparator 306 .
  • the two line-of-sight range calculators 301 and 302 are respectively connected to the right line-of-sight detector 203 and the left line-of-sight detector 204 , and calculate the line-of-sight ranges for the left and right eyes of the user based on the outputs from the two detectors 203 and 204 .
  • the left and right line-of-sight ranges calculated are input to the line-of-sight vector calculator 303 , which calculates the direction vectors of the left and right lines of sight (d R and d L to be described later).
  • the crossing point calculator 307 determines the coordinates of the crossing point of the lines of sight, based on the detected direction vectors d R and d L of the lines of sight.
  • the distance calculator 305 calculates a distance l from the view point of the user to the subject, based on the detected direction vectors of the lines of sight.
  • the image-pickable range calculator 304 calculates the image-pickable range (namely, between the distal limit A of the range and the proximal limit B of the range in the direction of depth).
  • the comparator 306 compares the distance l to the subject, calculated by the distance calculator 305 , to the limit values A and B, calculated by the image-pickable range calculator 304 , to determine whether the subject meets an image fusion condition. Specifically, if the distance to the subject, calculated by the distance calculator 305 , meets the following equation,
  • the subject is within the image-pickable range, and is determined to meet the image fusion condition.
  • An image controller 4 controls the units in the image pick-up apparatus. Specifically, the image controller 4 controls right and left camera processors 5 and 6 , for the right and left eyes, which converts image signals from CCDs 103 and 104 into a predetermined image format. These processors 5 and 6 , under the control of the image controller 4 , convert the left and right parallax images picked up by the stereo camera 1 into the predetermined format and store them in an image memory 9 .
  • the images in the image memory 9 are read and converted into video signals by the image controller 4 , and the video signals are then sent to the display units 201 and 202 via drivers 7 and 8 , respectively.
  • the image controller 4 exchanges data with the image memory 9 .
  • the image controller 4 also sends to the LED circuit 12 a signal that controls the lighting and extinction of an unshown LED installed in the image display 2 .
  • FIG. 3 shows the operation starting at the line-of-sight detectors 203 and 204 and ending with the calculating unit 3 , particularly, to the process for determining the direction vectors of the lines of sight based on the data detected by the line-of-sight detectors 203 and 204 .
  • the subject is picked-up using the stereo camera, and the parallax images in stereoscopic view are presented to the user using the display. Specifically, observing the left and right parallax images presented on the display, the observer indirectly looks at the subject. The image pick-up apparatus then detects the lines of sight of the observer's eyes.
  • Step S 1 corresponds to the line-of-sight detectors 203 and 204 shown in FIG. 1
  • steps S 2 and S 3 correspond to the line-of-sight range detectors 301 and 302 shown in FIG. 1
  • step S 4 corresponds to the line-of-sight vector calculator 303
  • steps S 5 , S 6 , S 8 , S 9 and S 10 correspond to the crossing point calculator 307
  • step S 7 corresponds to the distance calculator 305 .
  • step S 1 the lines of sight, namely, the directions of rotation of the eyeballs are detected, based on the corneal reflection phenomenon.
  • each of the line-of-sight detectors 203 and 204 has its own LCD and CCD.
  • LCD 205 and LCD 206 irradiate the eyeballs of the observer by infrared light beams having a known pattern. The patterned beams are reflected by the corneas, and are then captured by CCD 207 and CCD 208 as images.
  • step S 2 the coordinate values of each beam spot in each LCD coordinate system are determined.
  • the coordinate values of the beam spot in each LCD coordinate system represent the angle of rotation of the respective eyeball.
  • step S 3 the coordinate values of the beam spot in each LCD coordinate system obtained in step S 2 are converted into the coordinate values in the coordinate systems of the stereo camera 1 (namely, the coordinate systems of CCD 103 and CCD 104 ).
  • Step S 4 determines direction vectors d R and d L of straight lines that connect the coordinates of the left beat spot and right beam spot, expressed in the coordinate systems of the stereo camera 1 , to reference points of the optical systems 101 and 102 .
  • the direction vectors are lines of sight expressed in the world coordinate system.
  • the line of sight expressed in the world coordinate system is hereinafter referred to as “line-of-sight vector” or simply “direction vector”.
  • step S 5 determines whether the right direction vector d R and left direction vector d L are present the same plane.
  • the controller 4 goes to step S 6 to calculate the crossing point (designated X) of the right direction vector d R and left direction vector d L in the world coordinate system.
  • step S 8 plane P in which the right direction vector d R and the reference point of the right optical system 101 lie is determined as shown in FIG. 4.
  • step S 9 an orthogonal projection vector d L ′ of the left direction vector d L with respect to the plane P is determined.
  • step S 10 the coordinates of the crossing point (designated X′) of the right direction vector d R and the left direction vector d L ′ is calculated in the world coordinate system.
  • Step S 7 determines the distance l between the predetermined reference point of the stereo camera 1 (the center point between the reference point of the right optical system 101 and the reference point of the left optical system 102 ) and the crossing point determined in either step S 6 or step S 10 (X or X′).
  • a range within which humans can fuse left and right parallax images presented on display screens as a stereoscopic image is different depending on the distal side (beyond the display screens) or on the proximal side (on this side of the display screens).
  • the image-fusible range depends on the characteristics of the human eyes. Considering that the inter-pupillary distance of the human is about 65 mm wide, and that the distance of distinct vision of the human is about 200 mm, this system determines a distal limit A based on the human inter-pupillary distance (about 65 mm) and a proximal limit C based on the distance of distinct vision.
  • the y axis represents the direction of depth
  • the z axis represents the vertical direction of the apparatus.
  • the x axis is perpendicular to the direction of depth.
  • FIG. 5 is drawn at a constant in the z axis.
  • FIG. 5 there are shown a reference point O L of the left optical system 102 (the lens center when the optical system is considered as a single thin lens) and a reference point O R of the right optical system 101 (the lens center when the optical system is considered as a single thin lens).
  • the reference points O L and O R are fixed points, and the crossing point B of the optical axes of the optical systems 101 and 102 also becomes fixed.
  • A represent a distal limit point of the image-fusible range with respect to the point B and let C represent a proximal limit point of the image-fusible range with respect to the point B.
  • the point B then coincides with the center of the image screen of the left and right optical systems 102 and 101 . Since the point B has zero parallax, the point B appears as a point on the display screen to the observer looking at the stereoscopic image displayed. To the same observer, the point A appears as point present beyond the display screen and the point C appears as a point on this side of the display screen.
  • the point B is the crossing point of the optical axes of the two optical systems 101 and 102 as already described, it is possible to make an image pick-up plane coincide with a plane which is perpendicular to each optical axis and in which the point B lies.
  • a virtual image pick-up plane of the left optical system 102 is set up to be aligned on the point B.
  • the distal point A and the proximal point C respectively become a point A L and a point C L on the virtual image pick-up plane.
  • 2 ⁇ W represent the lens view angle of each of the optical systems 101 and 102 .
  • the display units 201 and 202 presenting the parallax images each have a horizontal width of 2W S , and are positioned in front of the observer with a predetermined distance d S allowed therebetween.
  • the length of the virtual image pick-up plane shown in FIG. 6 is preferably equal to the horizontal dimension of each of the display units 201 and 202 .
  • the size of the virtual image pick-up plane is 2W S .
  • the points A and C are present within an angle range of 2 ⁇ W .
  • the left eye of the observer sees the point A as the point A L ′ on the image pick-up plane, though the point A is actually present straight ahead of the observer.
  • the reason why there is a distance BA′ L between the point B and the point A′ L is that the two eyes of human are spaced having a parallax.
  • the distance BA′ L is the quantity that may be called a parallax of the subject, and is expressed in a deviation on the image.
  • D A represent the deviation of the point A and D B represent the deviation of the point C, and the following equation holds.
  • D A and D B are thus expressed by the following equations.
  • D A W S ⁇ tan ⁇ ⁇ ⁇ a tan ⁇ ⁇ ⁇ W EQ1
  • D C W S ⁇ tan ⁇ ⁇ ⁇ c tan ⁇ ⁇ ⁇ W EQ2
  • the left and right parallax images are obtained by picking up a subject 200 on the stereo camera 1 of FIG. 1 (with its base line distance 2k and convergence angle ⁇ b ) and the parallax images are displayed on the display units 201 and 202 (with their horizontal dimensions of 2W S ) placed at a position in front of the observer with the predetermined distance d S allowed. A determination is made of whether the left and right parallax images are displayed in a fused state to the eyes of the observer.
  • the determination is made depending on whether the distance l from the stereo camera 1 to the subject satisfies the above parameters (namely, the fusion condition, to be described later, determined by the base line distance of 2k, the convergence angle of ⁇ b , the distance d S to the display units and the size 2W S of the display units).
  • the following equation thus holds.
  • the right side of EQ 6 signifies the image-fusible range beyond the screen with the point B aligned with the center of the screen. Specifically, if the subject 200 is beyond the point B but on this side of the point A defined by the following equation, the subject 200 is within the fusible range. k ⁇ d h ⁇ tan ⁇ ⁇ ⁇ W + W S ⁇ tan ⁇ ⁇ ⁇ b W S - d h ⁇ tan ⁇ ⁇ ⁇ b ⁇ tan ⁇ ⁇ ⁇ W
  • k and ⁇ b are obtained by the convergence angle/base line distance detector 10 .
  • ⁇ W is a value known from lens data
  • W S is also a known value in display condition.
  • the value of d h is not limited to this.
  • FIG. 7 shows not only the virtual image pick-up plane 250 L but also the virtual image pick-up plane 250 R. Specifically, images A L and C L , corresponding to the points A and C, are obtained in the right image pick-up plane 250 L, and images A R and C R , corresponding to the points A and C, are obtained in the left image pick-up plane 250 R. If the observer looks at the image pick-up plane with both eyes, the left and right fields of view are combined on a single virtual plane 400 . On the virtual plane 400 , points A′ L and A′ R are formed for the point A, and points B′ L and B′ R are formed for the point B.
  • the point A appears to be present at a crossing point A′ where a straight line 300 L connecting the left eye to the point A′ L intersects a straight line 300 R connecting the right eye to the point A′ R .
  • the point C appears to be present at a crossing point C′ where a straight line 350 L connecting the left eye to the point C′ L intersects a straight line 350 R connecting the right eye to the point C′ R .
  • the point A appears to stand back by a distance d B
  • the point C appears to be projected by a distance d P as shown in FIG. 7.
  • FIG. 8 shows the relationship of the projection distance d P , the distance d S to the display unit and the deviation D C .
  • D C d P d h ( d S - d p ) EQ7
  • the right side of EQ10 signifies the proximal limit value of the image-fusible range with the screen center aligned with the point B. Specifically, the image-fusible range on this side of the point B is beyond the point C.
  • k and ⁇ b are obtained by the convergence angle/base line distance detector 10 .
  • ⁇ W is a value known from lens data
  • W S and d S are values known from the display condition.
  • the distance l to the subject 200 that satisfies the fusion condition is expressed by the following equation. k ⁇ d h ⁇ tan ⁇ ⁇ ⁇ W + W S ⁇ tan ⁇ ⁇ ⁇ b W S - d h ⁇ tan ⁇ ⁇ ⁇ b ⁇ tan ⁇ ⁇ ⁇ W ⁇ l ⁇ ⁇ ⁇ d P ⁇ W S ⁇ tan ⁇ ⁇ ⁇ b - d h ⁇ ( d S - d P ) ⁇ tan ⁇ ⁇ ⁇ W ⁇ ⁇ ⁇ ⁇ d P ⁇ W S + tan ⁇ ⁇ ⁇ b ⁇ d h ⁇ ( d S - d P ) ⁇ tan ⁇ ⁇ ⁇ W ⁇ ⁇ ⁇ W ⁇ EQ11
  • the image pick-up system has a function of determining whether the subject being captured meets the fusion condition, namely, falls within the image-pickable range (C ⁇ l ⁇ A), and informing the user of the determination result.
  • FIG. 9 is a flow diagram showing the control process that incorporates the user informing function.
  • step S 11 the convergence angle ⁇ b and the base line distance 2k, set by the user, are read into the optical systems 101 and 102 .
  • step S 12 the known size W S of the image pick-up apparatus, the distance d S to the display units, the inter-pupillary distance d h , and the projection distance d P are also read.
  • step S 13 the distal fusion limit point A is determined according to EQ6.
  • step S 14 the proximal fusion limit point C is determined according to EQ10.
  • step S 15 the image-pickable range is determined.
  • the comparator 306 compares the distance to the subject which the user looks at, calculated by the distance calculator 305 , namely, the distance l to the subject which the user is now image-picking up, with the distal fusion limit point A and the proximal fusion limit point C. The comparator 306 then compares the distance to the object of the view point determined in step S 11 with the image-pickable range determined in step S 15 . Specifically, the comparator 306 determines whether the object of the view point falls within or out of the image-pickable range. The comparison result is sent to the image controller 4 .
  • Either the display unit 201 or display unit 202 or both are provided with LEDs (not shown), and the display units 201 and 202 light or flash the LEDs to indicate whether the subject is within or out of the range.
  • the LEDs may be lit to indicate that the subject is out of the range and may be extinguished to indicate that the subject is within the range.
  • the LEDs may be flashed to indicate that the subject is out of the range and may be extinguished to indicate that the subject is within the range.
  • the image display 2 may present the result of the above determination.
  • the present invention is not limited to the first embodiment, and a diversity of modifications of the first embodiment are contemplated.
  • the observer looks at the image of the subject displayed on the display units and the lines of sight of the observer are then detected.
  • the present invention also works to detect the lines of sight of the observer (user) who directly looks at the subject. This is because the distance l to the subject is also determined from the lines of sight of the user who directly looks at the subject.
  • optical system 101 and CCD 103 No particular optical specifications are set for the optical system 101 and CCD 103 . Any systems are perfectly acceptable as long as the optical system 101 and CCD 103 in pairs are respectively identical, in optical specification, to the optical system 102 and CCD 104 in pairs.
  • image picking is performed in a cross method in which the camera optical axes cross each other.
  • image picking is performed in a parallel method in which the camera optical axes run in parallel.
  • the line-of-sight detection of the present invention is not limited to the method in which the angle of rotation of the eyeball is detected.
  • any other method may be used, including the EPG method for making use of a voltage difference between the voltages of the eyeballs, the method of using a difference in reflectance between the white and the pupil in the sclera of the eye, and the search coil method in which a coil-embedded contact lens is put on under the presence of a magnetic field to measure the movement of the eyeballs.
  • the image display 2 in the first embodiment is a fixed type.
  • the present invention also works in any other appropriate display means such as a head-mounted display (HMD).
  • HMD head-mounted display
  • the medium of the image memory 9 in the first embodiment is a magnetic tape.
  • the present invention is not limited to this.
  • the image memory 9 may be an IC memory, a magneto-optical disc, a DVD disc, a compact disc, or a PD disc.
  • the present invention permits the left and right parallax images to be picked up through a single lens set, without requiring two lens sets.
  • the present invention thus implements compact and low-cost design in the apparatus, eliminates the characteristic difference between the left and right lenses, and picks up a high-quality stereoscopic image with the simple construction.
  • the optical path lengths of the left and right parallax images to the subject are equalized. With this arrangement, the image pick-up apparatus is thus free from a magnification difference between the left and right images, presenting a high-quality stereoscopic image.
  • the lens of the system may be used, and the camera section needs no particular modification in order to pick up a stereoscopic image.
  • An ordinary two-dimensional lens may be used in the same camera. This arrangement provides expandability and increases the merit of the camera.
  • a plurality of lenses, including the lens for stereoscopic image picking and lenses of different specifications, for example, different focal lengths, may be commonly used on the same camera.
  • FIG. 10 shows the basic construction of a second embodiment of a stereoscopic image pick-up apparatus 1000 .
  • the stereoscopic image pick-up apparatus 1000 includes a lens unit 1001 and a main unit 1002 .
  • the interchangeable lens unit 1001 is the one that is standardized according to a predetermined format, and includes an image pick-up optical system 1100 , a liquid-crystal control circuit 1123 , an IG driver 1124 , motor drivers 1125 and 1126 , a lens microcomputer 1127 for controlling these units, and an image input terminal 1128 and a lens mount (not shown) and a contact block (not shown), both standardized according to a predetermined format.
  • the main unit 1002 includes the unshown camera mount and the contact block, which are standardized according to the predetermined format.
  • the lens mount of the lens unit 1001 is mechanically coupled with the camera mount in a detachable manner.
  • the contact block of the lens unit 1001 and the contact block of the main unit 1002 is mated with each other with the contacts of the two contact blocks connected.
  • the operation represented by an arrow 1007 namely, data communications between the lens microcomputer 1127 and a camera microcomputer 1208 are performed by a predetermined protocol.
  • the power supplying from the camera to the lens side is carried out through the respective contact blocks of the camera and lens sides.
  • the image pick-up optical system 1100 is driven by drive units 1009 and 1011 , and have total reflection mirrors 1107 and 1112 , both pivotally supported about respective predetermined axes.
  • the drive units 1009 and 1011 employ stepping motors in this embodiment.
  • DC motors or ultrasonic motors may be used.
  • drivers 1010 and 1012 feed drive signals to the stepping motors 1009 and 1011 .
  • the lens microcomputer 1127 counts the number of steps of the stepping motors 1009 and 1011 to detect the angle of rotation of each stepping motor.
  • the drivers 1010 and 1012 may be provided with encoders to detect the angles of pivot of the mirrors when the DC motors or ultrasonic motors are employed. The angles of pivot of the mirrors are thus detected.
  • the mirrors 1107 and 1112 are pivoted about their respective predetermined axes to variably change the directions of optical axes 1004 and 1005 .
  • the mirror 1107 ( 1112 ) turns about a straight line as the axis of pivot, which passes in the vicinity of the intersection of the optical axis 1004 ( 1005 ) and the mirror 1107 ( 1112 ), and is aligned perpendicular to the page of FIG. 10, namely, aligned in the vertical direction of the screen.
  • the optical axes 1004 and 1005 of the left and right images generally lie in the same plane, and intersect (or converge) at any predetermined position including infinity.
  • the mirrors 1107 and 1112 are pivoted about the respective predetermined axes to change the converged position of the optical axes. Making the convergence variable is essentially important to pick up a natural-looking image.
  • the spacing between the intersection of the optical axis 1004 and the reflective surface of the mirror 1107 and the intersection of the optical axis 1005 and the reflective surface of the mirror 1112 (hereinafter referred to a “base line distance”) is about 63 mm in the second embodiment, though the present invention is not limited to this value.
  • the base line distance is set to be equal to the average inter-pupillary distance of humans, namely, 63 mm.
  • a subject distance measuring unit 1008 measures a distance to the subject using a triangulation method.
  • FIG. 11 illustrates the principle of distance measurement using the triangulation.
  • the subject distance measuring unit 1008 includes a projection lens and light receiving lenses, an infrared LED (IRED) of light emitting means, line sensors, each having a plurality of photocells of light receiving means arranged in a line, and calculator means for calculating the distance l to the subject in response to the output of the line sensors.
  • IRED infrared LED
  • line sensors each having a plurality of photocells of light receiving means arranged in a line
  • calculator means for calculating the distance l to the subject in response to the output of the line sensors.
  • the light ray emitted from IRED is reflected from the subject, and is then received by the left light receiving lens L and the right light receiving lens R, and is then focused on the left line sensor L and right line sensor R. It is now determined which of the photocells receives the light ray.
  • the photocell X L of the sensor L receives the light ray and that the photocell X R of the sensor R receives the light ray.
  • the subject distance l is determined
  • the subject distance information is calculated using the triangulation in this embodiment. It is also possible to derive the subject distance information from the position information of lenses.
  • the image pick-up apparatus in the first embodiment employs two separate, left and right optical systems, while the image pick-up apparatus in the second embodiment has a common portion.
  • the optical system of the second embodiment is now discussed.
  • the optical system of the second embodiment includes a set of lenses 1108 , 1113 , and 1115 , each constructed of a single or a plurality of elements and having a negative refractive power, a lens set of lenses 1116 , 1117 , and 1119 , each constructed of a single or a plurality of elements and having a positive refractive power, a prism 1110 with its surfaces 1109 and 1111 featuring total reflection, polarizers 1101 , 1103 , 1104 and 1106 , and liquid-crystal elements 1102 and 1105 . On the left optical path, the polarizers 1101 and 1103 and liquid-crystal element 1102 are combined.
  • the electric field applied on the liquid crystal is controlled to transmit or not to transmit a luminous flux through the liquid crystal.
  • the same is true of the combination of the polarizers 1104 and 1106 and liquid-crystal element 1105 .
  • the liquid-crystal elements 1102 and 1105 serve as liquid-crystal shutters in their respective optical paths.
  • the liquid-crystal element employs FLC (ferroelectric liquid crystal).
  • FLC ferroelectric liquid crystal
  • TN twisted nematic liquid crystal
  • STN super twisted nematic liquid crystal
  • the polarizers 1101 and 1103 , and the polarizers 1104 and 1106 may be respectively glued onto the liquid crystals 1102 and 1105 using an adhesive agent, or may be supported on their own.
  • An aperture 1114 serves as an light quantity adjuster.
  • the aperture 1114 is arranged in front of the lens set to its side facing an object so that the effective diameter of a fore-element is reduced.
  • An IG meter 1120 controls the opening of the aperture 1114 .
  • Step motors 1121 and 1122 move respective lenses.
  • the lens set 1108 , 1113 and 1115 is fixed.
  • the lens 1116 is a variator
  • the lens 1117 is a compensator
  • the lens 1119 has the focusing function, and all these lenses are movable.
  • the variator lens 1116 and the compensator lens 1117 are mechanically interlocked with a barrel 1118 with a cam and are supported in such a manner that allows the variator lens 1116 and the compensator lens 1117 to be moved in the direction of optical axis. With the cam barrel 1118 rotated by the stepping motor 1121 , zooming is performed.
  • the driving method is not limited to this.
  • the lenses 1116 and 1117 may be separately driven by separate drive means without using the cam barrel.
  • the stepping motor 1122 drives the lens 1119 .
  • electromagnetic motors such as DC motors, ultrasonic motors, and electrostatic motors may be used.
  • the positions of the lenses 1116 , 1117 and 1119 in the direction of optical axis are detected by counting drive pulses of the stepping motors that drive these lenses and converting them into the positions of the lenses.
  • the present invention is not limited to this position detection means.
  • a variable resistor method, an electrostatic capacity method, or an optical method with PSD or IRED may be used.
  • the IG meter 1120 adjusts the light quantity by driving the aperture 1114 .
  • An unshown ND filter is arranged in the lens unit 1001 .
  • a rear-focus zoom type is used here. Specifically, during zooming, the lens microcomputer 1127 drives lenses 1116 , 1117 and 1119 to move them in a predetermined relationship. The present invention is not limited to this zoom type.
  • the main unit 1002 includes image pick-up elements 1201 , 1202 , and 1203 in a three-plate image pick-up unit 1200 , amplifiers 1204 , 1205 , and 1206 respectively connected to the image pick-up elements 1201 , 1202 , and 1203 , a signal processing circuit 1207 connected to each of the amplifiers 1204 , 1205 , and 1206 , the camera microcomputer 1208 connected to the signal processing circuit 1207 , unshown zoom switch and AF switch, both connected to the camera microcomputer 1208 , an image output terminal 1209 , and an electronic view finder (hereinafter referred to as EVF) 1003 .
  • EVF electronic view finder
  • the signal processing circuit 1207 includes a camera signal processing circuit 1207 a and an AF signal processing circuit 1207 b .
  • the output of the camera signal processing circuit 1207 a is output as a video signal, and the output of the camera microcomputer 1208 is output to the lens microcomputer 1127 in the lens unit 1001 .
  • the three-plate image pick-up unit 1200 separates an incident light ray picked up by the image pick-up optical system 1100 into the three primary colors, through a first prism through a third prism 1201 , 1202 , and 1203 (hereinafter referred to as color separation prisms).
  • the red component of the three primary colors is focused on the image pick-up element 1201
  • the green component is focused on the image pick-up element 1202
  • the blue component is focused on the image pick-up element 1203 .
  • the images of the subject formed on the image pick-up elements 1201 , 1202 , and 1203 are photoelectrically converted into electric signals, which are then respectively fed to the amplifiers 1204 , 1205 , and 1206 .
  • the electric signals are amplified by the respective amplifiers 1204 , 1205 , and 1206 to their proper levels, and are then converted by the camera signal processing circuit 1207 a into a standard television signal to be output as the video signal.
  • the electric signals are also fed to the AF signal processing circuit 1207 b .
  • the AF signal processing circuit 1207 b generates an AF assessment value signal using the three primary color signals from the amplifiers 1204 , 1205 , and 1206 .
  • the camera microcomputer 1208 reads the AF assessment value signal generated in the AF signal processing circuit 1207 b and transfers it to the lens microcomputer 1127 .
  • the lens microcomputer 1127 drives the lens 1119 for focus control, based on the incoming AF assessment value signal.
  • the image output terminal 1209 of the main unit 1002 is connected to the image input terminal 1128 of the lens unit 1001 via a cable 1129 , and the image signal picked up in the main unit is fed to the liquid-crystal control circuit 1123 of the lens unit side.
  • the video signal in the second embodiment is an NTSC interlaced signal. A video signal of 60 fields per second is thus output.
  • the vertical synchronizing pulse and horizontal synchronizing pulse of the video signal assures synchronization.
  • the vertical synchronization pulse is superimposed at the head portion of each video signal of 60 fields.
  • the liquid-crystal control circuit 1123 separates a vertical synchronization pulse at a rate of ⁇ fraction (1/60) ⁇ second from the video signal input through the image input terminal 1128 as shown in FIG. 12 and FIG. 13.
  • An odd/even field signal is derived from the input video signal to identify whether it is an odd field or an even field.
  • the determination of the odd field or even field is made depending on whether the vertical synchronizing pulse coincides with the edge of the horizontal synchronizing pulse (odd field) or the vertical synchronizing pulse is delayed from the edge of the horizontal synchronizing pulse by 1 ⁇ 2 H (H is the horizontal synchronization period) (even field).
  • a left liquid-crystal drive signal for the left eye and right liquid-crystal drive signal for the right eye are derived from the vertical synchronization pulse and the odd/even field signal, and are output to CCD 1200 and the liquid-crystal control circuit 1123 .
  • Each of the left and right liquid-crystal drive signals is alternately reversed in logic value in a time-division manner, and provides image pick-up timing to CCD 1200 so that CCD 1200 picks up the left and right parallax images in a time-division manner.
  • the time-division driving of the liquid crystals 1102 and 1105 is synchronized with the selection of the left and right parallax images by CCD 1200 .
  • CCD 1200 selectively picks up one parallax image
  • the liquid-crystal shutter corresponding to this parallax image is in its transmissive state while the other liquid-crystal shutter is in its non-transmissive state.
  • each liquid-crystal shutter stays in the non-transmissive state with a positive voltage applied thereon, while it stays in the transmissive state with a negative voltage applied thereon.
  • the drive signals are applied to the liquid crystals 1102 and 1105 so that the left liquid crystal is in the transmissive state when the right liquid crystal is in the non-transmissive state and is in the non-transmissive state while the right liquid crystal is in the transmissive state.
  • the image transmitted through the liquid crystal 1105 is picked up by the CCD 1200 .
  • the image transmitted through the liquid crystal 1102 is picked up CCD 1200 . Since the odd/even field signal is available as information in this embodiment, the left video signal is picked in the odd field while the right video signal is picked in the even field.
  • CCD 1200 alternates between the left parallax image and the right parallax image to pick up 30 fields of the left parallax image and 30 screens of the right parallax images, a total of 60 fields of parallax images per second.
  • the reading timing for reading image data from CCD 1200 is synchronized with the image picking, and the left parallax image signal and right parallax image signal are alternately output from the signal processing circuit 1207 as the video signal.
  • any known techniques may be used for the separation of the vertical synchronization pulse and the detection of the field.
  • the video signal is fed to the liquid-crystal control circuit 1123 via the cable 1129 .
  • a signal indicative of vertical synchronization and information indicative of the odd/even field may be transmitted in the course of the predetermined data communication between the lens microcomputer 1127 and the camera microcomputer 1208 .
  • the rising edge of the right liquid-crystal drive signal and the falling edge of the left liquid-crystal drive signal are aligned in synchronization with the falling edge of the vertical synchronizing pulse. It will be perfectly acceptable if the rising edge of the right liquid-crystal drive signal and the falling edge of the left liquid-crystal drive signal are within the vertical blanking period ( 20 H).
  • the left image signal is picked up during the odd field while the right image signal is picked up during the even field. Conversely, the left image signal picked up during the even field while the right image signal is picked up during the odd field.
  • the second embodiment employs a single EVF 1003 . Since the image signal contains a time-series of left and right parallax images alternately appearing, the left and right parallax images are observed as a double image through EVF 1003 .
  • the left and right images from the time-division image signal may be presented on separate display means.
  • An HMD having such function may be employed.
  • the HMD may be used as the EVF.
  • the EVF may be provided with the left image signal or right image signal only.
  • the distance l to the subject is variable.
  • the image-fusible range is determined by the convergence angle and the subject distance, and a given convergence angle assures image fusion within a certain range of subject distance. If the subject distance varies greatly, the fusion condition may not be met. In other words, the convergence angle is preferably adjusted in accordance with the subject distance.
  • the image pick-up apparatus of the second embodiment features an automatic convergence angle setting capability, in which a convergence angle is automatically set in accordance with a subject distance when the subject distance is put into a specified state. This convergence control is discussed referring to FIG. 14.
  • step S 51 the convergence control starts. This command is issued by the lens microcomputer 1127 .
  • step S 52 a distance measuring unit 8 determines the distance l.
  • n represents the count at a counter.
  • the lens microcomputer 1127 goes to step S 55 .
  • the lens microcomputer 1127 goes to step S 54 .
  • the absolute value of ⁇ > ⁇ is meant that there is a great difference between the present subject distance l n and the previous subject distance.
  • the image input terminal 1128 resets the count of its counter to 1 in step S 54 .
  • the lens microcomputer 1127 increments the count n of its counter by 1 in step S 55 .
  • step S 56 a determination is made of whether the value of n is greater than a threshold q. If the value of n is greater than the threshold q, the lens microcomputer 1127 goes to step S 57 . If the value of n is not greater than the threshold q, the lens microcomputer 1127 returns to step S 52 .
  • step S 56 By YES in step S 56 is meant that a change in the subject distance is equal to or greater than a distance ⁇ q.
  • the convergence distance l ref is updated with presently detected distance l n in step S 57 .
  • step S 58 the counter is reset to 1.
  • step S 59 the convergence angle corresponding to the convergence distance l ref is determined.
  • the lens microcomputer 1127 adjusts the angles of the mirror 1107 and 1112 until the convergence angle determined in step S 59 .
  • the mirrors 1107 and 1112 are driven in the directions respectively represented by arrows 1013 and 1015 .
  • the optical axes 1004 and 1005 are deflected as represented by arrows 1014 and 1016 , and thus converge at the convergence distance l ref .
  • the convergence is set to be variable when the number of occurrences of changes in the distance to the subject, each change greater than the predetermined value ⁇ , is greater than a predetermined number of times.
  • the threshold ⁇ and the number of times q may be determined depending on the lens system.
  • is determined in consideration of the current convergence, and the range of subject distance within which fusion is possible without fatigue involved.
  • q is an integer multiple of the field frequency.
  • the triangulation is used in the distance measurement in the second embodiment.
  • the present invention is not limited to this method.
  • Distance measurement is performed detecting the position of each lens in each optical system. In in-focus state, the subject distance is measured based on the zoom values of the left and right optical systems.
  • the second embodiment permits the left and right parallax images to be picked up through a single lens set, without requiring two lens sets.
  • the present invention thus implements compact and low-cost design in the apparatus, eliminates the characteristic difference between the left and right lenses, and picks up a high-quality stereoscopic image with the simple construction.
  • the image pick-up apparatus is thus free from a magnification difference between the left and right images, presenting a high-quality stereoscopic image. Since the left and right parallax images are picked up by a single pick-up device in the image pick-up system, no extra electronic circuit is required. Compact and low-cost design is thus implemented in the image pick-up apparatus.
  • the lens of the system may be used, and the camera section needs no particular modification in order to pick up a stereoscopic image.
  • An ordinary two-dimensional lens may be used in the same camera. This arrangement provides expandability and increases the merit of the camera.
  • a plurality of lenses, including the lens for stereoscopic image picking and lenses of different specifications, for example, different focal lengths, may be commonly used on the same camera.
US10/763,756 1997-12-03 2004-01-22 Image pick-up apparatus for stereoscope Abandoned US20040150728A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/763,756 US20040150728A1 (en) 1997-12-03 2004-01-22 Image pick-up apparatus for stereoscope

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP33299597A JP3976860B2 (ja) 1997-12-03 1997-12-03 立体映像撮像装置
JP9-332995 1997-12-03
JP10-323691 1998-11-13
JP10323691A JP2000152282A (ja) 1998-11-13 1998-11-13 立体映像撮影装置
US09/203,997 US6762794B1 (en) 1997-12-03 1998-12-01 Image pick-up apparatus for stereoscope
US10/763,756 US20040150728A1 (en) 1997-12-03 2004-01-22 Image pick-up apparatus for stereoscope

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/203,997 Division US6762794B1 (en) 1997-12-03 1998-12-01 Image pick-up apparatus for stereoscope

Publications (1)

Publication Number Publication Date
US20040150728A1 true US20040150728A1 (en) 2004-08-05

Family

ID=26571274

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/203,997 Expired - Fee Related US6762794B1 (en) 1997-12-03 1998-12-01 Image pick-up apparatus for stereoscope
US10/763,756 Abandoned US20040150728A1 (en) 1997-12-03 2004-01-22 Image pick-up apparatus for stereoscope

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/203,997 Expired - Fee Related US6762794B1 (en) 1997-12-03 1998-12-01 Image pick-up apparatus for stereoscope

Country Status (3)

Country Link
US (2) US6762794B1 (US20040150728A1-20040805-M00003.png)
EP (1) EP0921694B1 (US20040150728A1-20040805-M00003.png)
DE (1) DE69830459T2 (US20040150728A1-20040805-M00003.png)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060044265A1 (en) * 2004-08-27 2006-03-02 Samsung Electronics Co., Ltd. HMD information apparatus and method of operation thereof
US7108378B1 (en) 2001-01-29 2006-09-19 Maguire Jr Francis J Method and devices for displaying images for viewing with varying accommodation
WO2006023268A3 (en) * 2004-08-19 2007-07-12 Sony Computer Entertainment Inc Portable augmented reality device and method
US20080013184A1 (en) * 2006-07-14 2008-01-17 Motorola, Inc. Head's up display ambiguity eliminator
US20080220867A1 (en) * 2002-07-27 2008-09-11 Sony Computer Entertainment Inc. Methods and systems for applying gearing effects to actions based on input data
US20090041309A1 (en) * 2002-01-16 2009-02-12 Iritech, Inc. System And Method For Iris Identification Using Stereoscopic Face Recognition
US20100026787A1 (en) * 2007-03-12 2010-02-04 Canon Kabushiki Kaisha Head mounted image-sensing display device and composite image generating apparatus
US20100149342A1 (en) * 2008-12-12 2010-06-17 Panasonic Corporation Imaging apparatus
US7802883B2 (en) 2007-12-20 2010-09-28 Johnson & Johnson Vision Care, Inc. Cosmetic contact lenses having a sparkle effect
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US20110051239A1 (en) * 2009-08-31 2011-03-03 Casio Computer Co., Ltd. Three dimensional display device and method of controlling parallax barrier
WO2011035561A1 (zh) * 2009-09-27 2011-03-31 深圳市掌网立体时代视讯技术有限公司 立体数码成像会聚装置及方法
US20110122233A1 (en) * 2009-11-25 2011-05-26 Tsubasa Kasai Image pickup apparatus
US20110187834A1 (en) * 2010-02-03 2011-08-04 Takafumi Morifuji Recording device and recording method, image processing device and image processing method, and program
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US20120056997A1 (en) * 2010-09-08 2012-03-08 Samsung Electronics Co., Ltd. Digital photographing apparatus for generating three-dimensional image having appropriate brightness, and method of controlling the same
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US8164671B2 (en) * 2008-03-03 2012-04-24 Panasonic Corporation Imaging apparatus, imaging apparatus body and reporting terminal
US20120188335A1 (en) * 2011-01-26 2012-07-26 Samsung Electronics Co., Ltd. Apparatus and method for processing 3d video
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8323106B2 (en) 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8342963B2 (en) 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
US8527657B2 (en) 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US20140205185A1 (en) * 2011-09-13 2014-07-24 Sharp Kabushiki Kaisha Image processing device, image pickup device, and image display device
US8797260B2 (en) 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US20140267814A1 (en) * 2013-03-15 2014-09-18 Panasonic Corporation Camera controller device
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US8976265B2 (en) 2002-07-27 2015-03-10 Sony Computer Entertainment Inc. Apparatus for image and sound capture in a game environment
US8994788B2 (en) * 2010-05-25 2015-03-31 Panasonic Intellectual Property Corporation Of America Image coding apparatus, method, program, and circuit using blurred images based on disparity
US9049434B2 (en) 2010-03-05 2015-06-02 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9128367B2 (en) 2010-03-05 2015-09-08 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9188849B2 (en) 2010-03-05 2015-11-17 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9210449B2 (en) 2013-01-08 2015-12-08 Pixart Imaging Inc. Video generating system with multiple image sensors and related method thereof
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US9955141B2 (en) 2014-04-29 2018-04-24 Eys3D Microelectronics, Co. Portable three-dimensional scanner and method of generating a three-dimensional scan result corresponding to an object
WO2018076202A1 (zh) * 2016-10-26 2018-05-03 中国科学院深圳先进技术研究院 能够进行人眼追踪的头戴式可视设备及人眼追踪方法
CN109710477A (zh) * 2018-12-28 2019-05-03 哈尔滨拓博科技有限公司 一种对平面互动投影进行自动化测试的系统和方法
US10279254B2 (en) 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6483484B1 (en) 1998-12-18 2002-11-19 Semiconductor Energy Laboratory Co., Ltd. Goggle type display system
KR100354840B1 (ko) * 2000-05-09 2002-10-05 최부진 입체영상 취득을 위한 촬영기 및 촬영방법
JP2002027496A (ja) * 2000-07-03 2002-01-25 Canon Inc 撮影レンズユニット、撮影装置及び撮影システム
JP2002095015A (ja) * 2000-09-11 2002-03-29 Canon Inc 撮像システム、レンズユニット、撮像装置
DE10063380B4 (de) * 2000-12-19 2005-06-09 Heraeus Med Gmbh Verfahren und Vorrichtung zur Video-Aufnahme eines beleuchteten Feldes
US8085293B2 (en) * 2001-03-14 2011-12-27 Koninklijke Philips Electronics N.V. Self adjusting stereo camera system
JP2003016435A (ja) * 2001-06-27 2003-01-17 Matsushita Electric Ind Co Ltd 個体認証装置及び方法
AU2003215836A1 (en) * 2002-03-29 2003-10-13 Koninklijke Philips Electronics N.V. Method, system and computer program for stereoscopic viewing of 3d medical images
JP3788394B2 (ja) * 2002-06-13 2006-06-21 ソニー株式会社 撮像装置および撮像方法、並びに表示装置および表示方法
US20040041906A1 (en) * 2002-09-04 2004-03-04 Fruit Theodoric Benjamin Method for producing a real-time, moving, stereoscopic, preview image
JP3926244B2 (ja) * 2002-09-26 2007-06-06 株式会社島津製作所 撮影装置
JP2004179892A (ja) * 2002-11-26 2004-06-24 Sanyo Electric Co Ltd 撮像装置
JP2006005608A (ja) * 2004-06-17 2006-01-05 Hitachi Ltd 撮像装置
US8878924B2 (en) * 2004-09-24 2014-11-04 Vivid Medical, Inc. Disposable microscope and portable display
US8827899B2 (en) * 2004-09-24 2014-09-09 Vivid Medical, Inc. Disposable endoscopic access device and portable display
US9033870B2 (en) * 2004-09-24 2015-05-19 Vivid Medical, Inc. Pluggable vision module and portable display for endoscopy
US8858425B2 (en) * 2004-09-24 2014-10-14 Vivid Medical, Inc. Disposable endoscope and portable display
US8982181B2 (en) * 2006-06-13 2015-03-17 Newbery Revocable Trust Indenture Digital stereo photographic system
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit
US10795457B2 (en) 2006-12-28 2020-10-06 D3D Technologies, Inc. Interactive 3D cursor
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
JP4714176B2 (ja) * 2007-03-29 2011-06-29 富士フイルム株式会社 立体撮影装置及び光軸調節方法
JP2009055554A (ja) * 2007-08-29 2009-03-12 Fujifilm Corp 複数の撮像素子を搭載したフレキシブル撮像装置
US8115825B2 (en) * 2008-02-20 2012-02-14 Apple Inc. Electronic device with two image sensors
CN101588513B (zh) * 2009-01-07 2011-05-18 深圳市掌网立体时代视讯技术有限公司 一种立体摄像装置及方法
EP2608552A1 (en) * 2009-01-19 2013-06-26 Minoru Inaba Stereoscopic video imaging display system
JP5409107B2 (ja) * 2009-05-13 2014-02-05 任天堂株式会社 表示制御プログラム、情報処理装置、表示制御方法、および情報処理システム
US8291322B2 (en) * 2009-09-30 2012-10-16 United Video Properties, Inc. Systems and methods for navigating a three-dimensional media guidance application
JP5405264B2 (ja) 2009-10-20 2014-02-05 任天堂株式会社 表示制御プログラム、ライブラリプログラム、情報処理システム、および、表示制御方法
JP4754031B2 (ja) * 2009-11-04 2011-08-24 任天堂株式会社 表示制御プログラム、情報処理システム、および立体表示の制御に利用されるプログラム
US20110137727A1 (en) * 2009-12-07 2011-06-09 Rovi Technologies Corporation Systems and methods for determining proximity of media objects in a 3d media environment
JP2011139296A (ja) * 2009-12-28 2011-07-14 Sony Corp 映像信号処理装置及び映像信号処理方法
EP2355526A3 (en) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
JP5425305B2 (ja) * 2010-05-31 2014-02-26 富士フイルム株式会社 立体画像制御装置ならびにその動作制御方法およびその動作制御プログラム
EP2395766B1 (en) 2010-06-14 2016-03-23 Nintendo Co., Ltd. Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method
WO2012023168A1 (ja) * 2010-08-19 2012-02-23 パナソニック株式会社 立体映像撮像装置および立体映像撮像方法
US8718973B2 (en) * 2011-09-09 2014-05-06 Kabushiki Kaisha Toshiba Method, device, and system for calculating a geometric system model using an area-simulating-volume algorithm in three dimensional reconstruction
US20130083170A1 (en) * 2011-10-03 2013-04-04 Canon Kabushiki Kaisha Stereoscopic image pickup apparatus
CN103809737A (zh) * 2012-11-13 2014-05-21 华为技术有限公司 一种人机交互方法及装置
US9161020B2 (en) * 2013-04-26 2015-10-13 B12-Vision Co., Ltd. 3D video shooting control system, 3D video shooting control method and program
US9503705B2 (en) * 2013-12-12 2016-11-22 Lenovo (Singapore) Pte. Ltd. Stereoscopic image generation
US9507241B1 (en) 2015-11-17 2016-11-29 Lenovo (Singapore) Pte, Ltd. Adjustable camera field of view
EP3415078A1 (en) * 2017-06-16 2018-12-19 Essilor International Method and system for determining a pupillary distance of an individual
CN109936677B (zh) * 2017-12-15 2021-07-27 浙江舜宇智能光学技术有限公司 应用于多目相机的视频同步方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3251933A (en) * 1962-10-31 1966-05-17 Vare Ind Inc Three-dimensional television system
US4437745A (en) * 1982-09-30 1984-03-20 Stephen Hajnal Three dimensional camera system
US5003385A (en) * 1988-08-24 1991-03-26 Kabushiki Kaisha Toshiba Stereoscopic television system
US6606113B2 (en) * 1995-05-24 2003-08-12 Olympus Optical Co., Ltd. Stereoscopic endocsope system and TV imaging system for endoscope
US6683652B1 (en) * 1995-08-29 2004-01-27 Canon Kabushiki Kaisha Interchangeable lens video camera system having improved focusing
US6862140B2 (en) * 2000-02-01 2005-03-01 Canon Kabushiki Kaisha Stereoscopic image pickup system
US6864910B1 (en) * 1999-06-30 2005-03-08 Canon Kabushiki Kaisha Optical apparatus

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8430980D0 (en) 1984-12-07 1985-01-16 Robinson M Generation of apparently three-dimensional images
DE3640731A1 (de) 1986-11-28 1988-06-09 Standard Elektrik Lorenz Ag Anordnung zum uebertragen stereoskopischer videobilder
JPH0827499B2 (ja) 1987-12-03 1996-03-21 松下電器産業株式会社 立体テレビジョン用撮像装置
DE69210303T2 (de) * 1991-05-23 1996-11-14 Hitachi Ltd Breitbildschirmfernsehempfänger mit Bildseitenverhältnisumwandlungsfunktion und Verfahren zur Darstellung eines vergrösserten Abschnittes
JPH0568188A (ja) 1991-09-06 1993-03-19 Canon Inc ビデオカメラ
JPH05292543A (ja) * 1992-04-15 1993-11-05 Komatsu Ltd 視覚装置
JP2888713B2 (ja) * 1993-01-14 1999-05-10 キヤノン株式会社 複眼撮像装置
US5625408A (en) * 1993-06-24 1997-04-29 Canon Kabushiki Kaisha Three-dimensional image recording/reconstructing method and apparatus therefor
JPH0767023A (ja) 1993-08-26 1995-03-10 Canon Inc 複眼撮像装置
US5864360A (en) * 1993-08-26 1999-01-26 Canon Kabushiki Kaisha Multi-eye image pick-up apparatus with immediate image pick-up
EP0888017A2 (en) * 1993-08-26 1998-12-30 Matsushita Electric Industrial Co., Ltd. Stereoscopic image display apparatus and related system
JP3054002B2 (ja) * 1993-09-01 2000-06-19 キヤノン株式会社 複眼撮像装置
JPH0847000A (ja) * 1994-08-02 1996-02-16 Canon Inc 複眼撮像装置、画像信号変換装置、表示装置及び複眼撮像画像記録再生装置
JP3478606B2 (ja) * 1994-10-12 2003-12-15 キヤノン株式会社 立体画像表示方法および装置
US5864359A (en) * 1995-05-30 1999-01-26 Smith & Nephew, Inc. Stereoscopic autofocusing based on comparing the left and right eye images
US5532777A (en) * 1995-06-06 1996-07-02 Zanen; Pieter O. Single lens apparatus for three-dimensional imaging having focus-related convergence compensation
US5828913A (en) 1995-06-06 1998-10-27 Zanen; Pieter O. Method for three dimensional measurement and imaging having focus-related convergence compensation
US6088006A (en) * 1995-12-20 2000-07-11 Olympus Optical Co., Ltd. Stereoscopic image generating system for substantially matching visual range with vergence distance
JP3771964B2 (ja) * 1996-03-12 2006-05-10 オリンパス株式会社 立体映像ディスプレイ装置
JPH1070742A (ja) * 1996-08-29 1998-03-10 Olympus Optical Co Ltd 二眼式映像表示装置
JP2000354257A (ja) * 1999-06-10 2000-12-19 Sony Corp 画像処理装置、画像処理方法、およびプログラム提供媒体

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3251933A (en) * 1962-10-31 1966-05-17 Vare Ind Inc Three-dimensional television system
US4437745A (en) * 1982-09-30 1984-03-20 Stephen Hajnal Three dimensional camera system
US5003385A (en) * 1988-08-24 1991-03-26 Kabushiki Kaisha Toshiba Stereoscopic television system
US6606113B2 (en) * 1995-05-24 2003-08-12 Olympus Optical Co., Ltd. Stereoscopic endocsope system and TV imaging system for endoscope
US6683652B1 (en) * 1995-08-29 2004-01-27 Canon Kabushiki Kaisha Interchangeable lens video camera system having improved focusing
US6864910B1 (en) * 1999-06-30 2005-03-08 Canon Kabushiki Kaisha Optical apparatus
US6862140B2 (en) * 2000-02-01 2005-03-01 Canon Kabushiki Kaisha Stereoscopic image pickup system

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7108378B1 (en) 2001-01-29 2006-09-19 Maguire Jr Francis J Method and devices for displaying images for viewing with varying accommodation
US20090041309A1 (en) * 2002-01-16 2009-02-12 Iritech, Inc. System And Method For Iris Identification Using Stereoscopic Face Recognition
US7715595B2 (en) * 2002-01-16 2010-05-11 Iritech, Inc. System and method for iris identification using stereoscopic face recognition
US9682320B2 (en) 2002-07-22 2017-06-20 Sony Interactive Entertainment Inc. Inertially trackable hand-held controller
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8797260B2 (en) 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US8976265B2 (en) 2002-07-27 2015-03-10 Sony Computer Entertainment Inc. Apparatus for image and sound capture in a game environment
US20080220867A1 (en) * 2002-07-27 2008-09-11 Sony Computer Entertainment Inc. Methods and systems for applying gearing effects to actions based on input data
US9381424B2 (en) 2002-07-27 2016-07-05 Sony Interactive Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US10406433B2 (en) 2002-07-27 2019-09-10 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US10099130B2 (en) 2002-07-27 2018-10-16 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US10220302B2 (en) 2002-07-27 2019-03-05 Sony Interactive Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US11010971B2 (en) 2003-05-29 2021-05-18 Sony Interactive Entertainment Inc. User-driven three-dimensional interactive gaming environment
US8758132B2 (en) 2003-09-15 2014-06-24 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8251820B2 (en) 2003-09-15 2012-08-28 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8303411B2 (en) 2003-09-15 2012-11-06 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
WO2006023268A3 (en) * 2004-08-19 2007-07-12 Sony Computer Entertainment Inc Portable augmented reality device and method
US10099147B2 (en) 2004-08-19 2018-10-16 Sony Interactive Entertainment Inc. Using a portable device to interface with a video game rendered on a main display
US8547401B2 (en) * 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US20060044265A1 (en) * 2004-08-27 2006-03-02 Samsung Electronics Co., Ltd. HMD information apparatus and method of operation thereof
US10279254B2 (en) 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US20080013184A1 (en) * 2006-07-14 2008-01-17 Motorola, Inc. Head's up display ambiguity eliminator
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US8717420B2 (en) * 2007-03-12 2014-05-06 Canon Kabushiki Kaisha Head mounted image-sensing display device and composite image generating apparatus
US20100026787A1 (en) * 2007-03-12 2010-02-04 Canon Kabushiki Kaisha Head mounted image-sensing display device and composite image generating apparatus
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US7802883B2 (en) 2007-12-20 2010-09-28 Johnson & Johnson Vision Care, Inc. Cosmetic contact lenses having a sparkle effect
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
US8502908B2 (en) 2008-03-03 2013-08-06 Panasonic Corporation Imaging apparatus, imaging apparatus body and reporting terminal
US8164671B2 (en) * 2008-03-03 2012-04-24 Panasonic Corporation Imaging apparatus, imaging apparatus body and reporting terminal
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
US8323106B2 (en) 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US20100149342A1 (en) * 2008-12-12 2010-06-17 Panasonic Corporation Imaging apparatus
US8237799B2 (en) * 2008-12-12 2012-08-07 Panasonic Corporation Imaging apparatus
US8527657B2 (en) 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8342963B2 (en) 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US20110051239A1 (en) * 2009-08-31 2011-03-03 Casio Computer Co., Ltd. Three dimensional display device and method of controlling parallax barrier
US8817369B2 (en) * 2009-08-31 2014-08-26 Samsung Display Co., Ltd. Three dimensional display device and method of controlling parallax barrier
WO2011035561A1 (zh) * 2009-09-27 2011-03-31 深圳市掌网立体时代视讯技术有限公司 立体数码成像会聚装置及方法
US8780185B2 (en) * 2009-11-25 2014-07-15 Olympus Imaging Corp. Image pickup apparatus having a display controlled using interchangeable lens information and/or finder information
US20110122233A1 (en) * 2009-11-25 2011-05-26 Tsubasa Kasai Image pickup apparatus
US20140368616A1 (en) * 2009-11-25 2014-12-18 Olympus Imaging Corp. Image pickup apparatus
US20110187834A1 (en) * 2010-02-03 2011-08-04 Takafumi Morifuji Recording device and recording method, image processing device and image processing method, and program
US9188849B2 (en) 2010-03-05 2015-11-17 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9128367B2 (en) 2010-03-05 2015-09-08 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9049434B2 (en) 2010-03-05 2015-06-02 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US8994788B2 (en) * 2010-05-25 2015-03-31 Panasonic Intellectual Property Corporation Of America Image coding apparatus, method, program, and circuit using blurred images based on disparity
US20120056997A1 (en) * 2010-09-08 2012-03-08 Samsung Electronics Co., Ltd. Digital photographing apparatus for generating three-dimensional image having appropriate brightness, and method of controlling the same
US20120188335A1 (en) * 2011-01-26 2012-07-26 Samsung Electronics Co., Ltd. Apparatus and method for processing 3d video
US9723291B2 (en) * 2011-01-26 2017-08-01 Samsung Electronics Co., Ltd Apparatus and method for generating 3D video data
US20140205185A1 (en) * 2011-09-13 2014-07-24 Sharp Kabushiki Kaisha Image processing device, image pickup device, and image display device
US9456147B2 (en) 2013-01-08 2016-09-27 Pixart Imaging Inc. Video generating system with multiple image sensors and related method thereof
US9210449B2 (en) 2013-01-08 2015-12-08 Pixart Imaging Inc. Video generating system with multiple image sensors and related method thereof
US20140267814A1 (en) * 2013-03-15 2014-09-18 Panasonic Corporation Camera controller device
US9445012B2 (en) * 2013-03-15 2016-09-13 Panasonic Intellectual Property Management Co., Ltd. Camera controller device
US9955141B2 (en) 2014-04-29 2018-04-24 Eys3D Microelectronics, Co. Portable three-dimensional scanner and method of generating a three-dimensional scan result corresponding to an object
WO2018076202A1 (zh) * 2016-10-26 2018-05-03 中国科学院深圳先进技术研究院 能够进行人眼追踪的头戴式可视设备及人眼追踪方法
CN109710477A (zh) * 2018-12-28 2019-05-03 哈尔滨拓博科技有限公司 一种对平面互动投影进行自动化测试的系统和方法

Also Published As

Publication number Publication date
DE69830459T2 (de) 2006-03-23
EP0921694A2 (en) 1999-06-09
DE69830459D1 (de) 2005-07-14
EP0921694A3 (en) 2001-03-28
EP0921694B1 (en) 2005-06-08
US6762794B1 (en) 2004-07-13

Similar Documents

Publication Publication Date Title
US6762794B1 (en) Image pick-up apparatus for stereoscope
US6862140B2 (en) Stereoscopic image pickup system
US6864910B1 (en) Optical apparatus
EP0837603B1 (en) System for controlling the focus of variable-focus lenses by detection of the spatial relationship of the eyes of a person
JP3186448B2 (ja) 立体テレビカメラ
JP4863527B2 (ja) 立体映像撮像装置
EP0744036B1 (en) Image display apparatus
US20140368616A1 (en) Image pickup apparatus
US6751020B2 (en) Stereoscopic image pickup system
JPH08194274A (ja) 立体撮像装置
JP4208351B2 (ja) 撮像装置、輻輳距離決定方法および記憶媒体
JP5474530B2 (ja) 立体画像表示装置
JP2001016619A (ja) 撮像装置、その輻輳距離決定方法、記憶媒体および光学装置
JP2005173270A (ja) 立体撮影用光学装置、撮影装置、立体撮影システム及び立体撮影装置
JPH06105339A (ja) 立体カメラ装置
JPH08205200A (ja) 立体撮像装置
JP3520197B2 (ja) 立体カメラ装置
JP2001016617A (ja) 撮像装置、その輻輳制御方法、記憶媒体および光学装置
JP2000152282A (ja) 立体映像撮影装置
JP2004233600A (ja) 立体撮影装置
JP2002084555A (ja) 立体映像撮影装置
JP2003092769A (ja) 立体映像撮影装置
JP2002027498A (ja) 立体映像撮影装置
JP2002191059A (ja) 立体撮影装置
JP2004126290A (ja) 立体撮影装置

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION