WO2011151872A1 - 3-dimensional image data generating method - Google Patents

3-dimensional image data generating method Download PDF

Info

Publication number
WO2011151872A1
WO2011151872A1 PCT/JP2010/007620 JP2010007620W WO2011151872A1 WO 2011151872 A1 WO2011151872 A1 WO 2011151872A1 JP 2010007620 W JP2010007620 W JP 2010007620W WO 2011151872 A1 WO2011151872 A1 WO 2011151872A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
left eye
right eye
image data
eye
Prior art date
Application number
PCT/JP2010/007620
Other languages
French (fr)
Inventor
Hirofumi Tahara
Masaya Kosaka
Toru Fujii
Tsuneyuki Kubo
Original Assignee
Olympus Visual Communications Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Visual Communications Corp. filed Critical Olympus Visual Communications Corp.
Priority to EP10852483.6A priority Critical patent/EP2577394A4/en
Publication of WO2011151872A1 publication Critical patent/WO2011151872A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/02Stereoscopic photography by sequential recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing

Definitions

  • the present invention relates to the technology of generating 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye which are 2-dimensional images having parallax to each other, and allowing an observer to observe the images.
  • the technology described in (1) through (4) is known. That is, it is the technology of anaglyph of obtaining a 3-dimensional picture using the parallax of right and left eyes by synthesizing the images for left and right eyes using different colors, that is, red and blue, attaching blue and red filters to the glasses of the observer so that the left eye can observe only one of the blue and red images and the right eye can observe the other (1).
  • a parallel viewing method and a cross-eyed viewing method have been conventionally known as methods of generating 3-dimensional picture data (stereoscopic view data).
  • FIG. 1A is an explanatory view of the parallel viewing method.
  • the parallel viewing method is to observe a subject by parallelizing the directions of the lines of sight from two points of view corresponding to the left and right eyes of an observer.
  • CG computer graphics
  • a left point of view 61 as a point of view of observing an image for a left eye to be directed to the left eye of the observer and a right point of view 62 as a point of view of observing an image for a right eye to be directed to the right eye of the observer are arranged at a predetermined distance in a virtual space, and the observation line of sight from the left point of view 61 is set parallel to the observation line of sight from the right point of view 62.
  • a left-eye camera 61c for capturing the image for a left eye to be directed to the left eye of the observer and a right-eye camera 62c for capturing the image for a left eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a real space, and the optical axis of shooting of the left-eye camera 61c is set parallel to the optical axis of shooting of the right-eye camera 62c.
  • the distance between the left point of view 61 (left-eye camera 61c) and the right point of view 62 (right-eye camera 62c) is, for example, about 6.5 cm based on the human interpupillary distance.
  • FIG. 1B is an explanatory view of the cross-eyed viewing method.
  • the cross-eyed viewing method is to observe a subject by crossing the directions of the lines of sight from two points of view corresponding to the left and right eyes of an observer.
  • a left point of view 63 as a point of view of observing an image for a left eye to be directed to the left eye of the observer
  • a right point of view 64 as a point of view of observing an image for a right eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a virtual space
  • the observation line of sight from the left point of view 63 is arranged to cross the observation line of sight from the right point of view 64.
  • a left-eye camera 63c for capturing the image for a left eye to be directed to the left eye of the observer and a right-eye camera 64c for capturing the image for a left eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a real space, and the optical axis of shooting of the left-eye camera 63c is arranged to cross the optical axis of shooting of the right-eye camera 64c.
  • the distance between the left point of view 63 (left-eye camera 63c) and the right point of view 64 (right-eye camera 64c) is, for example, about 6.5 cm as described above.
  • the position at which the lines of sight cross each other is normally set about 1 through 3m ahead of the left eye point of view 63 (right point of view 64).
  • Patent Document 1 Japanese Patent Application Publication No. 2006-72429
  • an image obtained by observing a subject in the first direction from the observation position is set as an image for a left eye
  • an image obtained by observing a subject from the same observation position in the second direction leftward from the first direction is set as an image for a right eye.
  • the image for a left eye and the image for a right eye can be obtained in an optional order or simultaneously.
  • the image for a left eye and the image for a right eye are observed from the same observation position (one point of view or one camera), so it is not necessary to have two points of view or two cameras unlike the conventional parallel viewing method and the cross-eyed viewing method.
  • the image captured in the observation direction relatively rightward is defined as an image for a left eye
  • the image captured in the observation direction relatively leftward is defined as an image for a right eye.
  • the leftward direction refers to the counterclockwise direction viewed from above about the vertical central axis passing through the observation position.
  • the rightward direction refers to the clockwise direction viewed from above about the vertical central axis passing through the observation position.
  • the angle made by the two observation directions is to be within the range where the images for left and right eyes have a common observation area, and is preferably or less.
  • the 3-dimensional image data generating method generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and performs: a left eye image data acquiring process of acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring process of acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  • the predetermined position is an observation position in which a subject is observed, and is an arbitrary observation position with respect to the subject. From the predetermined position, the image data of the subject obtained by observing the subject in the first direction using an image pickup device, for example, a camera etc. is acquired as image data for a left eye, and the image data obtained by observing the subject in the second direction is acquired as image data for a right eye, and these acquired data are stored in a record medium.
  • an image pickup device for example, a camera etc.
  • the 3-dimensional image data generating system generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image data acquiring device for acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring device for acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  • the 3-dimensional image data generating program generates, using a computer, 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image data acquiring step of directing the computer to acquire image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring step of directing the computer to acquire image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  • the 3-dimensional image data generating program according to the present invention can also be defined as an invention of a record medium storing a 3-dimensional image data generating program by storing the program in a computer-readable record medium.
  • the 3-dimensional image observing method allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and performs: a left eye image projecting process of projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and a right eye image projecting process of projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the same predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
  • the 3-dimensional image observing system allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image projecting device for projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and a right eye image projecting device for projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the same predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
  • the 3-dimensional image pickup device captures a subject and obtains 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: an optical image pickup system; an image pickup element for converting an image formed by the optical image pickup system into an electric signal; an optical axis transition mechanism for changing directions of an optical axis of the optical image pickup system between a first direction and a second direction leftward from the first direction; an image pickup circuit for capturing the image for a left eye when the optical axis of the optical image pickup system is in the first direction, and capturing the image for a right eye when the optical axis of the optical image pickup system is in the second direction; and a record medium for storing the image captured by the image pickup circuit.
  • the 3-dimensional image data processed in the present invention includes data obtained by an actual shooting process in a real 3-dimensional space, data obtained by CG (computer graphics) in a virtual 3-dimensional space, and data obtained by combining the actually shot image data and the CG data.
  • FIG. 1A is an explanatory view of the 3-dimensional picture data for realizing a stereoscopic view, and an example of the parallel viewing method
  • FIG. 1B is an explanatory view of the 3-dimensional picture data for realizing the conventional stereoscopic view, and an example of the cross-eyed viewing method
  • FIG. 2 is a block diagram for explanation of the configuration of the computer used in the present embodiment
  • FIG. 3 is a configuration of the 3-dimensional image pickup device for realizing the present embodiment
  • FIG. 4 is a setting example in a studio when a subject is shot using an image pickup device
  • FIG. 5 is a schematic diagram of the image pickup device as viewed from above
  • FIG. 6 is a schematic diagram of the image pickup device with another configuration as viewed from above
  • FIG. 7 is a schematic diagram of the image pickup device with another configuration as viewed from above;
  • FIG. 8A is an explanatory view of the basic concept of the present embodiment, and illustrates the cross-eyed viewing method;
  • FIG. 8B is an explanatory view of the basic concept of the present embodiment, and illustrates a basic configuration;
  • FIG. 9A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method, and an image for a left eye is acquired;
  • FIG. 9B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method, and an image for a right eye is acquired;
  • FIG. 9A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method, and an image for a right eye
  • FIG. 10A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the cross-eyed viewing method, and an image for a left eye is acquired
  • FIG. 10B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the cross-eyed viewing method, and an image for a right eye is acquired
  • FIG. 11A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the method according to the present embodiment, and an image for a left eye is acquired
  • FIG. 11B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the method according to the present embodiment, and an image for a right eye is acquired;
  • FIG. 12A illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an image for a left eye is acquired;
  • FIG. 12B illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an image for a right eye is acquired;
  • FIG. 12A illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an image for a right eye is acquired;
  • FIG. 12A illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an
  • FIG. 13A illustrates the state of an observation in which a ball approaches from a distant point in the method according to the present embodiment, and an example of the case in which an image for a left eye is acquired
  • FIG. 13B illustrates the state of an observation in which a ball approaches from a distant point in the method according to the present embodiment, and an example of the case in which an image for a right eye is acquired
  • FIG. 14 is a flowchart for explanation of the process according to the first embodiment
  • FIG. 15 is a timing chart for explanation of the process according to the first embodiment
  • FIG. 16A is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the parallel viewing method
  • FIG. 16B is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the cross-eyed viewing method
  • FIG. 16C is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the present embodiment
  • FIG. 16B is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the cross-eyed viewing method
  • FIG. 16C is an explanatory view of the par
  • FIG. 17A is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a left eye acquired in the processing method according to the first embodiment
  • FIG. 17B is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a right eye acquired in the processing method according to the first embodiment
  • FIG. 17C is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a left eye overlapping the image for a right eye illustrated in FIGS. 17A and 17B above;
  • FIG. 17D is an explanatory view of the method of improving the parallax of images according to the second embodiment, and is an example of changing the relative positions of the image for a left eye and the image for a right eye, and improving the visibility;
  • FIG. 18 is a flowchart for explanation of the process according to the second embodiment;
  • FIG. 19A illustrates the concept for explanation of the third embodiment, and the state in which a 3-dimensional image obtained in the method according to the first and second embodiments is displayed on the screen;
  • FIG. 19B illustrates the concept for explanation of the third embodiment, and the state in which the perspective effect is emphasized in the cross-eyed viewing method and the parallel viewing method, and, for example, an approaching ball is observed;
  • FIG. 19C illustrates the concept for explanation of the third embodiment, and is an example of a 3-dimensional image represented according to the third embodiment
  • FIG. 20 is a flowchart for explanation of the process according to the third embodiment
  • FIG. 21 is a schematic diagram of the image pickup device without a barrel
  • FIG. 22 is a schematic diagram of the image pickup device in which an actuator is attached only to an optical image pickup system without a barrel
  • FIG. 23 is a schematic diagram of the image pickup device in which an actuator is attached only to an image pickup element without a barrel
  • FIG. 24 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a reflecting optical system is attached
  • FIG. 25 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a reflecting optical system
  • FIG. 26 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a polarizing plate and a double refraction optical element is attached
  • FIG. 27 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a polarizing plate and a double refraction optical element is attached
  • FIG. 28 is an explanatory view of the characteristic of the material having the property of double refraction
  • FIG. 29 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a polarizing conversion element is attached
  • FIG. 30 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a polarizing conversion element;
  • FIG. 31A is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric elements on the left and right sides of the optical image pickup system, and is an example of the first pair of piezoelectric elements performing drive and optical axis transition;
  • FIG. 31B is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric elements on the left and right sides of the optical image pickup system, and is an example of the second pair of piezoelectric elements performing drive and optical axis transition;
  • FIG. 31A is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric elements on the left and right sides of the optical image pickup system, and is an example of the first pair of piezoelectric elements performing drive and optical axis transition;
  • FIG. 31B is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric
  • FIG. 32A is an explanatory view of the fourth embodiment, and illustrates the state in which the liquid crystal shutter before the left eye of the observer is in the transmission state
  • FIG. 32B is an explanatory view of the fourth embodiment, and illustrates the state in which the liquid crystal shutter before the right eye of the observer is in the transmission state
  • FIG. 33 is a flowchart for explanation of the process according to the fourth embodiment
  • FIG. 34 is a flowchart for explanation of the process according to the fourth embodiment.
  • FIG. 3 is a configuration of the system of the device to which the 3-dimensional image data generating method according to the present embodiment is applied.
  • the system configuration includes an image pickup device 1, a computer 2, an input console 3, a display 4, and an external storage device 5.
  • the image pickup device 1 is, for example, a camera capable of shooting moving pictures, shoots a subject through an optical image pickup system, and transmits the shot image data to the computer 2 through a network 6 such as the Internet etc.
  • the image pickup device 1 can also be configured by a connection directly to the computer 2 without the network 6. Otherwise, the configuration can also be realized without connecting the image pickup device 1 to the computer 2 using a communication circuit to communicate data using, for example, a portable record medium. In addition, the configuration can also be realized by including the functions corresponding to the computer 2, the input console 3, the display 4, and the external storage device 5 in the image pickup device 1.
  • the 3-dimensional image data acquired by the 3-dimensional image data generating method according to the present embodiment can be acquired by an actual shooting process in a real 3-dimensional space or by CG in a virtual 3-dimensional space.
  • the image pickup device 1 is not required when the 3-dimensional image data is generated by the CG without the actual shooting process.
  • FIG. 2 is a block diagram for explanation of the computer 2, and the computer 2 is configured by a CPU 7, an internal semiconductor memory 8, an internal hard disk (hereinafter referred to as an HDD) 9, an interface for an external storage device (hereinafter referred to as an external storage device I/F) 10, an interface for a network (hereinafter referred to as a network I/F) 11, an interface for an input console (hereinafter referred to as an input console I/F) 12, and an interface for a display (hereinafter referred to as a display I/F) 13.
  • the CPU 7, the internal semiconductor memory 8, the internal HDD 9, etc. are connected through an internal data bus 14, and can communicate data with one another.
  • the internal HDD 9 stores a system program for driving the computer 2 according to the present embodiment, and the CPU 7 performs the process described later according to the program read from the internal HDD 9 to the internal semiconductor memory 8.
  • the internal semiconductor memory 8 can also be used as a work area, and temporarily stores the data being processed by the CPU 7.
  • the external storage device 5 communicates data with the computer 2 through the external storage device I/F 10, stores the image data acquired by, for example, the image pickup device 1, and transmits the image data stored in the external storage device 5 to the computer 2.
  • the display 4 displays the image data transmitted from the computer 2 through the display I/F 13. For example, the captured image input to the computer 2 through the network 6 is displayed, and the data of the captured image stored in the external storage device 5 is displayed.
  • FIG. 4 is a setting example in a studio when a subject is shot using an image pickup device 1, and an example of studio setting when 3-dimensional image data is obtained by the actual shooting process.
  • the image pickup device 1 is arranged toward subjects 15 and 16, and the image pickup device 1 is provided on, for example, a tripod 17.
  • the optical axis of shooting of the image pickup device 1 is configured for possible transition at angle .
  • the image pickup device 1 has a configuration for changing the directions of optical axis of shooting.
  • the images can be shot from the same position in the two shooting directions shifted by the angle of by swinging the entire direction of the image pickup device .
  • the angle can be, for example, at maximum.
  • the angle of is referred to as the limit of the forward divergence of human eyes by considering the adjustable angle of for each of the left and right eyes when the left eye and the right eye simultaneously diverge leftward and rightward respectively.
  • FIG. 5 is a schematic diagram of the image pickup device 1 as viewed from above, and the image pickup device 1 is configured with a barrel 22 to which an image pickup optical system 20 and an image pickup element 21 are directly or indirectly attached, a drive circuit 23, an image pickup circuit 24, and a record medium 25.
  • the image pickup optical system 20 is a lens having its focus in the position of the image pickup element 21, and forms an image from the rays from the subjects 15 and 16 on the image pickup element 21.
  • the image pickup element 21 can be an photoelectrical conversion element such as a CCD (charge coupled device) sensor, a CMOS (complementary metal oxide semiconductor) sensor, etc.
  • the image pickup circuit 24 captures a subject image formed on the image pickup element 21, and records the captured data converted into an electric signal on the record medium 25. The captured image can be sequentially transmitted to the computer 2 directly through the network 6 without recording it on the record medium 25.
  • Actuators 26a through 26d are provided at four points of the computer 2, and are driven according to the drive signal from the drive circuit 23.
  • the actuators 26a through 26d are configured by, for example, a piezoelectric element, the actuators 26a through 26b are attached to the barrel 22 at the image pickup optical system 20, and the actuators 26c and 26d are attached to the barrel 22 at the image pickup element 21.
  • the actuators 26a through 26b and the actuators 26c and 26d are driven at the drive signal output from the drive circuit 23, and the actuators 26a through 26d cooperatively changes the angle of the barrel 22 with respect to the body, thereby changing the optical axis of shooting by the angle . Therefore, while maintaining the relative positional relationship between the image pickup optical system 20 and the image pickup element 21, the optical axis of shooting can be correctly changed by providing the barrel 22 in the image pickup device 1 as described above and changing the angle of the barrel 22 itself.
  • the image pickup device 1 can also be configured to change the angle of the optical axis of shooting by attaching only the actuators 26a through 26b to the barrel 22 and driving the barrel 22 at the side of the image pickup optical system 20.
  • the image pickup device 1 can also be configured to change the angle of the optical axis of shooting by attaching only the actuators 26c through 26d to the barrel 22 and driving the barrel 22 at the side of the image pickup element 21.
  • FIG. 8A illustrates the cross-eyed viewing method.
  • a left eye point of view 31 (left eye camera 31c) is advanced to a cross point 33 along a left eye line of sight 32
  • a right eye point of view 34 (right eye camera 34c) is advanced to the cross point 33 along a right eye line of sight 35, thereby obtaining the configuration illustrated in FIG. 8B.
  • the left eye line of sight 32 (optical axis of the left eye camera) and the right eye line of sight 35 (optical axis of the right eye camera) exchange their left and right positions after the cross point 33 as compared with the positions before the cross point 33.
  • FIG. 8B according to the present embodiment, there is no exchange in right and left positions, thereby enabling a natural observation of a subject similar to a human view.
  • the present embodiment has a layout of obtaining an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other as the layout illustrated in FIG. 8B.
  • FIGS. 9A and 9B are explanatory views of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method.
  • FIG. 9A illustrates acquiring an image for a left eye
  • FIG. 9B illustrates acquiring an image for a right eye.
  • a left eye point of view 37 (left eye camera 37c) and a right eye point of view 38 (right eye camera 38c) are arranged at the left and right positions at a predetermined distance (for example, 6.5 cm as a human interpupillary distance), a sequence of white balls arranged at predetermined intervals are arrangedin parallel to the direction of the line of sight on the left of the left eye point of view 37, and a sequence of black balls arranged at predetermined intervals are arranged in parallel to the direction of the line of sight on the right of the right eye point of view 38.
  • a screen 39 arranged forward and vertically to the direction of the each line of sight is substituted for an image pickup surface.
  • FIG. 9A the position where the line (expressed by a broken line) which connects the left eye point of view 37 to each ball crosses the screen 39 is expressed by a small circle.
  • the ball On the image pickup surface, the ball is captured at the position corresponding to the small circle.
  • the image for a left eye A illustrated in FIG. 9A is obtained on the screen 39.
  • FIG. 9B an image for a right eye B is similarly obtained.
  • FIGS. 10A and 10B illustrate the cases of capturing and observing the scenes illustrated in FIGS. 9A and 9B with the cross-eyed viewing method.
  • the screen 39 is perpendicular to the direction of the line of sight, it is inclined with respect to the sequences of balls.
  • the image displayed in C of FIG. 10A is obtained as an image for a left eye and the image displayed in D of FIG. 10B is obtained an image for a right eye.
  • the nearest ball in the white balls is out of the image pickup surface, but all black balls are on the image pickup surface, thereby capturing a total of nine balls.
  • the nearest ball in the black balls is out of the image pickup surface, but all white balls are on the image pickup surface, thereby capturing a total of nine balls.
  • the nearest black ball exists in the image for a left eye only
  • the nearest white ball exists in the image for a right eye only.
  • FIGS. 11A and 11B are examples of using the present embodiment, and illustrate capturing and observing the cases similar to those in FIGS. 10A and 10B. Also in this case, since the screen 39 is perpendicular to the direction of the line of sight, and is inclined to the sequences of balls.
  • the image illustrated in E of FIG. 11A is obtained as an image for a left eye
  • the image illustrated in F of FIG. 11B is obtained as an image for a right eye.
  • the nearest white ball and black ball are out of the image pickup surface, and a total of eight balls, that is, four left balls and four right balls, can be captured in the present embodiment as in the parallel viewing method. That is, in the method of the present embodiment, the numbers of the left and right balls match each other, and there is no lost subject which occurs in the cross-eyed viewing method.
  • the event of losing a subject only in one of the left and right image can be largely improved, thereby realizing more natural stereoscopic effect as compared with the cross-eyed viewing method.
  • FIGS. 12A and 12B illustrate the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method.
  • FIG. 12A illustrates the case in which an image for a left eye is acquired
  • FIG. 12B illustrates the case in which an image for a right eye is acquired.
  • the balls move from a distant point to a near point on a symmetry axis 43 passing through the center of a left eye point of view 41 and a right eye point of view 42.
  • a ball is at a distant point, it is expressed by black.
  • a ball is at an intermediate point, it is expressed by shades.
  • a ball is at a near point, it is expressed by white.
  • the ball in the image for a left eye, the ball is positioned on the left of the center of the vision when the ball is at a distant point (refer to I of FIG. 12A), positioned near the center of the vision when the ball is at an intermediate point (refer to II of FIG. 12A), and positioned on the right of the center of the vision when the ball is at a near point (refer to III in FIG. 12A).
  • the ball in the image for a right eye, the ball is positioned on the right of the center of the vision when the ball is at a distant point (refer to I of FIG. 12B), positioned near the center of the vision when the ball is at an intermediate point (refer to II of FIG. 12B), and positioned on the left of the center of the vision when the ball is at a near point (refer to III in FIG. 12B).
  • FIGS. 13A and 13B illustrate the method according to the present embodiment, and the state of an observation in which a ball approaches from a distant point.
  • FIG. 13A corresponds to the case in which an image for a left eye is acquired.
  • FIG. 13B corresponds to the case in which an image for a right eye is acquired.
  • the images for a left eye illustrated in I through III of FIG. 13A and the images for a right eye illustrated in I through III of FIG. 13 B are acquired.
  • a change in position on the images is hardly detected although the balls approach, and the exchange of the right and left positions does not occur. Therefore, the method of the present embodiment can generating a 3-dimensional image having a more natural stereoscopic effect.
  • FIG. 4 is an example of setting in a studio when a subject is shot using the image pickup device 1, and acquiring 3-dimensional image data by the actual shooting process.
  • FIG. 14 is a flowchart for explanation of the process according to the present embodiment, and the process is performed by driving each circuit of the image pickup device 1 with the configuration illustrated in FIG. 5.
  • FIG. 15 is a timing chart for explanation of the process according to the present embodiment.
  • an image for a left eye obtained by an observation in the first direction from a predetermined observation position (capturing position) is acquired (S3).
  • the image pickup circuit 24 is driven, and the captured image data including the subjects 15 and 16 is converted into an electric signal by the image pickup element 21.
  • the process timing is expressed by b1 in FIG. 15.
  • the acquired image data for a left eye is recorded on the record medium 25 (S4).
  • the drive circuit 23 is driven, and the optical axis of shooting is moved in the second direction (S5, c1 illustrated in FIG. 15). Then obtained is an image for a right eye obtained by observing a subject in the second direction from the same position as the predetermined observation position (capturing position) (S6, d1 in FIG. 15). In addition, the acquired image data for a right eye is recorded on the record medium 25 (S7).
  • a shooting terminate instruction is issued (S8), and the processes above are repeated until the shooting terminate instruction is issued (NO in S8). That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S2), an image for a left eye is acquired (S3), and the acquired image data for a left eye is stored on the record medium 25 (S4). Furthermore, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction (S5), an image for a right eye is acquired (S6), and the acquired image data for a right eye is recorded on the record medium 25 (S7).
  • the processes above are repeated, the image for a left eye of the subject is acquired with the timing of b2, b3, b4, --- illustrated in FIG. 15, the image for a right eye of the subject is acquired with the timing of d2, d3, d4, ---, and the image data is sequentially recorded on the record medium 25.
  • the record medium 25 alternately stores the image data for a left eye and the image data for a right eye, and the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6 at the instruction from, for example, the input console 3, and stored in the external storage device 5.
  • a 3-dimensional still image when a 3-dimensional still image is acquired, one image for a left eye and one image for a right eye are to be acquired for one still image.
  • 3-dimensional moving pictures the image for a left eye is to be acquired once in 1/30 second
  • the image for a right eye is to be acquired once in 1/30 second by moving the optical axis of shooting every 1/60 second, for example.
  • 3-dimensional image data can also be generated by CG in a virtual 3-dimensional space.
  • a 3-dimensional image data generating program stored on, for example, the internal HDD 9 of the computer 2 can be read to the internal semiconductor memory 8 at an instruction from the input console 3, and the CPU 7 performs control based on the program, thereby generating 3-dimensional image data.
  • the 3-dimensional image data generating process by CG is similar to the 3-dimensional image data generating process described with reference to the flowchart in FIG. 14, and the similar process is performed by the program instead of the operations of the circuits of the drive circuits 23 and 24 provided in the image pickup device 1.
  • the generated 3-dimensional image data is stored on, for example, the external storage device 5.
  • the captured image data acquired by the image pickup device 1 is the same as the image data acquired by a left eye camera 41c and a right eye camera 42c illustrated in FIG. 13. That is, when the optical axis of shooting of the image pickup device 1 is set in the first direction, the image pickup device 1 acquires an image for a left eye. When the optical axis of shooting is set in the second direction, it acquires an image for a right eye.
  • the captured image data alternately including the image data for a left eye and the image data for a right eye are the same as the image data acquired by the left eye camera 41c and the right eye camera 42c illustrated in FIG. 13.
  • the display 4 can display a stereoscopic 3-dimensional image with a stereoscopic effect and less fatigue for an observer.
  • the second embodiment of the present invention is an invention for improving the visibility by changing the relative position of the image for a left eye and the image for a right eye so that, in addition to the contents of the first embodiment described above, the a shorter distance of an subject from the position of the image pickup device 1 (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  • FIGS. 16A through 16C are explanatory views of the parallax between an image for a left eye and an image for a right eye by the methods of generating various 3-dimensional data.
  • FIG. 16A is an example of the parallel viewing method
  • FIG. 16B is an example of the cross-eyed viewing method
  • FIG. 16C is an example of the present embodiment.
  • FIG. 16A is a view in which the image for a left eye A illustrated in FIG. 9A is arrangedwith the image for a right eye B illustrated in FIG. 9B.
  • FIG. 16B is a view in which the image for a left eye illustrated in C of FIG. 10A is arranged with the image for a right eye illustrated in D of FIG. 10B.
  • FIG. 16C is a view in which the image for a left eye illustrated in E of FIG. 11A is arranged with the image for a right eye illustrated in F of FIG. 11B.
  • d indicates the parallax (difference in position between the images of the same subject in the image for a left eye and the image for a right eye) for a subject of a long distance
  • D indicates the parallax for a subject of a short distance
  • the parallax D of a short distance is larger than the parallax d of a long distance, and the differenceis large in the parallel viewing method in FIG. 16A.
  • the cross-eyed viewing method in FIG. 16B both the parallax d of the long distance and the parallax D of the short distance are large, and their length are substantially equal.
  • the parallax D of the short distance is larger than the parallax d of the long distance, but the difference between them is not large.
  • the difference of the parallax is almost the same as in the cross-eyed viewing method, the problems of a loss of a subject and an exchange of right and left images are considerably improved.
  • the relative position between the image for a left eye and the image for a right eye is corrected so that the shorter distance of a subject from the position of the image pickup device (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  • FIGS. 17A through 17D are explanatory views of the method of improving the parallax of images according to the second embodiment.
  • FIG. 17A illustrates the image for a left eye acquired in the processing method according to the first embodiment.
  • FIG. 17B illustrates an image for a right eye.
  • FIG. 17C illustrates the image for a left eye overlapping the image for a right eye illustrated in FIGS. 17A and 17B, and is an explanatory view of the parallax according to the present embodiment.
  • the relative position between the image for a left eye and the image for a right eye is changed so that the shorter distance of a subject from the position of the image pickup device 1 (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye, thereby improving the visibility.
  • FIG. 18 is a flowchart for explanation of the process according to the present embodiment.
  • a process similar to the flowchart described with reference to FIG. 14 is performed, and the data of the image for a left eye and the image for a right eye is stored on the record medium 25, for example. That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S1, S2), an image for a left eye is acquired and stored on the record medium 25 (S3, S4), the drive circuit 23 is further driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction, and an image for a right eye is acquired and stored on the record medium 25 (S5 through S8). Then, the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6, and stored in the external storage device 5.
  • the relative position of the image for a left eye and the image for a right eye is changed so that a subject at a shorter distance from the predetermined position can have smaller parallax between the image for a left eye and the image for a right eye than ones at a longer distance (S9).
  • the data of the image for a left eye and the image for a right eye stored in, for example, the external storage device 5 is read, the parallax d for a long distance and the parallax D for a short distance are corrected so that it can be the ratio of FIG. 17D to FIG. 17C, and the result is stored in the external storage device 5 again.
  • the process above can be performed by generating 3-dimensional image data by CG as in the first embodiment, and making a correction of the relative position of the image for a left eye and the image for a right eye to the 3-dimensional image data generated by CG.
  • the parallax of a person at a short distance is small. Therefore, an unnatural effect is not generated although the person is regarded and observed. In addition, the parallax for a short distance and a long distance becomes small, thereby generating an easily observable image.
  • the data corrected for the parallax d for a long distance and the parallax D for a short distance is stored again in the external storage device 5.
  • the data of an image for a left eye and an image for a right eye generated by the actual shooting process or CG, which is stored in the external storage device 5 before the correction is read, then the parallax is corrected, and the resultant data is displayed on the display 4.
  • the parallax d for a long distance and the parallax D for a short distance are corrected.
  • the correction of the parallax d for a long distance and the parallax D for a short distance can be performed at each time when the data of an image for a left eye and an image for a right eye is acquired by the image pickup device 1 that captures a subject, thereby storing the corrected captured image data in the external storage device 5.
  • the third embodiment of the present invention is an invention of superposing an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other which are generated in other ways respectively on an image for a left eye and an image for a right eye obtained in the methods according to the first and second embodiment.
  • FIGS. 19A through 19C illustrate the concept describing the present embodiment.
  • FIG. 19A illustrates the state in which the 3-dimensional image obtained in the methods according to the first and second embodiments is displayed on the screen. In this state, a natural and easily visible image can be observed, but a subject which pops out of the screen cannot be observed.
  • FIG. 19B illustrates the state in which a subject (for example, an approaching ball) is observed with the setting of emphasizing the perspective effect in other methods, for example, the cross-eyed viewing method and the parallel viewing method.
  • the images displayed on the screens in FIGS. 19A and 19B are superposed, thereby generating the 3-dimensional image in FIG. 19C.
  • FIG. 20 is a flowchart for explanation of the process according to the present embodiment.
  • a process similar to the process according to the flowchart described with reference to FIG. 14 is performed, and the data of an image for a left eye and an image for a right eye is stored on, for example, the record medium 25. That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S1, S2), and the image for a left eye is acquired and stored on the record medium 25 (S3, S4). Furthermore, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction, an image for a right eye is acquired and stored on the record medium 25 (S5 through S8). Then, the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6, and stored in the external storage device 5.
  • the image for a left eye and the image for a right eye as 2-dimensional images generated in another method and having parallax to each other are respectively superposed on the image for a left eye and image for a right eye acquired in the process above (S10).
  • the data of the image for a left eye and the image for a right eye stored in the external storage device 5 is read, and superposed respectively on the image for a left eye and image for a right eye as 2-dimensional images generated in another method, stored in the same external storage device 5, and having parallax to each other, and the result is stored in the external storage device 5 again.
  • the process above can be performed as in the second embodiment by generating 3-dimensional image data by CG, and superposing the data of the image for a left eye and the image for a right eye generated in another method as the 3-dimensional image data generated by CG.
  • the data obtained by superposing the image for a left eye and image for a right eye generated in another method is stored in the external storage device 5 again.
  • the resultant data can be displayed on the display 4.
  • the data of the image for a left eye and the image for a right eye is stored in the external storage device 5
  • the data of the image for a left eye and image for a right eye as 2-dimensional images generated in another method and having parallax to each other is superposed.
  • the image pickup device 1 captures a subject
  • the data of an image for a left eye and an image for a right eye is acquired, the data of an image for a left eye and an image for a right eye generated in another method can be superposed, and the resultant captured image data can be stored in the external storage device 5.
  • the configuration of the image pickup device 1 is described above with reference to FIGS. 5 through 7, the configuration of the image pickup device 1 is not limited to the examples above.
  • the image pickup device 1 illustrated in FIGS. 21 and 22 are not provided with, for example, the barrel 22, and the actuator is attached to the image pickup optical system 20 and the image pickup element 21, or one of them.
  • FIG. 21 illustrates the image pickup device 1 with the configuration.
  • the actuators 26a through 26b are attached to the image pickup optical system 20, the actuators 26c and 26d are attached to the image pickup element 21, and the actuators 26a through 26d are cooperatively driven to provide the image pickup device 1 so that it can substantially move the optical axis with a simple configuration without the barrel 22.
  • FIG. 22 illustrates the image pickup device 1 for driving the actuators 26a through 26b according to the drive signal from the drive circuit 23 by attaching the actuators 26a through 26b to the image pickup optical system 20 only.
  • FIG. 23 illustrates the image pickup device 1 for driving the actuators 26c and 26d at the drive signal from the drive circuit 23 by attaching the actuators 26c and 26d to the image pickup element 21 only. Provided in these cases is the image pickup device 1 capable of substantially moving an optical axis with a simple configuration without the barrel 22.
  • FIG. 24 is a variation example of the image pickup device 1 used in the present embodiment.
  • An adapter 45 is attached to the image pickup device 1, and an optical axis transition mechanism 46 is provided in the adapter 45.
  • the image pickup device 1 provided with the adapter 45 can be regarded as one image pickup device.
  • the optical axis transition mechanism 46 is configured by reflecting optical systems 47 and 48, and actuators 26e and 26f provided at both ends of the reflecting optical system 47, the actuators 26e and 26f are driven at the drive signal provided from the drive circuit 23, and the reflecting optical system 47 is rotated (vibrated) about a support point 47a within the range of the angle .
  • the body of the image pickup device 1 is configured with the image pickup optical system 20, the image pickup element 21, the image pickup circuit 24, and the record medium 25, and the optical information of the subject reflected by the reflecting optical systems 47 and 48 in the adapter 45 is formed on the image pickup element 21 by the image pickup optical system 20, thereby acquiring the data of the image for a left eye and the image for a right eye.
  • FIG. 25 illustrates a device of the configuration of the adapter 45 incorporated into the image pickup device 1.
  • the reflecting optical systems 47 and 48 and the actuators 26e and 26f provided at both ends of the reflecting optical system 47 are provided, the actuators 26e and 26f are driven by the drive signal from the drive circuit 23, the reflecting optical system 47 is rotated about the support point 47a, the image pickup optical system 20 forms the optical information about the subject on the image pickup element 21, and the data of the image for a left eye and the image for a right eye is acquired.
  • An L-shaped optical path can be configured without the reflecting optical system 48
  • FIGS. 26 and 27 illustrate variation examples of the image pickup device 1 which moves the optical axis of shooting using double refraction.
  • the refractive index varies by the plane of polarization of incident light, and the incident light is divided into two parts.
  • the double refraction is found in six liquid crystal systems other than cubic system in the existing seven liquid crystal systems. Among them, calcite and crystal are well known.
  • double refractive optical element 50 Using a material having the double refraction property (double refractive optical element) 50, two light beams 51a and 51b which enter from different directions and have different planes of polarization can be led in a same direction 51. That is, when the double refractive optical element 50 is viewed from the output side of the light beams, two directions can be observed. This means two optical axes of shooting simultaneously exist.
  • a rotation mechanism 53 is configured by placing a polarizing plate 52 before the double refractive optical element 50, rotating the polarizing plate 52 about the double refractive optical element 50, and switching the polarization of the incident light to the double refractive optical element 50.
  • FIG. 26 illustrates an image pickup device capable of capturing 3-dimensional image data with the body of an image pickup device and an adapter 54.
  • FIG. 27 illustrates the configuration of the adapter 54 incorporated into the image pickup device 1.
  • the image pickup device 1 includes the polarizing plate 52, the double refractive optical element 50, the rotation mechanism 53 for switching the polarization of light entering the double refractive optical element 50. By rotating the polarizing plate 52, the directions of the image pickup optical axis of the image pickup optical system 20 can be switched between two directions.
  • FIGS. 29 and 30 also illustrate variation examples of the image pickup device 1 using a polarizing conversion element 27 for rotating the plane of polarization of incident light at the drive signal from the drive circuit 23.
  • FIG. 29 is an example of attaching an adapter 55 provided with the polarizing conversion element 27 to the image pickup device 1.
  • FIG. 30 is an example of providing the polarizing conversion element 27 in the body of the image pickup device 1.
  • the example illustrated in FIG. 29 sequentially has a polarizing plate 28, a polarizing conversion element 27, and a double refractive optical element 50 in the adapter 55, rotates the plane of polarization of the polarizing conversion element 27 by the drive signal from the drive circuit 23, and moves the optical axis of shooting.
  • the polarizing conversion element 27 for example, a twisted nematic liquid crystal can be used.
  • the example illustrated in FIG. 30 has the configuration of the adapter 55 arranged in the body of the image pickup device 1.
  • the plane of polarization of the polarizing conversion element 27 is rotated by the drive signal from the drive circuit 23, and the optical axis of shooting can be moved.
  • the image pickup device 1 illustrated in FIGS. 29 and 30 acquires the data of the image for a left eye and the image for a right eye of 3-dimensional images based on the image formed on the image pickup element 21 by the image pickup optical system 20.
  • FIGS. 31A and 31B are examples of the image pickup device capable of automatically switching the transition direction of the optical axis by the optical axis transition mechanism depending on the direction of the image pickup device 1.
  • the image pickup device 1 is provided with an attitude sensor 57, for example, a gravity sensor etc., and the drive circuit 23 drives the optical axis transition mechanism based on a detection result of the attitude sensor 57.
  • a first pair of piezoelectric elements 56a and 56b is provided on the left and right sides of the image pickup optical system 20, and a second pair of piezoelectric elements 56c and 56d is provided on the upper and lower sides of the image pickup optical system 20.
  • the drive circuit 23 drives the horizontal pair of piezoelectric elements 56a and 56b, and does not drive the vertical pair of piezoelectric elements 56c and 56d.
  • the attitude sensor 57 detects that the image pickup device 1 is in the state illustrated in FIG.
  • the drive circuit 23 drives the horizontal second pair of piezoelectric elements 56c and 56d, and does not drive the vertical pair of piezoelectric elements 56a and 56b. Therefore, with the above-mentioned configuration, the direction of the image pickup device 1 is detected, the pair of piezoelectric elements 56a and 56b or 56c and 56d to be driven can be automatically selected, and the optical axis of shooting can be automatically switched.
  • the present embodiment relates to the invention of displaying a 3-dimensional image generated according to the first through third embodiments on the display 4, and allowing an observer to observe the image.
  • FIGS. 32A and 32B are schematic diagram illustrating an example of a 3-dimensional image observing system for observing the 3-dimensional image.
  • a reproduction device 49 stores the 3-dimensional image data, and reproduces the 3-dimensional image data.
  • the 3-dimensional image data reproduced by the reproduction device 49 is output to the display 4, and the image data for a left eye and the image data for a right eye are alternately output to the display 4.
  • the reproduction device 49 outputs a timing signal to a timing indicator 4a provided for the display 4.
  • the image data for a left eye and the image data for a right eye are alternately input to the display 4 in synchronization with the timing signal.
  • the timing indicator 4a radiates the infrared light modulated according to the timing signal to an image selection device 60.
  • the image selection device 60 is a glass type device for an observer who observes the 3-dimensional image displayed on the display 4, and provided with a sensor 60a for detecting the infrared light at the side portion.
  • the image selection device 60 receives the infrared light with the sensor 60a, and at the timing when the image for a left eye is displayed on the display 4, places a liquid crystal shutter 59a positioned before a left eye 58a of the observer in a transmission state and places a liquid crystal shutter 59b positioned before a right eye 58b of the observer in a non-transmission state (FIG. 32A).
  • the liquid crystal shutter 59a positioned before the left eye 58a is placed in the non-transmission state
  • the liquid crystal shutter 59b positioned before the right eye 58b of the observer is set in the transmission state (FIG. 32B).
  • the image for a left eye can be projected to only the left eye by an independent projection optical system
  • the image for a right eye can be projected to only the right eye by another independent projection optical system.
  • the method of the above-mentioned anaglyph can also be used.
  • the polarization direction of the image for a left eye and the image for a right eye can be differentiated to allow the image for a left eye to pass through the left eye side of the polarizing filters provided before the left eye and right eye of the observer, and allow the image for a right eye to pass through the right eye side of the filters.
  • an optical element for changing the projection direction of the image for a left eye and the image for a right eye is provided before the display 4 to project only the image for a left eye to the left eye of the observer and project only the image for a right eye to the right eye.
  • FIG. 33 is a flowchart for explanation of the method of observing the 3-dimensional image.
  • the observing method is used by reading the 3-dimensional image data recorded on, for example, a record medium.
  • the image for a left eye obtained by an observation in the first direction from a predetermined observation position is projected to the left eye of the observer and not projected to the right eye (step (hereinafter expressed by ST) 1).
  • the image for a right eye obtained by an observation in the second direction leftward from the first direction from the same predetermined observation position is projected to the right eye of the observer, and not projected to the left eye (ST 2).
  • the above-mentioned processes are repeated, and the image for a left eye is observed only by the left eye of the observer, and the image for a right eye is observed only by the right eye of the observer.
  • FIG. 34 is a flowchart of the case in which a relative position adjustment between the image for a left eye and the image for a right eye described with reference to the second embodiment, and the superposition with a 3-dimensional stereoscopic image generated in another system as described with reference to the third embodiment are performed at the stage of the image observation. That is, the image for a left eye obtained by an observation in the first direction from the predetermined position is acquired (step (hereinafter expressed by STP) 1) from a predetermined position, and then the image for a right eye obtained by an observation in the second direction leftward from the first direction from the predetermined position is acquired (STP 2).
  • a relative position adjustment is performed between the image for a left eye and the image for a right eye described with reference to the second embodiment (STP 3), a superposing process is performed with the 3-dimensional stereoscopic image generated in another system described with reference to the third embodiment (STP 4), the image for a left eye is projected to the left eye of the observer, but not projected to the right eye, and the image for a right eye is projected to the right eye of the observer, but not projected to the left eye (STP 5).
  • the observer can observe the 3-dimensional image to which a relative position adjustment has been made between the image for a left eye and the image for a right eye as described above with reference to the second embodiment, and can furthermore observe the 3-dimensional image obtained by superposition to the 3-dimensional stereoscopic image generated in another system described above with reference to the third embodiment.
  • a 3-dimensional image observable with less fatigue than in the prior art can be obtained.
  • image data for a left eye and image data for a right eye are obtained by an observation in two directions from the same position, only one image pickup device is required.
  • a 3-dimendional camera is configured with only one optical system.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

A 3-dimensional image data generating method generates a natural 3-dimensional image having a stereoscopic effect and providing an observer with little fatigue in a technique of acquiring a stereoscopic view of images for left and right eyes as 2-dimensional images having parallax to each other. The 3-dimensional image data generating method generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and performs: a left eye image data acquiring process of acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring process of acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.

Description

3-DIMENSIONAL IMAGE DATA GENERATING METHOD
The present invention relates to the technology of generating 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye which are 2-dimensional images having parallax to each other, and allowing an observer to observe the images.
Conventionally known is the technology of separately providing a picture for a left eye and a picture for a right eye which are 2-dimensional pictures having parallax each other to the left eye and the right eye of an observer, thereby realizing a stereoscopic view. For example, the technology described in (1) through (4) is known. That is, it is the technology of anaglyph of obtaining a 3-dimensional picture using the parallax of right and left eyes by synthesizing the images for left and right eyes using different colors, that is, red and blue, attaching blue and red filters to the glasses of the observer so that the left eye can observe only one of the blue and red images and the right eye can observe the other (1). There is also a method of alternately displaying a picture for a left eye and a picture for a right eye, opening and closing the right and left shutters of the glasses of the observer in synchronization with the alternate display, and allowing the right eye to see only the image for a right eye and the left eye to see only the image for a left eye (2). There is another method of setting different polarization directions between the picture for a left eye and the picture for a right eye to allow only one image to pass through the left and right polarizing filters of the glasses of the observer, and allow the right eye to see only the image for a right eye and the left eye to see only the image for a left eye (3). In addition, there is a method of using a device provided with an optical system for providing an image independently for the left and right eyes of the observer such as an HMD (head mounted display) etc. (4) These technologies are disclosed in, for example, Patent Document 1.
Furthermore, a parallel viewing method and a cross-eyed viewing method have been conventionally known as methods of generating 3-dimensional picture data (stereoscopic view data).
FIG. 1A is an explanatory view of the parallel viewing method. The parallel viewing method is to observe a subject by parallelizing the directions of the lines of sight from two points of view corresponding to the left and right eyes of an observer. When generating 3-dimensional image data by CG (computer graphics), a left point of view 61 as a point of view of observing an image for a left eye to be directed to the left eye of the observer and a right point of view 62 as a point of view of observing an image for a right eye to be directed to the right eye of the observer are arranged at a predetermined distance in a virtual space, and the observation line of sight from the left point of view 61 is set parallel to the observation line of sight from the right point of view 62.
When 3-dimensional image data is generated during the actual shooting process, a left-eye camera 61c for capturing the image for a left eye to be directed to the left eye of the observer and a right-eye camera 62c for capturing the image for a left eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a real space, and the optical axis of shooting of the left-eye camera 61c is set parallel to the optical axis of shooting of the right-eye camera 62c. The distance between the left point of view 61 (left-eye camera 61c) and the right point of view 62 (right-eye camera 62c) is, for example, about 6.5 cm based on the human interpupillary distance.
On the other hand, FIG. 1B is an explanatory view of the cross-eyed viewing method. The cross-eyed viewing method is to observe a subject by crossing the directions of the lines of sight from two points of view corresponding to the left and right eyes of an observer. When generating 3-dimensional image data by CG, a left point of view 63 as a point of view of observing an image for a left eye to be directed to the left eye of the observer and a right point of view 64 as a point of view of observing an image for a right eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a virtual space, and the observation line of sight from the left point of view 63 is arranged to cross the observation line of sight from the right point of view 64.
When 3-dimensional image data is generated during the actual shooting process, a left-eye camera 63c for capturing the image for a left eye to be directed to the left eye of the observer and a right-eye camera 64c for capturing the image for a left eye to be directed to the right eye of the observer are arrangedat a predetermined distance in a real space, and the optical axis of shooting of the left-eye camera 63c is arranged to cross the optical axis of shooting of the right-eye camera 64c.
Also in the cross-eyed viewing method, the distance between the left point of view 63 (left-eye camera 63c) and the right point of view 64 (right-eye camera 64c) is, for example, about 6.5 cm as described above. The position at which the lines of sight cross each other is normally set about 1 through 3m ahead of the left eye point of view 63 (right point of view 64).
Patent Document 1: Japanese Patent Application Publication No. 2006-72429
In the present invention, an image obtained by observing a subject in the first direction from the observation position is set as an image for a left eye, and an image obtained by observing a subject from the same observation position in the second direction leftward from the first direction is set as an image for a right eye. The image for a left eye and the image for a right eye can be obtained in an optional order or simultaneously.
In the present invention, the image for a left eye and the image for a right eye are observed from the same observation position (one point of view or one camera), so it is not necessary to have two points of view or two cameras unlike the conventional parallel viewing method and the cross-eyed viewing method. Between the images captured in two different observation directions from one observation position, the image captured in the observation direction relatively rightward is defined as an image for a left eye, and the image captured in the observation direction relatively leftward is defined as an image for a right eye. The leftward direction refers to the counterclockwise direction viewed from above about the vertical central axis passing through the observation position. The rightward direction refers to the clockwise direction viewed from above about the vertical central axis passing through the observation position. The angle made by the two observation directions is to be within the range where the images for left and right eyes have a common observation area, and is preferably
Figure JPOXMLDOC01-appb-I000001
or less.
The detailed configuration is described below. The 3-dimensional image data generating method according to the present invention generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and performs: a left eye image data acquiring process of acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring process of acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
The predetermined position is an observation position in which a subject is observed, and is an arbitrary observation position with respect to the subject. From the predetermined position, the image data of the subject obtained by observing the subject in the first direction using an image pickup device, for example, a camera etc. is acquired as image data for a left eye, and the image data obtained by observing the subject in the second direction is acquired as image data for a right eye, and these acquired data are stored in a record medium.
The 3-dimensional image data generating system according to the present invention generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image data acquiring device for acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring device for acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
The 3-dimensional image data generating program according to the present invention generates, using a computer, 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image data acquiring step of directing the computer to acquire image data for a left eye obtained by an observation in a first direction from a predetermined position; and a right eye image data acquiring step of directing the computer to acquire image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position. The 3-dimensional image data generating program according to the present invention can also be defined as an invention of a record medium storing a 3-dimensional image data generating program by storing the program in a computer-readable record medium.
The 3-dimensional image observing method according to the present invention allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and performs: a left eye image projecting process of projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and a right eye image projecting process of projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the same predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
The 3-dimensional image observing system according to the present invention allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: a left eye image projecting device for projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and a right eye image projecting device for projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the same predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
The 3-dimensional image pickup device according to the present invention captures a subject and obtains 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, and includes: an optical image pickup system; an image pickup element for converting an image formed by the optical image pickup system into an electric signal; an optical axis transition mechanism for changing directions of an optical axis of the optical image pickup system between a first direction and a second direction leftward from the first direction; an image pickup circuit for capturing the image for a left eye when the optical axis of the optical image pickup system is in the first direction, and capturing the image for a right eye when the optical axis of the optical image pickup system is in the second direction; and a record medium for storing the image captured by the image pickup circuit.
The 3-dimensional image data processed in the present invention includes data obtained by an actual shooting process in a real 3-dimensional space, data obtained by CG (computer graphics) in a virtual 3-dimensional space, and data obtained by combining the actually shot image data and the CG data.
FIG. 1A is an explanatory view of the 3-dimensional picture data for realizing a stereoscopic view, and an example of the parallel viewing method; FIG. 1B is an explanatory view of the 3-dimensional picture data for realizing the conventional stereoscopic view, and an example of the cross-eyed viewing method; FIG. 2 is a block diagram for explanation of the configuration of the computer used in the present embodiment; FIG. 3 is a configuration of the 3-dimensional image pickup device for realizing the present embodiment; FIG. 4 is a setting example in a studio when a subject is shot using an image pickup device; FIG. 5 is a schematic diagram of the image pickup device as viewed from above; FIG. 6 is a schematic diagram of the image pickup device with another configuration as viewed from above; FIG. 7 is a schematic diagram of the image pickup device with another configuration as viewed from above; FIG. 8A is an explanatory view of the basic concept of the present embodiment, and illustrates the cross-eyed viewing method; FIG. 8B is an explanatory view of the basic concept of the present embodiment, and illustrates a basic configuration; FIG. 9A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method, and an image for a left eye is acquired; FIG. 9B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method, and an image for a right eye is acquired; FIG. 10A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the cross-eyed viewing method, and an image for a left eye is acquired; FIG. 10B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the cross-eyed viewing method, and an image for a right eye is acquired; FIG. 11A is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the method according to the present embodiment, and an image for a left eye is acquired; FIG. 11B is an explanatory view of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the method according to the present embodiment, and an image for a right eye is acquired; FIG. 12A illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an image for a left eye is acquired; FIG. 12B illustrates the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method, and an example of the case in which an image for a right eye is acquired; FIG. 13A illustrates the state of an observation in which a ball approaches from a distant point in the method according to the present embodiment, and an example of the case in which an image for a left eye is acquired; FIG. 13B illustrates the state of an observation in which a ball approaches from a distant point in the method according to the present embodiment, and an example of the case in which an image for a right eye is acquired; FIG. 14 is a flowchart for explanation of the process according to the first embodiment; FIG. 15 is a timing chart for explanation of the process according to the first embodiment; FIG. 16A is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the parallel viewing method; FIG. 16B is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the cross-eyed viewing method; FIG. 16C is an explanatory view of the parallax between an image for a left eye and an image for a right eye by the method of generating various 3-dimensional data, and an explanatory view of an example of the present embodiment; FIG. 17A is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a left eye acquired in the processing method according to the first embodiment; FIG. 17B is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a right eye acquired in the processing method according to the first embodiment; FIG. 17C is an explanatory view of the method of improving the parallax of images according to the second embodiment, and illustrates the image for a left eye overlapping the image for a right eye illustrated in FIGS. 17A and 17B above; FIG. 17D is an explanatory view of the method of improving the parallax of images according to the second embodiment, and is an example of changing the relative positions of the image for a left eye and the image for a right eye, and improving the visibility; FIG. 18 is a flowchart for explanation of the process according to the second embodiment; FIG. 19A illustrates the concept for explanation of the third embodiment, and the state in which a 3-dimensional image obtained in the method according to the first and second embodiments is displayed on the screen; FIG. 19B illustrates the concept for explanation of the third embodiment, and the state in which the perspective effect is emphasized in the cross-eyed viewing method and the parallel viewing method, and, for example, an approaching ball is observed; FIG. 19C illustrates the concept for explanation of the third embodiment, and is an example of a 3-dimensional image represented according to the third embodiment; FIG. 20 is a flowchart for explanation of the process according to the third embodiment; FIG. 21 is a schematic diagram of the image pickup device without a barrel; FIG. 22 is a schematic diagram of the image pickup device in which an actuator is attached only to an optical image pickup system without a barrel; FIG. 23 is a schematic diagram of the image pickup device in which an actuator is attached only to an image pickup element without a barrel; FIG. 24 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a reflecting optical system is attached; FIG. 25 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a reflecting optical system; FIG. 26 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a polarizing plate and a double refraction optical element is attached; FIG. 27 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a polarizing plate and a double refraction optical element is attached; FIG. 28 is an explanatory view of the characteristic of the material having the property of double refraction; FIG. 29 is a schematic diagram of the image pickup device to which an adapter provided with an optical axis transition mechanism by a polarizing conversion element is attached; FIG. 30 is a schematic diagram of the image pickup device whose body is provided with an optical axis transition mechanism by a polarizing conversion element; FIG. 31A is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric elements on the left and right sides of the optical image pickup system, and is an example of the first pair of piezoelectric elements performing drive and optical axis transition; FIG. 31B is a schematic diagram of the image pickup device provided with first and second pairs of piezoelectric elements on the left and right sides of the optical image pickup system, and is an example of the second pair of piezoelectric elements performing drive and optical axis transition; FIG. 32A is an explanatory view of the fourth embodiment, and illustrates the state in which the liquid crystal shutter before the left eye of the observer is in the transmission state; FIG. 32B is an explanatory view of the fourth embodiment, and illustrates the state in which the liquid crystal shutter before the right eye of the observer is in the transmission state; FIG. 33 is a flowchart for explanation of the process according to the fourth embodiment; and FIG. 34 is a flowchart for explanation of the process according to the fourth embodiment.
(Embodiment 1)
An embodiment of the present invention is described below with reference to the attached drawings.
(First Embodiment)
FIG. 3 is a configuration of the system of the device to which the 3-dimensional image data generating method according to the present embodiment is applied. In FIG. 3, the system configuration includes an image pickup device 1, a computer 2, an input console 3, a display 4, and an external storage device 5. The image pickup device 1 is, for example, a camera capable of shooting moving pictures, shoots a subject through an optical image pickup system, and transmits the shot image data to the computer 2 through a network 6 such as the Internet etc.
The image pickup device 1 can also be configured by a connection directly to the computer 2 without the network 6. Otherwise, the configuration can also be realized without connecting the image pickup device 1 to the computer 2 using a communication circuit to communicate data using, for example, a portable record medium. In addition, the configuration can also be realized by including the functions corresponding to the computer 2, the input console 3, the display 4, and the external storage device 5 in the image pickup device 1.
Furthermore, the 3-dimensional image data acquired by the 3-dimensional image data generating method according to the present embodiment can be acquired by an actual shooting process in a real 3-dimensional space or by CG in a virtual 3-dimensional space. The image pickup device 1 is not required when the 3-dimensional image data is generated by the CG without the actual shooting process.
FIG. 2 is a block diagram for explanation of the computer 2, and the computer 2 is configured by a CPU 7, an internal semiconductor memory 8, an internal hard disk (hereinafter referred to as an HDD) 9, an interface for an external storage device (hereinafter referred to as an external storage device I/F) 10, an interface for a network (hereinafter referred to as a network I/F) 11, an interface for an input console (hereinafter referred to as an input console I/F) 12, and an interface for a display (hereinafter referred to as a display I/F) 13. The CPU 7, the internal semiconductor memory 8, the internal HDD 9, etc. are connected through an internal data bus 14, and can communicate data with one another.
The internal HDD 9 stores a system program for driving the computer 2 according to the present embodiment, and the CPU 7 performs the process described later according to the program read from the internal HDD 9 to the internal semiconductor memory 8. The internal semiconductor memory 8 can also be used as a work area, and temporarily stores the data being processed by the CPU 7.
The external storage device 5 communicates data with the computer 2 through the external storage device I/F 10, stores the image data acquired by, for example, the image pickup device 1, and transmits the image data stored in the external storage device 5 to the computer 2. The display 4 displays the image data transmitted from the computer 2 through the display I/F 13. For example, the captured image input to the computer 2 through the network 6 is displayed, and the data of the captured image stored in the external storage device 5 is displayed.
FIG. 4 is a setting example in a studio when a subject is shot using an image pickup device 1, and an example of studio setting when 3-dimensional image data is obtained by the actual shooting process. In FIG. 4, the image pickup device 1 is arranged toward subjects 15 and 16, and the image pickup device 1 is provided on, for example, a tripod 17. As illustrated in FIG. 4, the optical axis of shooting of the image pickup device 1 is configured for possible transition at angle
Figure JPOXMLDOC01-appb-I000002
.
In the present embodiment, the image pickup device 1 has a configuration for changing the directions of optical axis of shooting. However, without the configuration (an angle transition mechanism), the images can be shot from the same position in the two shooting directions shifted by the angle of
Figure JPOXMLDOC01-appb-I000003
by swinging the entire direction of the image pickup device .
The angle
Figure JPOXMLDOC01-appb-I000004
can be, for example,
Figure JPOXMLDOC01-appb-I000005
at maximum. The angle of
Figure JPOXMLDOC01-appb-I000006
is referred to as the limit of the forward divergence of human eyes by considering the adjustable angle of
Figure JPOXMLDOC01-appb-I000007
for each of the left and right eyes when the left eye and the right eye simultaneously diverge leftward and rightward respectively.
FIG. 5 is a schematic diagram of the image pickup device 1 as viewed from above, and the image pickup device 1 is configured with a barrel 22 to which an image pickup optical system 20 and an image pickup element 21 are directly or indirectly attached, a drive circuit 23, an image pickup circuit 24, and a record medium 25.
The image pickup optical system 20 is a lens having its focus in the position of the image pickup element 21, and forms an image from the rays from the subjects 15 and 16 on the image pickup element 21. The image pickup element 21 can be an photoelectrical conversion element such as a CCD (charge coupled device) sensor, a CMOS (complementary metal oxide semiconductor) sensor, etc. The image pickup circuit 24 captures a subject image formed on the image pickup element 21, and records the captured data converted into an electric signal on the record medium 25. The captured image can be sequentially transmitted to the computer 2 directly through the network 6 without recording it on the record medium 25.
Actuators 26a through 26d are provided at four points of the computer 2, and are driven according to the drive signal from the drive circuit 23. The actuators 26a through 26d are configured by, for example, a piezoelectric element, the actuators 26a through 26b are attached to the barrel 22 at the image pickup optical system 20, and the actuators 26c and 26d are attached to the barrel 22 at the image pickup element 21.
The actuators 26a through 26b and the actuators 26c and 26d are driven at the drive signal output from the drive circuit 23, and the actuators 26a through 26d cooperatively changes the angle of the barrel 22 with respect to the body, thereby changing the optical axis of shooting by the angle
Figure JPOXMLDOC01-appb-I000008
. Therefore, while maintaining the relative positional relationship between the image pickup optical system 20 and the image pickup element 21, the optical axis of shooting can be correctly changed by providing the barrel 22 in the image pickup device 1 as described above and changing the angle of the barrel 22 itself.
As illustrated in FIG. 6, the image pickup device 1 can also be configured to change the angle of the optical axis of shooting by attaching only the actuators 26a through 26b to the barrel 22 and driving the barrel 22 at the side of the image pickup optical system 20.
In addition, as illustrated in FIG. 7, the image pickup device 1 can also be configured to change the angle of the optical axis of shooting by attaching only the actuators 26c through 26d to the barrel 22 and driving the barrel 22 at the side of the image pickup element 21.
With the configuration above, the basic concept of the present embodiment for acquiring the 3-dimensional image data is described below with reference to FIGS. 8A and 8B. First, FIG. 8A illustrates the cross-eyed viewing method. In the cross-eyed viewing method, a left eye point of view 31 (left eye camera 31c) is advanced to a cross point 33 along a left eye line of sight 32, and a right eye point of view 34 (right eye camera 34c) is advanced to the cross point 33 along a right eye line of sight 35, thereby obtaining the configuration illustrated in FIG. 8B.
In the state illustrated in FIG. 8B, from the position of the cross point 33 (predetermined position), the direction of the left eye line of sight 32 (first direction) is observed and an image for a left eye is acquired. Similarly, from the position of the cross point 33 (predetermined position), the direction of the right eye line of sight 35 (second direction) is observed and an image for a right eye is acquired. The direction of the right eye line of sight 35 is left side of the direction of the left eye line of sight 32. In this state, the subject before the cross point 33 can be obtained as a stereoscopic image because the points of view (camera) are advanced along the line of sight (optical axis of shooting) in the present embodiment (FIG. 8B) with respect to the cross-eyed viewing method illustrated in FIG. 8A.
On the other hand, in the cross-eyed viewing method in FIG. 8A, the left eye line of sight 32 (optical axis of the left eye camera) and the right eye line of sight 35 (optical axis of the right eye camera) exchange their left and right positions after the cross point 33 as compared with the positions before the cross point 33. However, in FIG. 8B according to the present embodiment, there is no exchange in right and left positions, thereby enabling a natural observation of a subject similar to a human view. The present embodiment has a layout of obtaining an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other as the layout illustrated in FIG. 8B.
Next, the characteristics of the image for a left eye and the image for a right eye are described below by comparing them with those in the conventional parallel viewing method and cross-eyed viewing method. FIGS. 9A and 9Bare explanatory views of the state in which a sequence of black balls and a sequence of white balls arranged at predetermined intervals are observed and captured in the parallel viewing method. FIG. 9A illustrates acquiring an image for a left eye, and FIG. 9B illustrates acquiring an image for a right eye.
First, a left eye point of view 37 (left eye camera 37c) and a right eye point of view 38 (right eye camera 38c) are arranged at the left and right positions at a predetermined distance (for example, 6.5 cm as a human interpupillary distance), a sequence of white balls arranged at predetermined intervals are arrangedin parallel to the direction of the line of sight on the left of the left eye point of view 37, and a sequence of black balls arranged at predetermined intervals are arranged in parallel to the direction of the line of sight on the right of the right eye point of view 38. For convenience of display in the figures, a screen 39 arranged forward and vertically to the direction of the each line of sight is substituted for an image pickup surface.
In FIG. 9A, the position where the line (expressed by a broken line) which connects the left eye point of view 37 to each ball crosses the screen 39 is expressed by a small circle. On the image pickup surface, the ball is captured at the position corresponding to the small circle. As a result, the image for a left eye A illustrated in FIG. 9A is obtained on the screen 39. In this case, there are five white balls and five black balls (a total of ten balls), but the nearest white and black balls are out of the image pickup surface, and a total of eight balls are captured. Also in FIG. 9B, an image for a right eye B is similarly obtained.
On the other hand, FIGS. 10A and 10B illustrate the cases of capturing and observing the scenes illustrated in FIGS. 9A and 9B with the cross-eyed viewing method. In this case, since the screen 39 is perpendicular to the direction of the line of sight, it is inclined with respect to the sequences of balls.
As a result, the image displayed in C of FIG. 10A is obtained as an image for a left eye and the image displayed in D of FIG. 10B is obtained an image for a right eye. In the image for a left eye, the nearest ball in the white balls is out of the image pickup surface, but all black balls are on the image pickup surface, thereby capturing a total of nine balls. On the other hand, in the image for a right eye, the nearest ball in the black balls is out of the image pickup surface, but all white balls are on the image pickup surface, thereby capturing a total of nine balls. In this case, the nearest black ball exists in the image for a left eye only, and the nearest white ball exists in the image for a right eye only.
Thus, when the case in which a subject is lost in only one of the left and right image occurs, the left and right images of the subject cannot be merged, thereby generating an unnatural 3-dimensional image.
Next, FIGS. 11A and 11B are examples of using the present embodiment, and illustrate capturing and observing the cases similar to those in FIGS. 10A and 10B. Also in this case, since the screen 39 is perpendicular to the direction of the line of sight, and is inclined to the sequences of balls.
As a result, the image illustrated in E of FIG. 11A is obtained as an image for a left eye, and the image illustrated in F of FIG. 11B is obtained as an image for a right eye. As clearly indicated in the figures, the nearest white ball and black ball are out of the image pickup surface, and a total of eight balls, that is, four left balls and four right balls, can be captured in the present embodiment as in the parallel viewing method. That is, in the method of the present embodiment, the numbers of the left and right balls match each other, and there is no lost subject which occurs in the cross-eyed viewing method.
As described above, according to the present embodiment, the event of losing a subject only in one of the left and right image can be largely improved, thereby realizing more natural stereoscopic effect as compared with the cross-eyed viewing method.
Described next is the comparison of the event of alternating right and left positions of the image between the cross-eyed viewing method and the method according to the present embodiment. FIGS. 12A and 12B illustrate the state of an observation in which a ball approaches from a distant point in the cross-eyed viewing method.
FIG. 12A illustrates the case in which an image for a left eye is acquired, and FIG. 12B illustrates the case in which an image for a right eye is acquired. It is assumed that the balls move from a distant point to a near point on a symmetry axis 43 passing through the center of a left eye point of view 41 and a right eye point of view 42. When a ball is at a distant point, it is expressed by black. When a ball is at an intermediate point, it is expressed by shades. When a ball is at a near point, it is expressed by white. As a result, the images for a left eye illustrated in I through III of FIG. 12A and the images for a right eye illustrated in I through III of FIG. 12B areobtained.
That is, in the image for a left eye, the ball is positioned on the left of the center of the vision when the ball is at a distant point (refer to I of FIG. 12A), positioned near the center of the vision when the ball is at an intermediate point (refer to II of FIG. 12A), and positioned on the right of the center of the vision when the ball is at a near point (refer to III in FIG. 12A). On the other hand, in the image for a right eye, the ball is positioned on the right of the center of the vision when the ball is at a distant point (refer to I of FIG. 12B), positioned near the center of the vision when the ball is at an intermediate point (refer to II of FIG. 12B), and positioned on the left of the center of the vision when the ball is at a near point (refer to III in FIG. 12B).
As described above, as a ball approaches, the each position of the ball in the left and right images approaches, reaches the same positions, and then exchanges the left and right positions. In the above-mentioned image for a left eye and image for a right eye, the event of exchanging the left and right positions of a subject depending on the distance occurs, thereby generating an unnatural 3-dimensional image. Because such an exchange of positions never occurs in a real world.
On the other hand, FIGS. 13A and 13B illustrate the method according to the present embodiment, and the state of an observation in which a ball approaches from a distant point. FIG. 13A corresponds to the case in which an image for a left eye is acquired. FIG. 13B corresponds to the case in which an image for a right eye is acquired. As a result, the images for a left eye illustrated in I through III of FIG. 13A and the images for a right eye illustrated in I through III of FIG. 13 B are acquired. In the present embodiment, as indicated in FIGS. 13A and 13B, a change in position on the images is hardly detected although the balls approach, and the exchange of the right and left positions does not occur. Therefore, the method of the present embodiment can generating a 3-dimensional image having a more natural stereoscopic effect.
Based on the characteristics of the present invention described above, the process of acquiring 3-dimensional image data by a device having the computer 2 in FIG. 3 is described below. FIG. 4 is an example of setting in a studio when a subject is shot using the image pickup device 1, and acquiring 3-dimensional image data by the actual shooting process.
FIG. 14 is a flowchart for explanation of the process according to the present embodiment, and the process is performed by driving each circuit of the image pickup device 1 with the configuration illustrated in FIG. 5. FIG. 15 is a timing chart for explanation of the process according to the present embodiment.
First, when a shoot instruction is issued by a button operation not illustrated in the attached drawings but provided on the image pickup device 1 (YES in step(hereinafter expressed by S) 1), the drive circuit 23 is driven and the optical axis of shooting is changed from the initial position to the first direction (S2). In the timing chart in FIG. 15, the timing of changing the optical axis of shooting by the drive to the rightmost position is expressed by a1. When the optical axis of shooting of the image pickup device 1 is set in the first direction in the initial state, the process (S2) is omitted.
Next, an image for a left eye obtained by an observation in the first direction from a predetermined observation position (capturing position) is acquired (S3). For example, the image pickup circuit 24 is driven, and the captured image data including the subjects 15 and 16 is converted into an electric signal by the image pickup element 21. The process timing is expressed by b1 in FIG. 15. Next, the acquired image data for a left eye is recorded on the record medium 25 (S4).
Then, the drive circuit 23 is driven, and the optical axis of shooting is moved in the second direction (S5, c1 illustrated in FIG. 15). Then obtained is an image for a right eye obtained by observing a subject in the second direction from the same position as the predetermined observation position (capturing position) (S6, d1 in FIG. 15). In addition, the acquired image data for a right eye is recorded on the record medium 25 (S7).
Then, it is determined whether or not a shooting terminate instruction is issued (S8), and the processes above are repeated until the shooting terminate instruction is issued (NO in S8). That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S2), an image for a left eye is acquired (S3), and the acquired image data for a left eye is stored on the record medium 25 (S4). Furthermore, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction (S5), an image for a right eye is acquired (S6), and the acquired image data for a right eye is recorded on the record medium 25 (S7).
Furthermore, the processes above are repeated, the image for a left eye of the subject is acquired with the timing of b2, b3, b4, --- illustrated in FIG. 15, the image for a right eye of the subject is acquired with the timing of d2, d3, d4, ---, and the image data is sequentially recorded on the record medium 25.
Afterwards, when a shooting terminate instruction is issued (YES in S8), the capturing process is terminated. Therefore, in the processes above, the record medium 25 alternately stores the image data for a left eye and the image data for a right eye, and the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6 at the instruction from, for example, the input console 3, and stored in the external storage device 5.
For example, when a 3-dimensional still image is acquired, one image for a left eye and one image for a right eye are to be acquired for one still image. When 3-dimensional moving pictures are to be acquired, the image for a left eye is to be acquired once in 1/30 second, the image for a right eye is to be acquired once in 1/30 second by moving the optical axis of shooting every 1/60 second, for example.
The processes above are performed when, for example, a studio setting as illustrated in FIG. 4 is prepared, but 3-dimensional image data can also be generated by CG in a virtual 3-dimensional space. In this case, a 3-dimensional image data generating program stored on, for example, the internal HDD 9 of the computer 2 can be read to the internal semiconductor memory 8 at an instruction from the input console 3, and the CPU 7 performs control based on the program, thereby generating 3-dimensional image data. The 3-dimensional image data generating process by CG is similar to the 3-dimensional image data generating process described with reference to the flowchart in FIG. 14, and the similar process is performed by the program instead of the operations of the circuits of the drive circuits 23 and 24 provided in the image pickup device 1. The generated 3-dimensional image data is stored on, for example, the external storage device 5.
As described above, the captured image data acquired by the image pickup device 1 is the same as the image data acquired by a left eye camera 41c and a right eye camera 42c illustrated in FIG. 13. That is, when the optical axis of shooting of the image pickup device 1 is set in the first direction, the image pickup device 1 acquires an image for a left eye. When the optical axis of shooting is set in the second direction, it acquires an image for a right eye. The captured image data alternately including the image data for a left eye and the image data for a right eye are the same as the image data acquired by the left eye camera 41c and the right eye camera 42c illustrated in FIG. 13.
Therefore, when the thus acquired and captured data is stored in the external storage device 5, and displayed on the display 4 by the control of the computer 2 (CPU 7), the display 4 can display a stereoscopic 3-dimensional image with a stereoscopic effect and less fatigue for an observer.
(Second Embodiment)
Described next is the second embodiment of the present invention.
The second embodiment of the present invention is an invention for improving the visibility by changing the relative position of the image for a left eye and the image for a right eye so that, in addition to the contents of the first embodiment described above, the a shorter distance of an subject from the position of the image pickup device 1 (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
FIGS. 16A through 16C are explanatory views of the parallax between an image for a left eye and an image for a right eye by the methods of generating various 3-dimensional data. FIG. 16A is an example of the parallel viewing method, FIG. 16B is an example of the cross-eyed viewing method, and FIG. 16C is an example of the present embodiment. FIG. 16A is a view in which the image for a left eye A illustrated in FIG. 9A is arrangedwith the image for a right eye B illustrated in FIG. 9B. FIG. 16B is a view in which the image for a left eye illustrated in C of FIG. 10A is arranged with the image for a right eye illustrated in D of FIG. 10B. FIG. 16C is a view in which the image for a left eye illustrated in E of FIG. 11A is arranged with the image for a right eye illustrated in F of FIG. 11B.
In each of the figures above, d indicates the parallax (difference in position between the images of the same subject in the image for a left eye and the image for a right eye) for a subject of a long distance, and D indicates the parallax for a subject of a short distance.
With reference to FIGS. 16A through 16C, when the parallax of a long distance and the parallax of a short distance are compared with each other in the parallel viewing method and the cross-eyed viewing method, the parallax D of a short distance is larger than the parallax d of a long distance, and the differenceis large in the parallel viewing method in FIG. 16A. In the cross-eyed viewing method in FIG. 16B, both the parallax d of the long distance and the parallax D of the short distance are large, and their length are substantially equal. In the method according to the present embodiment in FIG. 16C, the parallax D of the short distance is larger than the parallax d of the long distance, but the difference between them is not large.
When a user regards a subject, the lines of sight of both eyes are focused on the subject. Therefore, the point of the attention of the user is positioned at the central portion of the vision, and an observation of the subject is performed almost without the parallax between the eyes. Accordingly, when a subject at a long distance is observed in the parallel viewing method, a natural observation can be made. However, when a subject at a short distance is observed, the amount of parallax is large. Therefore, an unnatural observation is performed. Especially, when a line of sight is alternately set on a subject at a long distance and a subject at a short distance, for example, when a user drives a racing car in a TV game, the racing car is driven while alternately regarding the background of the racing course at a long distance and the cars running near the car of the user at a short distance. Therefore, it is hard for the user to appropriately adjust the eyes and the brain of the user, thereby incurring great fatigue.
In addition, there is little difference in parallax between a subject at a long distance and a subject at a short distance in the cross-eyed viewing method, and there occurs no problem which occurs in the parallel viewing method. However, it can be said that it is unnatural to regard a subject with parallax. In addition, there occurs problems of a loss of a subject and an exchange of right and left images in the cross-eyed viewing method as described before.
On the other hand, also in the method of the present embodiment, the difference of the parallax is almost the same as in the cross-eyed viewing method, the problems of a loss of a subject and an exchange of right and left images are considerably improved. In the second embodiment, in addition to the contents of the first embodiment, the relative position between the image for a left eye and the image for a right eye is corrected so that the shorter distance of a subject from the position of the image pickup device (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye. Thus, since the subject at a short distance is often regarded by a user and quickly moves, the visibility can be greatly improved if the parallax at the short distance becomes smaller. Furthermore, since a subject at a long distance is less frequently regarded as compared with a subject at a short distance, and slowly moves, the brain of the user can appropriately correspond with some parallax.
FIGS. 17A through 17D are explanatory views of the method of improving the parallax of images according to the second embodiment. FIG. 17A illustrates the image for a left eye acquired in the processing method according to the first embodiment. FIG. 17B illustrates an image for a right eye. FIG. 17C illustrates the image for a left eye overlapping the image for a right eye illustrated in FIGS. 17A and 17B, and is an explanatory view of the parallax according to the present embodiment. In this case, as described above, there is the parallax d for a long distance and the parallax D for a short distance.
In the present embodiment, as illustrated in FIG. 17D, the relative position between the image for a left eye and the image for a right eye is changed so that the shorter distance of a subject from the position of the image pickup device 1 (predetermined position), the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye, thereby improving the visibility.
FIG. 18 is a flowchart for explanation of the process according to the present embodiment. First, a process similar to the flowchart described with reference to FIG. 14 is performed, and the data of the image for a left eye and the image for a right eye is stored on the record medium 25, for example. That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S1, S2), an image for a left eye is acquired and stored on the record medium 25 (S3, S4), the drive circuit 23 is further driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction, and an image for a right eye is acquired and stored on the record medium 25 (S5 through S8). Then, the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6, and stored in the external storage device 5.
Next, the relative position of the image for a left eye and the image for a right eye is changed so that a subject at a shorter distance from the predetermined position can have smaller parallax between the image for a left eye and the image for a right eye than ones at a longer distance (S9). For example, the data of the image for a left eye and the image for a right eye stored in, for example, the external storage device 5 is read, the parallax d for a long distance and the parallax D for a short distance are corrected so that it can be the ratio of FIG. 17D to FIG. 17C, and the result is stored in the external storage device 5 again.
The process above can be performed by generating 3-dimensional image data by CG as in the first embodiment, and making a correction of the relative position of the image for a left eye and the image for a right eye to the 3-dimensional image data generated by CG.
When the thus generated 3-dimensional image is displayed on the display 4 by driving the computer 2 (CPU 7) at an instruction from, for example, the input console 3, the parallax of a person at a short distance is small. Therefore, an unnatural effect is not generated although the person is regarded and observed. In addition, the parallax for a short distance and a long distance becomes small, thereby generating an easily observable image.
In the explanation of the embodiment above, the data corrected for the parallax d for a long distance and the parallax D for a short distance is stored again in the external storage device 5. But it can be configured that the data of an image for a left eye and an image for a right eye generated by the actual shooting process or CG, which is stored in the external storage device 5 before the correction, is read, then the parallax is corrected, and the resultant data is displayed on the display 4.
Furthermore, in the explanation in the embodiment above, after the data of the image for a left eye and the image for a right eye is stored in the external storage device 5, the parallax d for a long distance and the parallax D for a short distance are corrected. The correction of the parallax d for a long distance and the parallax D for a short distance can be performed at each time when the data of an image for a left eye and an image for a right eye is acquired by the image pickup device 1 that captures a subject, thereby storing the corrected captured image data in the external storage device 5.
(Third Embodiment)
Described next is the third embodiment of the present invention.
The third embodiment of the present invention is an invention of superposing an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other which are generated in other ways respectively on an image for a left eye and an image for a right eye obtained in the methods according to the first and second embodiment.
FIGS. 19A through 19C illustrate the concept describing the present embodiment. FIG. 19A illustrates the state in which the 3-dimensional image obtained in the methods according to the first and second embodiments is displayed on the screen. In this state, a natural and easily visible image can be observed, but a subject which pops out of the screen cannot be observed. On the other hand, FIG. 19B illustrates the state in which a subject (for example, an approaching ball) is observed with the setting of emphasizing the perspective effect in other methods, for example, the cross-eyed viewing method and the parallel viewing method. In the present embodiment, the images displayed on the screens in FIGS. 19A and 19B are superposed, thereby generating the 3-dimensional image in FIG. 19C.
FIG. 20 is a flowchart for explanation of the process according to the present embodiment. First, a process similar to the process according to the flowchart described with reference to FIG. 14 is performed, and the data of an image for a left eye and an image for a right eye is stored on, for example, the record medium 25. That is, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the first direction (S1, S2), and the image for a left eye is acquired and stored on the record medium 25 (S3, S4). Furthermore, the drive circuit 23 is driven, the optical axis of shooting of the image pickup device 1 is moved in the second direction, an image for a right eye is acquired and stored on the record medium 25 (S5 through S8). Then, the captured image data stored on the record medium 25 is transmitted to the computer 2 through the network 6, and stored in the external storage device 5.
Next, the image for a left eye and the image for a right eye as 2-dimensional images generated in another method and having parallax to each other are respectively superposed on the image for a left eye and image for a right eye acquired in the process above (S10). For example, the data of the image for a left eye and the image for a right eye stored in the external storage device 5 is read, and superposed respectively on the image for a left eye and image for a right eye as 2-dimensional images generated in another method, stored in the same external storage device 5, and having parallax to each other, and the result is stored in the external storage device 5 again.
The process above can be performed as in the second embodiment by generating 3-dimensional image data by CG, and superposing the data of the image for a left eye and the image for a right eye generated in another method as the 3-dimensional image data generated by CG.
Therefore, if the thus generated 3-dimensional image displayed on the display 4 by driving the computer 2 (CPU 7) at an instruction from the input console 3, a more visible and realistic 3-dimensional image can be displayed.
In the explanation of the embodiment, the data obtained by superposing the image for a left eye and image for a right eye generated in another method is stored in the external storage device 5 again. However, it is also possible that, while reading the data of the image for a left eye and image for a right eye generated in an actual shooting process or by CG and stored in the external storage device 5 before the correction, and superposing the data of the image for a left eye and image for a right eye generated in another method, the resultant data can be displayed on the display 4.
Furthermore, in the explanation of the embodiment, after the data of the image for a left eye and the image for a right eye is stored in the external storage device 5, the data of the image for a left eye and image for a right eye as 2-dimensional images generated in another method and having parallax to each other is superposed. However, when the image pickup device 1 captures a subject, each time the data of an image for a left eye and an image for a right eye is acquired, the data of an image for a left eye and an image for a right eye generated in another method can be superposed, and the resultant captured image data can be stored in the external storage device 5.
Described next is a variation example of the image pickup device 1 which can be used in the first through third embodiments.
Although the configuration of the image pickup device 1 is described above with reference to FIGS. 5 through 7, the configuration of the image pickup device 1 is not limited to the examples above. The image pickup device 1 illustrated in FIGS. 21 and 22 are not provided with, for example, the barrel 22, and the actuator is attached to the image pickup optical system 20 and the image pickup element 21, or one of them.
For example, FIG. 21 illustrates the image pickup device 1 with the configuration. The actuators 26a through 26b are attached to the image pickup optical system 20, the actuators 26c and 26d are attached to the image pickup element 21, and the actuators 26a through 26d are cooperatively driven to provide the image pickup device 1 so that it can substantially move the optical axis with a simple configuration without the barrel 22.
In addition, FIG. 22 illustrates the image pickup device 1 for driving the actuators 26a through 26b according to the drive signal from the drive circuit 23 by attaching the actuators 26a through 26b to the image pickup optical system 20 only. FIG. 23 illustrates the image pickup device 1 for driving the actuators 26c and 26d at the drive signal from the drive circuit 23 by attaching the actuators 26c and 26d to the image pickup element 21 only. Provided in these cases is the image pickup device 1 capable of substantially moving an optical axis with a simple configuration without the barrel 22.
Furthermore, FIG. 24 is a variation example of the image pickup device 1 used in the present embodiment. An adapter 45 is attached to the image pickup device 1, and an optical axis transition mechanism 46 is provided in the adapter 45. The image pickup device 1 provided with the adapter 45 can be regarded as one image pickup device. In this case, the optical axis transition mechanism 46 is configured by reflecting optical systems 47 and 48, and actuators 26e and 26f provided at both ends of the reflecting optical system 47, the actuators 26e and 26f are driven at the drive signal provided from the drive circuit 23, and the reflecting optical system 47 is rotated (vibrated) about a support point 47a within the range of the angle
Figure JPOXMLDOC01-appb-I000009
. The body of the image pickup device 1 is configured with the image pickup optical system 20, the image pickup element 21, the image pickup circuit 24, and the record medium 25, and the optical information of the subject reflected by the reflecting optical systems 47 and 48 in the adapter 45 is formed on the image pickup element 21 by the image pickup optical system 20, thereby acquiring the data of the image for a left eye and the image for a right eye.
In addition, FIG. 25 illustrates a device of the configuration of the adapter 45 incorporated into the image pickup device 1. In the image pickup device 1, the reflecting optical systems 47 and 48 and the actuators 26e and 26f provided at both ends of the reflecting optical system 47 are provided, the actuators 26e and 26f are driven by the drive signal from the drive circuit 23, the reflecting optical system 47 is rotated about the support point 47a, the image pickup optical system 20 forms the optical information about the subject on the image pickup element 21, and the data of the image for a left eye and the image for a right eye is acquired. An L-shaped optical path can be configured without the reflecting optical system 48
FIGS. 26 and 27 illustrate variation examples of the image pickup device 1 which moves the optical axis of shooting using double refraction. For example, when the light enters the material having the property of double refraction illustrated in FIG. 28, the refractive index varies by the plane of polarization of incident light, and the incident light is divided into two parts. The double refraction is found in six liquid crystal systems other than cubic system in the existing seven liquid crystal systems. Among them, calcite and crystal are well known.
Using a material having the double refraction property (double refractive optical element) 50, two light beams 51a and 51b which enter from different directions and have different planes of polarization can be led in a same direction 51. That is, when the double refractive optical element 50 is viewed from the output side of the light beams, two directions can be observed. This means two optical axes of shooting simultaneously exist.
As illustrated in FIG. 26, a rotation mechanism 53 is configured by placing a polarizing plate 52 before the double refractive optical element 50, rotating the polarizing plate 52 about the double refractive optical element 50, and switching the polarization of the incident light to the double refractive optical element 50.
With the above-mentioned configuration, the direction of the optical axis of shooting of the image pickup optical system 20 can be switched between the two directions above. That is, FIG. 26 illustrates an image pickup device capable of capturing 3-dimensional image data with the body of an image pickup device and an adapter 54.
FIG. 27 illustrates the configuration of the adapter 54 incorporated into the image pickup device 1. The image pickup device 1 includes the polarizing plate 52, the double refractive optical element 50, the rotation mechanism 53 for switching the polarization of light entering the double refractive optical element 50. By rotating the polarizing plate 52, the directions of the image pickup optical axis of the image pickup optical system 20 can be switched between two directions.
Furthermore, FIGS. 29 and 30 also illustrate variation examples of the image pickup device 1 using a polarizing conversion element 27 for rotating the plane of polarization of incident light at the drive signal from the drive circuit 23. FIG. 29 is an example of attaching an adapter 55 provided with the polarizing conversion element 27 to the image pickup device 1. FIG. 30 is an example of providing the polarizing conversion element 27 in the body of the image pickup device 1.
That is, the example illustrated in FIG. 29 sequentially has a polarizing plate 28, a polarizing conversion element 27, and a double refractive optical element 50 in the adapter 55, rotates the plane of polarization of the polarizing conversion element 27 by the drive signal from the drive circuit 23, and moves the optical axis of shooting. As an example of the polarizing conversion element 27, for example, a twisted nematic liquid crystal can be used.
The example illustrated in FIG. 30 has the configuration of the adapter 55 arranged in the body of the image pickup device 1. As with the example illustrated in FIG. 29, the plane of polarization of the polarizing conversion element 27 is rotated by the drive signal from the drive circuit 23, and the optical axis of shooting can be moved. The image pickup device 1 illustrated in FIGS. 29 and 30 acquires the data of the image for a left eye and the image for a right eye of 3-dimensional images based on the image formed on the image pickup element 21 by the image pickup optical system 20.
FIGS. 31A and 31B are examples of the image pickup device capable of automatically switching the transition direction of the optical axis by the optical axis transition mechanism depending on the direction of the image pickup device 1. The image pickup device 1 is provided with an attitude sensor 57, for example, a gravity sensor etc., and the drive circuit 23 drives the optical axis transition mechanism based on a detection result of the attitude sensor 57.
For example, a first pair of piezoelectric elements 56a and 56b is provided on the left and right sides of the image pickup optical system 20, and a second pair of piezoelectric elements 56c and 56d is provided on the upper and lower sides of the image pickup optical system 20. When the attitude sensor 57 detects that the image pickup device 1 is in the state illustrated in FIG. 31A (horizontal layout), the drive circuit 23 drives the horizontal pair of piezoelectric elements 56a and 56b, and does not drive the vertical pair of piezoelectric elements 56c and 56d. On the other hand, when the attitude sensor 57 detects that the image pickup device 1 is in the state illustrated in FIG. 31B (vertical layout), the drive circuit 23 drives the horizontal second pair of piezoelectric elements 56c and 56d, and does not drive the vertical pair of piezoelectric elements 56a and 56b. Therefore, with the above-mentioned configuration, the direction of the image pickup device 1 is detected, the pair of piezoelectric elements 56a and 56b or 56c and 56d to be driven can be automatically selected, and the optical axis of shooting can be automatically switched.
(Fourth Embodiment)
Described next is the fourth embodiment of the present invention.
The present embodiment relates to the invention of displaying a 3-dimensional image generated according to the first through third embodiments on the display 4, and allowing an observer to observe the image.
FIGS. 32A and 32B are schematic diagram illustrating an example of a 3-dimensional image observing system for observing the 3-dimensional image. In FIGS. 32A and 32B, a reproduction device 49 stores the 3-dimensional image data, and reproduces the 3-dimensional image data. The 3-dimensional image data reproduced by the reproduction device 49 is output to the display 4, and the image data for a left eye and the image data for a right eye are alternately output to the display 4. The reproduction device 49 outputs a timing signal to a timing indicator 4a provided for the display 4. The image data for a left eye and the image data for a right eye are alternately input to the display 4 in synchronization with the timing signal.
In addition, the timing indicator 4a radiates the infrared light modulated according to the timing signal to an image selection device 60. The image selection device 60 is a glass type device for an observer who observes the 3-dimensional image displayed on the display 4, and provided with a sensor 60a for detecting the infrared light at the side portion.
The image selection device 60 receives the infrared light with the sensor 60a, and at the timing when the image for a left eye is displayed on the display 4, places a liquid crystal shutter 59a positioned before a left eye 58a of the observer in a transmission state and places a liquid crystal shutter 59b positioned before a right eye 58b of the observer in a non-transmission state (FIG. 32A). On the other hand, at the timing when the image for a right eye is displayed on the display 4, the liquid crystal shutter 59a positioned before the left eye 58a is placed in the non-transmission state, and the liquid crystal shutter 59b positioned before the right eye 58b of the observer is set in the transmission state (FIG. 32B). The processes above are repeated, the image for a left eye is observed only by the left eye of the observer, and the image for a right eye is observed only by the right eye of the observer.
However, it is only an example, and the image for a left eye can be projected to only the left eye by an independent projection optical system, and the image for a right eye can be projected to only the right eye by another independent projection optical system. The method of the above-mentioned anaglyph can also be used.
In addition, the polarization direction of the image for a left eye and the image for a right eye can be differentiated to allow the image for a left eye to pass through the left eye side of the polarizing filters provided before the left eye and right eye of the observer, and allow the image for a right eye to pass through the right eye side of the filters. Furthermore, an optical element for changing the projection direction of the image for a left eye and the image for a right eye is provided before the display 4 to project only the image for a left eye to the left eye of the observer and project only the image for a right eye to the right eye.
FIG. 33 is a flowchart for explanation of the method of observing the 3-dimensional image. The observing method is used by reading the 3-dimensional image data recorded on, for example, a record medium. As described above, the image for a left eye obtained by an observation in the first direction from a predetermined observation position is projected to the left eye of the observer and not projected to the right eye (step (hereinafter expressed by ST) 1). Next, the image for a right eye obtained by an observation in the second direction leftward from the first direction from the same predetermined observation position is projected to the right eye of the observer, and not projected to the left eye (ST 2). Then, the above-mentioned processes are repeated, and the image for a left eye is observed only by the left eye of the observer, and the image for a right eye is observed only by the right eye of the observer.
In the explanation above, the process (ST 1) is performed first, but the process (ST 2) can be performed first.
FIG. 34 is a flowchart of the case in which a relative position adjustment between the image for a left eye and the image for a right eye described with reference to the second embodiment, and the superposition with a 3-dimensional stereoscopic image generated in another system as described with reference to the third embodiment are performed at the stage of the image observation. That is, the image for a left eye obtained by an observation in the first direction from the predetermined position is acquired (step (hereinafter expressed by STP) 1) from a predetermined position, and then the image for a right eye obtained by an observation in the second direction leftward from the first direction from the predetermined position is acquired (STP 2).
Next, a relative position adjustment is performed between the image for a left eye and the image for a right eye described with reference to the second embodiment (STP 3), a superposing process is performed with the 3-dimensional stereoscopic image generated in another system described with reference to the third embodiment (STP 4), the image for a left eye is projected to the left eye of the observer, but not projected to the right eye, and the image for a right eye is projected to the right eye of the observer, but not projected to the left eye (STP 5).
By performing the processes above, the observer can observe the 3-dimensional image to which a relative position adjustment has been made between the image for a left eye and the image for a right eye as described above with reference to the second embodiment, and can furthermore observe the 3-dimensional image obtained by superposition to the 3-dimensional stereoscopic image generated in another system described above with reference to the third embodiment.
According to the present invention, as described above with reference to the embodiment above, a 3-dimensional image observable with less fatigue than in the prior art can be obtained. In addition, since image data for a left eye and image data for a right eye are obtained by an observation in two directions from the same position, only one image pickup device is required. With the conventional camera for a 3-dimensional image an optical system for capturing an image for a left eye and another optical system for capturing an image for a right eye are separately required, but in the present invention, a 3-dimendional camera is configured with only one optical system.

Claims (23)

  1. A 3-dimensional image data generating method which generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    a left eye image data acquiring process of acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and
    a right eye image data acquiring process of acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  2. The method according to claim 1, wherein
    an angle made by the first direction and the second direction is 4 degrees or less.
  3. The method according to claim 1, further comprising
    an image relative position changing process of changing a relative position between the image for a left eye and the image for a right eye so that the shorter the distance of a subject from the predetermined position, the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  4. The method according to claim 1, further comprising
    an image superposing process of superposing image data for a left eye and image data for a right eye as 2-dimensional images having parallax each other generated in a method different from the left eye image data acquiring process and the right eye image data acquiring process respectively to image data for a left eye and image data for a right eye generated by the left eye image data acquiring process and the right eye image data acquiring process.
  5. A 3-dimensional image data generating system which generates 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    a left eye image data acquiring device acquiring image data for a left eye obtained by an observation in a first direction from a predetermined position; and
    a right eye image data acquiring device acquiring image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  6. The system according to claim 5, wherein
    an angle made by the first direction and the second direction is 4 degrees or less.
  7. The system according to claim 5, further comprising
    an image relative position changing device changing a relative position between the image for a left eye and the image for a right eye so that the shorter the distance of a subject, the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  8. The system according to claim 5, further comprising
    an image superposing device superposing image data for a left eye and image data for a right eye as 2-dimensional images having parallax each other generated in a method different from the left eye image data acquiring process and the right eye image data acquiring process respectively to image data for a left eye and image data for a right eye generated by the left eye image data acquiring process and the right eye image data acquiring process.
  9. A 3-dimensional image data generating program which generates, using a computer, 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    a left eye image data acquiring step of directing the computer to acquire image data for a left eye obtained by an observation in a first direction from a predetermined position; and
    a right eye image data acquiring step of directing the computer to acquire image data for a right eye obtained by an observation in a second direction leftward from the first direction from the same predetermined position.
  10. The program according to claim 9, wherein
    an angle made by the first direction and the second direction is 4 degrees or less.
  11. The program according to claim 9, further comprising
    an image relative position changing step of directing a computer to change a relative position between the image for a left eye and the image for a right eye so that the shorter the distance of a subject from the predetermined position, the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  12. The program according to claim 9, further comprising
    an image superposing step of directing a computer to superpose image data for a left eye and image data for a right eye as 2-dimensional images having parallax each other generated in a method different from the left eye image data acquiring process and the right eye image data acquiring process respectively to image data for a left eye and image data for a right eye generated by the left eye image data acquiring process and the right eye image data acquiring process.
  13. A 3-dimensional image observing method which allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    a left eye image projecting process of projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and
    a right eye image projecting process of projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the same predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
  14. The method according to claim 13, further comprising
    an image relative position changing process of changing a relative position between the image for a left eye and the image for a right eye so that the shorter the distance of a subject from the predetermined position, the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye, before performing the left eye image projecting process and the right eye image projecting process.
  15. A 3-dimensional image observing system which allows an observer to observe 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    a left eye image projecting device projecting an image for a left eye obtained by observing a subject in a first direction from a predetermined position to the left eye of an observer but not projecting the image to the right eye of the observer; and
    a right eye image projecting device projecting an image for a right eye obtained by observing the subject in a second direction leftward from the first direction from the predetermined position to the right eye of the observer but not projecting the image to the left eye of the observer.
  16. The system according to claim 15, further comprising
    an image relative position changing device changing a relative position between the image for a left eye and the image for a right eye so that the shorter the distance of a subject from the predetermined position, the smaller the parallax of the subject captured in the image for a left eye and the image for a right eye.
  17. A 3-dimensional image pickup device which captures a subject and obtains 3-dimensional image data for a stereoscopic view including an image for a left eye and an image for a right eye as 2-dimensional images having parallax to each other, comprising:
    an optical image pickup system;
    an image pickup element converting an image formed by the optical image pickup system into an electric signal;
    an optical axis transition mechanism changing directions of an optical axis of the optical image pickup system between a first direction and a second direction leftward from the first direction;
    an image pickup circuit capturing the image for a left eye when the optical axis of the optical image pickup system is in the first direction, and capturing the image for a right eye when the optical axis of the optical image pickup system is in the second direction; and
    a record medium recording the images captured by the image pickup circuit.
  18. The device according to claim 17, further comprising
    a barrel to which the optical image pickup system and the image pickup element are directly or indirectly attached, wherein the optical axis transition mechanism is an actuator for moving the barrel by predetermined degrees.
  19. The device according to claim 17, wherein
    the optical axis transition mechanism moves at least one of the optical image pickup system and the image pickup element.
  20. The device according to claim 17, wherein
    the optical axis transition mechanism is an actuator for moving a reflecting optical system located on an optical axis of the optical image pickup system by predetermined degrees.
  21. The device according to claim 17, wherein
    the optical axis transition mechanism comprises: a double refraction optical element, a polarization element provided in a front of the double refraction optical element; and a rotation mechanism for switching polarization of light incident to the double refraction optical element by rotating the polarization element with respect to the double refraction optical element.
  22. The device according to claim 17, wherein
    the optical axis transition mechanism comprises: a double refraction optical element; a polarization element provided in front of the double refraction optical element; and an optical element provided between the polarization element and the double refraction optical element for rotating a plane of polarization according to an electric signal.
  23. The device according to claim 17, further comprising
    a attitude sensor for detecting an attitude of the 3-dimensional image pickup device, wherein the optical axis transition mechanism holds a moving direction of an optical axis clockwise and counterclockwise using a detection result of the attitude sensor regardless of the attitude of the 3-dimensional image pickup device.
PCT/JP2010/007620 2010-05-31 2010-12-29 3-dimensional image data generating method WO2011151872A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10852483.6A EP2577394A4 (en) 2010-05-31 2010-12-29 3-dimensional image data generating method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010124083A JP5627930B2 (en) 2010-05-31 2010-05-31 3D image data generation method
JP2010-124083 2010-05-31

Publications (1)

Publication Number Publication Date
WO2011151872A1 true WO2011151872A1 (en) 2011-12-08

Family

ID=45066274

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/007620 WO2011151872A1 (en) 2010-05-31 2010-12-29 3-dimensional image data generating method

Country Status (3)

Country Link
EP (1) EP2577394A4 (en)
JP (1) JP5627930B2 (en)
WO (1) WO2011151872A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5889719B2 (en) 2012-05-31 2016-03-22 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
TWI766316B (en) * 2020-07-22 2022-06-01 財團法人工業技術研究院 Light transmitting display system, image output method thereof and processing device thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0630446A (en) * 1992-07-13 1994-02-04 Ricoh Co Ltd Stereoscopic image recorder
JP2001042464A (en) * 1999-08-03 2001-02-16 Matsushita Electric Ind Co Ltd Method and device for producing stereoscopic image
JP2003005311A (en) * 2001-06-19 2003-01-08 Olympus Optical Co Ltd Stereoscopic image photographing device
JP2008154027A (en) * 2006-12-19 2008-07-03 Seiko Epson Corp Photographing device, photographing method, and program
JP2010041381A (en) * 2008-08-05 2010-02-18 Nikon Corp Electronic camera, stereo image generation method, and stereo image generation system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3190220B2 (en) * 1994-12-20 2001-07-23 シャープ株式会社 Imaging device
JPH08194274A (en) * 1995-01-13 1996-07-30 Olympus Optical Co Ltd Stereoscopic image pickup device
JP2006005686A (en) * 2004-06-18 2006-01-05 Toshiba Corp Picture photographing apparatus with driving mount base and camera monitoring system
JP2007288229A (en) * 2004-08-09 2007-11-01 Sharp Corp Imaging device
JP4630149B2 (en) * 2005-07-26 2011-02-09 シャープ株式会社 Image processing device
TWI336810B (en) * 2006-12-21 2011-02-01 Altek Corp Method of generating image data having parallax using a digital image-capturing device and digital image-capturing device
JP2009021761A (en) * 2007-07-11 2009-01-29 Fujifilm Corp Imaging device unit, imaging apparatus, imaging device positioning implement, imaging device mounting method and imaging device
JP5060231B2 (en) * 2007-09-21 2012-10-31 株式会社バンダイナムコゲームス Image generating method, stereoscopic printed matter, manufacturing method and program
JP2009075376A (en) * 2007-09-21 2009-04-09 Nikon Corp Optical axis adjusting device and imaging apparatus
JP5115799B2 (en) * 2008-01-21 2013-01-09 ソニー株式会社 Image processing apparatus and method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0630446A (en) * 1992-07-13 1994-02-04 Ricoh Co Ltd Stereoscopic image recorder
JP2001042464A (en) * 1999-08-03 2001-02-16 Matsushita Electric Ind Co Ltd Method and device for producing stereoscopic image
JP2003005311A (en) * 2001-06-19 2003-01-08 Olympus Optical Co Ltd Stereoscopic image photographing device
JP2008154027A (en) * 2006-12-19 2008-07-03 Seiko Epson Corp Photographing device, photographing method, and program
JP2010041381A (en) * 2008-08-05 2010-02-18 Nikon Corp Electronic camera, stereo image generation method, and stereo image generation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2577394A4 *

Also Published As

Publication number Publication date
JP2011250352A (en) 2011-12-08
EP2577394A1 (en) 2013-04-10
EP2577394A4 (en) 2017-05-31
JP5627930B2 (en) 2014-11-19

Similar Documents

Publication Publication Date Title
JP4635403B2 (en) Stereoscopic image creation method and apparatus
JP5641200B2 (en) Image processing apparatus, image processing method, image processing program, and recording medium
US8780185B2 (en) Image pickup apparatus having a display controlled using interchangeable lens information and/or finder information
SG186947A1 (en) Variable three-dimensional camera assembly for still photography
CN102165785A (en) Three-dimensional imaging device, method, and program
KR20110114620A (en) Stereoscopic imaging apparatus and method
US20150301313A1 (en) Stereoscopic lens for digital cameras
CN103329549B (en) Dimensional video processor, stereoscopic imaging apparatus and three-dimensional video-frequency processing method
JP5638791B2 (en) Imaging device
JP6907616B2 (en) Stereoscopic image imaging / display combined device and head mount device
EP2566166A1 (en) Three-dimensional imaging device
WO2011151872A1 (en) 3-dimensional image data generating method
TWI505708B (en) Image capture device with multiple lenses and method for displaying stereo image thereof
CN106534831A (en) Three-dimensional video shooting method and user terminal
JP2006267767A (en) Image display device
JPH0784326A (en) Stereoscopic image photographing display system
JP2002218501A (en) Image pickup device
JP2004258594A (en) Three-dimensional image display device realizing appreciation from wide angle
JP2005115251A (en) Stereoscopic video photographing and reproducing apparatus
JP2002344997A (en) Edit method for stereoscopic video signal, and optical adaptor for a video camera for stereoscopic video photographing
JP5222718B2 (en) Stereoscopic image reproduction apparatus, stereoscopic image reproduction program, and imaging apparatus
JP4496122B2 (en) Stereoscopic video shooting and playback device
JPH07134345A (en) Stereoscopic image pickup device, attachment for picking up stereoscopic image and stereoscopic image enjoying device
JP2002341289A (en) Stereoscopic video observation device
JP6036784B2 (en) Image processing apparatus, image processing method, image processing program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10852483

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2010852483

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010852483

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE