WO2007029686A1 - Système d’enregistrement/reproduction d’image 3d - Google Patents

Système d’enregistrement/reproduction d’image 3d Download PDF

Info

Publication number
WO2007029686A1
WO2007029686A1 PCT/JP2006/317531 JP2006317531W WO2007029686A1 WO 2007029686 A1 WO2007029686 A1 WO 2007029686A1 JP 2006317531 W JP2006317531 W JP 2006317531W WO 2007029686 A1 WO2007029686 A1 WO 2007029686A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
angle
image data
recording
Prior art date
Application number
PCT/JP2006/317531
Other languages
English (en)
Japanese (ja)
Inventor
Ryuji Kitaura
Original Assignee
Sharp Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Kabushiki Kaisha filed Critical Sharp Kabushiki Kaisha
Priority to JP2007534423A priority Critical patent/JP4619412B2/ja
Publication of WO2007029686A1 publication Critical patent/WO2007029686A1/fr

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding

Definitions

  • the present invention relates to a system for recording and reproducing stereoscopic images.
  • left-eye image and right-eye image with binocular parallax are prepared and projected to the left and right eyes, respectively.
  • left-eye image and right-eye image are prepared and projected to the left and right eyes, respectively.
  • 3D is a word that means 3D or 3D
  • 2D is a word that means 2D.
  • Stereoscopic image data is 3D image data
  • normal 2D image data is 2D images. Data.
  • FIG. 18 is a conceptual diagram for explaining the parallax barrier method.
  • FIG. 18 (a) is a diagram illustrating the principle of the occurrence of parallax.
  • FIG. 18 (b) is a diagram showing a screen displayed by the paralatras noria method.
  • the image display panel 100 displays an image in which the left-eye image and the right-eye image are arranged alternately every other pixel in the horizontal direction as shown in Fig. 18 (b).
  • the left eye image is only the left eye 102
  • the right eye image is the right eye 103. Therefore, stereoscopic observation can be performed.
  • FIG. 19 is a conceptual diagram showing an example of the recording data format of such a composite image.
  • the left-eye image 104 shown in FIG. 19 (a) and the right-eye image 105 shown in FIG. 19 (b) are arranged side by side to create and record one composite image 106 shown in FIG. 19 (c).
  • This composite image during playback By rearranging 106, the images are converted into images suitable for each display format as shown in Fig. 18 (b).
  • Patent Document 1 the force that an observer observes from an angle of 90 degrees with respect to the display surface on which an image is displayed.
  • Patent Document 2 the display surface is horizontally arranged and observed. It is said that the person observes from an oblique direction.
  • Patent Document 2 after capturing an image for the left eye and an image for the right eye from an obliquely upward direction with respect to an object placed on the reference plane, the depth direction pattern generated in each of the captured images. A method for correcting one spectrogram is described, and this method will be briefly described with reference to FIGS.
  • FIG. 20 is a diagram showing how the left-eye image and the right-eye image are captured at this time.
  • a reference plane 108 having a horizontal width H and a vertical width V is set horizontally on a horizontal plane 107.
  • An object 109 is placed on the reference plane 108, and a camera 110 for taking a right-eye image and a camera 111 for taking a left-eye image are respectively set obliquely above the reference plane 108 and the camera interval is set.
  • the camera 110 and the camera 111 are directed toward the object 109 to take an image.
  • the lines of sight of the camera 110 and the camera 111 are set to have the same angle ⁇ 1 with respect to the reference plane 108, respectively.
  • FIG. 21 is a diagram showing a left-eye image and a right-eye image taken at this time.
  • FIG. 21 (a) is an image for the left eye
  • FIG. 21 (b) is an image for the right eye
  • a reference plane 108 and an object 109 are captured in each image.
  • the four corner points of the reference plane 108 in FIG. 21 (a) are Pl, P2, P3, and P4, respectively
  • the four corner points of the reference plane 108 in FIG. 21 (b) are P 5, P6, P7, and P8, respectively.
  • FIG. 22 is a diagram showing a state in which the perspective is corrected for the left-eye image.
  • Fig. 21 (a) Force also cuts out the reference plane 108 as shown in Fig. 22 (a), and the Pl, P 2, P3, and P4 points of the cut out reference plane 108 are respectively shown in Fig. 22 (b).
  • the reference plane 108 is deformed and developed so as to be P9, P10, Pl1, and P12 of the image 113 for use.
  • the aspect ratio of the image in FIG. 22B is set to 3 ⁇ 4: V, that is, the same aspect ratio as that of the actual reference plane 108 is set.
  • FIG. 23 is a diagram showing a state in which the perspective is corrected for the right-eye image.
  • the reference plane 108 is cut out from FIG. 21 (b) as shown in FIG.
  • the points P5, P6, P7, and P8 on the projected reference plane 108 are expanded to P13, P14, P15, and P16 in the right-eye image 114 as shown in Fig. 23 (b).
  • the developed reference plane should have the same value as the actual aspect ratio of the reference plane.
  • FIG. 24 is a diagram illustrating a state in which stereoscopic viewing is performed using the left-eye image and the right-eye image subjected to perspective correction.
  • an anaglyph image is created using the perspective-corrected image for the left eye and the image for the right eye.
  • An anaglyph image is a single image created by extracting only the R component from the RGB image of the left-eye image, and extracting only the G or B component from the RGB image model of the right-eye image, and combining them. It is. The observer can view stereoscopically by wearing red and blue glasses and observing this image.
  • the created anaglyph image is printed on the printed material 115 with the same size as the reference surface 108 at the time of photographing, and is arranged on the horizontal surface 107.
  • the observer uses the red and blue glasses so that the position of the right eye 117 and the left eye 118 with respect to the printed material 115 and the angle ⁇ 1 between the line of sight 119 and the printed material 115 are the same as those of the camera at the time of shooting.
  • By looking at the printed matter 115 it is possible to observe a stereoscopic image in which the object 116 is raised on the printed matter 115.
  • Patent Document 1 JP 2002-125246 A
  • Patent Document 2 Japanese Patent No. 3579683
  • the reference plane does not necessarily need to be parallel to the horizontal plane, for example, it is included in EXIF (Exchangeable Image File Format) information attached to the image when the image is taken with a digital camera.
  • EXIF Exchangeable Image File Format
  • the present invention has been made to solve the above-described problems, in which an image is taken from an oblique direction with respect to a reference plane, and an angle formed by the reference plane and the line-of-sight direction of the camera at the time of shooting. Is recorded in the header of the captured image data as an observation angle, and the captured image data is converted into image data that has been corrected to eliminate the perspective in the depth direction. Create image data to do.
  • An object of the present invention is to provide a stereoscopic image recording / reproducing system in which an observer can perform stereoscopic viewing from an accurate direction by presenting an observation angle of a header to an observer when reproducing the created image data. And
  • the present invention is a stereoscopic image recording / reproducing system that generates, records, and reproduces stereoscopic image data from a plurality of image data corresponding to a plurality of viewpoints.
  • 3D image input means for outputting shooting angle information, which is information about the angle formed by the line-of-sight direction when shooting from the imaging device and the reference plane on which the subject is placed, together with image data and control information, and a stereoscopic image Control means for calculating an observation angle with respect to the display means for stereoscopic viewing from the photographing angle information, and the calculated observation angle information is recorded as 3D image control information in the stereoscopic image data together with the control information.
  • the 3D image recording means includes the imaging means, and the 3D image input means measures an inclination in the line-of-sight direction of the imaging means, and positions the imaging means and the reference plane based on the inclination.
  • the image processing apparatus further includes an imaging angle measuring unit that generates information and calculates an imaging angle according to the position information.
  • the 3D image input means is characterized in that an arbitrary value input from the outside is added as an offset to the shooting angle, and a newly calculated value is used as shooting angle information.
  • the 3D image input means the 3D image recording means records an arbitrary value inputted from the outside as offset angle information in the 3D image control information, and the 3D image reproduction means The offset angle information is analyzed from the image control information and output to the display means.
  • the 3D image recording means is characterized in that the 3D image recording means creates observation angle information by substituting the value of the photographing angle into the observation angle.
  • the 3D image input means analyzes the observation angle information included in the 3D image control information, and outputs the observation angle information to the display means. It is characterized by.
  • the 3D image reproduction means analyzes the observation angle information included in the 3D image control information, and the display means according to a value of the observation angle information. It is characterized by having an operating means capable of tilting.
  • the observation angle is recorded as the observation angle information in the control information of the stereoscopic image data, and the observation angle information is read from the control information of the stereoscopic image data by the reproducing means, and the observation is performed to the user at the time of output.
  • the observer can perform stereoscopic viewing from an accurate direction, and thus can perform stereoscopic viewing without distortion.
  • the observation angle information is recorded in the 3D image data to be recorded, so that the management and handling of the data becomes very simple.
  • the camera tilt at the time of shooting is measured, the position of the camera relative to the reference line is estimated from the measured camera tilt, and the camera tilt and the position relative to the camera reference line are estimated. Since the shooting angle is obtained from the relationship, the observation angle can be easily obtained.
  • an appropriate observation angle when the observer looks at the center of the display is displayed on the display.
  • the observer can easily know the appropriate observation angle, and as a result, the observer can perform stereoscopic viewing from the correct direction. It is possible to perform stereoscopic viewing without causing distortion of the generated stereoscopic image.
  • the present invention by adding an offset angle to the shooting angle when shooting or recording 3D image data, it is possible to arbitrarily set an observation angle that is uniquely determined by the shooting angle force at the time of shooting. Therefore, even if the shooting angle at the time of shooting is small, by adding an offset angle and setting the observation angle freely, it is possible to prevent creation of 3D image data that is difficult to observe or cannot be observed depending on the display be able to.
  • the offset angle is not calculated by adding it to the observation angle at the time of input or recording, it is also possible to show the viewer an appropriate angle for stereoscopic viewing by presenting the offset angle to the viewer during playback. .
  • FIG. 1 is a block diagram showing a configuration of a stereoscopic image recording / reproducing system according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing an example of the data configuration of 3D image data.
  • FIG. 3 is a diagram illustrating an example of a left-eye image and a right-eye image.
  • FIG. 4 is a diagram showing an example of image data in 3D image data.
  • FIG. 5 is a diagram showing an example of image data in 3D image data.
  • FIG. 6 is a diagram for explaining an example of image data in which the image aspect ratio is changed.
  • FIG. 7 is a block diagram showing a configuration of 3D image input means 2.
  • FIG. 8 is a diagram showing the relationship between the camera tilt angle and the shooting angle.
  • FIG. 9 is a block diagram showing a configuration of 3D image recording means 3.
  • FIG. 10 is a flowchart for explaining the operation of the 3D image recording means 3.
  • FIG. 11 is a block diagram showing a configuration of 3D image reproduction means 4.
  • FIG. 12 is a flowchart for explaining the operation of the 3D image reproducing means 4.
  • FIG. 13 is a diagram for explaining a method of creating display image data from decoded image data.
  • FIG. 14 is a diagram for explaining a new reference line and a shooting angle when an offset angle is added to the shooting angle.
  • FIG. 15 is a diagram for explaining a new reference line and a shooting angle when an offset angle is added to the shooting angle.
  • FIG. 16 is a diagram showing an example of the data configuration of 3D image data when offset angle information is recorded.
  • FIG. 17 is a diagram showing a state of observation when the display surface is tilted.
  • FIG. 18 is a conceptual diagram for explaining the parallax barrier method.
  • FIG. 19 is a conceptual diagram showing an example of a recording data format of a composite image.
  • FIG. 20 is a diagram showing a state of taking a left-eye image and a right-eye image.
  • FIG. 21 is a diagram showing captured left-eye images and right-eye images.
  • FIG. 22 is a diagram showing how perspective is corrected for the left-eye image.
  • FIG. 23 is a diagram showing how the perspective is corrected for the right-eye image.
  • FIG. 24 is a diagram showing a state in which stereoscopic vision is performed using a perspective-corrected left-eye image and right-eye image.
  • Image cropping means Image correction means
  • Image composition means Compression means
  • Header information creation means Multiplexing means Separating means
  • 3D control information analysis means Display image creation means Display means
  • FIG. 1 is a block diagram showing a configuration of a stereoscopic image recording / playback system according to an embodiment of the present invention.
  • the stereoscopic image recording / reproducing system 1 includes a 3D image input unit 2, a 3D image recording unit 3, and a 3D image reproducing unit 4.
  • the 3D image input means 2 inputs a plurality of image data corresponding to a plurality of viewpoints from the outside, shooting angle information indicating an angle when the input image data of each viewpoint is shot, and input image data of each viewpoint.
  • the horizontal image size and the vertical image size, and the horizontal viewpoint number information and the vertical viewpoint number information indicating the number of viewpoints included in the 3D image data are generated as the control information of the image data.
  • the 3D image recording means 3 records the image data input from the 3D image input means 2 as 3D image data.
  • the 3D image playback means 4 plays back the 3D image data recorded by the 3D image recording means 3
  • the 3D image data is image data for stereoscopic viewing, and is data composed of image data and 3D control information.
  • the description about each means will be described later. First, 3D image data and 3D control information will be described.
  • FIG. 2 is a diagram illustrating an example of a data configuration of 3D image data.
  • the 3D image data includes a header 5 and image data 6.
  • the header 5 includes image size information of the image data 6 and 3D control information.
  • Examples of header 5 include EXIF (Exchangeable Image File Format) heg, AVI (Audio Video Interleaved), ASF (Advanced Streaming Format), WMV (Windows Media Video), MP4 file format header, etc. It is done.
  • Examples of image data include uncompressed image data and JPEG (Joint Photographic Expe rts Group) and MPEG (Moving Picture Experts Group) compressed image data compressed by a compression method.
  • 3D control information includes information on the configuration of image data in 3D image data
  • It shows information for controlling display when displaying D images, and includes the number of horizontal and vertical viewpoints and observation angle information.
  • the number of viewpoints in the horizontal direction and the vertical direction indicates information on the number of image data having different viewpoints included in the 3D image data.
  • a 3D image without distortion can be observed by observing a predetermined angular force on the display surface on which the 3D image is displayed. Information on the predetermined angle at this time is taken as observation angle information.
  • FIG. 3 is a diagram for explaining an example of an image for the left eye and an image for the right eye
  • FIGS. 4 and 5 are diagrams for explaining an example of the image data 6 in the 3D image data.
  • FIG. 3A shows a left-eye image
  • FIG. 3B shows a right-eye image
  • the horizontal image size h and the vertical image size V are the same.
  • the left eye image and the right eye image are arranged side by side in the order of viewpoint as shown in Fig. 4 to obtain a single image data.
  • the number of viewpoints of the image data is 2 in the horizontal direction and 1 in the vertical direction.
  • the image size of this 3D image data is 2 X h for horizontal and vertical force.
  • FIG. 5 is an example when the number of horizontal viewpoints is 4 and the number of vertical viewpoints is 2, and images of 8 viewpoints are numbered in the same manner as in the description of FIG.
  • the images are arranged in raster scan from the upper left to the lower right in the order of viewpoints, such as 1 to 8, and are used as one image data.
  • the image data at this time can change the image aspect ratio.
  • the image aspect ratio is information indicating the value obtained by dividing the vertical scaling factor of the image data by the horizontal scaling factor.
  • FIG. 6 is a diagram for explaining an example of image data in which the image aspect ratio is changed.
  • the image data in Fig. 4 and Fig. 5 is the power to change the image aspect ratio at the time of creation, so the image aspect ratio is 1.
  • the vertical scale of the image data in Fig. 4 Is the image data without changing the horizontal scale to 1/2, and the horizontal size is h, Vertical is v and image aspect ratio is 2.
  • the value “2” is used as the image aspect ratio information.
  • the image aspect ratio is fixed to 1 for the sake of simplicity.
  • FIG. 7 is a block diagram showing the configuration of the 3D image input means 2.
  • the 3D image input unit 2 includes an imaging unit 9, a control unit 10, and an imaging angle measurement unit 11.
  • the image pickup means 9 is also configured with, for example, at least one image pickup device force such as a CCD camera, and takes in an external video and outputs it as an input image.
  • image pickup device force such as a CCD camera
  • the control means 10 is a means for controlling the imaging means 9, for example, controlling the left and right angles and position of the imaging means 9, and is realized by a CPU or the like, not shown.
  • the photographing angle measuring means 11 is a means using a general digital angle meter using a liquid sensor or the like, or a general gyro sensor, and is not related to the present invention, and therefore will not be described in detail.
  • the photographing angle measuring means 11 measures the inclination angle of the imaging means 9 with respect to the horizontal plane in the imaging direction, and the measured value force can also determine and output the photographing angle.
  • FIG. 8 is a diagram showing the relationship between the camera tilt angle and the shooting angle.
  • the imaging angle is defined as the angle ⁇ ⁇ ( ⁇ is an integer from 1 to 4) formed by the camera's line-of-sight direction and the reference line, and the value range is defined as a value from 0 to 90 degrees. .
  • the reference line is parallel to the horizontal line.
  • Camera tilt angle j8 n (where n is an integer from 1 to 4, hereinafter referred to as “camera tilt angle”) is measured using a digital goniometer or gyro sensor.
  • the value of the angle ⁇ ⁇ ranges from 0 to less than 360 degrees. A method for calculating the photographing angle based on the measured value force at this time will be described.
  • the camera tilt angle j8 ⁇ when the camera is parallel to the reference line and the top and bottom of the captured image is not turned over is assumed to be 0 degree, and the camera and the reference line are viewed from the side.
  • the camera tilt angle increases as the camera is rotated clockwise around the center of Suppose that it makes one revolution and is in a 0 degree state.
  • FIG. 8 (a) shows the relationship with the shooting angle oc 1 when the camera tilt angle ⁇ 1 is not less than 0 degrees and not more than 90 degrees.
  • a shooting angle ⁇ 1 that is an angle formed by the line-of-sight direction 13 of the camera 12 and the reference line 14 coincides with the camera tilt angle j81.
  • FIG. 8 (b) shows a shooting angle ⁇ 2 when the camera tilt angle ⁇ 2 is greater than 90 degrees and equal to or less than 180 degrees.
  • an imaging angle ⁇ 2 that is an angle formed by the line-of-sight direction 13 of the camera 12 and the reference line 14 is (180—j8 2).
  • 8 3 and j8 4 as shown in FIG. 8 (c) and FIG. 8 (d) are larger than 180 degrees, and 3
  • the angle is less than 60 degrees
  • the image is taken above the reference line 14 described with reference to FIGS. 8A and 8B, so the reference line 14 is not included in the captured image. Therefore, the reference line 15 parallel to the reference line 14 and above the camera is used as a new reference line.
  • FIG. 8 (c) shows the shooting angle ex 3 when the camera tilt angle ⁇ 3 is larger than 180 degrees and smaller than 270 degrees.
  • the angle formed by the reference line 15 and the viewing direction 13 of the camera 12 is defined as an imaging angle oc3.
  • the value of ⁇ 3 at this time is (j8 3 ⁇ 180).
  • FIG. 8 (d) shows the shooting angle ⁇ 4 when the camera tilt angle j84 is larger than 270 degrees and smaller than 360 degrees (this is the same as 0 degrees).
  • the angle formed by the reference line 15 and the line-of-sight direction 13 of the camera 12 is defined as an imaging angle ⁇ 4.
  • the value of ⁇ 4 is (3
  • the shooting angle measuring means 11 measures the tilt of the camera at the time of shooting, estimates the position relative to the camera reference line from the measured camera tilt, and determines the camera tilt and the camera reference line. By obtaining the shooting angle from the positional relationship with respect to, it is possible to output the angle between the viewing direction of the camera used for shooting and the reference plane and the reference plane including the reference line in the shot image.
  • the photographing angle obtained above is information used for obtaining an observation angle necessary for performing stereoscopic viewing without distortion! How to obtain the observation angle will be described later.
  • the imaging means 9 is two CCD cameras, the left-eye image data and the right-eye image data are output, and the image data for each viewpoint is output.
  • the image size of the data is the same.
  • the imaging means 9 outputs the image data captured by the two CCD cameras as input image data.
  • a rectangular reference plane is set with paper or the like so as to be parallel to the horizontal line under the object to be photographed and photographed on the reference plane while viewing the captured image. Take a picture so that the object fits in.
  • specific marks are placed at positions corresponding to the four corners of the reference plane, and a square with these marks as vertices is used as a new reference plane so that the object to be photographed can be placed in the reference plane. You may take a picture.
  • both the reference plane and the mark may be installed, and shooting may be performed so that both are included in the image to be captured.
  • the above-mentioned mark and the outer frame constituting the reference plane are determined in advance as a predetermined image, overwritten at a predetermined position of the input image, and used as a new input image.
  • the position of the mark, the size of the frame on the reference plane, and the position of a specific point in the reference plane can be output by the user. Let's be able to input freely from the outside.
  • the information on the mark position and the size and position of the frame of the reference surface may be output together with the image data without overwriting the mark and the reference surface. This information can be used by the subsequent 3D image recording means to determine the size and position of the reference plane in the image.
  • the reference plane or the position of the mark may be set so that the aspect ratio of the actual size of the reference plane to be imaged is the same as the aspect ratio of the image data to be captured.
  • the reference plane or mark position may be set so that the aspect ratio of the actual size is a specific value.
  • photographing is performed such that the center of the photographed image is positioned on a horizontal line passing through the center of the reference plane.
  • control means 10 outputs the horizontal image size, the vertical image size, and the horizontal viewpoint number of the input image at that time as 2 and the vertical viewpoint number as 1, respectively. To do.
  • the photographing angle measuring means 11 outputs the photographing angle at this time as photographing angle information.
  • the shooting angle measuring means 11 automatically measures and outputs the shooting angle, but instead of the shooting angle measuring means 11, a shooting angle input means is installed, The photographer may input a numerical value for the photographing angle from an external source.
  • the 3D image input means 2 is an image signal input device that receives a video signal or the like instead of the imaging means 9, an image display device that receives and displays a TV signal, a video or DVD Any device that outputs image data, such as an image reproducing device that reproduces image data, an image reading device such as a scanner, or an image data file reading device, is not limited thereto.
  • the shooting angle information is input by the user with an external force.
  • the 3D image input means 2 uses a plurality of pieces of image data corresponding to a plurality of viewpoints as 3D photographed image data, and photographing angle information, a horizontal image size, a vertical image size, and a horizontal The number of viewpoints in the direction and the number of viewpoints in the vertical direction can be output.
  • the shooting angle information is the same. It can be calculated. If the number of viewpoints in the vertical direction is 2 or more, the shooting angle information can be calculated in the same way for each set of image data in the same vertical direction. As many as the number of viewpoints in the vertical direction are calculated and output.
  • FIG. 9 is a block diagram showing the configuration of the 3D image recording means 3.
  • the 3D image recording means 3 cuts a part of the input image data force image, outputs the cut image data which is the cut image data, and the depth direction with respect to the cut image data.
  • An image correcting unit 17 that corrects the perspective and outputs corrected image data
  • an image combining unit 18 that combines the corrected image data and outputs combined image data
  • a compression that compresses and encodes the combined image data into compressed encoded data.
  • control means 22 is realized by a CPU or the like (not shown), and is a means for controlling each means in the 3D image recording means 3.
  • control means 22 controls each means connected to the control means 22 using the input information, and the encoded image data of the 3D image and
  • FIG. 10 is a flowchart for explaining the operation of the 3D image recording means 3.
  • the horizontal viewpoint number information is 2
  • the vertical viewpoint number information is 1
  • the input image data is described as two image data for the left and right eyes.
  • step S1 the 3D image recording means 3 starts 3D image recording processing, and proceeds to step S2.
  • the control means 22 determines whether or not the input image data and the control information are input to the 3D image recording means 3, and if it is input !, if not, the process returns to the determination step S2. Otherwise, the input image data, the horizontal image size, the vertical image size, the horizontal viewpoint number information, the vertical viewpoint number information, and the shooting angle information of the input image data of each viewpoint as control information are input to the 3D image recording means 3. Proceed to step S3. At this time, in the 3D image recording means 3, the input image data is supplied to the image cutting means 16, and the horizontal image size, vertical image size, horizontal viewpoint number information, vertical viewpoint number information, and shooting angle information are supplied to the control means 22, respectively. Entered.
  • Image encoded data is created from the input image data by the processing of steps S3 to S6 described below. Further, the image cropping method and the image correction method performed by the image cropping unit 16 and the image correction unit 17 described in these steps are the same as the method disclosed in Patent Document 2, and are related to the present invention. Since there is no such description, their detailed explanation is omitted.
  • step S3 the left and right viewpoint input image data are input to the image cropping means 16, respectively.
  • the image cropping means 16 performs processing for each input image data of each viewpoint. Means to do.
  • the image cutout means 16 obtains a specific reference plane from these input image data by image matching or the like. When a mark is photographed instead of the reference plane, the mark is also obtained by matching or the like, and the inside of a rectangle including four marks is used as the reference plane.
  • an image obtained by cutting out the reference plane is output as a cut-out image for each of the left and right viewpoints.
  • step S4 an image obtained by cutting out the reference plane is output as a cut-out image for each of the left and right viewpoints.
  • a specific area may be cut out as the reference plane, or the user may input the reference plane directly from the outside.
  • Multiple different reference planes may be prepared, and the user may select which reference plane to use from the outside. Also, if the mark position described in the explanation of the 3D image input means 2 or the size and position of the frame of the reference plane is input, obtain the information ability reference plane.
  • step S 4 left and right viewpoint image data is input to the image correction means 17.
  • the image correction unit 17 is a unit that performs processing for each input image data of each viewpoint.
  • the image correcting means 17 uses the reference plane developed in the same way as in FIG. 22 (b) or FIG. 23 (b) for the cut-out image in FIG. 22 (a) or FIG. 23 (a).
  • the reference plane aspect ratio the actual aspect ratio
  • the aspect ratio of the reference plane may be the aspect ratio of the input image data, or it may be set in advance or the user should input from the outside.
  • the reference plane aspect ratio value may be treated as a specific value set in the stereoscopic image recording system.
  • the position of the reference plane or mark is adjusted at the time of shooting so that the aspect ratio of the actual size of the reference plane to be photographed is the same as the value of this reference plane aspect ratio.
  • the image correction unit 17 outputs the corrected image data for the left and right viewpoints, respectively, and proceeds to step S5.
  • step S5 the image compositing means 18 converts the 3D image data from the input image data. Perform the process of compositing.
  • the image synthesizing means 18 converts the input image data, which is the image data of each viewpoint, from the horizontal viewpoint number information and the vertical viewpoint number information in the same manner as described in FIG. 4, FIG. 5, and FIG. A means for arranging and creating image data.
  • the horizontal viewpoint number information is 2
  • the vertical viewpoint number information is 1
  • the input image data is two corrected image data for the left and right eyes.
  • the corrected image data is input to the image composition means 18.
  • the control unit 22 transmits the horizontal viewpoint number information and the vertical viewpoint number information to the image synthesizing unit 18 and controls the image synthesizing unit 18 to synthesize 3D image data.
  • image aspect ratio information is created.
  • the image aspect ratio information is set to 1 by combining the images so that the image aspect ratio is 1.
  • the image composition means 18 outputs the created 3D image data to the compression means 19, and outputs the horizontal image size, vertical image size, and image aspect ratio information of the 3D image data to the control means 22, respectively. Proceed to
  • step S6 the compression means 19 performs a process of encoding the input image data by using an encoding method such as JPEG or MPEG and outputting the encoded data.
  • the compression means 19 is composed of a general-purpose compression means and is not related to the present invention, so the configuration is omitted.
  • 3D image data is input to the compression means 19.
  • the compression means 19 encodes the input image data, outputs the encoded data, and proceeds to step S7.
  • the control means 22 includes, as information necessary for creating the header, the horizontal image size of the entire encoded image, the vertical image size, the horizontal viewpoint number information, the vertical viewpoint number information, Information including shooting angle information and image aspect ratio information is transmitted to header information creation means 20.
  • the header information creating means 20 creates and outputs the header 5 including 3D control information using the information input from the control means 22 as described in FIG.
  • 3D control information is created by substituting the imaging angle for the observation angle constituting the 3D control information.
  • viewing angle information is obtained from the shooting angle, and is used as 3D control information.
  • step S8 the multiplexing means 21 performs a process of multiplexing the input code data and the header.
  • the multiplexing means 21 multiplexes the encoded data input from the compression means 19 and the header input from the header information creation means 20 to create multiplexed data, and outputs it as 3D image data. Proceed to S9.
  • step S9 the 3D image data created by the multiplexing means 21 is recorded.
  • the stereoscopic image recording / reproducing system 1 includes data recording / reproducing means (not shown) inside.
  • This data recording / reproducing means can record 3D image data output from the multiplexing means 21 in the 3D image recording means 3 as data, for example, removable media such as cards, node disks, optical disks, magnetic tapes, etc. This means that data can be recorded on or read from a recording medium. Since the recording / reproducing means itself is a general one and its configuration is not related to the present invention, the description thereof will be omitted.
  • the data recording / reproducing means is included in the stereoscopic image recording / reproducing system 1.
  • the data recording / reproducing means may be provided outside.
  • the data recording / playback means are devices that can exchange data with the outside, that is, external hard disks, optical disk recording / playback devices, and card readers for general personal computers (hereinafter referred to as “PCs”). It may be the PC itself.
  • a digital video camera or digital video may be used.
  • the transmission path may be considered as the Internet, and the data recording / reproducing means may be a server connected to the Internet. Further, the data recorded in the data recording / reproducing means can be freely read by the 3D image reproducing means 4.
  • step S9 the 3D image data is recorded by the data recording / reproducing means, and the process proceeds to determination step S10.
  • the determination step S10 it is determined whether or not the recording process of the 3D image recording unit 3 is to be ended, and if it is determined that the recording process is to be ended, the process proceeds to step S11 and the recording of the 3D image recording unit 3 is performed. If not, return to step S2 and continue 3D image recording.
  • the factors for determining that recording is ended in the determination step S10 are the same as those of a normal recording device, for example, a user's interruption operation, a lack of recording medium capacity, a knottery outage, and interruption of power supply. , Faults such as disconnection, and accidents.
  • the 3D image recording unit 3 can record 3D image data.
  • FIG. 11 is a block diagram showing the configuration of the 3D image reproduction means 4.
  • the 3D image reproduction means 4 separates the 3D image data into a header and encoded image data, and outputs a separation means 23, and decodes the image data from the input encoded data to create a display image.
  • Decoding means 24 for outputting to means 26, 3D control information analyzing means 25 for analyzing and outputting 3D control information to control means 27, Display image creating means for creating and outputting a display image from the decoded image data cover 26, a control means 27 for controlling the display image creating means 26 and the display means 28, respectively, and a display means 28 for displaying the inputted 3D image data.
  • the display means 28 is assumed to be means for performing a stereoscopic display using, for example, a paralatus nore as shown in FIG.
  • FIG. 12 is a flowchart for explaining the operation of the 3D image reproduction means 4.
  • step S12 the 3D image playback means 4 starts playback processing.
  • the 3D image reproducing means 4 accesses the data recording / reproducing means described in the 3D image recording means 3.
  • step S13 it is determined whether or not 3D image data is input to the 3D image playback means 4, and if it is input, the process proceeds to step S14, and if not, step S14 is performed. 13 Go back.
  • step S14 3D image data is input to the separating means 23. Separating means 23 separates encoded data and header from the input 3D image data, and outputs the encoded data to decoding means 24 and the header to 3D control information analyzing means 25, respectively. Proceed to [0117] In step S15, the encoded data is input to the decoding unit 24. The decoding unit 24 decodes the input encoded data, and outputs the decoded image data to the display image creating unit 26. move on.
  • the decoding unit 24 is a unit that decodes input encoded data, such as JPEG or MPEG, and outputs decoded image data.
  • the decryption means 24 is composed of general-purpose decryption means and is not related to the present invention, and therefore its configuration is omitted.
  • step S 16 the header is input to the 3D control information analysis means 25.
  • the 3D control information analysis means 25 analyzes the 3D control information contained in the header, and controls the 3D control information including the number of viewpoints in the horizontal direction, the number of viewpoints in the vertical direction, the image aspect ratio information, and the observation angle information. And go to step S17.
  • step S 17 the decoded image data is input to the display image creating means 26.
  • control means 27 receives the number of viewpoints in the horizontal direction, the number of viewpoints in the vertical direction, and image aspect ratio information.
  • the display image creation means 26 converts the decoded image data using the information on the number of viewpoints, and creates display image data.
  • the number of horizontal viewpoints is 2
  • the number of vertical viewpoints is 1
  • the image aspect ratio information is 1.
  • FIG. 13 is a diagram for explaining a method of creating display image data from decoded image data.
  • FIG. 13 (a) shows decoded image data in which the number of viewpoints in the horizontal direction is 2, the number of viewpoints in the vertical direction is 1, and the image aspect ratio information is 1.
  • the left half of the decoded image data is image data for the left eye, and the right half is image data for the right eye.
  • the image powers of these viewpoints are arranged horizontally in the order of the viewpoints.
  • the display image creating means 26 interprets this structure from the number of horizontal viewpoints and the number of vertical viewpoints, and divides the image data of each viewpoint into vertically long strips. As shown in Fig. 13 (b), the left and right sides of the divided strips are rearranged in the order of viewing points to create display image data, which are output to the display means 28. Proceed to 18.
  • step S 18 display image data is input from the display image creation means 26 and observation angle information is input from the control means 27 to the display means 28.
  • the display means 28 is also configured with a display and a paralatus noria force, and as shown in FIG.
  • the image data is displayed in 3D, and the process proceeds to decision step S19.
  • the display means 28 may display the input observation angle information as a numerical value on the display of the display means 28.
  • the observer can easily know the appropriate observation angle when viewing the center of the display of the display means 28 and, as a result, can perform stereoscopic viewing from the correct direction, which is different from the assumption.
  • Angular force Enables stereoscopic viewing without causing distortion that occurs during observation.
  • the observation angle information is displayed as a numerical value on the display surface.
  • a horizontal slit or a wrench chiral sheet is prepared on the front surface of the display unit 28, and an appropriate observation angle is obtained.
  • a specific image pattern that can be observed only when the observer observes may be displayed. Thereby, the observer can know an appropriate observation position more easily.
  • a plurality of directional backlights or a backlight capable of switching the angle is prepared, and a backlight switching means for switching these backlights is installed in the 3D image reproduction means 4. Then, the backlight switching means may be controlled by the control means 27 so that light is emitted only in the direction indicated by the observation angle information. By doing so, the user can know an appropriate observation position more easily.
  • step S 19 it is determined whether or not the playback process of the 3D image playback means 4 is to be ended. If it is determined that the playback process is to be ended, the process proceeds to step S 20 and the 3D image playback means 4 of If not, the process returns to step S13, and the 3D image playback process is continued.
  • the reason for judging that the judgment step S20 finishes the reproduction is the same as the normal reproduction device, for example, the interruption operation of the user, the interruption of the power supply such as the battery running out, the failure such as the disconnection, and the broken data. For example, an accident such as an input.
  • the 3D image playback means 4 ends the playback process.
  • the 3D image playback means 4 can play back 3D image data for stereoscopic display.
  • a display, a stand for supporting the display, and a movable means for changing the angle of the display surface may be added. Also, the movable at this time The means may be composed of a motor, etc. so that the control means 27 can automatically change the angle of the display surface according to the observation angle information!
  • the display surface when the observation angle information is 0 degrees, the display surface is vertical, and when the observation angle information is 90 degrees, the display surface is horizontal.
  • A is A (0 ⁇ A ⁇ 90) degrees
  • the display surface may be controlled by tilting the upper part of the display surface by (90—A) degrees from the vertical state. In this way, by automatically changing the angle of the display surface according to the observation angle information, the user can observe from an appropriate observation angle without the need for operation, which is extremely simple. High stereoscopic display is possible.
  • the force that the display surface is tilted (90-A) degrees from the vertical state When the observation angle information is not 90 degrees, the display surface is leveled and the observation angle information is displayed. You may make it display. Further, at this time, the above-mentioned movable means may be provided with a mechanism that fluctuates up and down by rotation alone, and the display surface may be automatically lowered when it is horizontal. By doing so, the user 1 can also observe an appropriate observation angle and positional force.
  • the observation angle information is recorded, transmitted, and reproduced in the header area of the image data, so that the management and handling of the data is very simple. It becomes.
  • the reference line is a line parallel to the horizontal line when defining the shooting angle, but this need not be the case.
  • the user may add the offset angle r? From the outside to the shooting angle information output by the shooting angle measurement unit 11 inside the 3D image input unit 2.
  • the absolute value of the offset angle 7? Indicates the angle between the horizontal line and the new reference line changed by adding the offset angle. If the value of r? If the part in front of the correct reference line is close to the camera, and if the value of 7? Also, here, the value of r? Is limited so that the angle between the new reference line and the viewing direction of the camera is a value between 0 and 90 degrees.
  • FIG. 14 and FIG. 15 are diagrams for explaining a new reference line and an imaging angle when an offset angle is added to the imaging angle.
  • Figure 14 (a) shows the shooting angle when 7? Is negative.
  • Fig. 14 (b) shows the change in the shooting angle when 7? Is a positive value.
  • FIG. 15 shows only the camera 12 and the camera viewing direction 13, the new reference line 29, and the shooting angle 1.
  • FIG. 15 (a) corresponds to FIG. 14 (a)
  • FIG. 15 (b) corresponds to FIG. 14 (b).
  • FIG. 15 (a) and FIG. 15 (b) are the same as FIG. 8 (a) except that the imaging angle ⁇ 1 is only ⁇ ′ 1. Accordingly, the operations in the 3D image recording means 3 and the 3D image reproducing means 4 are not different from those described above, and the description thereof is omitted.
  • the 3D image recording unit 3 may allow the user to input from the outside.
  • the shooting angle information is updated by the control means 22 of the 3D image recording means 3 as in the case of the 3D image input means 2, and the observation angle is obtained based on the updated shooting angle information, and the 3D image data is obtained. Record.
  • the user can freely set an observation angle that is uniquely determined from the shooting angle at the time of shooting by setting the offset angle to the shooting angle when shooting or recording 3D image data.
  • the degree of freedom when shooting is also increased.
  • the observation angle at the time of display is also reduced if the offset angle is not added. If the viewing angle is extremely small, the display cannot be displayed even on a display with a small viewing angle, which is very difficult for the user to observe, but the viewing angle can be set freely by adding an offset angle as described above. Therefore, it is possible to prevent the creation of 3D image data that is difficult or impossible to observe depending on the display.
  • the offset angle ⁇ at this time is recorded in the 3D image data as offset angle information in the 3D image recording means 3, and the 3D image reproducing means 4 uses the offset angle information during reproduction. You can play back 3D image data and display it in 3D!
  • FIG. 16 is a diagram showing the structure of 3D image data when the offset angle information at this time is recorded.
  • the offset angle information may be recorded in the 3D control information in the header 5 like the number of viewpoints and the observation angle information.
  • the internal configuration is the same as in FIG. 11, and the operation of each means is also explained in FIG. 11 except for the 3D control information analysis means 25, the control means 27, and the display means 28. Therefore, the description of the separation means 23, the decoding means 24, and the display screen creation means 26 is omitted, and only the operations of the 3D control information analysis means 25, the control means 27, and the display means 28 are explained. .
  • the 3D control information analysis means 25 analyzes the 3D control information force and the offset angle information and outputs them to the control means 27.
  • control means 27 displays the display surface by an offset angle so that the horizontal plane in the stereoscopic image to be reproduced is parallel to the actual horizontal plane.
  • a message notifying that the camera is tilted may be displayed on the display means 28.
  • the 3D image playback means 4 is provided with a movable means for tilting the display surface, and the display is automatically displayed for the offset angle. The surface may be tilted.
  • FIG. 17 is a diagram showing a state of observation when the display surface is tilted at this time.
  • 3D image data to be played back by the 3D image playback means 4 is a 3D image created by shooting a cube 109 placed on the reference plane 108 as shown in Fig. 20 and adding an offset angle of 7 ?. Let it be image data.
  • the observer tilts the display surface from the line 30 parallel to the actual horizontal line by an offset angle of 7 ?, so that the angle between the observer's line-of-sight direction 31 and the display surface is the angle a'1. Force observation.
  • 3D image data created by adding an offset angle by the 3D image recording means 3 is reproduced by the 3D image reproduction means 4, and the 3D image is displayed in the 3D image with the display surface horizontal.
  • the horizontal plane is displayed tilted by an offset angle from the actual horizontal plane.
  • the observer can make the reference plane 108 parallel to the line 30 parallel to the horizontal line.
  • the cube 109 of the reference plane 108 can be observed in the same manner as the actual arrangement.
  • the display itself may be notified to be tilted by the offset angle or may be automatically tilted so that the horizontal plane in the stereoscopic image becomes an actual horizontal plane.
  • the stereoscopic image is the same as the reality at the time of shooting. Displayed by arrangement. That is, since the reference plane in the stereoscopic image displayed on the 3D image reproduction means 4 is parallel to the actual horizontal line, the observer can observe a very realistic image.
  • the shooting angle of the camera is different, so that the angle-of-view information may be obtained and recorded for the number of viewpoints in the vertical direction.
  • the stereoscopic image recording / reproducing system of the present invention is not limited to the above-described embodiment, and various changes can be made without departing from the scope of the present invention.
  • photographing is performed from an oblique direction with respect to the reference plane, and the angle formed by the reference plane at the time of photographing and the line-of-sight direction of the camera is recorded as the observation angle in the header of the photographed image data.
  • image data for stereoscopic viewing from an oblique direction is created.
  • the observation angle of the header is presented to the observer, so that the observer can perform a stereoscopic view from the correct direction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

Cette invention concerne un système d’enregistrement/reproduction d’image 3D permettant à un utilisateur d’effectuer un affichage 3D dans une direction correcte lorsqu’il visualise une image 3D en trois dimensions. Un moyen d’enregistrement d’image 3D (3) calcule des données d’angle de perspective à partir de données d’angle de capture d’image, cet angle étant défini par une surface de référence lors de la capture et la direction de ligne visuelle d’une caméra, et les enregistre en tant que données de contrôle dans un en-tête (5) des données d’image capturée. Lors de la lecture des données d’image, un moyen de reproduction d’image 3D (4) lit les données d’angle de perspective dans l’en-tête (5) des données d’image et présente un angle de perspective adéquat à un utilisateur via un moyen d’affichage (28).
PCT/JP2006/317531 2005-09-07 2006-09-05 Système d’enregistrement/reproduction d’image 3d WO2007029686A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007534423A JP4619412B2 (ja) 2005-09-07 2006-09-05 立体画像記録再生システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-259682 2005-09-07
JP2005259682 2005-09-07

Publications (1)

Publication Number Publication Date
WO2007029686A1 true WO2007029686A1 (fr) 2007-03-15

Family

ID=37835806

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/317531 WO2007029686A1 (fr) 2005-09-07 2006-09-05 Système d’enregistrement/reproduction d’image 3d

Country Status (2)

Country Link
JP (1) JP4619412B2 (fr)
WO (1) WO2007029686A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008310696A (ja) * 2007-06-15 2008-12-25 Fujifilm Corp 撮像装置、立体画像再生装置及び立体画像再生プログラム
JP2011530706A (ja) * 2008-08-12 2011-12-22 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. 3d−tofカメラ装置及びそのための位置・向き較正方法
WO2018116580A1 (fr) * 2016-12-19 2018-06-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
CN110059670A (zh) * 2019-04-29 2019-07-26 杭州雅智医疗技术有限公司 人体头面部、肢体活动角度及体姿非接触测量方法及设备
JP2020520032A (ja) * 2016-04-08 2020-07-02 マックス メディア グループ, エルエルシーMaxx Media Group,Llc 電子ディスプレイの前方または上に投影されて見える仮想3次元画像を作製するためのシステム、方法、およびソフトウェア

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07298302A (ja) * 1994-04-22 1995-11-10 Canon Inc 複眼撮像−表示システム
JPH08339043A (ja) * 1995-06-12 1996-12-24 Minolta Co Ltd 映像表示装置
JPH10234057A (ja) * 1997-02-17 1998-09-02 Canon Inc 立体映像装置及びこれを含むコンピュータシステム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3667620B2 (ja) * 2000-10-16 2005-07-06 株式会社アイ・オー・データ機器 ステレオ画像撮影アダプタ、ステレオ画像撮影用カメラ、および、ステレオ画像処理装置
JP4397217B2 (ja) * 2002-11-12 2010-01-13 株式会社バンダイナムコゲームス 画像生成システム、画像生成方法、プログラム及び情報記憶媒体
JP3579683B2 (ja) * 2002-11-12 2004-10-20 株式会社ナムコ 立体視用印刷物の製造方法、立体視用印刷物

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07298302A (ja) * 1994-04-22 1995-11-10 Canon Inc 複眼撮像−表示システム
JPH08339043A (ja) * 1995-06-12 1996-12-24 Minolta Co Ltd 映像表示装置
JPH10234057A (ja) * 1997-02-17 1998-09-02 Canon Inc 立体映像装置及びこれを含むコンピュータシステム

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008310696A (ja) * 2007-06-15 2008-12-25 Fujifilm Corp 撮像装置、立体画像再生装置及び立体画像再生プログラム
JP2011530706A (ja) * 2008-08-12 2011-12-22 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. 3d−tofカメラ装置及びそのための位置・向き較正方法
JP2020520032A (ja) * 2016-04-08 2020-07-02 マックス メディア グループ, エルエルシーMaxx Media Group,Llc 電子ディスプレイの前方または上に投影されて見える仮想3次元画像を作製するためのシステム、方法、およびソフトウェア
KR102402381B1 (ko) * 2016-12-19 2022-05-27 소니그룹주식회사 정보 처리 장치, 정보 처리 방법, 및 프로그램
CN110073660A (zh) * 2016-12-19 2019-07-30 索尼公司 信息处理设备、信息处理方法和程序
KR20190096976A (ko) * 2016-12-19 2019-08-20 소니 주식회사 정보 처리 장치, 정보 처리 방법, 및 프로그램
JPWO2018116580A1 (ja) * 2016-12-19 2019-11-07 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US11106050B2 (en) 2016-12-19 2021-08-31 Sony Corporation Information processing apparatus, and information processing method
WO2018116580A1 (fr) * 2016-12-19 2018-06-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP7099326B2 (ja) 2016-12-19 2022-07-12 ソニーグループ株式会社 情報処理装置、情報処理方法、及びプログラム
US11924402B2 (en) 2016-12-19 2024-03-05 Sony Group Corporation Information processing apparatus and information processing method
CN110059670A (zh) * 2019-04-29 2019-07-26 杭州雅智医疗技术有限公司 人体头面部、肢体活动角度及体姿非接触测量方法及设备
CN110059670B (zh) * 2019-04-29 2024-03-26 杭州雅智医疗技术有限公司 人体头面部、肢体活动角度及体姿非接触测量方法及设备

Also Published As

Publication number Publication date
JPWO2007029686A1 (ja) 2009-03-19
JP4619412B2 (ja) 2011-01-26

Similar Documents

Publication Publication Date Title
US7349006B2 (en) Image processing apparatus and method, recording medium, and program
US8218855B2 (en) Method and apparatus for receiving multiview camera parameters for stereoscopic image, and method and apparatus for transmitting multiview camera parameters for stereoscopic image
US9544498B2 (en) Method for forming images
JP4476905B2 (ja) 立体表示画像データの構造、立体表示画像データの記録方法、表示再生方法、記録プログラム、および表示再生プログラム
US20090284584A1 (en) Image processing device
JP2002077943A (ja) 画像取り扱い装置
US20090244258A1 (en) Stereoscopic display apparatus, stereoscopic display method, and program
JP2010078768A (ja) 立体映像撮影装置および立体映像撮影システム
JP5420075B2 (ja) 立体画像再生装置、その視差調整方法、視差調整プログラム、及び撮影装置
US20110242273A1 (en) Image processing apparatus, multi-eye digital camera, and program
JP4619412B2 (ja) 立体画像記録再生システム
US20110193937A1 (en) Image processing apparatus and method, and image producing apparatus, method and program
JP4975256B2 (ja) 立体映像呈示装置
EP2566166B1 (fr) Dispositif d'imagerie en trois dimensions
JP2005130310A (ja) 立体視画像処理装置
US5874987A (en) Method for recording stereoscopic images and device for the same
JP4657066B2 (ja) 立体表示装置
US20130272677A1 (en) Image file generation device, image file reproduction device, and image file generation method
JPH0715748A (ja) 画像記録再生装置
JP2005130312A (ja) 立体視画像処理装置,コンピュータプログラム,および視差補正方法
JP2011119825A (ja) 映像処理装置および映像処理方法
JP2001052192A (ja) 撮影表示システム、立体画像表示方法及び記憶媒体
JP2006128899A (ja) 撮像装置
WO2012117460A1 (fr) Dispositif de lecture vidéo 3d
JP2002300608A (ja) 立体画像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2007534423

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06797437

Country of ref document: EP

Kind code of ref document: A1