US9087406B2 - Automated stereoscopic computer-animation techniques for determining scaled stereo parameters - Google Patents
Automated stereoscopic computer-animation techniques for determining scaled stereo parameters Download PDFInfo
- Publication number
- US9087406B2 US9087406B2 US13/802,692 US201313802692A US9087406B2 US 9087406 B2 US9087406 B2 US 9087406B2 US 201313802692 A US201313802692 A US 201313802692A US 9087406 B2 US9087406 B2 US 9087406B2
- Authority
- US
- United States
- Prior art keywords
- value
- parallax
- computer
- scaled
- focal length
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H04N13/0007—
-
- H04N13/0022—
-
- H04N13/0203—
-
- H04N13/0239—
-
- H04N13/0246—
-
- H04N13/0275—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present disclosure relates generally to generating optimized stereo settings for computer animation, and more specifically to calculating scaled bounded-parallax constraints for a computer-generated object in view of a pair of stereoscopic cameras within a computer-generated scene.
- Cinematographic-quality computer animation has evolved to produce increasingly realistic and engaging visual effects.
- One way that this is accomplished is through the use of stereoscopic filming techniques that simulate human binocular vision by presenting slightly different viewpoints of a scene to a viewer's left and right eye.
- This technique also known colloquially as “3D,” can be used to enhance the illusion of depth perception and make objects in a computer-generated scene appear to extend outward from a two-dimensional screen.
- each eye views the world from a slightly different perspective.
- the difference in the view from each eye also called parallax, is caused, in part, by the spatial separation between the eyes.
- the amount of parallax is increased for objects that are closer to the viewer as compared to objects that are further from the viewer.
- the brain is able to combine the different views from each eye and use the parallax between views to perceive the relative depth of real-world objects.
- Computer-animation stereoscopic filming techniques take advantage of the brain's ability to judge depth through parallax by presenting separate images to each eye. Each image depicts a computer-generated object from a slightly different viewpoint. The distance between the left and right images displayed on a screen (parallax) indicates the relative depth of the displayed computer-generated object. Parallax can be positive or negative depending on whether the computer-generated object appears to be behind the screen (positive parallax) or if it appears to be in front of the screen (negative parallax).
- a film maker e.g., a director or stereographer
- scene parameters stereoscopic parameters
- a film maker may be able to adjust scene parameters that determine the camera position, camera separation, camera convergence, and focal length of the lens to increase or decrease the stereo effect (perceived depth of a computer-generated object in a computer-generated scene).
- Another traditional solution is to provide the director or stereographer with direct control over the scene parameters for each scene in a film.
- This approach also has drawbacks in that it may be difficult to fine tune all of the scene parameters to achieve the desired amount of stereo effect. Too little stereo effect and the objects in the scene will appear flat. Too much stereo effect and the objects may appear distorted or the scene may become uncomfortable to view. Additionally, because this approach relies on manual input, the stereo effect may be inconsistent throughout the film sequence, especially when stereo adjustments are applied to a particular scene but not to others.
- scaled-parallax constraints are determined for the placement of a pair of stereoscopic cameras within a computer-generated scene.
- the focal length of the pair of stereoscopic cameras is obtained.
- a lower-bound value and upper-bound value for a range of focal lengths is also obtained.
- a set of bounded-parallax constraints including a near-parallax value and a far-parallax value is also obtained. If the focal length is less than the lower-bound value, then a scaled near-parallax value is calculated based on the near-parallax value and a lower-scale factor that is based on the lower-bound value.
- a scaled far-parallax value is also calculated based on the far-parallax value and the lower-scale factor. If the focal length is greater than the upper-bound value, then the scaled near-parallax value is calculated based on the near-parallax value and an upper-scale factor that is based on the upper-bound value. The scaled far-parallax value is also calculated based on the far-parallax value and the upper-scale factor. If the focal length is greater than or equal to the lower-bound value and less than or equal to the upper-bound value, then the scaled near-parallax value is set as the near-parallax value, and the scaled far-parallax value is set as the far-parallax value. The scaled near-parallax value and scaled far-parallax value are stored as the scaled-parallax constraints.
- the lower-scale factor is computed as a ratio of the lower-bound value divided by the focal length
- the upper-scale factor is computed as a ratio of the upper-bound value divided by the focal length
- each camera of the pair of stereoscopic cameras is positioned relative to each other based on the scaled-parallax constraints and a stereoscopic image of the computer-generated scene is created with the pair of stereoscopic cameras.
- the stereoscopic image is stored in computer memory.
- a camera separation value and a convergence value may be calculated for the pair of stereoscopic cameras based on the scaled near-parallax and scaled far-parallax values. These values may also be stored in computer memory. Each camera of the pair of stereoscopic cameras is positioned relative to each other within the computer-generated scene based on the camera separation value and the convergence value.
- FIG. 1 depicts a stereoscopically filmed, computer-generated scene.
- FIGS. 2A and 2B depict exemplary configurations for stereoscopically filming a computer-generated scene.
- FIG. 2C depicts an exemplary configuration for displaying a stereoscopically filmed scene.
- FIG. 3 depicts an exemplary process for determining a set of scene parameters based on a selected lens focal length.
- FIG. 4 depicts an exemplary process for determining a set of scene parameters using baseline stereo settings.
- FIG. 5 depicts an exemplary computer-animated scene filmed by a pair of stereoscopic cameras.
- FIG. 6 depicts an exemplary computer system.
- FIG. 1 depicts a stereoscopically filmed, computer-generated scene.
- the scene depicts two animated characters viewed in profile. For purposes of this discussion, each animated character is treated as a single computer-generated object.
- the image depicted in FIG. 1 is a composite of two views of the computer-generated scene: one view from a left camera and one view from a right camera.
- the left and right camera views can be used to produce a stereoscopic image of the computer-generated scene.
- the cameras used to produce the left and right views are offset a distance that corresponds to an estimated distance between the viewer's eyes (approximately 65 mm).
- the image in FIG. 1 appears slightly blurred because the animated characters (exemplary computer-generated objects) are viewed from the slightly different positions of the left and right camera.
- the left-camera view is presented to the viewer's left eye in isolation and the right-camera view is presented to the viewer's right eye in isolation.
- This can be achieved using a number of techniques that are known in the art, including, for example, use of stereoscopic glasses. Using these known techniques, the left-camera view is separately presented to the left eye using polarized or color-coded light that corresponds to a polarized or color-coded left lens of the stereoscopic glasses. Similarly, the right-camera view is separately presented to the right eye using polarized or color-coded light that is distinguishable from the left-camera view.
- the viewer is able to mentally and visually combine the left-camera and right-camera view into a composite image that includes a certain degree of parallax for one or more computer-generated objects.
- the greater the parallax the closer/farther the computer-generated object appears to the viewer (with respect to the display screen).
- a film maker can use this stereo effect to make computer-generated objects appear to have depth even though they are displayed on what is essentially a two-dimensional display screen.
- the computer-generated scene can be animated using traditional computer-animation techniques and the scene can be stereoscopically filmed.
- the resulting stereoscopic film sequence comprises a series of image frames, each image frame representing the computer-generated scene at a point in time.
- the computer-generated scene can be made to depict a live action scene appearing to have depth due to the stereoscopic effect of the filming technique.
- FIGS. 2A and 2B depict exemplary optical configurations of a stereoscopically filmed computer-generated scene in camera space.
- the configurations include a left camera ( 202 , 212 ) and a right camera ( 204 , 214 ) that are capable of viewing a point ( 210 , 220 ) on an object in a computer-generated scene.
- FIGS. 2A and 2B depict alternative configurations for positioning the cameras when filming the computer-generated scene.
- FIG. 2A depicts a converged camera configuration with the cameras 202 and 204 pointed inward at an angle ⁇ and converging along a curved convergence surface.
- FIG. 2B depicts an alternative configuration with cameras 212 and 214 pointed in a parallel direction and having sensors ( 216 , 218 ) offset from the center of their respective lens at a distance h.
- the parallel cameras 212 and 214 converge along a convergence plane.
- Either of the camera configurations shown in FIG. 2A or 2 B can be used to stereoscopically film a computer-generated scene.
- the left and right cameras ( 202 , 204 ) each record a different image of the computer generated scene, which includes point 210 .
- the left camera 202 records an image of the point 210 at left-image location (S lx , S ly ) using the left camera sensor 206 .
- the right camera 202 records an image of the point 210 at right-image location (S rx , S ry ) using the right camera sensor 208 .
- the difference between the left-image location (S lx , S ly ) and the right-image location (S rx , S ry ) indicates the amount of parallax for point 210 .
- FIG. 1 the left and right cameras
- the left and right cameras ( 212 , 214 ) each record a different image of the point 220 at left-image location (S lx , S ly ) for left sensor 216 and the right-image location (S rx , S ry ) for right sensor 218 .
- FIGS. 2A and 2B also depict several scene parameters that have an impact on how computer-generated objects or points in the computer-generated scene will be perceived by the viewer.
- the three-dimensional scene coordinate (C x , C y , C z ) describes the location of the point 210 within the computer-generated scene.
- Convergence distance c is the distance from the lenses and the convergence surface or convergence plane.
- the convergence surface/plane corresponds to the location of points that will have zero parallax between the left and right images. Also, points located further away from the convergence surface/plane will have greater parallax than those points that are closer to the convergence surface/plane.
- the camera separation t represents the distance between optical nodes of the left and right cameras, and may also have an impact on the amount of parallax.
- the left and right cameras also have sensor width W c and a focal length f from the sensor to the lens.
- FIG. 2C depicts an exemplary configuration of a stereoscopically filmed computer-generated scene in viewer space.
- viewer space represents how a stereoscopically filmed, computer-generated scene may be perceived by a modeled viewer located a specified distance from a modeled screen.
- the modeled viewer has an inter-ocular distance e and is positioned a distance V z from the modeled screen having a screen width W s .
- FIG. 2C depicts how left and right views, each presented to the modeled viewer's left and right eye respectively, result in eye convergence that simulates the points as being out of plane from the screen.
- FIG. 2C depicts perceived point 310 that appears to be behind the screen plane, and perceived point 320 that appears to be in front of the screen plane.
- Perceived point 310 is represented by left-camera image 312 and right-camera image 314 . Because the left-camera image 312 is to the left of right-camera image 314 , the perceived point 310 is said to have positive parallax and will appear to the viewer to have a depth that is greater than the distance from the viewer to the screen K. In other words, to the viewer, the perceived point 310 will appear to exist behind the screen plane.
- perceived point 320 is represented by left-camera image 322 and right-camera image 324 . Because the left-camera image 322 is to the right of right-camera image 324 , the perceived point 320 is said to have negative parallax and will appear to the viewer to have a depth that is less than the distance from the viewer to the screen V z . In other words, to the viewer, the perceived point 320 will appear to exist in front of the screen plane.
- a set of bounded-parallax constraints typically includes a far-parallax value, a near-parallax value, a near distance, a far distance, and a focal length.
- the far-parallax value is the maximum positive parallax for the computer-generated scene and is typically expressed in terms of pixels or a percentage of screen width.
- the near-parallax value is the minimum negative parallax for the computer-generated scene and is also typically expressed in terms of pixels or a percentage of screen width.
- the near distance and far distance are the near and far limits of where computer-generated objects may be placed within the computer-generated scene.
- Focal length is the focal length of the pair of stereoscopic cameras and is depicted as f in FIGS. 2A and 2B , above.
- the amount of stereo effect perceived by the viewer can be controlled by manipulating the bounded-parallax constraints or the other scene parameters discussed above with respect to FIGS. 2A and 2B .
- the scene parameters e.g., the bounded-parallax constraints
- the scene parameters and the optimal parameter settings may also vary over the course of computer-animated film sequence.
- one solution is to manually adjust the scene parameters to suit the requirements for each scene.
- this approach typically requires the direct involvement of a skilled director or stereographer to ensure that the settings are appropriate. Even then, it may be difficult to maintain consistency across scenes in a film or across films produced from the same studio.
- a stereo configuration can be specified with respect to a set of bounded parallax constraints, including a far-parallax value, a near-parallax value, a near distance, a far distance, and a focal length.
- the quality of the stereo effect for a scene depends, in part, on the relationship between the bounded parallax constraints.
- a pair of near- and far-parallax values produce a satisfactory stereo effect only for a limited range of lens focal lengths. That is, for lenses having a focal length outside of this range, the near- and far-parallax values may produce a stereo effect that appears distorted to the viewer. For example, using a lens having a focal length that is too large or too small may result in scene geometry that appears unnaturally flat or stretched when stereoscopically viewed.
- FIG. 3 depicts an exemplary process 1200 for determining scaled-parallax constraints based on a predetermined lens palette.
- process 1200 can be used to boost (amplify) or crush (attenuate) the amount of stereo effect for a particular scene based on the focal length of the lens as compared to the predetermined lens palette.
- a focal length of the lens is obtained.
- the focal length is typically the optical length of one of the pair of stereoscopic cameras used to stereoscopically film a computer-generated scene.
- the focal length of the lens determines, in part, the field of view of the pair of stereoscopic cameras and can be selected to provide a particular view of the computer-generated scene.
- the focal length is selected by the film maker by selecting a virtual lens.
- the focal length may be selected by implication by fixing the field of view for a particular shot.
- the film maker selects a lens having a focal length ranging from 14 mm to 200 mm. The shorter the selected focal length, the wider the field of view. Conversely, the longer the selected focal length, the narrower the field of view, increasing the zoom.
- an upper-bound and a lower-bound value for the lens palette is obtained.
- the lower-bound value is approximately 22 mm and the upper-bound value is approximately 28 mm.
- Another typical lens palette is defined by a lower-bound value of approximately 24 mm and an upper-bound value of approximately 32 mm.
- the near-parallax and far-parallax values are obtained.
- the near-parallax and far-parallax values may be obtained from manual settings or obtained from another automated process.
- the near-parallax and far-parallax values may be manually set by a director or stereographer to produce a desired stereo effect.
- the near-parallax and far-parallax values can also be calculated based on sets of predetermined stereo setting values.
- the near-parallax and far parallax values may also be obtained from an automated process for determining baseline stereo settings, as described below with respect to process 1100 .
- the values are obtained and typically stored, at least temporarily, in computer memory.
- the prime lens ratio is set to 1.0 if the focal length of the lens is greater than or equal to the lower-bound value and less than or equal to the upper-bound value. That is, if the focal length of the lens falls within the given lens palette, the prime lens ratio is 1.0.
- a scaled near-parallax value and a scaled far-parallax value are calculated based on the prime lens ratio.
- the scaled near-parallax value is calculated as the product of the near-parallax value obtained in operation 1206 and the prime lens ratio calculated in operations 1208 , 1210 , or 1212 .
- the scaled far-parallax value is calculated as the product of the far-parallax value obtained in operation 1206 and the prime lens ratio calculated above.
- the scaled near-parallax value and scaled far-parallax value are stored in computer memory.
- the scaled near-parallax value and scaled far-parallax value may be stored, for example, on a non-transitory computer-readable storage medium, such as a computer storage disk.
- a non-transitory computer-readable storage medium such as a computer storage disk.
- Other computer storage examples are provided below and discussed with respect to FIG. 6 .
- the scaled near-parallax value and scaled far-parallax value can also be used to calculate other stereoscopic parameters for the computer-generated scene. For example, a camera separation value and a convergence value can be calculated for the pair of cameras based on the scaled far-parallax value and the scaled near-parallax value. Equation 3, below, depicts an exemplary technique for calculating a camera separation value t using far-parallax and near-parallax values.
- W c is the camera sensor width
- R c is the resolution of the camera sensor
- nd is the minimum scene depth (near distance)
- fs is the far-parallax (far shift)
- fd is the maximum scene depth (far distance)
- ns is the near-parallax (near shift).
- Equation 4 depicts an exemplary technique for calculating a convergence distance value c using far-parallax and near-parallax values.
- the camera separation value and a convergence distance value may also be stored, for example, on a non-transitory computer-readable storage medium, such as a computer storage disk.
- a non-transitory computer-readable storage medium such as a computer storage disk.
- Other computer storage examples are provided below and discussed with respect to FIG. 6 .
- the camera separation value t and the convergence distance value c may be used to position the pair of stereoscopic cameras in the computer-generated scene.
- FIG. 2A depicts a converged camera configuration with the cameras 202 and 204 separated by a distance t and pointed inward at an angle ⁇ and converging along a curved convergence surface that is a convergence distance c from the cameras.
- FIG. 2B depicts an alternative configuration with cameras 212 and 214 also separated a distance t and pointed in a parallel direction.
- the camera sensors ( 216 , 218 ) are offset from the center of their respective lens at a distance h and the parallel cameras 212 and 214 converge along a convergence plane that is a convergence distance c from the cameras.
- the convergence principles shown in FIGS. 2A and 2B can also be combined to produce a convergence configuration with the cameras both pointed inward at an angle ⁇ and having sensors offset a distance h. Equations 5 and 6, below, demonstrate the relationship between the convergence value c and parameters that directly specify the position of the pair of stereoscopic cameras.
- a stereoscopic image of the computer-generated scene can be captured by the camera sensors.
- the image may be saved as a single image, or associated with a frame or time entry in an animated film sequence.
- a series of stereoscopic images are captured as the computer-generated objects in the computer-generated scene are manipulated to produce a computer animation sequence.
- the bounded-parallax constraints calculated using process 1200 may remain the same throughout a given computer animation sequence. However, in some cases, the bounded-parallax constraints or the placement for a pair of stereoscopic cameras may change one or more times during the computer animation sequence. For example, the focal length may change due to changes in the settings of a zoom lens or in the field of view of the cameras. As a result, the scaled bounded-parallax constraints may be re-calculated to account for the changes in the focal length.
- the stereoscopic image or images can be displayed to a viewer using known stereoscopic techniques to produce a scene appearing to have depth in and out of the screen.
- the stereoscopic image may be displayed to a viewer who is viewing the image through stereo glasses.
- Process 1200 can be combined with other automated processes for producing or refining scene parameters for a stereoscopically filmed scene.
- process 1200 can be combined with process 1100 , discussed below with respect to FIG. 4 for obtaining baseline bounded parallax constraints.
- Process 1100 may be used, for example, to obtain the bounded parallax constraints in operation 1206 of process 1200 .
- U.S. Provisional Application No. 61/678,568 describes other exemplary processes for calculating scaled parallax constraints, creative control of parallax constraints, scripted parallax constraints, and other parameters that can be combined with process 1200 , described above.
- 61/678,568 is incorporated by reference herein in its entirety as an example of how other automated processes that may be combined with process 1200 . Combining multiple processes may produce stereo settings that result in an optimal stereo effect and are more consistent across a film sequence, as compared to traditional manual stereo-setting techniques.
- FIG. 4 depicts a flow chart of an exemplary process 1100 for determining the bounded-parallax constraints for the placement for a pair of stereoscopic cameras in a computer-generated scene using baseline stereo settings.
- process 1100 can be used to determine acceptable stereo settings for a particular computer-generated scene based on one or more tables of baseline stereo settings.
- the one or more tables of baseline stereo settings are typically formulated in advance and include groupings of stereo parameters (stereo setting entries) that are known to produce an acceptable stereo effect for a particular scene layout.
- the one or more tables of baseline stereo settings are manually created by a stereographer or director having skill in configuring stereo settings for a scene.
- a computer-generated scene includes at least one computer-generated object, which is in view of at least one camera of a pair of stereoscopic cameras.
- FIG. 5 depicts an exemplary computer-generated scene 400 with two animated characters ( 450 , 452 ) (exemplary computer-generated objects) in view of a pair of stereoscopic cameras ( 402 , 404 ).
- Each camera has a camera sensor ( 406 , 408 ) positioned (centered or offset) with respect to a lens having a focal length f.
- the field of view of the stereoscopic cameras is determined, in part, by the focal length f of the camera lenses and defines the visual boundaries of the computer-generated scene.
- the minimum scene depth is calculated.
- the minimum scene depth is based on the distance from the cameras to the nearest point of interest in a computer-generated scene.
- the nearest point of interest may be the point on the computer-generated object that is closest to the camera.
- FIG. 5 depicts an exemplary configuration for a computer-generated scene 400 having two computer-animated characters ( 450 , 452 ) positioned with respect to cameras ( 402 , 404 ).
- Distance 420 represents the distance from the pair of cameras ( 402 , 404 ) to the nearest point 455 on the nearest animated character in the scene 400 .
- distance 420 is measured from the midpoint between the pair of cameras ( 402 , 404 ) to a point 455 on the nearest animated character 450 .
- Other points related to the location of the pair of cameras ( 402 , 404 ), including, for example, the location of the camera sensors ( 406 , 408 ) or the location of the lenses, could also be used as reference points for the distance to the nearest point 455 .
- the nearest point of interest in a computer-generated scene can be determined by scanning or sampling the computer-generated scene over a projected area associated with a middle portion of the camera sensor.
- the projected area associated with a middle portion of the sensor can be determined based on the field of view of the pair of stereoscopic cameras.
- the pair of camera sensors ( 406 , 408 ) is associated with a primary projected area 410 defined, in part, by the field of view of the pair of cameras ( 402 , 404 ).
- the primary projected area 410 roughly corresponds to the complete image that will be captured by the cameras and presented to the viewer.
- the primary projected area 410 typically includes transient objects like brush, leaves, and the ground effects that may change over the film sequence.
- selecting a nearest point of interest based on the location of these transient objects may produce a minimum scene depth that changes rapidly and results in a jumpy stereo effect.
- These transient objects are also typically located near the periphery of the primary projected area 410 and are also not likely to be subject of the viewer's attention.
- the transient objects may include objects on the ground or foliage that surrounds the primary subjects on the scene.
- a second projected area associated with the middle portion of the camera sensor can be used to determine the minimum scene depth.
- a secondary projected area 411 associated with a middle portion of the pair of camera sensors is defined.
- the secondary projected area 411 typically includes the subject of the viewer's attention and excludes many of the transient objects included in the periphery of the primary projected area 410 .
- the size of the secondary projected area 411 can be determined in order to produce minimum scene depth that will be consistent across the film sequence and will also correspond to the subject of the viewer's attention. In the example depicted in FIG.
- the size of the secondary projected area 411 is approximately 2 ⁇ 3 of the field of view of the pair of cameras (or approximately 2 ⁇ 3 of the size of the primary projected area 410 ). In other cases, the secondary projected area 411 may be, for example, approximately 3 ⁇ 4, 1 ⁇ 2, 1 ⁇ 4, or any other fractional size of the field of view of the pair of cameras.
- the secondary projected area 411 may be scanned or sampled for the nearest point of interest.
- a depth array of depth pixels is defined over the secondary projected area 411 .
- Each depth pixel is associated with a depth value representing, for example, a distance from the pair of cameras to the nearest intersecting object in the scene as measured along an imaginary line that originates at one or more camera sensor pixels and passes through the corresponding depth pixel in the depth array. If an intersecting object is not present for the depth pixel, the depth value may be empty or zero.
- a scanning algorithm may be implemented that finds the lowest non-zero or non-empty depth value in the depth array and stores that value as the nearest point of interest.
- a sampling algorithm may be implemented that selects a subset of the depth pixels in the depth array, finds the lowest non-zero or non-empty depth value, and stores that value as the nearest point of interest.
- one or more computer-generated objects may be tagged as important for the scene.
- an animated character may be manually tagged by the film maker as important because the animated character is the subject of the scene or film sequence.
- the nearest point on the nearest tagged computer-generated object with respect to the pair of cameras is used as the nearest point of interest.
- a near-parallax value (or near shift ns) is calculated based on a set of baseline stereo settings.
- the near-parallax value typically represents the maximum negative parallax between left and right views of a computer-generated object in the scene.
- the set of baseline stereo settings includes multiple stereo-setting entries of setting parameter values that are known to produce a satisfactory stereo effect. Specifically, each stereo-setting entry specifies a recommended scene depth, a recommended focal length, and a recommended near-parallax value.
- the multiple stereo-setting entries may be determined in advance and stored in a database or series of tables. As previously mentioned, the stereo-setting entries may be manually created in advance by a stereographer or director having skill in configuring stereo settings for a scene.
- the near-parallax value is calculated by selecting a stereo-setting entry having a recommended scene depth that corresponds to the minimum scene depth (determined in operation 1102 ) and having a recommended focal length that corresponds to the focal length of the pair of cameras.
- the near-parallax value is calculated based on the recommended near-parallax value of the selected stereo-setting entry.
- the set of baseline stereo settings is stored to facilitate the selection of a stereo-setting entry given two of the three recommended values.
- pairs of recommended near-parallax values and associated recommended scene depths are stored in a table of stereo-setting entries. Multiple tables of stereo-setting entries are created, each table associated with a recommended focal length. Using this storage configuration, a table associated with a recommended focal length can be selected based on the focal length of the pair of cameras. Within the selected table, a stereo-setting entry having a recommended scene depth that corresponds to the minimum scene depth (determined in operation 1102 ) can be selected. The near-parallax value can then be determined based on the recommended near-parallax value of the selected stereo-setting entry.
- the selected table will not have a recommended scene depth that exactly matches the minimum scene depth.
- two or more stereo-setting entries may be selected and the near-parallax value can be determined by interpolating between two or more parameter values associated with the selected stereo-setting entries.
- recommended parameter values from multiple focal length tables can be used to interpolate the near-parallax value.
- the set of baseline stereo settings includes recommended focal lengths ranging from 14 mm to 200 mm and recommended scene depths ranging from 0.4 to 100,000 length units.
- ns stereo LUT ( nd,f ), (7) where nd is the minimum scene depth (determined in operation 1102 ), f is the focal length of the pair of cameras, and ns is the near-parallax value (or near shift).
- a far-parallax value is calculated based on the focal length of the lenses of the pair of cameras.
- the far-parallax value typically represents the maximum positive parallax between left and right views of a computer-generated object in the scene.
- the far-parallax value is based on the focal length of the pair of cameras. Equation 8 below depicts an exemplary relationship between the far-parallax value and focal length.
- fs K*f, (8) where fs is the far-parallax value (or far shift), f is the focal length of the pair of cameras, and K is a scalar value.
- K is 1.0, resulting in a far-parallax value that equals the focal length of the pair of cameras.
- the near-parallax value and far-parallax value are stored.
- the values are typically stored and associated with the other bounded-parallax constraints (e.g., near distance, far distance, and focal length) that specify the stereo settings for an image or frame in a film sequence.
- the values may be stored, for example, on a non-transitory computer-readable storage medium, such as a computer storage disk. Other computer storage examples are provided below and discussed with respect to FIG. 6 .
- the near-parallax and far-parallax values can be used to calculate other stereoscopic parameters for the computer-generated scene.
- a camera separation value and a convergence value can be calculated for the pair of cameras based on the far-parallax value and the near-parallax value.
- An exemplary calculation of the camera separation value and convergence value is provided above with respect to equations 3 and 4, above.
- the camera separation value and a convergence distance value may also be stored, for example, on a non-transitory computer-readable storage medium, such as a computer storage disk. Other computer storage examples are provided below and discussed with respect to FIG. 6 .
- the camera separation value t and the convergence distance value c may be used to position the pair of stereoscopic cameras in the computer-generated scene.
- FIG. 2A depicts a converged camera configuration with the cameras 202 and 204 separated by a distance t and pointed inward at an angle ⁇ and converging along a curved convergence surface that is a convergence distance c from the cameras.
- FIG. 2B depicts an alternative configuration with cameras 212 and 214 also separated a distance t and pointed in a parallel direction.
- the camera sensors ( 216 , 218 ) are offset from the center of their respective lens at a distance h and the parallel cameras 212 and 214 converge along a convergence plane that is a convergence distance c from the cameras.
- the convergence principles shown in FIGS. 2A and 2B can also be combined to produce a convergence configuration with the cameras both pointed inward at an angle ⁇ and having sensors offset a distance h. Equations 5 and 6, described above, demonstrate the relationship between the convergence value c and parameters that directly specify the position of the pair of stereoscopic cameras.
- the pair of stereoscopic cameras can be positioned within a computer-generated scene.
- a stereoscopic image of the computer-generated scene can be captured by the camera sensors.
- the image may be saved as a single image, or associated with a frame or time entry in an animated film sequence.
- a series of stereoscopic images are captured as the computer-generated objects in the computer-generated scene are manipulated to produce a computer animation sequence.
- the bounded-parallax constraints calculated using process 1100 may remain the same throughout a given computer animation sequence. However, in some cases, the bounded-parallax constraints or the placement for a pair of stereoscopic cameras may change one or more times during the computer animation sequence. For example, if the objects in the computer generated scene move toward or away from the pair of stereoscopic cameras, the distance to the nearest point in the scene may change (e.g. point 455 in FIG. 5 ). In some cases, it may be beneficial to recalculate the bounded-parallax constraints based on an updated distance to the nearest point by repeating process 1100 . Similarly, bounded-parallax constraints may be re-calculated for a changing focal length. For example, the focal length may change due to changes in the settings of a zoom lens.
- the stereoscopic image or images can be displayed to a viewer using known stereoscopic techniques to produce a scene appearing to have depth in and out of the screen.
- the stereoscopic image may be displayed to a viewer who is viewing the image through stereo glasses.
- Process 1100 can be combined with other automated processes for producing or refining scene parameters for a stereoscopically filmed scene.
- process 1100 can be combined with process 1200 , discussed above with respect to FIG. 3 .
- the near-parallax and far-parallax values (or other bounded-parallax constraints) can be used in operation 1206 of process 1200 , as discussed above.
- U.S. Provisional Application No. 61/678,568 describes other exemplary processes for calculating scaled parallax constraints, creative control of parallax constraints, scripted parallax constraints, and other parameters that can be combined with process 1100 , described above. Combining multiple processes may produce stereo settings that result in an optimal stereo effect and are more consistent across a film sequence, as compared to traditional manual stereo-setting techniques.
- FIG. 6 depicts an exemplary computer system 2000 configured to perform any one of the above-described processes.
- computer system 2000 may be a general-purpose computer including, for example, a processor, memory, storage, and input/output devices (e.g., monitor, keyboard, disk drive, Internet connection, etc.).
- computer system 2000 may include circuitry or other specialized hardware for carrying out some or all aspects of the processes.
- computer system 2000 may be configured as a system that includes one or more units, each of which is configured to carry out some aspects of the processes either in software, in hardware, or in some combination thereof.
- the process for computing the bounded-parallax constraints in accordance with the processes described above may be computed on parallel computer processors or performed on separate computer systems.
- FIG. 6 depicts a computer system 2000 with a number of standard components that may be used to perform the above-described processes.
- the main system 2002 includes a motherboard 2004 having an input/output (“I/O”) section 2006 , one or more central processing units (“CPU”) 2008 , and a memory section 2010 , which may have a flash memory card 2012 related to it.
- the I/O section 2006 is connected to a display 2024 , a keyboard 2014 , a disk storage unit 2016 , and a media drive unit 2018 .
- the media drive unit 2018 can read a computer-readable medium 2020 , which typically contains computer-readable instructions 2022 and data.
- At least some values based on the results of the above-described processes can be saved for subsequent use.
- the outputs of the system including the bounded-parallax constraints, can be saved directly in memory 2010 (e.g., RAM (Random Access Memory)) or another form of storage, such as disk storage 2016 .
- values derived from the bounded-parallax constraints such as camera positions or images of the computer-generated scene, can also be saved directly in memory.
- the above-described processes may be used to define the bounded-parallax constraints for a computer-generated scene.
- a user can compose and stereoscopically film a computer-generated scene to produce a stereoscopic image that does not require excessive convergence or divergence of the viewer's eyes.
- This stereoscopic image may be visualized as a still image or as part of a film sequence.
- the stereoscopic image may be stored in memory 2010 or disk storage 2016 , or viewed on a computer display 2024 .
- a non-transitory computer-readable medium can be used to store (e.g., tangibly embody) one or more computer programs for performing any one of the above-described processes by means of a computer.
- the computer program may be written, for example, in a general-purpose programming language (e.g., Pascal, C, C++) or some specialized application-specific language.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
PLR=LB/f, (1)
where PLR is the prime lens ration, LB is the lower bound value, and f is the focal length of the lens.
PLR=UB/f (2)
where PLR is the prime lens ration, UB is the upper bound value, and f is the focal length of the lens.
where Wc is the camera sensor width, Rc is the resolution of the camera sensor, nd is the minimum scene depth (near distance), fs is the far-parallax (far shift), fd is the maximum scene depth (far distance), and ns is the near-parallax (near shift).
where f is the focal length of the pair of cameras, t is the camera separation value, and c is the convergence distance value. Thus, using the camera separation value t and the convergence distance value c, the pair of stereoscopic cameras can be positioned within a computer-generated scene.
ns=stereoLUT(nd,f), (7)
where nd is the minimum scene depth (determined in operation 1102), f is the focal length of the pair of cameras, and ns is the near-parallax value (or near shift).
fs=K*f, (8)
where fs is the far-parallax value (or far shift), f is the focal length of the pair of cameras, and K is a scalar value. In the present embodiment, K is 1.0, resulting in a far-parallax value that equals the focal length of the pair of cameras.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/802,692 US9087406B2 (en) | 2012-08-01 | 2013-03-13 | Automated stereoscopic computer-animation techniques for determining scaled stereo parameters |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261678568P | 2012-08-01 | 2012-08-01 | |
US13/802,692 US9087406B2 (en) | 2012-08-01 | 2013-03-13 | Automated stereoscopic computer-animation techniques for determining scaled stereo parameters |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140036038A1 US20140036038A1 (en) | 2014-02-06 |
US9087406B2 true US9087406B2 (en) | 2015-07-21 |
Family
ID=50025022
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/802,671 Active 2037-04-23 US10719967B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for placing masking window objects in a computer-generated scene for stereoscopic computer-animation |
US13/802,661 Active 2033-11-23 US9129436B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for smoothing scripted stereo curves for stereoscopic computer animation |
US13/802,692 Active 2033-11-05 US9087406B2 (en) | 2012-08-01 | 2013-03-13 | Automated stereoscopic computer-animation techniques for determining scaled stereo parameters |
US13/802,706 Active 2033-11-14 US9070222B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for automating stereo settings for stereoscopic computer animation |
US13/802,632 Active 2033-11-21 US9076262B2 (en) | 2012-08-01 | 2013-03-13 | Scripted stereo curves for stereoscopic computer animation |
US13/802,714 Active 2034-06-01 US9443338B2 (en) | 2012-08-01 | 2013-03-14 | Techniques for producing baseline stereo parameters for stereoscopic computer animation |
US13/802,716 Active 2034-10-19 US9582918B2 (en) | 2012-08-01 | 2013-03-14 | Techniques for producing creative stereo parameters for stereoscopic computer animation |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/802,671 Active 2037-04-23 US10719967B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for placing masking window objects in a computer-generated scene for stereoscopic computer-animation |
US13/802,661 Active 2033-11-23 US9129436B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for smoothing scripted stereo curves for stereoscopic computer animation |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/802,706 Active 2033-11-14 US9070222B2 (en) | 2012-08-01 | 2013-03-13 | Techniques for automating stereo settings for stereoscopic computer animation |
US13/802,632 Active 2033-11-21 US9076262B2 (en) | 2012-08-01 | 2013-03-13 | Scripted stereo curves for stereoscopic computer animation |
US13/802,714 Active 2034-06-01 US9443338B2 (en) | 2012-08-01 | 2013-03-14 | Techniques for producing baseline stereo parameters for stereoscopic computer animation |
US13/802,716 Active 2034-10-19 US9582918B2 (en) | 2012-08-01 | 2013-03-14 | Techniques for producing creative stereo parameters for stereoscopic computer animation |
Country Status (1)
Country | Link |
---|---|
US (7) | US10719967B2 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10719967B2 (en) | 2012-08-01 | 2020-07-21 | Dreamworks Animation L.L.C. | Techniques for placing masking window objects in a computer-generated scene for stereoscopic computer-animation |
US9161020B2 (en) * | 2013-04-26 | 2015-10-13 | B12-Vision Co., Ltd. | 3D video shooting control system, 3D video shooting control method and program |
CN104023221B (en) * | 2014-06-23 | 2016-04-13 | 深圳超多维光电子有限公司 | Stereo image parallax control method and device |
SG11201706709RA (en) * | 2015-02-20 | 2017-09-28 | Bungy New Zealand Ltd | Object movement control apparatus and method |
CA2978665A1 (en) * | 2015-03-17 | 2016-09-22 | Blue Sky Studios, Inc. | Methods, systems and tools for 3d animation |
US10828125B2 (en) * | 2015-11-03 | 2020-11-10 | Synaptive Medical (Barbados) Inc. | Dual zoom and dual field-of-view microscope |
CN105898280A (en) * | 2015-12-28 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Stereoscopic film source play optimization method and device |
CN106251403B (en) * | 2016-06-12 | 2018-02-16 | 深圳超多维光电子有限公司 | A kind of methods, devices and systems of virtual three-dimensional Scene realization |
CN106375749B (en) * | 2016-09-12 | 2018-06-29 | 北京邮电大学 | A kind of disparity adjustment method and device |
GB201616413D0 (en) * | 2016-09-28 | 2016-11-09 | International Business Machines Corporation | Monitoring network addresses and managing data transfer |
EP3576407A1 (en) * | 2018-05-31 | 2019-12-04 | Nokia Technologies Oy | Stereoscopic content |
US11538214B2 (en) * | 2020-11-09 | 2022-12-27 | Meta Platforms Technologies, Llc | Systems and methods for displaying stereoscopic rendered image data captured from multiple perspectives |
Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020122585A1 (en) | 2000-06-12 | 2002-09-05 | Swift David C. | Electronic stereoscopic media delivery system |
US6496598B1 (en) | 1997-09-02 | 2002-12-17 | Dynamic Digital Depth Research Pty. Ltd. | Image processing method and apparatus |
US6512892B1 (en) | 1999-09-15 | 2003-01-28 | Sharp Kabushiki Kaisha | 3D camera |
US20030020708A1 (en) | 2001-07-23 | 2003-01-30 | Peter-Andre Redert | Image processing unit for and method of generating a first output image and a second output image and image display apparatus provided with such an image processing unit |
US6516099B1 (en) | 1997-08-05 | 2003-02-04 | Canon Kabushiki Kaisha | Image processing apparatus |
US6798406B1 (en) | 1999-09-15 | 2004-09-28 | Sharp Kabushiki Kaisha | Stereo images with comfortable perceived depth |
US7027659B1 (en) | 1998-05-20 | 2006-04-11 | Texas Instruments Incorporated | Method and apparatus for generating video images |
US20060098896A1 (en) | 2004-08-11 | 2006-05-11 | Acushnet Company | Apparatus and method for scanning an object |
US20070192722A1 (en) | 2006-02-10 | 2007-08-16 | Fujifilm Corporation | Window display system and window display method |
US20100039502A1 (en) | 2008-08-14 | 2010-02-18 | Real D | Stereoscopic depth mapping |
US20110169825A1 (en) * | 2008-09-30 | 2011-07-14 | Fujifilm Corporation | Three-dimensional display apparatus, method, and program |
US20110292045A1 (en) | 2009-02-05 | 2011-12-01 | Fujifilm Corporation | Three-dimensional image output device and three-dimensional image output method |
US20120056984A1 (en) | 2010-09-03 | 2012-03-08 | Samsung Electronics Co., Ltd. | Method and apparatus for converting 2-dimensional image into 3-dimensional image by adjusting depth of the 3-dimensional image |
US20120250152A1 (en) | 2011-03-31 | 2012-10-04 | Honeywell International Inc. | Variable focus stereoscopic display system and method |
US20120262543A1 (en) | 2011-04-13 | 2012-10-18 | Chunghwa Picture Tubes, Ltd. | Method for generating disparity map of stereo video |
US20120320048A1 (en) | 2010-03-05 | 2012-12-20 | Panasonic Corporation | 3d imaging device and 3d imaging method |
US20130002666A1 (en) | 2011-02-04 | 2013-01-03 | Kazuhiro Mihara | Display device for displaying video, eyewear device for assisting in viewing video, video system with display device and eyewear device, and control method of video system |
US20130100254A1 (en) | 2010-08-31 | 2013-04-25 | Panasonic Corporation | Image capture device and image processing method |
US20130101263A1 (en) | 2010-08-31 | 2013-04-25 | Panasonic Corporation | Image capture device, player, system, and image processing method |
US20130120529A1 (en) | 2010-07-28 | 2013-05-16 | Yutaka Nio | Video signal processing device and video signal processing method |
US20130128992A1 (en) | 2010-09-17 | 2013-05-23 | Viswanathan Swaminathan | Methods and Apparatus for Preparation of Casual Stereoscopic Video |
US20130187910A1 (en) | 2012-01-25 | 2013-07-25 | Lumenco, Llc | Conversion of a digital stereo image into multiple views with parallax for 3d viewing without glasses |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020163482A1 (en) | 1998-04-20 | 2002-11-07 | Alan Sullivan | Multi-planar volumetric display system including optical elements made from liquid crystal having polymer stabilized cholesteric textures |
US7512262B2 (en) | 2005-02-25 | 2009-03-31 | Microsoft Corporation | Stereo-based image processing |
US7884823B2 (en) | 2007-06-12 | 2011-02-08 | Microsoft Corporation | Three dimensional rendering of display information using viewer eye coordinates |
US8228327B2 (en) | 2008-02-29 | 2012-07-24 | Disney Enterprises, Inc. | Non-linear depth rendering of stereoscopic animated images |
WO2011123177A1 (en) * | 2010-04-01 | 2011-10-06 | Thomson Licensing | Method and system of using floating window in three-dimensional (3d) presentation |
EP2395765B1 (en) | 2010-06-14 | 2016-08-24 | Nintendo Co., Ltd. | Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method |
JPWO2012023168A1 (en) | 2010-08-19 | 2013-10-28 | パナソニック株式会社 | Stereoscopic imaging device and stereoscopic imaging method |
JP2012257105A (en) | 2011-06-09 | 2012-12-27 | Olympus Corp | Stereoscopic image obtaining apparatus |
US10719967B2 (en) | 2012-08-01 | 2020-07-21 | Dreamworks Animation L.L.C. | Techniques for placing masking window objects in a computer-generated scene for stereoscopic computer-animation |
-
2013
- 2013-03-13 US US13/802,671 patent/US10719967B2/en active Active
- 2013-03-13 US US13/802,661 patent/US9129436B2/en active Active
- 2013-03-13 US US13/802,692 patent/US9087406B2/en active Active
- 2013-03-13 US US13/802,706 patent/US9070222B2/en active Active
- 2013-03-13 US US13/802,632 patent/US9076262B2/en active Active
- 2013-03-14 US US13/802,714 patent/US9443338B2/en active Active
- 2013-03-14 US US13/802,716 patent/US9582918B2/en active Active
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6516099B1 (en) | 1997-08-05 | 2003-02-04 | Canon Kabushiki Kaisha | Image processing apparatus |
US6496598B1 (en) | 1997-09-02 | 2002-12-17 | Dynamic Digital Depth Research Pty. Ltd. | Image processing method and apparatus |
US7027659B1 (en) | 1998-05-20 | 2006-04-11 | Texas Instruments Incorporated | Method and apparatus for generating video images |
US6512892B1 (en) | 1999-09-15 | 2003-01-28 | Sharp Kabushiki Kaisha | 3D camera |
US6798406B1 (en) | 1999-09-15 | 2004-09-28 | Sharp Kabushiki Kaisha | Stereo images with comfortable perceived depth |
US20020122585A1 (en) | 2000-06-12 | 2002-09-05 | Swift David C. | Electronic stereoscopic media delivery system |
US20030020708A1 (en) | 2001-07-23 | 2003-01-30 | Peter-Andre Redert | Image processing unit for and method of generating a first output image and a second output image and image display apparatus provided with such an image processing unit |
US20060098896A1 (en) | 2004-08-11 | 2006-05-11 | Acushnet Company | Apparatus and method for scanning an object |
US20070192722A1 (en) | 2006-02-10 | 2007-08-16 | Fujifilm Corporation | Window display system and window display method |
US20100039502A1 (en) | 2008-08-14 | 2010-02-18 | Real D | Stereoscopic depth mapping |
US20110169825A1 (en) * | 2008-09-30 | 2011-07-14 | Fujifilm Corporation | Three-dimensional display apparatus, method, and program |
US20110292045A1 (en) | 2009-02-05 | 2011-12-01 | Fujifilm Corporation | Three-dimensional image output device and three-dimensional image output method |
US20120320048A1 (en) | 2010-03-05 | 2012-12-20 | Panasonic Corporation | 3d imaging device and 3d imaging method |
US20130120529A1 (en) | 2010-07-28 | 2013-05-16 | Yutaka Nio | Video signal processing device and video signal processing method |
US20130100254A1 (en) | 2010-08-31 | 2013-04-25 | Panasonic Corporation | Image capture device and image processing method |
US20130101263A1 (en) | 2010-08-31 | 2013-04-25 | Panasonic Corporation | Image capture device, player, system, and image processing method |
US20120056984A1 (en) | 2010-09-03 | 2012-03-08 | Samsung Electronics Co., Ltd. | Method and apparatus for converting 2-dimensional image into 3-dimensional image by adjusting depth of the 3-dimensional image |
US20130128992A1 (en) | 2010-09-17 | 2013-05-23 | Viswanathan Swaminathan | Methods and Apparatus for Preparation of Casual Stereoscopic Video |
US20130002666A1 (en) | 2011-02-04 | 2013-01-03 | Kazuhiro Mihara | Display device for displaying video, eyewear device for assisting in viewing video, video system with display device and eyewear device, and control method of video system |
US20120250152A1 (en) | 2011-03-31 | 2012-10-04 | Honeywell International Inc. | Variable focus stereoscopic display system and method |
US20120262543A1 (en) | 2011-04-13 | 2012-10-18 | Chunghwa Picture Tubes, Ltd. | Method for generating disparity map of stereo video |
US20130187910A1 (en) | 2012-01-25 | 2013-07-25 | Lumenco, Llc | Conversion of a digital stereo image into multiple views with parallax for 3d viewing without glasses |
Non-Patent Citations (17)
Title |
---|
"Stereoscopic Filmmaking Whitepaper: The Business and Technology of Stereoscopic Filmmaking", Autodesk Inc., 2008, 8 pages. |
Adobe Community Help, "Understanding Stereoscopic 3D in After Effects", available online at <https://helpx.adobe.com/after-effects/kb/stereoscopic-3d-effects.html#main-3D-depth-cues-in-After-Effects->, retrieved on Feb. 4, 2015, Jul. 12, 2011, 11 pages. |
Engle, Rob, "Beowulf 3D: A Case Study", Proc. of SPIE-IS&T Electronic Imaging, SPIE vol. 6803, 2008, pp. 68030R-1-68030R-9. |
Fern, Christoph, "3D-TV Using Depth-Image-Based Rendering (DIBR)", Proc. Of VIIP, vol. 3, 2003, 6 pages. |
Kim et al., "Depth Adjustment for Stereoscopic Image Using Visual Fatigue Prediction and Depth-Based View Synthesis", Dept. of Electrical and Electronics Eng., IEEE, 2010, pp. 956-961. |
Lang et al., "Nonlinear Disparity Mapping for Stereoscopic 3D", ACM Transactions on Graphics, vol. 29, No. 4, Article 75, Jul. 2010, pp. 75:1-75:10. |
Lipton, Lenny, "Digital Stereoscopic Cinema: The 21st Century", Proc. of SPIE-IS&T Electronic Imaging, SPIE vol. 6803, 2008, pp. 68030W-1-68030W-7. |
Neuman, Robert, "Bolt 3D: A Case Study", SPIE-IS&T, vol. 7237, 2009, pp. 72370E-1-72370E-10. |
Non-Final Office Action received for U.S. Appl. No. 13/802,661, mailed on Jan. 23, 2015, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/802,714, mailed on Mar. 11, 2015, 22 pages. |
Notice of Allowance received for U.S. Appl. No. 13/802,632, mailed on Jan. 21, 2015, 10 pages. |
Notice of Allowance received for U.S. Appl. No. 13/802,706, mailed on Feb. 23, 2015, 23 pages. |
Notice of Allowance received for U.S. Appl. No. 13/845,701, mailed on Oct. 7, 2014, 10 pages. |
Song et al., "A Stereoscopic OpenGL-Based Interactive Plug-in Framework for Maya and Beyond", ACM, VRCAI 2009, Yokohama, Dec. 14-15, 2009, pp. 363-368. |
Sun et al., "Evaluating Methods for Controlling Depth Perception in Stereoscopic Cinematography", Proceedings of SPIE, vol. 7237, 2009, 12 pages. |
Ware et al., "Dynamic Adjustment of Stereo Display Parameters", IEEE Transactions on Systems, Man and Cybernetics-Part A: Systems and Humans, vol. 28, No. 1, Jan. 1998. 31 pages. |
Zilly et al., "The Stereoscopic Analyzer-An Image-Based Assistance Tool for Stereo Shooting and 3D Production", Proceedings of 2010 IEEE 17th International Conference on Image Processing, Sep. 26-29, 2010, pp. 4029-4032. |
Also Published As
Publication number | Publication date |
---|---|
US20140035917A1 (en) | 2014-02-06 |
US20140036038A1 (en) | 2014-02-06 |
US9443338B2 (en) | 2016-09-13 |
US20140036039A1 (en) | 2014-02-06 |
US20140036037A1 (en) | 2014-02-06 |
US9070222B2 (en) | 2015-06-30 |
US10719967B2 (en) | 2020-07-21 |
US9582918B2 (en) | 2017-02-28 |
US9129436B2 (en) | 2015-09-08 |
US20140036036A1 (en) | 2014-02-06 |
US9076262B2 (en) | 2015-07-07 |
US20140035903A1 (en) | 2014-02-06 |
US20140035918A1 (en) | 2014-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9087406B2 (en) | Automated stereoscopic computer-animation techniques for determining scaled stereo parameters | |
US7983477B2 (en) | Method and apparatus for generating a stereoscopic image | |
US8711204B2 (en) | Stereoscopic editing for video production, post-production and display adaptation | |
US20130057644A1 (en) | Synthesizing views based on image domain warping | |
US9754379B2 (en) | Method and system for determining parameters of an off-axis virtual camera | |
JP6810873B2 (en) | Systems, methods, and software for creating virtual 3D images that appear projected in front of or above an electronic display. | |
US9338426B2 (en) | Three-dimensional image processing apparatus, three-dimensional imaging apparatus, and three-dimensional image processing method | |
Liu et al. | 3d cinematography principles and their applications to stereoscopic media processing | |
CA2540538C (en) | Stereoscopic imaging | |
US9165393B1 (en) | Measuring stereoscopic quality in a three-dimensional computer-generated scene | |
CN108287609B (en) | Image drawing method for AR glasses | |
US10110876B1 (en) | System and method for displaying images in 3-D stereo | |
CN114637391A (en) | VR content processing method and equipment based on light field | |
US8952958B1 (en) | Stereoscopic computer-animation techniques based on perceptual constraints | |
OA19355A (en) | System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display | |
NZ757902B2 (en) | System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DREAMWORKS ANIMATION LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCNALLY, PHILIP;LOW, MATTHEW;SIGNING DATES FROM 20130321 TO 20130404;REEL/FRAME:030181/0588 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNORS:PACIFIC DATA IMAGES L.L.C.;DREAMWORKS ANIMATION L.L.C.;REEL/FRAME:035343/0829 Effective date: 20150330 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNORS:PACIFIC DATA IMAGES L.L.C.;DREAMWORKS ANIMATION L.L.C.;REEL/FRAME:035343/0829 Effective date: 20150330 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: DREAMWORKS ANIMATION L.L.C., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041193/0144 Effective date: 20160920 Owner name: PACIFIC DATA IMAGES L.L.C., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041193/0144 Effective date: 20160920 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |