OA19355A - System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display - Google Patents

System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display Download PDF

Info

Publication number
OA19355A
OA19355A OA1201900394 OA19355A OA 19355 A OA19355 A OA 19355A OA 1201900394 OA1201900394 OA 1201900394 OA 19355 A OA19355 A OA 19355A
Authority
OA
OAPI
Prior art keywords
virtual object
virtual
reference plane
viewpoints
stereoscopic
Prior art date
Application number
OA1201900394
Inventor
Richard S FREEMAN
Scott A HOLLINGER
Original Assignee
Maxx Media Group, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maxx Media Group, LLC filed Critical Maxx Media Group, LLC
Publication of OA19355A publication Critical patent/OA19355A/en

Links

Abstract

A system, method and software for producing a virtual scene (10) to be viewed on an electronic display (12). A virtual reference plane (24) is defined. The reference plane (24) has peripheral boundaries (27,28,29,30). A virtual object (20) is positioned above the reference plane (24) in the virtual scene (10). Stereoscopic camera viewpoints (25, 26) are calculated that enable the virtual object (20) to be imaged with the reference plane (24) within the peripheral boundaries (27, 28, 29, 30) of the reference plane (24). The virtual object (20) is digitally altered before and/or after being stereoscopically imaged. The altering includes bending, tapering or tilting a portion of the virtual object (20), and/or tilting a portion of the reference plane (24). A common set of boundaries are set for a superimposed image to create a final image (48).

Description

Although the présent invention system, method and software can be embodied in many ways, the embodiment illustrated shows the System, method and software being used to simulate an image of a dinosaur. This embodiment is selected for the purposes of description and explanation. The dinosaur is intended to represent any object, real or imaginary, that can be imaged and presented through the System. However, the illustrated embodiment is purely exemplary and should not be considered a limitation when interpreting the scope of the appended claims.
Referring to Fig. 1, it will be understood that the present invention is used to produce a virtual scene 10 on a display 12 of an electronic device 14. The virtual scene 10 appears to a person viewing the virtual scene 10 to hâve features that are threedimensional. Furthermore, the virtual scene 10 has éléments that appear to the viewer to extend above the plane of the display 12. If the electronic device '14 has a traditional LED or LCD display, the virtual scene 10 will hâve to be viewed with 3D glasses in order to observe the three-dimensional effects in the virtual scene 10. If the electronic device 14 has an auto-stereoscopic display, then the three-dimensional effects in the virtual scene 10 can be observed without the need of specialized glasses.
The virtual scene 10 displayed on the electronic device 14 can be a static image or a video. Furthermore, the virtual scene 10 can be part of a video game or a movie. Regardless of the context in which the virtual scene 10 is presented, the user must download or otherwise input the image, mobile application, game, movie or other such prepared software file 16 into the electronic device 14 that contains the virtual scene 10.
The prepared software file 16 is created by a graphie artist, game designer or similar content producer. The content producer créâtes the virtual scene 10 in the prepared software file 16 using graphie modeling software 18 run on the computer system 19 of the content producer. As will be described, the graphie modeling software 18 requires the use of two stereoscopic images. If the virtual scene 10 contains virtual objects, the virtual objects are imaged with virtual caméras. If the virtual scene 10 contains real objects, the real objects can be imaged with a stereoscopic set of real caméras 17.
Referring to Fig. 2 and Fig. 3 in conjunction with Fig. 1, an exemplary virtual scene 10 is shown that was created using the graphie modeling software 18. The virtual scene 10 contains a primary object
20. In the shown example, the primary object 20 is a dinosaur 22. However, it will be understood that any object can be modeled in the virtual scene 10. The virtual scene 10 has a reference plane 24. The reference plane 24 can be any plane in the virtual scene 10 from which objects are to appear above, in front of, and/or below. In the shown embodiment, the reference plane 24 is oriented with the ground upon which the dinosaur 22 stands. The reference plane 24 of the virtual scene 10, when displayed on an electronic display 12, is going to be oriented along the plane of the electronic display 12. As such, when the virtual scene 10 is viewed, any object imaged above the reference plane 24 will project forward and appear to extend out in front of the display 12 or above the display 12, depending on the orientation of the display 12. Conversely, any object imaged below the reference plane 24 will appear to be rearwardly projected and will appear below or behind the virtual zéro parallax reference plane, when the virtual scene 10 is viewed.
If the virtual scene 10 is to be printed, then the reference plane 24 is selected by the content producer. The reference plane is typically selected to correspond with the plane of the paper upon which the Virtual scene 10 is printed. However, other reference planes can be selected.
Although a real object can be imaged with real caméras 17 to produce digital stereoscopic and/or auto-stereoscopic images, this technique is not used as the example. In the example provided, the object to be imaged is a Virtual object that is generated by the graphie modeling software 18 that is run on the computer, system 19 of the content producer. As such, by way of example, it will be assumed that the primary object 20 being created is a Virtual object set in the Virtual scene 10 and imaged with Virtual caméras. However, it will be understood that the same techniques to be described herein can be used to create stereoscopic and/or auto-stereoscopic images of a real object by imaging a real object with real caméras 17.
Stereoscopic views are taken of the Virtual scene 10. The stereoscopic views are taken from a Virtual left caméra viewpoint 25 and a Virtual right caméra viewpoint 26. The distance DI between the Virtual caméra viewpoints 25, 26 and the angle of élévation Al of the Virtual caméra viewpoints 25, 26 are dépendent upon the scope of the Virtual scene 10. The Virtual scene 10 is being created to be shown on an electronic display 12. Most electronic displays are rectangular in shape, having a width that is between 50% and 80% of the length. Accordingly, the virtual scene 10 is created within boundaries that make the virtual scene 10 appropriate in size and scale for a typical electronic display 12. The boundaries include a front boundary 27, a rear boundary 28, and two side boundaries 29, 30. Any virtual scene 10 that is to be displayed on the electronic display 12 must exist within the boundaries 27, 28, 29, 30 in order to be seen.
A rear image boundary 28 is set for the virtual scene 10. Ail of the objects to be imaged in the virtual scene 10 are to appear forward of the rear image boundary 28. The primary object 20 has a height Hl. The virtual caméra viewpoints 25, 26 are set to a second height H2. The second height H2 is a function of the object height Hl-· and the rear image boundary 28. The second height H2 of the virtual caméra viewpoints 25, 26 is high enough so that the top of the primary object 20, as viewed from the virtual caméra viewpoints 25, 26, does not extend above the rear image boundary 28. The élévation angle of the virtual caméra viewpoints 25, 26 and the convergence angle of the caméra viewpoints 25, 26 hâve a direct mathematical relationships that dépend upon the scene boundaries 27, 28, 29, 30 and height H1 of the primary object 20.
The Virtual caméra viewpoints 25, 26 hâve parallax angles so that the virtual caméra viewpoints 25, 26 intersect at the reference plane 24. That is, the two virtual caméra viewpoints 25, 26 achieve zéro parallax at the reference plane 24. The convergence point P is preferably selected to correspond to a point near the bottom and rear of the primary object 20, should the primary object 20 be resting on the reference plane 24. For example, in the shown embodiment, the reference plane 24 corresponds to the ground upon which the dinosaur 22 stands. The virtual caméra viewpoints 25, 26 are directed to the ground just below the rear of the dinosaur's body. However, if the virtual scene were that of an airplane flying through clouds, then the reference plane could be well below the position of the airplane. In this scénario, the virtual caméra viewpoints 25, 26 would be directed to the reference plane 24 below where the virtual airplane appears to fly. The angles of the virtual caméra viewpoints 25, 26 are adjusted on a frame-by-frame basis as the primary object 20 moves relative to the reference plane 24.
Referring to Fig. 4 in conjonction with Fig. 3, it can be explained that the Virtual scene 10 is not merely imaged from the caméra viewpoints 25, 26. Rather, before and/or after the imaging of the virtual scene 10, the virtual scene 10 is digitally manipulated in various manners that are bénéficiai to the stereoscopic images that will be obtained. The digital manipulations include, but are not limited to:
i. tilt manipulations of the reference plane of the virtual scene;
ii. tilt manipulations of the primary and secondary objects in the virtual scene;
iii. bend manipulations of objects in the virtual scene; and iv. taper manipulations of objects m the virtual scene.
The manipulations that are used dépend upon the details of the objects to be imaged in the virtual scene 10.
Fig. 4 illustrâtes two of the possible tilt manipulations that can be used. In a first tilt manipulation, the reference plane 24 can be tilted toward or away from the virtual caméra viewpoints 25, 26. The preferred tilt angle A2 is generally between 1 degree and 20 degrees from the horizontal, depending upon the final perceived height of the primary object 20. In a second tilt manipulation, the object 20 can be tilted toward or away from the virtual caméra viewpoints 25, 26. The preferred tilt angle Al is generally between 1 degree and 20 degrees from the horizontal, depending upon the final perceived height of the primary object 20. The tilt angle Al of the primary object 20 is independent the tilt angle A2 of the reference plane and other éléments in the virtual scene 10.
Using the caméra viewpoint conversion point P under the primary object 20 as a fulcrum point, the reference plane 24 can be digitally manipulated to tilt forward or backward. The tilt angle T2 of the reference plane 24 is independent of the tilt angle Tl of the primary object 20. The tilting of the reference plane 24 changes the position of the rear image boundary 28 relative to the perceived position of the primary object 20. This enables the height of the primary object 20 to be increased proportionately within the confines of the mathematical relationship.
Referring to Fig. 5, a preferred bend manipulation is shown. In Fig. 5, the primary object 20B is shown as a rectangle, rather than a dinosaur, for ease of explanation. A bend in the complex shape of a dinosaur would be difficult to perceive. A bend point B1 is selected along the height of the primary object 20B. The bend point B1 is between 1/3 and 2/3 the overall height of the primary object 20B. The primary object 20B is also divided into three régions 31, 33, 35 along its height. In the first région 31, the primary image 20B is not manipulated. In the second région 33, no manipulation occurs until the bend line B1. Any portion of the primary object 20B above the bend line B1 and within the second région 33 is digitally tiIted by a first angle AAI. In the third région 35, the primary object 20B is tilted at a second angle AA2, which is steeper than the first angle AAI. The first angle AAI and the second angle AA2 are measured in relation to an imaginary vertical plane that is parallel to the vertical plane in which the virtuaT caméra viewpoints 25, 26 are set. The resuit is that the Virtual scene 10 can be made larger and taller without extending above the rear image boundary 28. When viewed from the Virtual caméra viewpoints 25, 26, the primary object 20B appears taller and has a more pronounced forward or vertical projection.
Referring to Fig. 6, a preferred taper manipulation is explained. Again, the primary object 20B is shown as a représentative rectangle, rather than a dinosaur for ease of explanation. The primary object 20B is divided into two régions 37, 39 along its height. In the first région 37, the primary object 20B is not manipulated. In the second région 39, the primary object 20B is reduced in size using a taper from front to back of an angle AA3 of between 1 degree and 25 degrees. The point where the taper begins is positioned between 1/3 and 2/3 up the height of the primary object 20B. The resuit is that the virtual scene 10 can be made wider without extending beyond the side image boundaries 29, 30. When viewed, the primary object 20B appears taller and has a more pronounced forward or vertical projection.
Once the virtual scene 10 is digitally adjusted in one or more of the manners described, an altered virtual scene is created. Alternatively, the virtual scene can be imaged prior to any digital adjustments and the digital adjustments can be performed after the two images are combined into a stereoscopic or auto-stereoscopic image.
Referring to Fig. 7 and Fig. 8 in conjunction with Fig. 2, it can be seen that the two images 40, 42 are stereoscopic, with one being the left caméra image 40 (Fig. 7) and one being the right caméra image 42 (Fig. 8). Each stereoscopic image 40, 42 has a fading perspective due to the angle of the caméra viewpoints. This causes the front image boundary 27 to appear to be wider than the rear image boundary 28.
Referring to Fig. 9, a top view of one of the stereoscopic images 40, 42 from Fig. 7 or Fig. 8 is shown. Although only one of the stereoscopic images is shown, it will be understood that the described process is performed on both of the stereoscopic images. Thus, the reference numbers 40, 42 of both stereoscopic images are used to indicate that the processes affect both.
Temporary reference guides are superimposed upon the stereoscopic images 40. 42. The reference guides include a set of inner guidelines 44 and a set of outer guidelines 46. The inner 'guidelines 44 are parallel lines that extend from the rear image boundary 28 to the front image boundary 27. The inner guidelines 44 begin at points P2 where in stereoscopic images 40, 42 met the rear boundary line 28. The outer guidelines 46 are also parallel lines that extend from the rear image boundary 28 to the front image boundary 27. The position of the outer guidelines 46 dépends upon the dimensions of the electronic display 12 upon which the virtual scene 10 is to be displayed. The width between the outer guidelines 46 corresponds to the pixel width of the electronic display to be used.
Referring to Fig. 10 in conjunction with Fig. 9, it can be seen that the stereoscopic images 40, 42 are digitaliy altered to fit within the parameters of the outer guidelines 46. As such, the stereoscopic images 40, 42 are widened toward the rear image boundary 28 and compressed toward the front image boundary 27. This créâtes corrected stereoscopic images 40A, 42A. The inner guidelines 44 remain on the corrected stereoscopic images 40A, 42A.
Referring to Fig. 11, in conjunction with Fig. 10, the corrected left and right stereoscopic images 40A, 42A are superimposed. The inner guidelines 44 from both corrected stereoscopic images 40A, 42A are aligned. Once alignment is achieved, the inner guidelines 44 are removed. This créâtes a final image 48. Depending upon how the final image 48 is to be viewed, the corrected stereoscopic images 40A, 42A can be colored in red or blue, or the corrected images 40A, 42A can be oppositely polarized. In this manner, when the final image 48 is viewed using 3D glasses or on an anto-stereoscopc display, the final image 48 will appear to be three-dimensional.
Referring to Fig. 12 in view of ail earlier figures, the software methodology for the overall system can now be summarized. As is indicated in Block 50, a content producer créâtes a Virtual scene 10 that includes one or more objects 20 that are to appear as 3D objects in the Virtual scene 10. See prior description of Fig. 1 and Fig. 2. The content producer also selects a reference plane 24 for the Virtual scene 10. See Block 52. Using the reference plane 16 and the selected objects 20, the content producer can détermine the boundaries of the Virtual scene 10. See Block 54.
Knowing the boundaries of the Virtual scene 10 and the reference plane 24, the content producer sets the angle and height of Virtual caméra viewpoints 25, 26 of the real stereoscopic caméras 17. The caméra viewpoints are set so that the line of sight for the stereoscopic caméras achieve zéro parallax at the reference plane 24. See Block 56. Also see prior description of Fig. 3.
As is indicated by Blocks 58, 60 and 62, the virtual scene 10 is digitally altered using tilt manipulations, bend manipulations and taper manipulations. See prior description of Fig. 4, Fig. 5 and Fig. 6. Two stereoscopic images 40, 42 are then obtained for the virtual scene. See Block 64.
Also see prior description of Fig. 7 and Fig. 8. The stereoscopic images 40, 42 are then corrected to fit the boarder guidelines of the Virtual image 10. See Block 66. Also see prior description of Fig. 9 and
Fig. 10. Lastly, the corrected stereoscopic images are superimposed. See Block 68. Also see prior description of Fig. 11. The resuit is a final image 48 that will appear to extend above, or in front of, the display 12 when viewed by a user.
It will be understood that the embodiment of the présent invention that is illustrated and described is merely exemplary and that a person skilled in the art can make many variations to that embodiment. Ail such embodiments are intended to be included within the scope of the présent invention as defined by the appended claims.

Claims (19)

1. A method of producing a virtual scene to be viewed on a display, wherein said virtual scene contains a virtual object that appears to be three dimensional when viewed on said display, said method comprising the steps of:
defining a virtual reference plane having peripheral boundaries that include a front boundary, a rear boundary, and side boundaries;
setting said virtual object on said virtual reference plane;
determining stereoscopic caméra viewpoints that enable said virtual object to be imaged with said reference plane within said peripheral boundaries of said reference plane;
altering said virtual object by virtually bending a portion of said virtual object to create an altered virtual object;
imaging said altered virtual object from said stereoscopic caméra viewpoints, wherein imaging said altered virtual object from said stereoscopic caméra viewpoints créâtes a first image and a second image;
superimposing said first image and said second image to create a superimposed image;
defining a common set of boundaries for said superimposed image to create a final image; and displaying said final image on said display, wherein said final image appears, at least in part, to extend out of said display.
2. The method according to Claim 1, wherein said display has a screen plane and said final image is displayed with said reference plane oriented relative said screen plane.
3. The method according to Claim 1, wherein said stereoscopic caméra viewpoints are in a common plane and altering said virtual object by virtually bending said portion of said virtual object as viewed from said common plane.
4. The method according to Claim 1, wherein altering said virtual object by virtually bending a portion of said virtual object includes selecting a first bend point at a first élévation on said virtual object and bending said virtual object only above said first bend point.
5. The method according to Claim 4, wherein said virtual object has a perceived height and said first élévation of said first bend point is between 1/3 and 2/3 of said perceived height.
6. The method according to Claim 1, wherein altering said virtual object by virtually bending a portion of said virtual object includes selecting a first bend point at a first élévation on said virtual object and a second bend point at a second élévation on said virtual object, wherein said virtual object is bent by a first angle above said first bend point and by a second angle above said second bend point.
7. The method according to Claim 6, wherein said second angle is greater than said first angle.
S. The method according to Claim 1, including further altering said virtual object by virtually tapering at least part of said virtual object away from said stereoscopic caméra viewpoints.
9. The method according to Claim 1, further including altering said virtual object by virtually tilting at least part of said reference plane as viewed from said stereoscopic caméra viewpoints.
10. The method according to Claim 1, further including altering said virtual object by virtually tilting at least part of said virtual object as viewed from said stereoscopic caméra viewpoints.
11. A method of producing a virtual scene to be viewed on a display, wherein said virtual scene contains a virtual object that appears, at least in part, to be three dimensional and to extend out of the display when viewed on said display, said method comprising the steps of:
defining a virtual reference plane having peripheral boundaries that include a front boundary, a rear boundary, and side boundaries;
setting said virtual object on said virtual reference plane; ; determining stereoscopic caméra viewpoints that enable said virtual object to be imaged with said reference plane within said peripheral boundaries of said reference plane;
altering said virtual object by virtually tilting at least a portion of said reference plane to create an altered virtual object;
imaging said altered virtual object from said stereoscopic caméra viewpoints, wherein imaging said altered virtual object from said stereoscopic caméra viewpoints créâtes a first image and a second image;
superimposing said first image and said second image to create a superimposed image; and defining a common set of boundaries for said superimposed image to create a final image.
12. The method according to Claim 11, wherein said display has a screen plane and said final image is displayed with said reference plane oriented relative said screen plane.
13. The method according to Claim 11, wherein said stereoscopic caméra viewpoints are in a common plane and altering said virtual object by virtually tilting at least a portion of said reference plane includes tilting said reference plane proximate said rear boundary as viewed from said stereoscopic caméra viewpoints.
14. The method according to Claim 11, wherein said stereoscopic caméra viewpoints are in a common plane and altering said virtual object by virtually tilting at least a portion of said reference plane includes tilting said virtual object in reference to said stereoscopic caméra viewpoints.
15. The method according to Claim 11 further including altering said virtual object by virtually bending a portion of said virtual object with reference to said common plane.
16. The method according to Claim 11, further including altering said virtual object by virtually tapering at least part of said virtual object away from said stereoscopic caméra viewpoints.
17. A method of producing a virtual scene to be viewed on a display, wherein said virtual scene contains a virtual object that appears, at least in part, to be three dimensional and to extend out of said display when viewed on said display, said method comprising the steps of:
defining a virtual reference plane having peripheral boundaries;
setting said virtual object on said virtual reference plane;
determining stereoscopic caméra viewpoints that enable said virtual object to be imaged with said reference plane within said peripheral boundaries of said reference plane;
altering said virtual object by virtually tapering a portion of said virtual object as viewed from said stereoscopic caméra viewpoints to create an altered virtual object;
imaging said altered virtual object from said stereoscopic viewpoints, wherein imaging said altered virtual object from said stereoscopic viewpoints créâtes a first image and a second image;
superimposing said first image and said second image to create a superimposed image to create a final image.
18. The method according to Claim 17, further including showing said final image in a display, wherein said display has a screen plane and said final image is displayed with said reference plane oriented relative said screen plane.
19. The method according to Claim 17, wherein altering said virtual object by virtually tapering a portion of said virtual object includes selecting a first point at a first élévation on said virtual object and tapering said virtual object only above said first bend point.
20. The method according to Claim 17, wherein said Virtual object has a perceived height and said first élévation of said first point is between 1/3 and 2/3 of said perceived height.
OA1201900394 2017-04-06 2018-04-05 System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display OA19355A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/481,447 2017-04-06

Publications (1)

Publication Number Publication Date
OA19355A true OA19355A (en) 2020-06-29

Family

ID=

Similar Documents

Publication Publication Date Title
AU2018249563B2 (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
US7796134B2 (en) Multi-plane horizontal perspective display
US7983477B2 (en) Method and apparatus for generating a stereoscopic image
US9087406B2 (en) Automated stereoscopic computer-animation techniques for determining scaled stereo parameters
US20050219695A1 (en) Horizontal perspective display
US20120306860A1 (en) Image generation system, image generation method, and information storage medium
WO2006001361A1 (en) Stereoscopic image creating method and device
TWI531212B (en) System and method of rendering stereoscopic images
US20060221071A1 (en) Horizontal perspective display
US20060250390A1 (en) Horizontal perspective display
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
US9225960B2 (en) Apparatus and method for attenuating stereoscopic sense of stereoscopic image
US10110876B1 (en) System and method for displaying images in 3-D stereo
OA19355A (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
NZ757902B2 (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
CN207603821U (en) A kind of bore hole 3D systems based on cluster and rendering
US10475233B2 (en) System, method and software for converting images captured by a light field camera into three-dimensional images that appear to extend vertically above or in front of a display medium
KR20090013925A (en) Methods generating real-time stereo images
KR101513999B1 (en) Apparatus and method for creating 3dimensional image
WO2018187724A1 (en) System, method and software for converting images captured by a light field camera into three-dimensional images that appear to extend vertically above or in front of a display medium
JP2005284183A (en) Method and device for creating stereoscopic image
CN106231281A (en) A kind of display converting method and device