US20140362175A1 - Apparatus and method for producing images for stereoscopic viewing - Google Patents
Apparatus and method for producing images for stereoscopic viewing Download PDFInfo
- Publication number
- US20140362175A1 US20140362175A1 US14/465,757 US201414465757A US2014362175A1 US 20140362175 A1 US20140362175 A1 US 20140362175A1 US 201414465757 A US201414465757 A US 201414465757A US 2014362175 A1 US2014362175 A1 US 2014362175A1
- Authority
- US
- United States
- Prior art keywords
- view
- image
- pane
- cradle
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/18—Stereoscopic photography by simultaneous viewing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- H04N13/0207—
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
- G03B35/10—Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/2252—
-
- H04N5/23216—
-
- H04N5/23238—
Abstract
Disclosed are methods and apparatus for producing images for stereoscopic, three-dimensional viewing generated with existing mobile graphical devices, such as cameras, video recorders, PDAs, telephones, and other devices, such as the iPhone™. Two reflectors, which may be mirrors or prisms, are positioned to separate the field of view of a camera lens on the mobile graphical device into a direct field of view and an offset field of view and to record and display the separated fields in a two-paned format on the mobile graphical device. Eye lenses are used to separately view the split images. An integrated housing and cradle receive the mobile graphical device and are coupled to the reflectors and the eye lenses. An area of each pane may be provided to display control icons for three-dimensional viewing, and a control area may be provided out of view on a touch screen for user control.
Description
- This application claims priority pursuant to 35 U.S.C. 119(e) to U.S. Provisional Application No. 61/386,346, filed on Sep. 24, 2010 and U.S. Provisional Application No. 61/317,035, filed on Mar. 24, 2010, both which are incorporated herein by reference in their entirety.
- The present invention relates generally to producing images for stereoscopic viewing, and more particularly to an apparatus and methods for separating the fields of view of a camera lens into a direct field of view and an offset field of view and for producing images from each field of view side by side on a mobile graphical device, and further for viewing the produced images with an integrated stereoscopic viewer as a single three-dimensional image.
- Since the invention of the stereoscope by Sir Charles Wheatstone in 1838, people have been entertained by viewing two-dimensional media as three-dimensional. One strategy used to accomplish stereoscopy, known as stereopsis, involves creating two views of a single image, one view offset from the other, and providing each view separately to an observer's eyes. Systems or devices that use stereopsis to create photography or video for three-dimensional viewing involve image-splitting using mirrors or other reflectors and camera lenses to produce offset images that may be viewed through eye lenses that separate the images to each eye. Different configurations of mirrors and lenses have been used to split images with varying results.
- Systems using one reflector, such as a mirror, one lens, and one two-dimensional camera involve placing a mirror nearly perpendicular, but slightly inward to, the field of view of a camera so that the camera captures mirrored images, one slightly offset from the other. In order to view the split images stereoscopically, one of the images must be reversed, which requires some form of post-processing. Prior art systems using two mirrors, one lens and one camera have involved using two mirrors hinged together at their edges and slightly angled with respect to each other and aiming the camera at the hinged edges. Similar systems have also involved unhinging the mirrors and separating them slightly. These types of two-mirror systems were used starting in the late nineteenth century, but are seldom used today. They are awkward to use because the camera is aimed at right angles to the subject and the images are reversed left to right. Other systems, using three or four reflectors, such as mirrors or prisms, have been used, but are more complicated to manufacture and maintain. A discussion of these systems may be found at http://www.lhup.edu/˜dsimanek/3d/3dgallery16.htm.
- Several devices in the prior art are known to provide or use an integrated viewer with left and right eye lenses, or viewing lenses, to stereoscopically view images provided on film or transmitted from an outside source, such as the internet or some other network. U.S. Pat. No. 2,313,562, issued to P. Mainardi, et al., for “Stereoscope” discloses “a stereoscope in which the viewing lenses have approximately the same focal length as the lens or lenses with which the pictures were taken”, which uses a housing with a front window on one side and two oculars on the other side, a means for symmetrically holding a film with a split image, and reflectors for rotating the images for proper viewing. This device requires that split images are provided oppositiaxially, for example, head-to-head or foot-to-foot. United States Patent Application number US 2006/0055773, by Kutka, for “Device and Method for Stereoscopic Reproduction of Picture Information on a Screen” discloses a housing permanently attached to a screen, viewing lenses mounted to the housing separated by a distance approximating the distance between an observer's eyes, and optical means for allowing an observer to separately view left and right images for three-dimensional viewing. This device does not have a camera and does not generate the split image. The images must originate from a network or other outside source. United States Patent Application number US 2007/0222856, by Amaru, for “Portable Device for Viewing an Image and Associated Production Method” discloses a housing with a display and two viewer openings, an optical unit with a lens or reflector arrangement that magnifies or sharpens images, a memory for storing externally transmitted images, the possibility of viewing received images three-dimensionally, and a location detection means to facilitate receiving images relevant to a viewer's surroundings. This device does not produce and must receive images. United States Patent Application number US 2010/0277575, by Ismael, et al., for “Method and Apparatus for Providing a 3D Image via a Media Device” discloses a frame for holding a handheld media display device that is displaying a right input image and a left input image and an arrangement of prisms and lenses that present a three-dimensional image to a viewer. This device does not create images and requires that a split image is provided for three-dimensional viewing.
- Another series of devices or systems in the prior art involve using reflectors for creating and manipulating filtered images, multiple fields of view, or split, offset images. U.S. Pat. No. 4,009,951, issued to Ihms, for “Apparatus for Stereoscopic Photography” discloses a setup using a conventional device, such as a 35 mm SLR camera, whereby reflectors and filters are attached to the camera lens to capture images that may later be viewed as three-dimensional. This device does not capture or create offset images as described earlier. U.S. Pat. No. 4,288,819, issued to Williams, for “Multi-Field Imaging Device” discloses a “multi-field imaging device . . . for directing light from a first optical field and a separated second optical field into a video camera used for security surveillance thereby reproducing a split-screen image with the video camera. The multi-field imaging device can be aligned whereby light from the second optical field is directed directly onto one side of an image plane. A pair of mirrors are adjusted so that light from the first optical field is reflected by the mirrors to the image plane in a juxtaposed relation with the light from the second optical field.” This device provides for multiple fields of view on a single screen, but is otherwise not relevant to stereoscopic viewing of images. U.S. Pat. No. 5,856,888, issued to Ross, et al., for “Split Beam Optical Character Reader” discloses an “optical system for an optical character reader in which a camera, such as a TV camera, reads an image field in a document, includes at least one pair of mirrors which shift half of the image both laterally and vertically to convert a relatively long image . . . into a rectangular image with a much lower aspect ratio.” This optical system reshapes the image field to allow for simplified and lower cost optical character recognition. While this device splits images, it does not do so for three-dimensional viewing. U.S. Pat. No. 6,603,876, issued to Matsuo, et al., for “Stereoscopic Picture Obtaining Device” discloses a device that obtains two pictures from different locations of viewpoint, rotates the pictures using dove prisms, combines the pictures, and condenses the pictures for stereoscopic viewing while preserving the aspect ratio of the original pictures. This device is relatively complicated and relatively expensive to produce and is not portable.
- While the prior art reveals devices and systems for splitting and manipulating images for stereoscopic viewing, as well as portable devices for viewing three-dimensional images, a portable, handheld device or system for producing split, offset images or video for immediate stereoscopic viewing and enjoyment is unknown in the prior art. Accordingly, it would be desirable to develop and provide a self-contained device that uses a camera, a simplified reflector arrangement, and viewing lenses to produce and enjoy three-dimensional visual media. The inventions discussed in connection with the described embodiments address these and other deficiencies of the prior art.
- The features and advantages of the present inventions will be explained in or apparent from the following description of the preferred embodiment considered together with the accompanying drawings.
- The present inventions address the deficiencies of the prior art of three-dimensional visual media. Specifically, improvements are in the area of stereoscopic viewing of offset images on mobile graphical devices, such as iPhones™ or other similar devices.
- The field of view of a camera on a mobile graphical device is split into two portions with one portion revealing a direct field of view and the other portion having a field of view revealing reflections off of two reflectors, such as mirrors or prisms, that are positioned to reflect an offset view of the direct field of view from the first portion of the field of view. This arrangement may be coupled to a cradle or housing that receives the mobile graphical device and is coupled to the reflectors and to a pair of eye lenses for viewing split images as three-dimensional visual media. Thus, unlike the prior art, a user of a mobile graphical device, such as an iPhone™, will be able to use a two-dimensional camera on a handheld device to create and view three-dimensional photographs and videos in real time using a portable container. Embodiments of this device are unknown in the prior art.
- If the mobile graphical device has a touch screen on the display, each split image may be augmented with an area for displaying control icons, and the screen may further have an area for controlling the icons, which will appear to a viewer as three-dimensional. For example, a viewer may see an icon as a three-dimensional camera and may take a picture by touching the area next to the icon. This interface is designed so that the three-dimensional image will not be interfered with by a user's finger. Embodiments of this type of three-dimensional interface are also unknown in the prior art.
- Described embodiments of the inventions provide an apparatus for producing images for stereoscopic viewing of visual media on a mobile graphical device. The mobile graphical device has a camera lens with a field of view split into two portions, a first portion and a second portion. The first portion of the field of view is unblocked and the second portion of the field of view faces a first reflector, which may be a mirror or a prism. The first reflector is juxtaposed relative to the camera and positioned at an angle to the plane crossing the circumference of the camera lens. A second reflector, which may also be a mirror or a prism, is juxtaposed relative to the first reflector and positioned so that it reflects an offset view of the unblocked first portion of the field of view to the first reflector and into the second portion of the field of view. A sensor, which is part of the camera in the described embodiments, is positioned to receive light reflected into each field of view. The sensor also outputs data defining the images produced by the received reflections. A first memory is used to store data defining the image, or reflection, received through the first portion of the field of view, and a second memory is used to store data defining the image, or reflection, received through the second portion of the field of view.
- In certain embodiments, the mobile graphical device has a display with a two-paned image. The first pane shows an image defined by the data in the first memory as a direct view of the first portion of the field of view. The second pane, which is adjacent to the first pane, shows an image defined by the data in the second memory as an offset view of the first portion of the field of view as reflected through the second portion of the field of view. Also in certain embodiments, a first eye lens is positioned for viewing only the first pane and a second eye lens is positioned for viewing only the second pane. A viewer who views the images in the first pane and the second pane through the first eye lens and the second eye lens will see a single, three-dimensional image.
- In further embodiments, where the display of the mobile graphical device is a touch screen, areas may be placed on the display bordering the first pane and the second pane for producing stereoscopic, three-dimensional icons for user control. A first control icon area borders the first pane and a second control icon area borders the second pane. The control icon areas are placed on corresponding sides of the first pane and the second pane so that when viewed by an end user through the first eye lens and the second eye lens, the user will see a single three dimensional image and a single three dimensional icon in an area bordering the image. A described embodiment has a user control area adjacent to the three-dimensional icons and outside the area viewable through the eye lenses that provide user control functions when touched by an end user. For example, the end user may see a three-dimensional icon of a camera and be able to touch an area in the user control area near the icon to take a picture.
- Certain embodiments may have a cradle configured to receive the mobile graphical device, the sensor, the first memory and the second memory where the cradle is coupled to the first reflector, the second reflector, the first eye lens and the second eye lens. A housing may further be coupled to the cradle, the first reflector, the second reflector, the first eye lens and the second eye lens. Various embodiments allow for different arrangements of the cradle and/or the housing, such as arrangements where the area between the mobile graphical device and the eye lenses is open and arrangements where the area between the mobile graphical device and the eye lenses is enclosed.
- Also in described embodiments, the camera lens, the first reflector, the second reflector, the first eye lens and the second eye lens are positioned so that the first portion of the field of view and the second portion of the field of view simulate the view of a two-eyed human observer.
- Certain embodiments are designed for receiving a mobile graphical device into a cradle where the cradle has a surface with an aperture exposing a field of view as described above. Similarly, the first reflector and the second reflector are positioned as described above, except they are positioned relative to the aperture, and a first eye lens and a second eye lens are coupled to the cradle. Also, a fastener allows one to secure a mobile graphical device to the cradle. Embodiments are described wherein the cradle is configured to receive a mobile graphical device so the a camera lens in the mobile graphical device is positioned within the aperture of the cradle.
- The inventions will now be more particularly described by way of example with reference to the accompanying drawings. Novel features believed characteristic of the inventions are set forth in the claims. The inventions themselves, as well as the preferred mode of use, further objectives, and advantages thereof, are best understood by reference to the following detailed description of the embodiment in conjunction with the accompanying drawings, in which:
-
FIG. 1A shows a top plan view of the camera lens side of a mobile graphical device with a two reflector image splitting arrangement. -
FIG. 1B shows a side plan view of the camera lens side of a mobile graphical device with a two reflector image splitting arrangement. -
FIG. 1C shows a side plan view of a first reflector. -
FIG. 1D shows a top plan view of a first reflector. -
FIG. 1E shows a side plan view of a second reflector. -
FIG. 1F shows a top plan view of a second reflector. -
FIG. 2A shows a perspective view of the arrangement of a sensor, a camera lens, a first reflector and a second reflector splitting the field of view towards an object from the lens side. -
FIG. 2B shows the stereo camera image produced by the setup ofFIG. 2A and the respective memory used to store data for each image. -
FIG. 3A shows a side view of the optical paths of the split fields of view with a prism used for the first reflector and the second reflector. -
FIG. 3B shows a top view of the optical paths of the offset field of view with a prism used for the first reflector and the second reflector. -
FIG. 4A shows a perspective view of the arrangement of a camera lens of a two-dimensional camera mounted on a mobile graphical device, a prism and a directing mirror splitting the field of view from the image side. -
FIG. 4B shows a top view of the arrangement of a two dimensional camera in the mobile graphical device, a prism and a directing mirror splitting the field of view. -
FIG. 5 shows a plan view of the display side of a mobile graphical device. -
FIG. 6A shows a double image and a user control GUI for using a camera as it appears in a two-dimensional, double-paned display of the mobile graphical device. -
FIG. 6B shows a single image and a user control GUI for using a camera as it appears in a three-dimensional, single pane on the display of the mobile graphical device. -
FIG. 6C shows a double image and a user control GUI for viewing photographs as it appears in a two-dimensional, double-paned display of the mobile graphical device. -
FIG. 6D shows single image and a user control GUI for viewing photographs as it appears in a three-dimensional, single pane on the display of the mobile graphical device. -
FIG. 7A shows a perspective view of the display of a mobile graphical device with split images, user control icons, and a user control area as it appears in a two-dimensional view without using eye lenses. -
FIG. 7B shows a perspective view of the display of a mobile graphical device with an image, user control icons, and a user control area as it appears in a three-dimensional view using eye lenses. -
FIG. 8 shows a perspective view of how the GUI may be controlled with zebra strips. -
FIG. 9 shows a perspective view of the top of a viewer apparatus with a housing and a cradle and lenses coupled thereto. -
FIG. 10 shows a perspective view of the bottom of the viewer apparatus. -
FIG. 11 shows a front elevational view of the viewer apparatus. -
FIG. 12 shows a rear elevational view of the viewer apparatus. -
FIG. 13 shows a top elevational view of the viewer apparatus. -
FIG. 14 shows a bottom elevational view of the viewer apparatus. -
FIG. 15 shows a left side view of the viewer apparatus. -
FIG. 16 shows a right side view of the viewer apparatus. -
FIG. 17 shows a right side view of the viewer apparatus housing and cradle marked for the cross-sectional view of the following figure. -
FIG. 18 shows a cross-sectional view of the viewer apparatus housing and cradle. -
FIG. 19 shows a right side view of the viewer apparatus housing and cradle as balanced on a supporting surface for viewing. -
FIG. 20A shows a perspective view of a sample cradle used for receiving a mobile graphical device. -
FIG. 20B shows a perspective view of a mobile graphical device. -
FIG. 20C shows a perspective view of a housing. -
FIG. 21 shows a perspective view of how a sample cradle attaches to a housing. -
FIG. 22A shows a perspective view of the spring controlled mechanism of the spring latch used to attach the cradle to the housing. -
FIG. 22B shows a perspective view of how a user moves the spring controlled mechanism of the spring latch. -
FIG. 23 shows a perspective view of the cradle coupled to the housing and a mobile graphical device fastened to the cradle. - The described embodiments reveal an apparatus and methods for producing images for stereoscopic viewing on mobile graphical devices, such as an iPhone™ or other similar telephones or communications devices, PDAs, cameras, video recorders, an iTouch™, an iPod, or other related devices. One embodiment of an apparatus for producing stereoscopic images comprises a mobile graphical device with a camera lens. The camera lens has a field of view wherein the field of view is limited to a first portion of the field of view and a second portion of the field of view. A first reflector, which may be a mirror or a prism, is juxtaposed relative to the camera so that the first reflector is outside of the first portion of the field of view, leaving the first portion of the field of view unobstructed by any reflectors and focused on what an end user wishes to see in three-dimensions. The second portion of the field of view comprises the first reflector, with the first reflector at an angle to the plane crossing the circumference of the camera lens. A second reflector is juxtaposed relative to the first reflector so that the second reflector reflects what is in the first portion of the field of view from an offset position. Everything that is reflected off of the second reflector is reflected off of the first reflector and into the camera lens. Thus, the camera lens receives a direct view of what is in the first portion of the field of view and an offset view of what is in the first portion of the field of view as reflected off of the two reflectors in the second portion of the field of view. A sensor receives the light reflected through both portions of the field of view and outputs data defining the received reflections. A first memory and a second memory are used for storing the data defining the reflections received through the first portion of the field of view and the second portion of the field of view respectively. The mobile graphical device in the described embodiments comprises a display with a two-paned image. The first pane shows an image defined by the data in the first memory as a direct view of the first portion of the field of view. The second pane is adjacent to the first pane and shows an image defined by the data in the second memory as an offset view of the first portion of the field of view as reflected through the second portion of the field of view. A first eye lens and a second eye lens, which may be in the form of glasses or a binocular-like arrangement, may be added to the apparatus and positioned so that only the first pane may be viewed through the first lens and only the second pane may be viewed through the second lens.
- The apparatus may further comprise a user interface where the display is a touch screen that has a first control icon area bordering the first pane and a second control icon area bordering corresponding sides of the second pane so that, when viewed by an end user through the first eye lens and the second eye lens, the first pane and the second pane appear as a single three-dimensional image and the first control icon area and the second control icon area appear as a single set of three-dimensional icons. Additionally, a user control area adjacent to the three-dimensional icons and outside the area viewable through the first eye lens and the second eye lens may be used to provide user control functions when touched by an end user. Embodiments using the interface described in this paragraph may or may not include a camera and may or may not include a first reflector and a second reflector.
- Other embodiments are defined without the mobile graphical device wherein an apparatus for creating and viewing a split field of view for stereoscopic viewing comprises the first reflector, the second reflector, the first eye lens and the second eye lens, as already described. These embodiments further have a cradle with a fastener for securing a mobile graphical device and a surface including an aperture exposing a field of view. The field of view is limited to a first portion of the field of view and a second portion of the field of view and the cradle is configured to receive a mobile graphical device so that a camera lens in the mobile graphical device is positioned within the aperture of the cradle. The display on the mobile graphical device displays a two-paned image towards the first eye lens and the second eye lens with one pane being a direct view through the first portion of the field of view as seen through the first eye lens and the other pane being an offset view of the first portion of the field of view through the second portion of the field of view as seen through the second eye lens.
- The embodiments described thus far may be coupled together in various ways with the cradle configured to receive the mobile graphical device, the sensor, the first memory and the second memory wherein the cradle is coupled to the first reflector, the second reflector, the first eye lens and the second eye lens. Additionally, the embodiments described thus far may be configured together in various ways with the apparatus further comprising a housing coupled to the cradle, the first reflector, the second reflector, the first eye lens and the second eye lens. The use of the cradle and the housing should not be construed to limit the possible configurations of the cradle and the housing, including configurations where the cradle is part of the housing, where there is no housing, or where various openings in the housing allow access to the touch screen for user control.
- All embodiments described herein, where a two-paned display is used and the first eye lens and the second eye lens are used, are configured to simulate the view of a two-eyed human observer, whether or not a camera is used, whether or not reflectors are used, and whether or not a three-dimensional user interface is used.
-
FIG. 1A shows a top plan view of thecamera lens 12 side of a mobilegraphical device 10 with a two reflector image splitting arrangement. The mobilegraphical device 10 may be an iPhone™ and may have acamera lens 12 having a diameter of 5.33 mm placed 10.75 mm from the left side of the mobilegraphical device 10 and 10.2 mm from the top of the mobilegraphical device 10. Thefirst reflector 14 is shown as a trapezoidal-shaped mirror blocking half of thecamera lens 12 and dividing the field of view of thecamera lens 12. Thesecond reflector 16 is shown as a square mirror set away from thefirst reflector 14.FIG. 1B shows a side plan view of thecamera lens 12 side of a mobilegraphical device 10 with a two reflector image splitting arrangement. Thecamera lens 12 is shown in relation to thefirst reflector 14 and thesecond reflector 16. Thefirst reflector 14 is placed 6.0 mm in front of thecamera lens 12 and positioned so that it evenly splits the field of view from thecamera lens 12 into a first portion of the field ofview 18 and, with thesecond reflector 16, into a second portion of the field ofview 20. Thesecond reflector 16 is attached to the mobilegraphical device 10 5.0 mm from the horizontal line crossing the center of thecamera lens 12. The back side of thefirst reflector 14 is set 49.5° from the horizontal plane bisecting thecamera lens 12, and the front side of thesecond reflector 16 is set 39.7° from the horizontal plane 5.0 mm below the horizontal plane bisecting thecamera lens 12. This placement of the reflectors ensures that the first portion of the field ofview 18 and the second portion of the field ofview 20 are convergent and not parallel.FIG. 1B also shows an embodiment with the silvered side of thefirst reflector 15 and the silvered side of thesecond reflector 17 on the back of the respective reflectors. In embodiments where the material protecting the silvered side of the reflector may cause unacceptable refraction, unacceptable degradation in light intensity or any other unacceptable characteristic caused by the material, the silvered side of thefirst reflector 15 and the silvered side of thesecond reflector 17 may be placed on the front of thefirst reflector 14 and thesecond reflector 16 respectively to create first surface mirrors. The dashed lines inFIG. 1B show the general direction of the first portion of the field ofview 18 and the second portion of the field ofview 20. -
FIG. 1C shows a side plan view of afirst reflector 14. In this figure, the total thickness of thefirst reflector 14, including the silvered side of thefirst reflector 15 and the protective material, which may be glass, acrylic, or another transparent material, is 3.0 mm.FIG. 1D shows a top plan view of afirst reflector 14, which is shown as trapezoidal with a length of 20.0 mm, a narrow end measuring 10.0 mm and a wide end measuring 20.0 mm.FIG. 1E shows a side plan view of asecond reflector 16. In this figure, the total thickness of thesecond reflector 16, including the silvered side of thesecond reflector 17 and the protective material, which may be glass, acrylic, or another transparent material, is also 3.0 mm.FIG. 1F shows a top plan view of asecond reflector 16, which is shown as square with a length and a width of 23.5 mm each. -
FIG. 2A shows a perspective view of the arrangement of asensor 22, acamera lens 12, afirst reflector 14 and asecond reflector 16 splitting the field of view towards a subject of the field ofview 24 from thecamera lens 12 side. The dashed lines show the first portion of the field ofview 18 and the second portion of the field ofview 20 as seen through thecamera lens 12 with theleft image center 26 and theright image center 28 positioned accordingly and crossing the subject of the field ofview 24. This figure shows the entire field of view bisected into the first portion of the field ofview 18 and the second portion of the field ofview 20 where the first portion of the field ofview 18 is an unimpeded, direct view of the subject of the field ofview 24 and where the second portion of the field ofview 20 is an offset view of the first portion of the field ofview 18 as reflected off of thefirst reflector 14 and thesecond reflector 16. Thesensor 22 is shown positioned behind thecamera lens 12 so that it senses the combination of the first portion of the field ofview 18 and the second portion of the field ofview 20.FIG. 2B shows thestereo camera image 30 produced by the setup ofFIG. 2A and the respective memory used to store data for each image. Theleft image 32 represents the direct view through the first portion of the field ofview 18, which is presented based on the data stored in thefirst memory 31. Theright image 34 represents the offset view of the first portion of the field ofview 18 as seen through the second portion of the field ofview 20, which is presented based on the data stored in thesecond memory 33. -
FIG. 3A shows a side view of the optical paths of the split fields of view with aprism 36 used for thefirst reflector 14 and thesecond reflector 16. The mobilegraphical device 10 is shown with the display side facing downward and thecamera lens 12 facing upward. In this figure, thefirst reflector 14 and thesecond reflector 16 are not shown as mirrors but are shown as interior surfaces of aprism 36. The dashed lines show how light is reflected and represent the first portion of the field ofview 18 and the second portion of the field ofview 20. Theprism 36 is attached to the mobilegraphical device 10 using aprism mount 38 that does not interfere with the fields of view. As in the embodiments using mirrors, the field of view through thecamera lens 12 is split with aprism 36 in the same manner.FIG. 3B shows a top view of the optical paths of the offset, second portion of the field ofview 20 with aprism 36 used for thefirst reflector 14 and thesecond reflector 16. -
FIG. 4A shows a perspective view of the arrangement of acamera lens 12 of a two-dimensional camera 42 mounted on a mobilegraphical device 10, aprism 36 and a directingmirror 40 splitting the field of view from the image side. The field of view is shown split into the first portion of the field ofview 18 and the second portion of the field ofview 20, which are directed to producing aleft image 32 and aright image 34, where the first portion of the field ofview 18 and the second portion of the field ofview 20 are ultimately convergent so that theleft image 32 and theright image 34 represent a direct view of theleft image 32 and where theright image 34 is an offset view of theleft image 32.FIG. 4B shows a top view of the arrangement of a twodimensional camera 42 in the mobilegraphical device 10, aprism 36 and a directingmirror 40 splitting the field of view. The first portion of the field ofview 18 is shown as a direct view of theleft image 32, and the second portion of the field ofview 20 is shown as aright image 34, which is an offset view of theleft image 32. -
FIG. 5 shows a plan view of the display side of a mobilegraphical device 10. Thedisplay 43 is shown divided into afirst pane 44 and asecond pane 46 with apane divider 50 in the middle of thefirst pane 44 and thesecond pane 46. In this embodiment, the top and the right side of thedisplay 43 is bordered by auser control area 48 that can control user functions when touched by an end user. In the shown embodiment, thedisplay 43 is 2.94 inches along the top and 1.92 inches along the side. Theuser control area 48 is 2.94 inches along the top, 1.92 inches along the side, and 0.14 inches thick. Thefirst pane 44 and thesecond pane 46 are each 1.77 inches high and 1.32 inches wide, while the pane divider is 0.6 inches wide. Thefirst pane 44 shows an image representing the first portion of the field ofview 18 and thesecond pane 46 shows an image representing the second portion of the field ofview 20. In this example, the firstcontrol icon area 54 and the secondcontrol icon area 56 will be within thefirst pane 44 and thesecond pane 46, respectively. The end user will be able to touch the user control area to activate functions associated with the three-dimensional icons. -
FIG. 6A shows a double image and a user control GUI for using a camera as it appears in a two-dimensional, double-paned display 43 of the mobilegraphical device 10. Thefirst pane 44 shows aleft image 32 and thesecond pane 46 shows aright image 34. The firstcontrol icon area 54 and the secondcontrol icon area 56 show icons that may be used in this user interface. For example, from left to right in both panes, the icons may be used to select this particular user interface, to go back to a previous user interface, to choose the type of file to be created as a camera file, and to take a picture. While a user is setting up to take a picture and viewing a live scene, thefirst pane 44 contains a first live image view 52 and thesecond pane 46 contains a second offsetlive image view 53.FIG. 6B shows a single image and a user control GUI for using a camera as it appears in a three-dimensional, single pane on thedisplay 43 of the mobilegraphical device 10. In the view shown inFIG. 6B , which is a three-dimensional view of the split image, the user sees the first live image view 52 in a single pane. Additionally, the icons in the control icon area will also appear as three-dimensional. -
FIG. 6C shows a double image and a user control GUI for viewing photographs as it appears in a two-dimensional, double-paned display 43 of the mobilegraphical device 10. Thefirst pane 44 shows aleft image 32 and thesecond pane 46 shows aright image 34. The firstcontrol icon area 54 and the secondcontrol icon area 56 show icons that may be used in this user interface. For example, from left to right in both panes, the icons may be used to select this particular user interface, to go back to a previous user interface, to choose the type of file to be viewed as a photo file, and to move up and down the list of photo categories. While a user is using the interface to view photographs, thefirst pane 44 contains a firstphoto library list 58 and thesecond pane 46 contains a secondphoto library list 60.FIG. 6D shows single image and a user control GUI for viewing photographs as it appears in a three-dimensional, single pane on the display of the mobilegraphical device 10. In the view shown inFIG. 6D , which is a three-dimensional view of the split image, the user sees the firstphoto library list 58 in a single pane. Additionally, the icons in the control icon area will also appear as three-dimensional. -
FIG. 7A shows a perspective view of thedisplay 43 of a mobilegraphical device 10 with split images, user control icons, and auser control area 48 as it appears in a two-dimensional view without using eye lenses. Thefirst pane 44 shows aleft image 32, which in this case is a square. Thesecond pane 46 shows aright image 34, which in this case is also a square. Thefirst pane 44 also contains the firstcontrol icon area 54, and thesecond pane 46 contains the secondcontrol icon area 56. This figure shows a mobilegraphical device 10, such as an iPhone™, as it appears in a two-dimensional mode with no attachments to provide three-dimensional viewing.FIG. 7B shows a perspective view of thedisplay 43 of a mobilegraphical device 10 with a three-dimensional image 62, three-dimensionaluser control icons 64, and auser control area 48 as it appears in a three-dimensionalview using glasses 65 with afirst eye lens 66 and asecond eye lens 68. In this figure, acradle 70 is configured to receive the mobilegraphical device 10 and any accessories related to the mobilegraphical device 10, such as a camera and its associated sensors and memory. - With reference to the
display 43 of the mobilegraphical device 10, the three-dimensional image 62 is facilitated withuser control icons 64 atuser control area 48 for three-dimensional viewing and navigation. User navigation with associated screens of the user control GUI is achieved, e.g., as virtual rooms or various game environments such that the user is allowed to turn or rotate with head movements, gestures and the like via the mobilegraphical device 10 in any direction to interact, participate in game play, or observe and select different menu items where the user looks. Additionally users may make tap entries on-screen to make appropriate game selections. Screen buttons are placed on the bottom/left or bottom/right side portions of the screen inuser control area 48, and as discussed further below, a plurality of user thumb or finger openings positioned at the bottom of theviewer apparatus housing 76 allow for advantageous relative positions where users can make screen selections during game play. - To this end, the primary interactions are typically directed to the lower portion of the screen to minimize visual disruption and smearing of the screen. Virtual interactive buttons and controls may be limited, spaced, or sized for easy targeting to alleviate difficulty in placing fingers/thumbs when viewing the stereoscopic imagery. Where appropriate, swipe up/down and other gestures inherent to the mobile
graphical device 10 can be enabled as a secondary or redundant mode of interaction. User games and Explore modes use the navigation and control GUI to allow the user to enter into a 360° scene, choose from available objects such as floating icons and the like. For example, a trivia game may be presented with subjects for a multiple choice, text-based trivia questions. The available subjects in the scene allow the user to aim the headset at the correct answer. Other skill games may utilize the three-dimensional stereoscopic platform to test user skill and balance as they navigate, e.g., a rolling ball through a whimsical abstract virtual environment where users are challenged to avoid hazards and traps while navigating to the finish line before a timer runs out. -
FIG. 8 shows a perspective view of how the GUI may be controlled withzebra strips 72, such as ZEBRA™ elastomeric electronic connectors from Fujipoly, although other similar products may be used. For example, as described thus far, a mobilegraphical device 10 having adisplay 43 with afirst pane 44 showing aleft image 32 and asecond pane 46 showing aright image 34 where thefirst pane 44 has a firstcontrol icon area 54 and thesecond pane 46 has a secondcontrol icon area 56 may use the zebra strips 72 instead of auser control area 48 to allow the user to select icon functionality while keeping the user's fingers out of the view of thedisplay 43. Thus, thedisplay 43 as seen by the user through thefirst eye lens 66 and thesecond eye lens 68 will appear unblocked by fingers as a three-dimensional image with three dimensional icons as the image in the viewer'sbrain 74. -
FIG. 9 shows a perspective view of the top of a viewer apparatus with ahousing 76 and acradle 70 and lenses coupled thereto. Thefirst reflector 14 and thesecond reflector 16 are shown coupled to the combination of thecradle 70 and thehousing 76. Between thefirst reflector 14 and thesecond reflector 16 is anaperture 78 in thecradle 70 and/or thehousing 76 that may align with a camera. Also shown in this figure is abase support 80 for when a user wishes to balance the viewer apparatus on a support surface for viewing through the opposite end.FIG. 10 shows a perspective view of the bottom of the viewer apparatus. Thehousing 76, thecradle 70, thefirst reflector 14, thesecond reflector 16, and thebase support 80 are shown from a viewpoint opposite that shown inFIG. 9 . Also shown inFIG. 10 is the mobilegraphical device 10 as it is received by thecradle 70, as well as afirst thumb hole 82 and asecond thumb hole 84 that allow a user to control user interfaces as previously described. Holes may be placed elsewhere in the housing for finger control if so desired. Acradle latch 86 is further shown in this figure. Thecradle latch 86 may be used to lock and release thecradle 70 to and from thehousing 76 so that the mobilegraphical device 10 may be inserted into and removed from the apparatus. -
FIG. 11 shows a front elevational view of the viewer apparatus. This view shows what a user sees when putting the viewer apparatus to the user's eyes. The viewer sees the front of thehousing 76, which has thefirst eye lens 66 and thesecond eye lens 68 and anose slot 88 so that the user may place the viewer apparatus unimpeded to the user's eyes.FIG. 12 shows a rear elevational view of the viewer apparatus. This view shows the viewer apparatus facing away from the viewer as the viewer looks into the apparatus. Thecradle 70 is shown holding the mobilegraphical device 10 and the tops of thefirst reflector 14 and the second reflector are also shown along with thebase support 80. -
FIG. 13 shows a top elevational view of the viewer apparatus showing the relative positions of thecradle 70, thehousing 76 and thebase support 80 from this angle. Also shown is areflector mount 90 that holds the reflectors in place.FIG. 14 shows a bottom elevational view of the viewer apparatus showing the relative positions of the previously describedcradle 70,housing 76,base support 80,first thumb hole 82,second thumb hole 84,cradle latch 86 andnose slot 88. -
FIG. 15 shows a left side view of the viewer apparatus andFIG. 16 shows a right side view of the viewer apparatus. Both of these figures show the relative positions from both sides of thecradle 70, thehousing 76, thebase support 80 and thereflector mount 90.FIG. 16 further shows another angle of the mobilegraphical device 10 positioned in thecradle 70. -
FIG. 17 shows a right side view of theviewer apparatus housing 76 andcradle 70 marked for the cross-sectional view of the following figure. As inFIG. 16 , this figure shows the relative positions from the right side of thecradle 70, thehousing 76, thebase support 80, thereflector mount 90, and the mobilegraphical device 10 positioned in thecradle 70. This figure also shows across-sectional cut line 92 to show the location of the cross-sectional view shown inFIG. 18 . -
FIG. 18 shows a cross-sectional view of theviewer apparatus housing 76 andcradle 70. From this angle, the relative positions of thecradle 70, thehousing 76, thebase support 80, thefirst eye lens 66 and thesecond eye lens 68 are shown. Also shown is aview divider 94 that separates the inside of thehousing 76 so that a viewer, when looking through thefirst eye lens 66, will only see theleft image 32 in thefirst pane 44 and, when looking through thesecond eye lens 68, will only see theright image 34 in thesecond pane 46. -
FIG. 19 shows a right side view of theviewer apparatus housing 76 andcradle 70 as balanced on a supportingsurface 96 for viewing with thebase support 80 supporting the apparatus. -
FIG. 20A shows a perspective view of asample cradle 70 used for receiving a mobilegraphical device 10. Thecradle latch 86 is shown on top and is used to latch thecradle 70 to thehousing 76. A pair of cradle snaps 98 are shown and are for snapping into a latch on thehousing 76 so that thecradle 70 may pivot and enclose the mobilegraphical device 10 within thecradle 70 and thehousing 76.FIG. 20B shows a perspective view of a mobilegraphical device 10 oriented to be received by thecradle 70. Thecradle 70 inFIG. 20A is sized to act as a fastener for securing the mobilegraphical device 10.FIG. 20C shows a perspective view of ahousing 76. Aspring latch 100 is shown that receives thecradle snap 98 shown inFIG. 20A and allows thecradle 70 to pivot into thehousing 76 and snap closed. Ahousing snap 102 secures thecradle 70 to thehousing 76 by coupling with thecradle latch 86 shown inFIG. 20A . -
FIG. 21 shows a perspective view of how asample cradle 70 attaches to ahousing 76. Thecradle 70 has a pair of cradle snaps 98 that attach to aspring latch 100. Thespring latch 100 comprises aspring 104 and a pair of springclose tabs 106 that can be compressed to move a pair ofdowels 108 into the body of the spring latch. The cradle snaps 98 have holes that can be positioned to receive thedowels 108 when the springclose tabs 106 are released, thus attaching thecradle 70 to thehousing 76.FIG. 22A shows a perspective view of the spring controlled mechanism of thespring latch 100 used to attach thecradle 70 to thehousing 76, andFIG. 22B shows a perspective view of how a user moves the spring controlled mechanism of thespring latch 100. Thespring 104 is compressed by pressing the springclose tabs 106 inward with a thumb and forefinger. -
FIG. 23 shows a perspective view of thecradle 70 coupled to thehousing 76 and a mobilegraphical device 10 fastened to thecradle 70. A coupledassembly 110 is shown so that thecradle 70 is pivotably secured to thehousing 76. Thedisplay 43 on the mobilegraphical device 10 is positioned so that when thecradle 70 is pivoted closed, thedisplay 43 will face the eye lenses and be separated by theview divider 94. Thefirst thumb hole 82 and thesecond thumb hole 84 are shown to be within reach of thedisplay 43 touch screen when the apparatus is closed. - Use of the embodiments described above may be extended so that several devices may be coupled through a network. In this scenario, users may share their fields of view and three-dimensional views of their respective surroundings. The described embodiments may also be adapted for mobile graphical devices that do not have a camera by replacing the reflectors with a dual camera accessory that is configured to capture a direct field of view and an offset field of view as described above. This type of arrangement may also be configurable with the viewing apparatus described above.
- One problem that may exist in the described embodiments is that light concentrated through the eye lenses into the viewing apparatus may damage the display of the mobile graphical device. This problem may be remedied by using a solar control film such as llumar R50 on the display, which will prevent degradation of an LCD or other type of display.
- While the present inventions have been illustrated by a description of various embodiments and while these embodiments have been set forth in considerable detail, it is intended that the scope of the inventions be defined by the appended claims. It will be appreciated by those skilled in the art that modifications to the foregoing preferred embodiments may be made in various aspects. It is deemed that the spirit and scope of the inventions encompass such variations to be preferred embodiments as would be apparent to one of ordinary skill in the art and familiar with the teachings of the present application.
Claims (23)
1-20. (canceled)
21. An apparatus adapted to receive and hold an electronic device which includes a camera and a display, the apparatus comprising:
a. a first optical system for forming two images of an object having spaced apart perspectives in a substantially stereoscopic relationship to each other and delivering the two images to the camera in a side-by side relationship; and
b. a second optical system including a pair of lenses, each lens for receiving a portion of image from the display and presenting a respective portion of the image to one of a user's eyes for forming a stereoscopic view.
22. The apparatus of claim 21 wherein the first optical system comprises two portions.
23. The apparatus according to claim 22 wherein a first portion of the optical system comprises free space for providing an unaltered first image to substantially a first half of the camera.
24. The apparatus according to claim 23 wherein a second portion of the optical system comprises a pair of mirrors.
25. The apparatus according to claim 24 wherein a first mirror is positioned to receive a second image having a spaced apart perspective of the unaltered first image and for providing that second image to a second mirror which is positioned to provide that second image to substantially a second half of the camera.
26. The apparatus according to claim 25 wherein the first mirror is positioned at an angle of 49.5 degrees from an axis of the camera and the second mirror is positions at an angle of 39.7 degrees from the axis.
27. The apparatus according to claim 22 wherein a second portion of the optical system comprises a prism.
28. The apparatus according to claim 27 wherein a first internal face of the prism is positioned to receive a second image having a spaced apart perspective of the unaltered first image and for providing that second image to a second internal face which is positioned to provide that second image to substantially a second half of the camera.
29. The apparatus of claim 21 wherein the second optical system comprises glasses.
30. The apparatus of claim 21 wherein the second optical system is formed as binoculars.
31. The apparatus of claim 21 wherein the second optical system comprises a solar protection film.
32. The apparatus of claim 21 wherein the display is a touch screen and the second optical system includes an aperture for allowing access to the user for entering commands on the touch screen.
33. An apparatus comprising:
a. a housing including a first end and a second end, wherein the housing includes a pair of eye lenses located at the first end;
b. a cradle coupled with the housing at the second end and sized to receive a mobile graphic device, wherein the cradle includes:
i. an aperture aligned with a camera lens of the mobile graphic device when the mobile graphic device is received in the cradle; and
ii. a first fixed reflector coupled with the cradle and positioned to divide the aperture into two portions such that light directly enters through a first portion of the aperture; and
iii. a second fixed reflector coupled with the cradle and set apart from the first reflector such that light indirectly enters through a second portion of the aperture by reflecting initially off the second fixed reflector and subsequently off the first fixed reflector.
34. An apparatus comprising:
a. a housing including a first end and a second end, wherein the housing includes a pair of eye lenses located at the first end; and
b. a cradle coupled with the housing at the second end and sized to receive a mobile graphic device, wherein the cradle includes:
i. an aperture aligned with a camera lens of the mobile graphic device when the mobile graphic device is received in the cradle;
ii. a first fixed reflector coupled with the cradle and positioned to divide a field of view of the camera lens into two portions, wherein a first portion of the field of view is an unimpeded direct view of a subject of the field of view; and
iii. a second fixed reflector coupled with the cradle and set apart from the first reflector, wherein a second portion of the field of view is an offset view of the first portion as reflected off the second fixed reflector and subsequently off the first fixed reflector.
35. A non-transitory computer readable medium storing instructions that, when executed by a computing device, cause the computing device to a method, the method comprising:
a. receiving a live image taken through a camera lens, wherein a first portion of the live image is of an unimpeded direct view of a subject of a field of view of the camera lens and a second portion of the live image is of an offset view of the first portion;
b. dividing a display into a first pane and a second pane with a pane divider between the first pane and a second pane; and
c. displaying the first portion of the live image in the first pane and the second portion of the live image in the second pane.
36. The non-transitory computer readable medium storing instructions according to claim 35 further comprising forming a GUI portion on the display.
37. The non-transitory computer readable medium storing instructions according to claim 36 wherein a user can control an image on the display.
38. A non-transitory computer readable medium storing instructions that, when executed by a computing device, cause the computing device to a method, the method comprising:
a. storing an image in a memory, wherein a first portion of the image is of an unimpeded direct view of a subject of a field of view of the camera lens and a second portion of the image is of an offset view of the first portion;
b. dividing a display into a first pane and a second pane with a pane divider between the first pane and a second pane; and
c. displaying the first portion of the image in the first pane and the second portion of the image in the second pane.
39. The non-transitory computer readable medium storing instructions according to claim 38 further comprising forming a GUI portion on the display.
40. The non-transitory computer readable medium storing instructions according to claim 39 41, 42 wherein a user can control an image on the display.
41. The non-transitory computer readable medium storing instructions according to claim 38 further comprising controlling the image based upon movement of a user's head in concert with the computing device.
42. The non-transitory computer readable medium storing instructions according to claim 38 further comprising controlling a GUI by aiming a headset that holds the computing device at a virtual object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/465,757 US20140362175A1 (en) | 2010-03-24 | 2014-08-21 | Apparatus and method for producing images for stereoscopic viewing |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31703510P | 2010-03-24 | 2010-03-24 | |
US38634610P | 2010-09-24 | 2010-09-24 | |
US13/017,157 US8908015B2 (en) | 2010-03-24 | 2011-01-31 | Apparatus and method for producing images for stereoscopic viewing |
US14/465,757 US20140362175A1 (en) | 2010-03-24 | 2014-08-21 | Apparatus and method for producing images for stereoscopic viewing |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/017,157 Continuation US8908015B2 (en) | 2010-03-24 | 2011-01-31 | Apparatus and method for producing images for stereoscopic viewing |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140362175A1 true US20140362175A1 (en) | 2014-12-11 |
Family
ID=44673545
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/017,157 Expired - Fee Related US8908015B2 (en) | 2010-03-24 | 2011-01-31 | Apparatus and method for producing images for stereoscopic viewing |
US14/465,757 Abandoned US20140362175A1 (en) | 2010-03-24 | 2014-08-21 | Apparatus and method for producing images for stereoscopic viewing |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/017,157 Expired - Fee Related US8908015B2 (en) | 2010-03-24 | 2011-01-31 | Apparatus and method for producing images for stereoscopic viewing |
Country Status (3)
Country | Link |
---|---|
US (2) | US8908015B2 (en) |
EP (1) | EP2550642A4 (en) |
WO (1) | WO2011119459A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016139497A1 (en) * | 2015-03-05 | 2016-09-09 | Intellisense Zrt. | Optical attachment for mobile display devices |
US20160349791A1 (en) * | 2015-05-27 | 2016-12-01 | Google Inc. | Virtual reality headset and device case |
US20170257618A1 (en) * | 2016-03-03 | 2017-09-07 | Disney Enterprises, Inc. | Converting a monocular camera into a binocular stereo camera |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI371681B (en) * | 2009-09-18 | 2012-09-01 | Primax Electronics Ltd | Notebook computer with multi-image capture function |
US20130222363A1 (en) * | 2012-02-23 | 2013-08-29 | Htc Corporation | Stereoscopic imaging system and method thereof |
JP5910485B2 (en) * | 2012-03-16 | 2016-04-27 | 株式会社リコー | Imaging system |
US9075572B2 (en) * | 2012-05-02 | 2015-07-07 | Google Technology Holdings LLC | Media enhancement dock |
US9690111B2 (en) | 2012-07-03 | 2017-06-27 | Not Flat Photos, Llc | Collapsible stereoscopic viewer |
US9473760B2 (en) * | 2012-08-08 | 2016-10-18 | Makerbot Industries, Llc | Displays for three-dimensional printers |
US9154677B2 (en) * | 2012-09-20 | 2015-10-06 | Apple Inc. | Camera accessory for angled camera viewing |
GB2499102B (en) * | 2013-01-11 | 2013-12-25 | Mvr Global Ltd | Head-mounted display device |
US20150103146A1 (en) * | 2013-10-16 | 2015-04-16 | Qualcomm Incorporated | Conversion of at least one non-stereo camera into a stereo camera |
US9615081B2 (en) | 2013-10-28 | 2017-04-04 | Lateral Reality Kft. | Method and multi-camera portable device for producing stereo images |
EP2866446B1 (en) * | 2013-10-28 | 2016-07-06 | Lateral Reality Kft. | Method and multi-camera portable device for producing stereo images |
KR20150081765A (en) * | 2014-01-06 | 2015-07-15 | 삼성전자주식회사 | Outputting Method For Screen data And Electronic Device supporting the same |
US20150256817A1 (en) * | 2014-05-27 | 2015-09-10 | Albert John Hofeldt | Imaging Adapter and Method of recording and playback |
US9551873B2 (en) * | 2014-05-30 | 2017-01-24 | Sony Interactive Entertainment America Llc | Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content |
US9420075B2 (en) * | 2014-07-16 | 2016-08-16 | DODOcase, Inc. | Virtual reality viewer and input mechanism |
US9910504B2 (en) * | 2014-08-21 | 2018-03-06 | Samsung Electronics Co., Ltd. | Sensor based UI in HMD incorporating light turning element |
JP6410548B2 (en) * | 2014-10-10 | 2018-10-24 | 興和株式会社 | Stereoscopic device and program |
DE102015100680B4 (en) * | 2015-01-19 | 2016-10-13 | Carl Zeiss Ag | Methods and devices for environmental representation |
US9804393B1 (en) | 2015-02-09 | 2017-10-31 | Google Inc. | Virtual reality headset |
US10209769B2 (en) | 2015-05-27 | 2019-02-19 | Google Llc | Virtual reality headset |
US9857595B2 (en) | 2015-07-31 | 2018-01-02 | Google Llc | Integrated mobile device shipping container and virtual reality headset |
US10139637B2 (en) | 2015-07-31 | 2018-11-27 | Google Llc | Integrated mobile device packaging and virtual reality headset |
USD792398S1 (en) | 2015-07-31 | 2017-07-18 | Google Inc. | Smartphone packaging and virtual reality headset |
USD853231S1 (en) | 2016-02-24 | 2019-07-09 | Google Llc | Combined smartphone package and virtual reality headset |
EP3217355A1 (en) * | 2016-03-07 | 2017-09-13 | Lateral Reality Kft. | Methods and computer program products for calibrating stereo imaging systems by using a planar mirror |
ES1161684Y (en) * | 2016-04-20 | 2016-10-21 | Mayordomo Juan Antonio Martinez | Portable electronic device |
JP6992094B2 (en) | 2017-09-11 | 2022-01-13 | グーグル エルエルシー | Switchable virtual reality and augmented reality devices |
GB2569325B (en) | 2017-12-13 | 2020-05-06 | Imperial Innovations Ltd | Ear examination apparatus |
US11526014B2 (en) | 2019-07-16 | 2022-12-13 | Texas Instruments Incorporated | Near eye display projector |
US11624908B2 (en) | 2019-12-28 | 2023-04-11 | Lenovo (Singapore) Pte. Ltd. | Optical assembly |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030072570A1 (en) * | 2001-10-12 | 2003-04-17 | Pentax Corporation | Stereo camera and automatic convergence adjusting device |
US20050237517A1 (en) * | 2004-04-27 | 2005-10-27 | Santa Barbara Infrared, Inc. | Optical alignment method and system |
US20060274218A1 (en) * | 2005-03-15 | 2006-12-07 | Jiuzhi Xue | Windows with electrically controllable transmission and reflection |
US20090046141A1 (en) * | 2005-09-13 | 2009-02-19 | Konami Digital Entertainment Co., Ltd. | Stereoscopic spectacles |
US20100013910A1 (en) * | 2008-07-21 | 2010-01-21 | Vivid Medical | Stereo viewer |
US20100225744A1 (en) * | 2009-03-09 | 2010-09-09 | Masaomi Tomizawa | Shooting apparatus and shooting control method |
US20130016181A1 (en) * | 2010-03-30 | 2013-01-17 | Social Animal Inc. | System and method for capturing and displaying cinema quality panoramic images |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4009951A (en) * | 1973-11-05 | 1977-03-01 | Ihms James E | Apparatus for stereoscopic photography |
US4288819A (en) * | 1979-05-10 | 1981-09-08 | Williams Robert T | Multi-field imaging device |
GB2129994A (en) * | 1982-10-29 | 1984-05-23 | Video Technology Limited | Apparatus for displaying a three-dimensional image |
US6417969B1 (en) | 1988-07-01 | 2002-07-09 | Deluca Michael | Multiple viewer headset display apparatus and method with second person icon display |
US20010015753A1 (en) | 2000-01-13 | 2001-08-23 | Myers Kenneth J. | Split image stereoscopic system and method |
JP4049977B2 (en) | 2000-09-01 | 2008-02-20 | パイオニア株式会社 | Communication terminal device and lens adapter used for communication terminal device |
EP1410621A1 (en) | 2001-06-28 | 2004-04-21 | Omnivee Inc. | Method and apparatus for control and processing of video images |
GB2385428A (en) * | 2002-01-17 | 2003-08-20 | Zoran Perisic | Apparatus for creating or viewing three dimensional photography |
JP4297653B2 (en) | 2002-07-04 | 2009-07-15 | シャープ株式会社 | Information equipment with stereoscopic image display function |
KR100554991B1 (en) | 2002-09-17 | 2006-02-24 | 샤프 가부시키가이샤 | Electronics with two and three dimensional display functions |
JP3973525B2 (en) | 2002-09-24 | 2007-09-12 | シャープ株式会社 | Electronic device having 2D (2D) and 3D (3D) display functions |
GB0307077D0 (en) | 2003-03-27 | 2003-04-30 | Univ Strathclyde | A stereoscopic display |
KR20070001157A (en) | 2004-02-07 | 2007-01-03 | 패트릭 로만 아마루 | Portable device for viewing image and associated production method |
DE102004010369A1 (en) | 2004-03-03 | 2005-11-24 | Siemens Ag | Apparatus and method for the stereoscopic reproduction of image information on a screen |
KR100677569B1 (en) | 2004-12-13 | 2007-02-02 | 삼성전자주식회사 | 3D image display apparatus |
KR101112735B1 (en) | 2005-04-08 | 2012-03-13 | 삼성전자주식회사 | 3D display apparatus using hybrid tracking system |
KR100649523B1 (en) | 2005-06-30 | 2006-11-27 | 삼성에스디아이 주식회사 | Stereoscopic image display device |
KR100728115B1 (en) | 2005-11-04 | 2007-06-13 | 삼성에스디아이 주식회사 | Three-dimensional display device and driving method thereof |
EP1962175A4 (en) | 2005-12-14 | 2012-09-12 | Yappa Corp | Image display device |
ES2771676T3 (en) | 2005-12-20 | 2020-07-06 | Koninklijke Philips Nv | Autostereoscopic display device |
GB2436409A (en) * | 2006-03-20 | 2007-09-26 | Sharp Kk | Camera with reflector for forming images on different sensor portions |
US20070252953A1 (en) | 2006-04-27 | 2007-11-01 | Robert Metzger | Crosstalk reduced stereoscopic viewing apparatus |
JP4706638B2 (en) | 2006-09-29 | 2011-06-22 | セイコーエプソン株式会社 | Display device, image processing method, and electronic apparatus |
JP4669482B2 (en) | 2006-09-29 | 2011-04-13 | セイコーエプソン株式会社 | Display device, image processing method, and electronic apparatus |
US7978239B2 (en) | 2007-03-01 | 2011-07-12 | Eastman Kodak Company | Digital camera using multiple image sensors to provide improved temporal sampling |
US20110115751A1 (en) * | 2009-11-19 | 2011-05-19 | Sony Ericsson Mobile Communications Ab | Hand-held input device, system comprising the input device and an electronic device and method for controlling the same |
-
2011
- 2011-01-31 US US13/017,157 patent/US8908015B2/en not_active Expired - Fee Related
- 2011-03-21 WO PCT/US2011/029135 patent/WO2011119459A1/en active Application Filing
- 2011-03-21 EP EP11759965.4A patent/EP2550642A4/en not_active Withdrawn
-
2014
- 2014-08-21 US US14/465,757 patent/US20140362175A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030072570A1 (en) * | 2001-10-12 | 2003-04-17 | Pentax Corporation | Stereo camera and automatic convergence adjusting device |
US20050237517A1 (en) * | 2004-04-27 | 2005-10-27 | Santa Barbara Infrared, Inc. | Optical alignment method and system |
US20060274218A1 (en) * | 2005-03-15 | 2006-12-07 | Jiuzhi Xue | Windows with electrically controllable transmission and reflection |
US20090046141A1 (en) * | 2005-09-13 | 2009-02-19 | Konami Digital Entertainment Co., Ltd. | Stereoscopic spectacles |
US20100013910A1 (en) * | 2008-07-21 | 2010-01-21 | Vivid Medical | Stereo viewer |
US20100225744A1 (en) * | 2009-03-09 | 2010-09-09 | Masaomi Tomizawa | Shooting apparatus and shooting control method |
US20130016181A1 (en) * | 2010-03-30 | 2013-01-17 | Social Animal Inc. | System and method for capturing and displaying cinema quality panoramic images |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016139497A1 (en) * | 2015-03-05 | 2016-09-09 | Intellisense Zrt. | Optical attachment for mobile display devices |
US20160349791A1 (en) * | 2015-05-27 | 2016-12-01 | Google Inc. | Virtual reality headset and device case |
US20170257618A1 (en) * | 2016-03-03 | 2017-09-07 | Disney Enterprises, Inc. | Converting a monocular camera into a binocular stereo camera |
US10455214B2 (en) * | 2016-03-03 | 2019-10-22 | Disney Enterprises, Inc. | Converting a monocular camera into a binocular stereo camera |
US11178380B2 (en) | 2016-03-03 | 2021-11-16 | Disney Enterprises, Inc. | Converting a monocular camera into a binocular stereo camera |
Also Published As
Publication number | Publication date |
---|---|
EP2550642A1 (en) | 2013-01-30 |
US8908015B2 (en) | 2014-12-09 |
EP2550642A4 (en) | 2016-09-14 |
US20120026298A1 (en) | 2012-02-02 |
WO2011119459A1 (en) | 2011-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8908015B2 (en) | Apparatus and method for producing images for stereoscopic viewing | |
JP3231329U (en) | Multipurpose mobile device case / cover integrated with camera system for 3D and / or 2D high quality video recording, photography, selfie recording and non-electric 3D / multi-video and static frame viewers | |
JP5172972B2 (en) | 3D image display device | |
US9423827B2 (en) | Head mounted display for viewing three dimensional images | |
US20170017088A1 (en) | Head Mounted Display With Lens | |
KR101728845B1 (en) | Display device, packaging box, and packaging device | |
SG186947A1 (en) | Variable three-dimensional camera assembly for still photography | |
JP4421673B2 (en) | 3D image display device | |
TWI585464B (en) | Expansion display device and expansion display system | |
JP5817407B2 (en) | Display device | |
US10067352B2 (en) | 3D image generating lens tool | |
US11029520B2 (en) | Head mounted display with lens | |
JP3197950U (en) | Image observation frame | |
JP2001312018A (en) | Two-in-a-set image and stereo camera for obtaining the image | |
TWI542193B (en) | An external device that provides stereo photography and display | |
KR101573267B1 (en) | Method and apparatus for controlling the interaction between packing box and electronic display device | |
KR101652579B1 (en) | Method and apparatus for displaying electronic display device | |
TWI632802B (en) | Three-dimensional image pick-up and display system | |
JP2615363B2 (en) | 3D image device | |
JPS63280216A (en) | Stereoscopic display device | |
JP2000347133A (en) | Stereo camera | |
JP2003222803A (en) | Binocular apparatus | |
WO2012129770A1 (en) | A stereo picture/screen viewing device | |
TWM438643U (en) | Attach-on three dimensional image frame of electronic device | |
JP2011203411A (en) | Stereoscopic image-viewing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |