WO2005067318A2 - Multi-dimensional imaging apparatus, systems, and methods - Google Patents
Multi-dimensional imaging apparatus, systems, and methods Download PDFInfo
- Publication number
- WO2005067318A2 WO2005067318A2 PCT/US2004/043252 US2004043252W WO2005067318A2 WO 2005067318 A2 WO2005067318 A2 WO 2005067318A2 US 2004043252 W US2004043252 W US 2004043252W WO 2005067318 A2 WO2005067318 A2 WO 2005067318A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- separating
- lens
- facet
- eye
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/06—Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
Definitions
- Various embodiments described herein relate generally to image processing, including apparatus, systems, and methods used to record and project multi-dimensional images.
- Cylindrical panoramas may be constructed using a single rotating camera. As the camera is rotated, images may be captured at defined increments until the desired panoramic field of view has been traversed. Vertical strips may then be extracted from the center of each image, and the strips can be placed next to one another to form a single uninterrupted cylindrical panoramic image. [0003] This process can be extended to create cylindrical stereoscopic (e.g., three-dimensional) panoramic images. For example, two cameras can be mounted, one next to the other, separated by a defined distance. The cameras may then be rotated in unison about a point halfway between them. Each camera can be used to create a separate cylindrical panorama using concatenated vertical image slices, as described above.
- cylindrical stereoscopic e.g., three-dimensional
- FIG. 1 is a top view of a lens refracting right eye rays according to various embodiments:
- FIG. 2 is a top view of a lens refracting left eye rays according to various embodiments;
- FIG. 3 is a top view of a lens and apparatus according to various embodiments;
- FIG. 4 is a top view of an apparatus according to various embodiments;
- FIG. 5 is a top view of an apparatus and a system according to various embodiments
- FIG. 6 is a perspective view of a system according to various embodiments.
- FIGs. 7A-7E illustrate portions of a stereoscopic panorama creation process according to various embodiments
- FIG. 8 illustrates several fields of view relating to a lens according to various embodiments
- FIG. 9 is a top view of lens surface point ray angles relating to a lens according to various embodiments
- FIG. 10 is a top view of eye ray angles relating to a lens according to various embodiments.
- FIG. 11 is a top view of lens facet orientation angles relating to a lens according to various embodiments.
- FIG. 12 is a top view of additional lens facet orientation angles relating to a lens according to various embodiments.
- FIG. 13 is a top view of a multi-viewpoint lens according to various embodiments;
- FIG. 14 is a top view of a multi-viewpoint image capture apparatus according to various embodiments;
- FIG. 15 is a top view of a multiple-image projection system according to various embodiments.
- FIGs. 16A and 16B are flow charts illustrating several methods according to various embodiments.
- FIG. 17 is a block diagram of several articles according to various embodiments.
- the quality of the stereoscopic effect created using two cameras may be governed by the distance separating the centers ofthe camera lenses.
- the lenses are separated by an amount approximating the average human inter-ocular distance (i.e., about 6.4 centimeters, or the average distance between the pupils ofthe left and right eyes)
- the stereoscopic effect may accurately mimic human vision. If the cameras are placed closer together, the three dimensional depth ofthe captured scene may diminish. If they are placed farther apart, the three dimensional depth may increase.
- many stereoscopic camera systems use a camera or lens separation of about 6.4 centimeters.
- a camera system capable of capturing a moving cylindrical stereoscopic image (e.g., video) in real time
- a mechanism that allows a video camera (or other image capture device) of arbitrary size to capture alternating left and right eye rays from outside ofthe center inter-ocular circle may be needed.
- the cylindrical field of view may be divided into smaller pieces, each covered by an individual image capture device.
- a lens and an apparatus may be constructed to interlace them.
- this interlacing is a simple horizontal alternation of left eye rays and right eye rays.
- This effect can be achieved using a lens specifically designed to refract left and right eye rays in an unusual way.
- This lens may be designed to encompass the entire surface area of a cylinder surrounding a multi-camera assembly.
- the radial symmetry of a multi-camera assembly helps simplify the lens design process.
- the cylindrical surface can be separated into several identical portions, or segments. The area of the cylindrical surface corresponding to a single video camera can thus be isolated, and the isolated lens segment can be designed in relation to its corresponding video camera.
- each lens or lens segment may be designed to refract various incoming light rays, corresponding to the left and right eye viewing rays, into its respective video camera. Since the left and right eye rays pass through the cylindrical lens surface in a non-symmetrical way, a uniform lens surface may not properly accomplish such refraction.
- FIG. 1 is a top view of a lens 100 refracting right eye rays 102 according to various embodiments
- FIG. 2 is a top view of a lens 200 refracting left eye rays 202 according to various embodiments.
- a faceted lens 100, 200 has an outer surface 104, 204 (i.e., the faceted surface) designed to refract right eye rays 102 and left eye rays 202 onto the image acquisition plane 106, 206 of a video camera 110, 210, or other image capture device.
- individual vertical lens facets 112, 212 are used to refract individual vertical spans of eye rays 114, 214 into individual vertical lines of pixels included in the video camera's 110, 210 captured image acquisition plane 106, 206.
- FIG. 3 is a top view of a lens 300 and apparatus 316 according to various embodiments ofthe invention.
- the individual lens facets 312 for left and right eye rays are alternated along the outer surface 304 ofthe lens 300 in order to capture both left eye rays and right eye rays at substantially the same time.
- the rays can be refracted onto the image acquisition plane 306 ofthe video camera 310, or other image capture device.
- the use of an interlaced, faceted lens 300 allows the video camera 310 (or other image capture device) to capture a sequence of vertically interlaced images. Since this vertical interlacing pattern remains constant throughout the entire video sequence, the left and right eye imagery can be isolated and separated in real time. The uniformly radial, tangential nature ofthe captured left and right eye rays allows several of these lens-camera apparatus to be placed next to one another to extend the cylindrical field of view ofthe overall device. Thus, it is the combination apparatus 316, comprising the lens 300 and the video camera 310, or other image capture device, that may be replicated a number of times to provide a panoramic, stereoscopic image capture system.
- the term "panoramic” means an image, either monoscopic or stereoscopic, having a field of view of from about 60 degrees up to about 360 degrees.
- FIG. 4 is a top view of an apparatus 416 according to various embodiments.
- the apparatus 416 which may be similar to or identical to the apparatus 316 is shown, along with the relevant inter-ocular distance D.
- the apparatus 416 may include a lens 400 having a plurality of interleaved separating facets 412 including a first separating facet 422 to refract left eye rays 424 and a second separating facet 426 to refract right eye rays 428.
- the apparatus 416 may also include an image acquisition plane 406 (perhaps as part of an image capture device 430, such as a frame-grabber, digital video camera, or some other device) to receive a refracted left eye ray 432 from the first separating facet 422, and to receive a refracted right eye ray 434 from the second separating facet 426.
- an image acquisition plane 406 might as part of an image capture device 430, such as a frame-grabber, digital video camera, or some other device to receive a refracted left eye ray 432 from the first separating facet 422, and to receive a refracted right eye ray 434 from the second separating facet 426.
- FIG. 5 is a top view of an apparatus 516 and a system 536 according to various embodiments.
- the apparatus 516 may include a first lens 538 and a first image acquisition plane 540 as shown in FIG. 4 with respect to apparatus 416.
- the apparatus 516 may also include a second lens 542 and image acquisition plane 544.
- the first lens 538 and first image acquisition plane 540 may be similar to or identical to the lens 400 and image acquisition plane 406 shown in FIG. 4.
- the second lens 542 and second image acquisition plane 544 may also be similar to or identical to the lens 400 and image acquisition plane 406 shown in FIG. 4, such that the second lens 542 may have a second plurality of interleaved separating facets (not shown in FIG.
- the second image acquisition plane 544 may be used to receive a second refracted left eye ray from the third separating facet, and to receive a second refracted right eye ray from the fourth separating facet, as described with respect to the apparatus 416 depicted in FIG. 4.
- the first lens 538 may have a first inner radius 546 defining a portion 548 of a cylindrical section 550
- the second lens 542 may have a second inner radius 552 located approximately on a cylinder 554 defined by the portion 548 of the cylindrical section 550.
- the lenses 400, 500 may include an inner radius 546 defrning a portion 548 of a cylindrical section 550, as well as an outer radius 551 along which are approximately located a plurality of separating facets 512.
- the plurality of facets 512 may include a plurality of left eye ray separating facets interleaved with a plurality of right eye ray separating facets (see FIG. 4, elements 412, 422, and 426).
- FIG. 6 is a perspective view of a system 636 according to various embodiments. Referring now to FIGs. 5 and 6, it can be seen that a system 536, 636 may include a plurality of lenses 500, 600.
- the lenses 500, 600 may be similar to or identical to the lens 400 shown in FIG. 4, having a plurality of interleaved facets 412, 512.
- the system 536, 636 may also include a plurality of image acquisition planes 506 (not shown in FIG. 6) to receive refracted left eye rays from first separating facets in the lenses 500, 600, and to receive refracted right eye rays from second separating facets in the lenses 500, 600.
- the system 536, 636 may include a memory 556 (not shown in FIG. 6) to receive image data 558 (not shown in FIG. 6) from the plurality of image acquisition planes 506.
- the image data 558 may include information to construct a stereoscopic image, including a panoramic stereoscopic image.
- the image data 558 may include a separated left eye image and a separated right eye image.
- the system 536, 636 may also include a processor 560 coupled to the memory 556 to join the separated left eye image and the separated right eye image (e.g. see elements 770, 772 of FIG. 7).
- the resulting extracted left and right eye imagery can also be placed next to each other in real time to create uniform, seamless left and right eye panoramic imagery (see elements 774, 776 of FIG. 7).
- FIGs. 7A-7E illustrate portions of a stereoscopic panorama creation process according to various embodiments. This process permits real-time capture of 360-degree, cylindrical stereoscopic video imagery.
- a single apparatus 716 including a lens 700 (similar to or identical to the lens 400 shown in FIG. 4) and an image capture device 730 (similar to or identical to the image capture device 430 of FIG. 4), is shown being used to capture an image of various objects 762.
- FIG. 7 A a single apparatus 716, including a lens 700 (similar to or identical to the lens 400 shown in FIG. 4) and an image capture device 730 (similar to or identical to the image capture device 430 of FIG. 4), is shown being used to capture an image of various objects 762.
- FIG. 7 A a single apparatus 716, including a lens 700 (similar to or identical to the lens 400 shown in FIG. 4) and an image capture device 730 (similar to or identical to the image capture device 430 of FIG. 4), is shown being used to capture an image of various objects 762.
- FIG. 7B depicts an approximation of an interlaced image 764 captured by the image capture device 730 via the faceted lens 700 (e.g., constructed from a plurality of captured left eye rays and a plurality of captured right eye rays).
- FIG. 7C shows de-interlaced left and right eye image strips 766, 768 constructed from the interlaced image 764.
- FIG. 7D shows concatenated left and right image sections 770, 772, or separated left and right eye images, constructed from the de-interlaced left and right eye image strips 766, 768, respectively.
- FIG. 7C shows de-interlaced left and right eye image strips 766, 768 constructed from the interlaced image 764.
- FIG. 7D shows concatenated left and right image sections 770, 772, or separated left and right eye images, constructed from the de-interlaced left and right eye image strips 766, 768, respectively.
- FIG. 7E shows left and right eye panoramic images 774, 776, respectively, obtained by joining together a number of left and right image sections obtained from adjoining apparatus 716, including left and right image sections 770, 772, arranged in a manner similar to or identical to that ofthe apparatus 516 in FIG. 5.
- a stereoscopic, panoramic (e.g., up to 360 degree) view ofthe objects 762 can be recreated.
- FIG. 8 illustrates several fields of view relating to a lens 800 according to various embodiments.
- Lens 800 may be similar to or identical to the lens 400 shown in FIG. 4.
- a faceted lens 800 that performs the refraction to achieve the desired stereoscopic effect can be described mathematically based on certain specified physical values. These values include the inter-ocular distance (D) that provides the desired stereoscopic effect, the index of refraction ( n ) ofthe material to be used to create the faceted lens, the distance from the center of eye point rotation to the image capture device ( r c ), the effective horizontal field of view ofthe image capture device (fov c ), and the effective horizontal field of view ofthe apparatus's faceted lens section (fov, ).
- the distance D may be a selected inter- ocular distance, which can be any desired distance, but which is most useful when selected to be approximately 4 centimeters to approximately 8 centimeters.
- the subsequent mathematical process assumes an x-y coordinate system, having an origin O at the center of eye point rotation. All angular measurements are in degrees.
- FIG. 9 is a top view of lens surface point ray angles relating to a lens
- Lens 900 may be similar to or identical to the lens 400 shown in FIG. 4.
- the lens facet properties corresponding to a particular point on the lens surface 974 are dependent on the location of that point (P t ) and the angle ofthe ray 976 from the image capture device 930 to that point ( ⁇ c ).
- the apparatus 916 (which may be similar to or identical to the apparatus 416 shown in FIG. 4) can be designed such that the lens surface area corresponding to the field of view ofthe image capture device (fov c ) matches the lens surface area corresponding to the field of view ofthe faceted lens section (fov, ) (see FIG. 8).
- a ray 978 from the center of eye point rotation O may intersect the lens surface at the same point ( P t ).
- the angle ( ⁇ , ) of that ray 978 can be calculated as follows: fav c
- This ray angle ( ⁇ ; ) allows calculation ofthe lens surface intersection point (P t P ⁇ , ? iy in x-y coordinates) as follows:
- FIG. 10 is a top view of eye ray angles relating to a lens 1000 according to various embodiments.
- Lens 1000 may be similar to or identical to lens 400 shown in FIG. 4.
- the lens facet residing at the lens surface intersection point (P i ) should preferably be oriented to capture either one ofthe desired left eye rays 1080 or right eye rays 1082, tangential to the circular path of eye rotation 1084
- FIG. 11 is a top view of lens facet orientation angles relating to a lens
- FIG. 12 is a top view of additional lens facet orientation angles relating to a lens 1200 according to various embodiments.
- Lenses 1100, 1200 may be similar to or identical to the lens 400 shown in FIG. 4.
- FIGs. 10, 11, and 12 it can be seen that the two calculated points of tangency (P tl and P ⁇ ), when viewed in conjunction with the lens surface intersection point (P t ), may correspond to the desired left eye ray (P LE ) and right eye ray (P ⁇ ) that pass through the lens surface at that point.
- the angle formed between each eye ray and the x-axis ( ⁇ and ⁇ LE , respectively) is useful in calculating the refraction properties ofthe current lens surface facet for each eye ray. These angles can be calculated as follows:
- the final facet properties may be calculated for the current lens position, taking into account the index of refraction n.
- the current facet may be chosen to perform refraction that will capture either the left eye ray ( ⁇ LE ) or the right eye ray ( ⁇ ⁇ ).
- the lens facet In order to perform the desired refraction, the lens facet must be oriented such that the incoming eye ray ( ⁇ ⁇ £ or ⁇ ) is refracted to match the current camera ray ( ⁇ c ).
- the lens facet orientation ( ⁇ RS or ⁇ LS ) can be calculated as follows:
- 1000, 1100, 1200 may include an outer radius r having a separating facet, such that r is approximately equal to
- r c comprises a distance from a center of rotation to an image acquisition plane
- fov c comprises an effective horizontal field of view for the image acquisition plane
- fov comprises an effective horizontal field of view spanned by the lens
- a lens 400, 500, 600, 700, 800, 900, 1000, 1100, 1200 may include one or more separating facets having a facet orientation selected from one of ⁇ ffi approximately equal to
- ⁇ ⁇ is approximately equal to an image capture device ray angle minus a selected eye ray angle
- ⁇ LS approximately equal to ( c si iinnn(( ⁇ A( ⁇ ⁇ ⁇
- ⁇ L is approximately equal to an image capture device ray angle minus another selected eye ray angle.
- any number of image acquisition planes may be located at a radial distance r c from an origin point located at a center of a selected inter-ocular distance (e.g., an inter-ocular distance of approximately 4 cm to 8 cm).
- an outer radius ofthe lens n may correspond to a distance at which a field of view ofthe associated image acquisition plane overlaps a field of view ofthe lens.
- FIG. 13 is a top view of a multi-viewpoint lens 1300 according to various embodiments.
- the lens 1300 may be similar to or identical to lens 400 shown in FIG. 4.
- the lens facet residing at the lens surface intersection point ( P t ) should preferably be oriented to capture one of the desired left eye rays
- one ofthe right eye rays 1382, and/or an additional eye ray 1386 tangential to a first circular path of eye rotation 1384 (having a diameter approximately equal to the inter-ocular distance D ⁇ ) or to a second circular path of eye rotation 1388 (having a diameter approximately equal to the inter-ocular distance D 2 ) and passing through the lens surface intersection point ( P t ).
- any number of additional viewpoints may be accommodated by altering the inter-ocular distance (e.g., selecting D 2 instead of Di), forming a new circular path of eye rotation (e.g., having a center of rotation at O 2 instead of Oi), and finding a new point of tangency (e.g., P l3 instead of P ) on the circular path.
- the lens 1300 may include a plurality of separating facets, such as left eye separating facets, right eye separating facets, and one or more additional eye ray separating facets (perhaps corresponding to multiple additional viewpoints).
- separating facets such as left eye separating facets, right eye separating facets, and one or more additional eye ray separating facets (perhaps corresponding to multiple additional viewpoints).
- An example of using the formulas shown above for such a multi- faceted lens include a lens 1300 having a first separating facet with a facet
- a ⁇ R is approximately equal to the image capture device ray angle minus a selected first eye ray angle, a second separating facet with a facet orientation of ⁇ iS
- a ⁇ L is approximately equal to the image capture device ray angle minus a second selected eye ray angle, and a third separating facet having a facet orientation of ⁇
- ⁇ r is approximately equal to the image capture device ray angle minus a third selected eye ray angle.
- the lens 1300 may form a portion of a multi-viewpoint image capture device, or a multi-image projection system.
- FIG. 14 is a top view of a multi-viewpoint image capture apparatus 1416 according to various embodiments.
- a lens 1400 can be provided that enables a single device to capture two or more distinct images simultaneously.
- a single image capture device equipped with a lens similar to that described in FIGs. 4, 10, or 13, can be placed in a room to capture a first image near a first wall, a second near another wall, and a third in between the first and second walls.
- FIG. 14 Such an image capture device is shown in FIG. 14.
- the apparatus 1416 which may be similar to the apparatus 416, is shown along with the relevant inter-ocular distance D.
- the apparatus 1416 may include a lens 1400 having a plurality of interleaved separating facets 1412 including a first separating facet 1422 to refract left eye rays 1424 and a second separating facet 1426 to refract right eye rays 1428.
- the left eye rays may be grouped as rays received from a first image
- the right eye rays may be grouped as rays received from a second image.
- the apparatus 1416 may also include an image acquisition plane 1406 (perhaps as part of an image capture device 1430, such as a frame- grabber, digital video camera, or some other device) to receive a refracted left eye ray 1432 from the first separating facet 1422, and to receive a refracted right eye ray 1434 from the second separating facet 1426.
- An image acquisition plane 1406 (perhaps as part of an image capture device 1430, such as a frame- grabber, digital video camera, or some other device) to receive a refracted left eye ray 1432 from the first separating facet 1422, and to receive a refracted right eye ray 1434 from the second separating facet 1426.
- Additional separating facets can be included in the lens 1400, as described with respect to the lens 1300 in FIG. 13, and additional eye rays associated with other viewpoints (e.g., the third viewpoint associated with the point of tangency P ti in FIG.
- the apparatus 1416 may include a lens having a first plurality of interleaved separating facets including a first separating facet to refract left eye rays and a second separating facet to refract right eye rays, and an image acquisition plane to receive a first refracted left eye ray from the first separating facet, and to receive a first refracted right eye ray from the second separating facet.
- the lens may include one or more additional eye ray separating facets interleaved with the first separating facet and the second separating facet.
- the first separating facet may correspond to a first viewpoint
- the second separating facet may correspond to a second viewpoint
- one ofthe additional eye ray separating facets may correspond to a third viewpoint.
- the image acquisition plane may be located at a radial distance r c from a first origin point located at the center of a first inter-ocular distance.
- Additional separating facets included in the lens may correspond to a second inter-ocular distance and be interleaved with the first and second separating facets.
- FIG. 15 is a top view of a multiple-image projection system 1516 according to various embodiments.
- Much ofthe prior discussion has focused on the use of lenses and image capture devices capable of capturing imagery from two or more viewpoints (e.g., P , P t2 , andP in FIG. 13, and FIG. 14). This concept can be reversed and applied to the projection of images.
- a lens can be provided that enables a single projector to display two or more distinct video presentations simultaneously.
- the 1516 may include a lens 1500 having a plurality of interleaved separating facets 1512 including a first separating facet 1522 to refract left eye rays 1524 and a second separating facet 1526 to refract right eye rays 1528.
- the left eye rays may be grouped to form a first projected image Ii
- the right eye rays may be grouped to form a second projected image I 2 .
- the apparatus 1516 may also include an image projection plane 1506
- the image projection plane 1506 may be located at a radial distance r e from an origin point located at a center of a first inter-ocular distance (e.g., Di in FIG.
- the lens 1500 may include one or more additional eye ray separating facets (not shown for clarity, but perhaps interleaved with the first separating facet 1522 and the second separating facet 1526), wherein the first separating facet corresponds to a first viewpoint, wherein the second separating facet corresponds to a second viewpoint, and wherein the additional eye ray separating facet corresponds to a third viewpoint and a second inter-ocular distance (e.g., D 2 in FIG. 13).
- additional eye ray separating facet corresponds to a third viewpoint and a second inter-ocular distance (e.g., D 2 in FIG. 13).
- Such modules may include hardware circuitry, and/or one or more processors and/or memory circuits, software program modules, including objects and collections of objects, and/or firmware, and combinations thereof, as desired by the architect ofthe lens 100, 200, 300, 400, 500, 538, 542, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1500, apparatus 316, 416, 516, 716, 1416, 1516 and systems 536, 636, and as appropriate for particular implementations of various embodiments.
- the lens, apparatus, and systems of various embodiments can be used in applications other than panoramic cameras, and thus, various embodiments are not to be so limited.
- the illustrations ofthe lens 100, 200, 300, 400, 500, 538, 542, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1500, apparatus 316, 416, 516, 716, 1416, 1516 and system 536, 636 are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use ofthe structures described herein.
- Applications that may include the novel lens, apparatus, and systems of various embodiments include frame grabbers, cameras, binoculars, telescopes, and microscopes.
- FIGs. 16A and 16B are flow charts illustrating several methods according to various embodiments. Some ofthe methods to be described may be derived from the process illustrated in FIG. 7.
- a method 1611 may (optionally) begin at block 1615 with receiving a plurality of left eye rays through one of a first plurality of separating facets of a lens at an image acquisition plane, and receiving a plurality of right eye rays through one of a second plurality of separating facets ofthe lens at the image acquisition plane.
- the first plurality of separating facets may be interleaved with the second plurality of separating facets, as shown in FIG. 4.
- the method 1611 may continue with acquiring data from the image acquisition plane to construct a separated left eye image, and acquiring data from the image acquisition plane to construct a separated right eye image at block 1619.
- the method 1611 may further include joining the separated left eye image to provide a joined left eye image, and joining the separated right eye image to provide a joined right eye image at block 1627, as well as combining the joined left eye image and the joined right eye image to provide a stereoscopic image at block 1627.
- the method may also include combining the joined left eye image and the joined right eye image to provide a 360 degree (or some lesser amount of degrees), panoramic stereoscopic image at block 1631.
- an outer radius ofthe lens may correspond to a distance at which a field of view ofthe image acquisition plane overlaps a field of view of the lens.
- the method 1611 may also include repeatedly acquiring data from the image acquisition plane to construct a separated left eye image, repeatedly acquiring data from the image acquisition plane to construct a separated right eye image, and processing the separated left eye image and the separated right eye image to provide a moving stereoscopic image at block 1623.
- the method 1611 may further include repeatedly acquiring data from the image acquisition plane to construct a separated left eye image, repeatedly acquiring data from the image acquisition plane to construct a separated right eye image, and processing the separated left eye image and the separated right eye image to provide a moving 360 degree (or some lesser number of degrees), panoramic stereoscopic image at block 1623.
- a method of projecting multiple images is illustrated in FIG. 16B.
- a method 1641 may include projecting a plurality of left eye rays through one of a first plurality of separating facets of a lens from an image projection plane at block 1645.
- the method 1641 may also include projecting a plurality of right eye rays through one of a second plurality of separating facets ofthe lens from the image projection plane at block 1649.
- the first plurality of separating facets may be interleaved with the second plurality of separating facets, and the plurality of left eye rays may comprise a portion of a separated left eye image, while the plurality of right eye rays may comprise a portion of a separated right eye image.
- the outer radius ofthe lens may correspond to a distance at which the field of view ofthe image projection plane overlaps the field of view ofthe lens.
- the terms "information” and “data” may be used interchangeably.
- Information including parameters, commands, operands, and other data, can be sent and received in the form of one or more carrier waves.
- One of ordinary skill in the art will understand the manner in which a software program can be launched from a computer-readable medium in a computer-based system to execute the functions defined in the software program.
- One of ordinary skill in the art will further understand the various programming languages that may be employed to create one or more software programs designed to implement and perform the methods disclosed herein.
- the programs may be structured in an object-orientated format using an object-oriented language such as Java, Smalltalk, or C++.
- FIG. 17 is a block diagram of an article 1785 according to various embodiments, such as a computer, a memory system, a magnetic or optical disk, some other storage device, and/or any type of electronic device or system.
- the article 1785 may comprise a processor 1787 coupled to a machine-accessible medium such as a memory 1789 (e.g., a memory including an electrical, optical, or electromagnetic conductor) having associated information 1791 (e.g., computer program instructions or other data), which when accessed, results in a machine (e.g., the processor 1787) performing such actions as receiving a plurality of left eye rays through one of a first plurality of separating facets of a lens at an image acquisition plane, and receiving a plurality of right eye rays through one of a second plurality of separating facets ofthe lens at the image acquisition plane.
- a machine-accessible medium such as a memory 1789 (e.g., a memory including an electrical, optical, or electromagnetic conductor) having associated information 1791 (e.g., computer program instructions or other data), which when accessed, results in a machine (e.g., the processor 1787) performing such actions as receiving a plurality of left eye rays through one of a first
- Still further activities may include projecting a plurality of left eye rays through one of a first plurality of separating facets of a lens from an image projection plane, and projecting a plurality of right eye rays through one of a second plurality of separating facets ofthe lens from the image projection plane.
- the plurality of left eye rays may comprise a portion of a separated left eye image
- the plurality of right eye rays may comprise a portion of a separated right eye image.
- Implementing the lenses, apparatus, systems, and methods described herein may provide a mechanism for re-creating panoramic (up to 360 degrees), stereoscopic images in real time.
- a single lens may be used in place of multiple lenses.
- Such a mechanism may improve the quality of imaging in three dimensions at reduced cost and increased efficiency.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002551636A CA2551636A1 (en) | 2003-12-26 | 2004-12-22 | Multi-dimensional imaging apparatus, systems, and methods |
AU2004313174A AU2004313174A1 (en) | 2003-12-26 | 2004-12-22 | Multi-dimensional imaging apparatus, systems, and methods |
JP2006547355A JP2007517264A (en) | 2003-12-26 | 2004-12-22 | Multidimensional imaging apparatus, system and method |
EP04815342A EP1702475A2 (en) | 2003-12-26 | 2004-12-22 | Multi-dimensional imaging apparatus, systems, and methods |
CN2004800420407A CN1922892B (en) | 2003-12-26 | 2004-12-22 | Multi-dimensional imaging apparatus, systems, and methods |
IL176521A IL176521A0 (en) | 2003-12-26 | 2006-06-22 | Multi-dimensional imaging apparatus, systems, and methods |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US53244703P | 2003-12-26 | 2003-12-26 | |
US60/532,447 | 2003-12-26 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2005067318A2 true WO2005067318A2 (en) | 2005-07-21 |
WO2005067318A9 WO2005067318A9 (en) | 2005-11-17 |
WO2005067318A3 WO2005067318A3 (en) | 2006-01-05 |
Family
ID=34748798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2004/043252 WO2005067318A2 (en) | 2003-12-26 | 2004-12-22 | Multi-dimensional imaging apparatus, systems, and methods |
Country Status (9)
Country | Link |
---|---|
US (2) | US7347555B2 (en) |
EP (1) | EP1702475A2 (en) |
JP (1) | JP2007517264A (en) |
KR (1) | KR20070007059A (en) |
CN (1) | CN1922892B (en) |
AU (1) | AU2004313174A1 (en) |
CA (1) | CA2551636A1 (en) |
IL (1) | IL176521A0 (en) |
WO (1) | WO2005067318A2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006107934A1 (en) * | 2005-04-04 | 2006-10-12 | Micoy Corporation | Multi-dimensional imaging |
US7347555B2 (en) | 2003-12-26 | 2008-03-25 | Micoy Corporation | Multi-dimensional imaging apparatus, systems, and methods |
US7429997B2 (en) | 2000-11-29 | 2008-09-30 | Micoy Corporation | System and method for spherical stereoscopic photographing |
US20080298674A1 (en) * | 2007-05-29 | 2008-12-04 | Image Masters Inc. | Stereoscopic Panoramic imaging system |
US7982777B2 (en) | 2005-04-07 | 2011-07-19 | Axis Engineering Technologies, Inc. | Stereoscopic wide field of view imaging system |
WO2012056437A1 (en) | 2010-10-29 | 2012-05-03 | École Polytechnique Fédérale De Lausanne (Epfl) | Omnidirectional sensor array system |
US8334895B2 (en) | 2005-05-13 | 2012-12-18 | Micoy Corporation | Image capture and processing using converging rays |
CN104079917A (en) * | 2014-07-14 | 2014-10-01 | 中国地质大学(武汉) | 360-degree panorama stereoscopic camera |
WO2017100760A1 (en) * | 2015-12-11 | 2017-06-15 | Google Inc. | Lampshade for stereo 360 capture |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4625957B2 (en) * | 2005-10-07 | 2011-02-02 | 国立大学法人 東京大学 | All-around stereo image capture device |
GB2473636A (en) * | 2009-09-18 | 2011-03-23 | Sharp Kk | Multiple view display comprising lenticular lens having regions corresponding to two different centres of curvature |
US8836848B2 (en) * | 2010-01-26 | 2014-09-16 | Southwest Research Institute | Vision system |
US8743199B2 (en) * | 2010-03-09 | 2014-06-03 | Physical Optics Corporation | Omnidirectional imaging optics with 360°-seamless telescopic resolution |
US8942964B2 (en) | 2010-06-08 | 2015-01-27 | Southwest Research Institute | Optical state estimation and simulation environment for unmanned aerial vehicles |
US8730396B2 (en) * | 2010-06-23 | 2014-05-20 | MindTree Limited | Capturing events of interest by spatio-temporal video analysis |
US20120154518A1 (en) * | 2010-12-17 | 2012-06-21 | Microsoft Corporation | System for capturing panoramic stereoscopic video |
US8466406B2 (en) | 2011-05-12 | 2013-06-18 | Southwest Research Institute | Wide-angle laser signal sensor having a 360 degree field of view in a horizontal plane and a positive 90 degree field of view in a vertical plane |
US9503638B1 (en) * | 2013-02-04 | 2016-11-22 | UtopiaCompression Corporation | High-resolution single-viewpoint panoramic camera and method of obtaining high-resolution panoramic images with a single viewpoint |
US9291886B2 (en) | 2013-08-30 | 2016-03-22 | American Tack & Hardware Co., Inc. | Animated projection system |
US9185391B1 (en) | 2014-06-17 | 2015-11-10 | Actality, Inc. | Adjustable parallax distance, wide field of view, stereoscopic imaging system |
US20180063507A1 (en) * | 2015-04-09 | 2018-03-01 | Philippe Cho Van Lieu | Apparatus for recording stereoscopic panoramic photography |
US20160349600A1 (en) * | 2015-05-26 | 2016-12-01 | Gopro, Inc. | Multi Camera Mount |
CN107316273B (en) * | 2016-04-27 | 2021-05-18 | 深圳看到科技有限公司 | Panoramic image acquisition device and acquisition method |
WO2018193713A1 (en) * | 2017-04-21 | 2018-10-25 | ソニー株式会社 | Imaging device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10030196A1 (en) * | 2000-06-22 | 2002-01-10 | 4D Vision Gmbh | Method and arrangement for recording multiple views of a scene or an object |
WO2002044808A2 (en) * | 2000-11-29 | 2002-06-06 | Rvc Llc | System and method for spherical stereoscopic photographing |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3953869A (en) | 1974-09-24 | 1976-04-27 | Dimensional Development Corporation | Stereoscopic photography apparatus |
NL181823C (en) | 1974-09-24 | 1987-11-02 | Nimslo Technology Ltd | METHOD FOR OBTAINING AN IMAGE REGISTRATION OF A SPATIAL OBJECT FIELD FOR THE PRODUCTION OF A STEREOSCOPIC IMAGE |
US4475798A (en) | 1977-12-27 | 1984-10-09 | The Three Dimensional Photography Corporation | Camera for stereoscopic photography |
US4214821A (en) | 1978-10-30 | 1980-07-29 | Termes Richard A | Total environment photographic mount and photograph |
JP2515101B2 (en) * | 1986-06-27 | 1996-07-10 | ヤマハ株式会社 | Video and audio space recording / playback method |
US4927262A (en) * | 1989-04-26 | 1990-05-22 | Hughes Aircraft Company | Multiple pupil projection display |
US5023725A (en) | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5130794A (en) | 1990-03-29 | 1992-07-14 | Ritchey Kurtis J | Panoramic display system |
US6002430A (en) | 1994-01-31 | 1999-12-14 | Interactive Pictures Corporation | Method and apparatus for simultaneous capture of a spherical image |
ATE158129T1 (en) | 1992-03-23 | 1997-09-15 | Canon Kk | MULTIPLE LENS IMAGE RECORDING DEVICE AND MISREGISTRATION CORRECTION |
US5650813A (en) | 1992-11-20 | 1997-07-22 | Picker International, Inc. | Panoramic time delay and integration video camera system |
JP2883265B2 (en) | 1993-09-24 | 1999-04-19 | キヤノン株式会社 | Image processing device |
US5905593A (en) * | 1995-11-16 | 1999-05-18 | 3-D Image Technology | Method and apparatus of producing 3D video by correcting the effects of video monitor on lenticular layer |
US5562572A (en) | 1995-03-10 | 1996-10-08 | Carmein; David E. E. | Omni-directional treadmill |
US5980256A (en) | 1993-10-29 | 1999-11-09 | Carmein; David E. E. | Virtual reality system with enhanced sensory apparatus |
JPH07140569A (en) | 1993-11-16 | 1995-06-02 | Shunichi Kiwada | Stereoscopic image photographing method, prism used therefor and stereoscopic image photographing device |
US6236748B1 (en) | 1994-08-02 | 2001-05-22 | Canon Kabushiki Kaisha | Compound eye image pickup device utilizing plural image sensors and plural lenses |
US5703961A (en) | 1994-12-29 | 1997-12-30 | Worldscape L.L.C. | Image transformation and synthesis methods |
US5657073A (en) | 1995-06-01 | 1997-08-12 | Panoramic Viewing Systems, Inc. | Seamless multi-camera panoramic imaging with distortion correction and selectable field of view |
US6031540A (en) | 1995-11-02 | 2000-02-29 | Imove Inc. | Method and apparatus for simulating movement in multidimensional space with polygonal projections from subhemispherical imagery |
US6141034A (en) | 1995-12-15 | 2000-10-31 | Immersive Media Co. | Immersive imaging method and apparatus |
JP3832895B2 (en) | 1996-05-28 | 2006-10-11 | キヤノン株式会社 | Image composition apparatus and image composition system |
CN1224512A (en) * | 1996-06-03 | 1999-07-28 | 赫尔曼D·米姆斯 | Method and apparatus for three-dimensional photography |
EP0902913A1 (en) | 1996-06-03 | 1999-03-24 | Mims, Herman D. | Method and apparatus for three-dimensional photography |
US6075905A (en) | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
US5721585A (en) | 1996-08-08 | 1998-02-24 | Keast; Jeffrey D. | Digital video panoramic image capture and display system |
US6552744B2 (en) | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
JPH11133484A (en) | 1997-10-29 | 1999-05-21 | Canon Inc | Double eye image pickup device |
US6034716A (en) | 1997-12-18 | 2000-03-07 | Whiting; Joshua B. | Panoramic digital camera system |
US6522325B1 (en) | 1998-04-02 | 2003-02-18 | Kewazinga Corp. | Navigable telepresence method and system utilizing an array of cameras |
AU4184399A (en) | 1998-05-13 | 1999-11-29 | Infinite Pictures Inc. | Panoramic movies which simulate movement through multidimensional space |
US6323858B1 (en) | 1998-05-13 | 2001-11-27 | Imove Inc. | System for digitally capturing and recording panoramic movies |
US6665003B1 (en) | 1998-09-17 | 2003-12-16 | Issum Research Development Company Of The Hebrew University Of Jerusalem | System and method for generating and displaying panoramic images and movies |
US6469710B1 (en) | 1998-09-25 | 2002-10-22 | Microsoft Corporation | Inverse texture mapping using weighted pyramid blending |
US6023588A (en) | 1998-09-28 | 2000-02-08 | Eastman Kodak Company | Method and apparatus for capturing panoramic images with range data |
US6169858B1 (en) | 1999-07-26 | 2001-01-02 | Eastman Kodak Company | Panoramic image capture aid |
US6639596B1 (en) | 1999-09-20 | 2003-10-28 | Microsoft Corporation | Stereo reconstruction from multiperspective panoramas |
ES2227200T3 (en) * | 2000-05-19 | 2005-04-01 | Tibor Balogh | METHOD AND APPLIANCE TO SUBMIT 3D IMAGES. |
US6559846B1 (en) | 2000-07-07 | 2003-05-06 | Microsoft Corporation | System and process for viewing panoramic video |
US6946077B2 (en) * | 2000-11-30 | 2005-09-20 | Envirotrol, Inc. | pH stable activated carbon |
US6947059B2 (en) | 2001-08-10 | 2005-09-20 | Micoy Corporation | Stereoscopic panoramic image capture device |
US7180663B2 (en) * | 2002-06-19 | 2007-02-20 | Robert Bruce Collender | 3D motion picture theatre |
US20040001138A1 (en) * | 2002-06-27 | 2004-01-01 | Weerashinghe W.A. Chaminda P. | Stereoscopic panoramic video generation system |
US6755532B1 (en) * | 2003-03-20 | 2004-06-29 | Eastman Kodak Company | Method and apparatus for monocentric projection of an image |
US7150531B2 (en) * | 2003-08-26 | 2006-12-19 | The Regents Of The University Of California | Autostereoscopic projection viewer |
AU2004313174A1 (en) | 2003-12-26 | 2005-07-21 | Micoy Corporation | Multi-dimensional imaging apparatus, systems, and methods |
US7796152B2 (en) * | 2005-04-04 | 2010-09-14 | Micoy Corporation | Multi-dimensional imaging |
-
2004
- 2004-12-22 AU AU2004313174A patent/AU2004313174A1/en not_active Abandoned
- 2004-12-22 KR KR1020067014987A patent/KR20070007059A/en not_active Application Discontinuation
- 2004-12-22 EP EP04815342A patent/EP1702475A2/en not_active Withdrawn
- 2004-12-22 CA CA002551636A patent/CA2551636A1/en not_active Abandoned
- 2004-12-22 CN CN2004800420407A patent/CN1922892B/en not_active Expired - Fee Related
- 2004-12-22 WO PCT/US2004/043252 patent/WO2005067318A2/en active Application Filing
- 2004-12-22 JP JP2006547355A patent/JP2007517264A/en active Pending
- 2004-12-22 US US11/020,787 patent/US7347555B2/en active Active
-
2006
- 2006-06-22 IL IL176521A patent/IL176521A0/en unknown
-
2008
- 2008-03-24 US US12/054,370 patent/US7553023B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10030196A1 (en) * | 2000-06-22 | 2002-01-10 | 4D Vision Gmbh | Method and arrangement for recording multiple views of a scene or an object |
WO2002044808A2 (en) * | 2000-11-29 | 2002-06-06 | Rvc Llc | System and method for spherical stereoscopic photographing |
Non-Patent Citations (1)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 1995, no. 09, 31 October 1995 (1995-10-31) & JP 07 140569 A (SHUNICHI KIWADA), 2 June 1995 (1995-06-02) * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7429997B2 (en) | 2000-11-29 | 2008-09-30 | Micoy Corporation | System and method for spherical stereoscopic photographing |
US7347555B2 (en) | 2003-12-26 | 2008-03-25 | Micoy Corporation | Multi-dimensional imaging apparatus, systems, and methods |
WO2006107934A1 (en) * | 2005-04-04 | 2006-10-12 | Micoy Corporation | Multi-dimensional imaging |
US7796152B2 (en) | 2005-04-04 | 2010-09-14 | Micoy Corporation | Multi-dimensional imaging |
US7982777B2 (en) | 2005-04-07 | 2011-07-19 | Axis Engineering Technologies, Inc. | Stereoscopic wide field of view imaging system |
US8004558B2 (en) * | 2005-04-07 | 2011-08-23 | Axis Engineering Technologies, Inc. | Stereoscopic wide field of view imaging system |
US8885024B2 (en) | 2005-05-13 | 2014-11-11 | Micoy Corporation | Stereo imagers and projectors, and method |
US8890940B2 (en) | 2005-05-13 | 2014-11-18 | Micoy Corporation | Stereo image capture and processing |
US8334895B2 (en) | 2005-05-13 | 2012-12-18 | Micoy Corporation | Image capture and processing using converging rays |
US20080298674A1 (en) * | 2007-05-29 | 2008-12-04 | Image Masters Inc. | Stereoscopic Panoramic imaging system |
WO2012056437A1 (en) | 2010-10-29 | 2012-05-03 | École Polytechnique Fédérale De Lausanne (Epfl) | Omnidirectional sensor array system |
US10362225B2 (en) | 2010-10-29 | 2019-07-23 | Ecole Polytechnique Federale De Lausanne (Epfl) | Omnidirectional sensor array system |
CN104079917A (en) * | 2014-07-14 | 2014-10-01 | 中国地质大学(武汉) | 360-degree panorama stereoscopic camera |
WO2017100760A1 (en) * | 2015-12-11 | 2017-06-15 | Google Inc. | Lampshade for stereo 360 capture |
Also Published As
Publication number | Publication date |
---|---|
WO2005067318A3 (en) | 2006-01-05 |
US20050141089A1 (en) | 2005-06-30 |
US7553023B2 (en) | 2009-06-30 |
CN1922892A (en) | 2007-02-28 |
WO2005067318A9 (en) | 2005-11-17 |
AU2004313174A1 (en) | 2005-07-21 |
IL176521A0 (en) | 2006-10-05 |
CN1922892B (en) | 2012-08-15 |
CA2551636A1 (en) | 2005-07-21 |
JP2007517264A (en) | 2007-06-28 |
KR20070007059A (en) | 2007-01-12 |
US20080192344A1 (en) | 2008-08-14 |
US7347555B2 (en) | 2008-03-25 |
EP1702475A2 (en) | 2006-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7553023B2 (en) | Multi-dimensional imaging apparatus, methods, and systems | |
US7796152B2 (en) | Multi-dimensional imaging | |
US7872665B2 (en) | Image capture and processing | |
Tan et al. | Multiview panoramic cameras using mirror pyramids | |
US6795109B2 (en) | Stereo panoramic camera arrangements for recording panoramic images useful in a stereo panoramic image pair | |
US9503638B1 (en) | High-resolution single-viewpoint panoramic camera and method of obtaining high-resolution panoramic images with a single viewpoint | |
US10057487B1 (en) | Panoramic imaging systems based on normal-lens cameras | |
KR102176963B1 (en) | System and method for capturing horizontal parallax stereo panorama | |
WO2003054625A1 (en) | A panoramic stereoscopic imaging method and apparatus | |
Hua et al. | A high-resolution panoramic camera | |
Tan et al. | Multiview panoramic cameras using a mirror pyramid | |
CN112351358B (en) | 360-degree free three-dimensional type three-dimensional display sound box based on face detection | |
US20190235214A1 (en) | Spherical camera lens system, camera system and lens assembly | |
Vanijja et al. | Omni-directional stereoscopic images from one omni-directional camera | |
Hua et al. | Design analysis of a high-resolution panoramic camera using conventional imagers and a mirror pyramid | |
Pritch et al. | Optics for omnistereo imaging | |
JP2003532914A (en) | Stereoscopic panoramic camera arrangement for recording useful panoramic images in stereoscopic panoramic image pairs | |
JP2001258050A (en) | Stereoscopic video imaging device | |
Lin et al. | Single-row superposition-type spherical compound-like eye for pan-tilt motion recovery | |
Endo et al. | A method for panoramic stereo image acquisition | |
Vanijja et al. | Omni-directional binocular stereoscopic images from one omni-directional camera | |
Lin et al. | Pan-Tilt Motion Estimation Using Superposition-Type Spherical Compound-Like Eye |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
COP | Corrected version of pamphlet |
Free format text: PAGES 1/18-18/18, DRAWINGS, REPLACED BY CORRECT PAGES 1/17-17/17 |
|
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 176521 Country of ref document: IL |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2551636 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006547355 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004313174 Country of ref document: AU Ref document number: 1020067014987 Country of ref document: KR Ref document number: 4281/DELNP/2006 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004815342 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2004313174 Country of ref document: AU Date of ref document: 20041222 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2004313174 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200480042040.7 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2004815342 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067014987 Country of ref document: KR |