WO2011161973A1 - 全方位撮影システム - Google Patents
全方位撮影システム Download PDFInfo
- Publication number
- WO2011161973A1 WO2011161973A1 PCT/JP2011/003615 JP2011003615W WO2011161973A1 WO 2011161973 A1 WO2011161973 A1 WO 2011161973A1 JP 2011003615 W JP2011003615 W JP 2011003615W WO 2011161973 A1 WO2011161973 A1 WO 2011161973A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- mirror
- point
- hyperboloid
- mirrors
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/06—Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B17/00—Systems with reflecting surfaces, with or without refracting elements
- G02B17/02—Catoptric systems, e.g. image erecting and reversing system
- G02B17/06—Catoptric systems, e.g. image erecting and reversing system using mirrors only, i.e. having only one curved mirror
- G02B17/0605—Catoptric systems, e.g. image erecting and reversing system using mirrors only, i.e. having only one curved mirror using two curved mirrors
- G02B17/061—Catoptric systems, e.g. image erecting and reversing system using mirrors only, i.e. having only one curved mirror using two curved mirrors on-axis systems with at least one of the mirrors having a central aperture
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/06—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe involving anamorphosis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/58—Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present invention relates to an omnidirectional imaging system, and more particularly to an omnidirectional imaging system capable of blind spot complementation by a plurality of omnidirectional images.
- FIG. 29 shows a conventional omnidirectional imaging system described in Patent Document 1 mentioned above.
- the main mirror 301 reflects light incident from the direction of the surrounding horizontal 360 degrees.
- the secondary mirror 302 further reflects the light reflected by the primary mirror 301.
- the light reflected by the secondary mirror 302 forms an image on the imaging surface 304 through the principal point of the light receiving lens system 303.
- the main mirror 301 and the sub mirror 302 are covered with a transparent tube 305. As described above, according to the optical system of FIG. 29, an object in the direction of 360 degrees around the periphery can be photographed by one light receiving lens system.
- FIG. 30 shows a conventional omnidirectional imaging system described in Patent Document 2. As shown in FIG.
- an omnidirectional mirror 401 is composed of a hemispherical mirror 402a and a plurality of spherical mirrors 402b.
- the hemispherical mirror 402a reflects light incident from the direction of the surrounding horizontal 360 degrees, and the plurality of spherical mirrors 402b also reflect light incident from the surrounding horizontal 360 degree direction.
- the imaging device 403 captures the light reflected by the hemispherical mirror 402 a and the plurality of spherical mirrors 402 b.
- FIG. 31 shows a conventional omnidirectional imaging system described in Patent Document 3. As shown in FIG.
- a camera 510 has a fisheye lens ⁇ 1 and a rotator mirror ⁇ . Further, the camera 510 is a portion that forms an image directly on a part of the imaging device 514 through the fisheye lens ⁇ 1 and the lens optical system 512, and is reflected by the rotating mirror ⁇ and one of the imaging device 514 through the fisheye lens ⁇ 1 and the lens optical system 512. It consists of a part to form an image on a part.
- the distance information to the subject is obtained from the portion to be imaged directly through the above fisheye lens ⁇ 1 and the portion to be reflected by the rotating mirror ⁇ and imaged through the fisheye lens ⁇ 1. There is.
- Patent No. 3523783 JP 2005-234224 A Unexamined-Japanese-Patent No. 2006-220603
- Patent Document 1 has a problem that there is a blind spot area hidden by the sub mirror.
- this blind spot area corresponds to a portion directly below the ceiling, and is a very important part as an entire surrounding image.
- Patent Document 2 Although positioning is possible, there is a problem that the camera is located on the object side of the primary mirror and secondary mirror, so that blind spots of the primary and secondary mirrors by the camera exist. doing. In particular, this blind spot area corresponds to a portion directly below the ceiling, and is a very important part as an entire surrounding image. Furthermore, since the secondary mirror is located on the subject side of the primary mirror, there is a problem that a blind spot area of the primary mirror by the secondary mirror exists and the imaging range of the primary mirror is narrowed.
- Patent Document 3 there is no blind spot directly below when attached to the ceiling, but the imaging range of the fisheye lens is to be reflected by the rotator mirror to acquire an image from a different viewpoint from the fisheye lens. Has a problem of narrowing the
- An object of the present invention is to solve the conventional problems, and it is an object of the present invention to provide an omnidirectional imaging system capable of acquiring an image in which a blind area is eliminated while securing a high viewing angle.
- An omnidirectional imaging system includes a main mirror comprising a hyperbolic mirror, a sub mirror comprising a plurality of hyperbolic mirrors disposed around the primary mirror, and an image reflected by the main mirror. And a camera for capturing an image reflected by the plurality of sub mirrors, wherein the outer focal point of the hyperboloid of the main mirror and the outer focal point of the hyperboloid of the plurality of sub mirrors are substantially coincident with each other.
- the camera is disposed such that a viewpoint of the camera, which is an incident pupil position of a lens attached to the camera, and an outer focal point of hyperboloids of the main mirror and the plurality of sub mirrors substantially coincide with each other.
- the camera in the omnidirectional imaging system, can directly capture an image reflected by each of the main mirror and the plurality of secondary mirrors, and is compared with, for example, an omnidirectional imaging apparatus using a fisheye lens according to related art. It is possible to secure a wider viewing angle.
- the image corresponding to the blind spot area generated due to the camera on the central axis of the main mirror can be acquired from the images reflected by the plurality of sub mirrors, the blind spot area can be eliminated.
- the present invention can not only be realized as such an omnidirectional imaging system, but also be realized as an omnidirectional imaging method in which the characteristic means included in the omnidirectional imaging system is taken as a step, or such characteristic steps May be realized as a program that causes a computer to execute the program. It goes without saying that such a program can be distributed via a recording medium such as a compact disc read only memory (CD-ROM) and a transmission medium such as the Internet.
- a recording medium such as a compact disc read only memory (CD-ROM)
- CD-ROM compact disc read only memory
- the present invention can be realized as a semiconductor integrated circuit (LSI) that implements some or all of the functions of such an omnidirectional imaging system.
- LSI semiconductor integrated circuit
- a high view angle can be secured by the main mirror formed of a hyperboloid mirror. Furthermore, since the outer mirrors of the primary mirror and the outer surface of the hyperboloid of the secondary mirror substantially coincide with each other, it is possible to obtain an image without a dead area.
- an omnidirectional imaging system capable of acquiring an image without blind spots while securing a high viewing angle is provided, and the importance of the omnidirectional imaging system as an image acquisition means is increasing today.
- the practical value of the invention is extremely high.
- FIG. 1 is a diagram showing an optical system configuration of an omnidirectional imaging system in Embodiments 1 and 2 of the present invention, and (a) is an optical system configuration of an omnidirectional imaging system in Embodiments 1 and 2 of the present invention
- FIG. 14B is a perspective view of the optical system configuration of the omnidirectional imaging system according to Embodiments 1 and 2 of the present invention.
- FIG. 2 is a diagram showing focal positions of the primary mirror and the secondary mirror in the first and second embodiments of the present invention.
- FIG. 3 is a diagram showing the positional relationship between the main mirror and one of the plurality of sub mirrors in the first and second embodiments of the present invention.
- FIG. 4 is an image processing configuration diagram of a blind spot complementing unit according to Embodiment 1 of the present invention.
- FIG. 5 is a first schematic diagram of an example of complementary processing to be added to an image captured by the omnidirectional imaging system according to Embodiment 1 of the present invention.
- FIG. 6 is a second schematic diagram of an example of complementary processing to be added to an image captured by the omnidirectional imaging system according to Embodiment 1 of the present invention.
- FIG. 7 is an image processing configuration diagram of a positioning unit in Embodiment 2 of the present invention.
- FIG. 8 is a conceptual diagram of positioning vector calculation in Embodiment 2 of the present invention.
- FIG. 9 is a schematic configuration diagram of an optical system of an omnidirectional imaging system according to a third embodiment of the present invention.
- FIG. 10 is a diagram showing the positional relationship between one of the primary mirror and the secondary mirror in the third embodiment of the present invention.
- FIG. 11 is a first schematic diagram of an example of complementary processing to be added to an image captured by an omnidirectional imaging system according to a third embodiment of the present invention.
- FIG. 12 is a second schematic diagram of an example of the complementary processing to be added to the image captured by the omnidirectional imaging system according to the third embodiment of the present invention.
- FIG. 13 is a third schematic diagram of an example of the complementing process to be added to the image captured by the omnidirectional imaging system in the third embodiment of the present invention.
- FIG. 14 is a schematic configuration diagram of an optical system of an omnidirectional imaging system according to a fourth embodiment of the present invention.
- FIG. 15 is a first schematic diagram of an example of complementary processing to be added to an image captured by an omnidirectional imaging system according to a fourth embodiment of the present invention.
- FIG. 16 is a second schematic diagram of an example of the complementary processing to be added to the image captured by the omnidirectional imaging system according to the fourth embodiment of the present invention.
- FIG. 17 is a third schematic diagram of an example of complementary processing to be added to an image captured by the omnidirectional imaging system according to the fourth embodiment of the present invention.
- FIG. 18 is a schematic diagram of an optical system of an omnidirectional imaging system according to a fifth embodiment of the present invention.
- FIG. 19 is an image view of a photographed image of the omnidirectional photographing system in the fifth embodiment of the present invention.
- FIG. 20 is a diagram for explaining complementation processing by the omnidirectional imaging system in the fifth embodiment of the present invention, and (a) is an image diagram of a photographed image of the omnidirectional imaging system in the fifth embodiment of the present invention, (b) is an image figure of an example of complement processing added to an image photographed by an omnidirectional photographing system in a fifth embodiment of the present invention.
- FIG. 21 is an external view of an imaging system configuration of an omnidirectional imaging system according to a sixth embodiment of the present invention.
- FIG. 22 is a pictorial image view of an omnidirectional imaging system according to a sixth embodiment of the present invention.
- FIG. 23 is a diagram of the image processing configuration of the blind spot complementing unit in the sixth embodiment of the present invention.
- FIG. 24 is a diagram for explaining complementation processing by the omnidirectional imaging system in the sixth embodiment of the present invention, and (a) is an image diagram of a photographed image of the omnidirectional imaging system in the sixth embodiment of the present invention, (b) is an image figure of an example of complement processing added to an image photographed by an omnidirectional photographing system in a sixth embodiment of the present invention.
- FIG. 25 is an image diagram showing the relationship between the angle of incident light of equidistant projection and the imaging point in the photographed image.
- FIG. 26 is a diagram of the image processing configuration of the positioning unit in the sixth embodiment of the present invention.
- FIG. 27 is a conceptual diagram of positioning vector calculation according to Embodiment 6 of the present invention.
- FIG. 28 is a block diagram showing a hardware configuration of a computer system for realizing the omnidirectional imaging system according to Embodiments 1 to 6 of the present invention.
- FIG. 29 is a structural diagram of a conventional omnidirectional imaging system described in Patent Document 1.
- FIG. 30 is a structural diagram of a conventional omnidirectional imaging system described in Patent Document 2.
- FIG. 31 is a structural diagram of a conventional omnidirectional imaging system described in Patent Document 3.
- An omnidirectional imaging system includes a main mirror comprising a hyperbolic mirror, a sub mirror comprising a plurality of hyperbolic mirrors disposed around the primary mirror, and an image reflected by the main mirror. And a camera for capturing an image reflected by the plurality of sub mirrors, wherein the outer focal point of the hyperboloid of the main mirror and the outer focal point of the hyperboloid of the plurality of sub mirrors are substantially coincident with each other.
- the camera is disposed such that a viewpoint of the camera, which is an incident pupil position of a lens attached to the camera, and an outer focal point of hyperboloids of the main mirror and the plurality of sub mirrors substantially coincide with each other.
- the camera in the omnidirectional imaging system, can directly capture an image reflected by each of the main mirror and the plurality of secondary mirrors, and is compared with, for example, an omnidirectional imaging apparatus using a fisheye lens according to related art. It is possible to secure a wider viewing angle.
- the image corresponding to the blind spot area generated due to the camera on the central axis of the main mirror can be acquired from the images reflected by the plurality of sub mirrors, the blind spot area can be eliminated.
- a roughly planar almost planar mirror disposed between the outer foci of the hyperboloids of the main mirror and the plurality of submirrors and the inner foci of hyperboloids of the main mirror and the plurality of submirrors.
- the camera is disposed such that the viewpoint of the camera is located at a symmetrical position with respect to the outer focal point of the hyperboloids of the main mirror and the plurality of sub mirrors, and the general plane mirror, and the outside light is the main mirror
- the light may be reflected by the plurality of sub mirrors and further reflected by the approximately planar mirror to be incident on the camera.
- the omnidirectional imaging system can be further miniaturized.
- the X axis-Z axis plane includes the central axis of the primary mirror and the central axis of the first secondary mirror which is one of the plurality of secondary mirrors, and the central axis of the primary mirror coincides with the Z axis,
- the outer focal point of the hyperboloid of the primary mirror and the first secondary mirror coincides with the origin of the X-Z plane,
- the position where the sub mirror is disposed is limited to the position where the sub mirror is not reflected in the main mirror.
- the effective field of view of the primary mirror can be secured more widely.
- the X axis-Z axis plane includes the central axis of the main mirror and the central axis of the second sub mirror which is one of the plurality of sub mirrors, and the central axis of the main mirror coincides with the Z axis;
- the angle between the central axis of the primary mirror and the central axis of the second secondary mirror is ⁇ , and the point on the outer diameter of the primary mirror where the outer diameter is the largest
- the second submirror has the vertex position T (x T , z T ) of the second submirror
- the arrangement position of the sub mirror can be limited to a position where a part of the blind area in the image reflected by the main mirror can be reliably reflected.
- the secondary mirror can surely complement the blind spot area of the primary mirror.
- a blind spot complementing unit may be provided that generates a complementary composite image, which is an image for complementing a blind spot area in the image reflected by the main mirror, with the images reflected by the plurality of sub mirrors.
- the omnidirectional imaging system can generate a complementary composite image that complements the blind spot area in the image reflected by the main mirror by combining a plurality of images reflected by the sub mirror.
- the blind spot complementing unit generates an arbitrary viewpoint image having a viewpoint at an inner focal point of the main mirror using the images reflected by the plurality of sub mirrors, and the complemented composite image is generated by the arbitrary viewpoint image. It may be generated.
- a designated point corresponding to the designated point on the image reflected by the main mirror is a point corresponding to the subject corresponding to the designated point, and any one of the plurality of auxiliary mirrors A corresponding section which is a point on the image reflected by one may be calculated by image matching, and a positioning unit may be provided which measures the object from the coordinates of the designated point and the coordinates of the corresponding point.
- the positioning unit included in the omnidirectional imaging system can calculate the point on the main mirror and the point on the sub mirror on which the light from the designated one subject is incident. Therefore, the positioning unit can measure the position of the subject in the three-dimensional space by so-called triangulation.
- it is a point corresponding to the subject corresponding to the designated point with respect to the designated point which is a designated point on the image reflected by any one of the plurality of sub mirrors. It is good also as a positioning part which calculates the corresponding point which is a point on the image reflected by the main mirror by image matching, and measures the object from the coordinates of the designated point and the coordinates of the corresponding point.
- a point corresponding to the subject corresponding to the designated point with respect to the designated point which is a designated point on the image reflected by any one of the plurality of sub mirrors.
- the corresponding point which is a point on the image reflected by any one sub mirror except the sub mirror is calculated by image matching, and the positioning of the object is performed based on the coordinates of the designated point and the coordinates of the corresponding point.
- a positioning unit may be provided.
- a blind spot complementing unit for generating a complementary composite image, which is an image for complementing a blind spot area in the image reflected by the main mirror by the image reflected by at least a part of the plurality of sub mirrors You may provide.
- the blind spot complementing unit generates the blind spot area by generating an arbitrary viewpoint image with a viewpoint at an inner focal point of the main mirror using an image reflected by at least a part of the plurality of sub mirrors. May be generated.
- the blind spot complementing unit records the correspondence between the image reflected by the sub mirror used when generating the complemented composite image and the area in the complemented composite image, and the positioning unit Corresponds to a point on the image reflected by the first sub-mirror corresponding to the area in the complementary composite image that includes the designated point on the composite image, and is specified on the complementary composite image
- a second point which is a point corresponding to the point may be calculated by image matching, and the positioning of the subject may be performed from the first point and the second point.
- An omnidirectional imaging system includes: a plurality of hyperboloid mirrors; and a camera for capturing an image reflected by the plurality of hyperboloid mirrors, wherein the plurality of hyperboloid mirrors are respectively A central axis of the hyperboloid mirror is disposed so as to substantially coincide with an outer focal point of the hyperboloid mirror, and the camera is arranged such that the viewpoint of the camera which is the entrance pupil position of the lens attached to the camera It is disposed to substantially coincide with the outer focal point.
- the omnidirectional imaging system outputs an image in which a blind angle area is eliminated while securing a high viewing angle from overlapping portions of images reflected by a plurality of sub mirrors, even without the main mirror.
- a roughly plane-shaped roughly planar mirror disposed between the outer focal points of the plurality of hyperboloid mirrors substantially coinciding with the viewpoint of the camera and the inner foci of the plurality of hyperbolic mirrors
- the camera is disposed such that the viewpoints of the camera are located at symmetrical positions with respect to the outer focal point of the plurality of hyperboloid mirrors and the substantially plane mirror, and outside light is provided for each of the plurality of hyperboloid mirrors And may be reflected by the approximately planar mirror to be incident on the camera.
- the blind spot complementing unit may be configured to generate a complementary composite image which is an image having no blind spot area on the image by combining the image reflected by at least a part of the plurality of hyperboloid mirrors. Good.
- the blind spot complementing unit is configured to set a blind spot area on an image reflected by any one hyperbolic mirror among the plurality of hyperboloid mirrors among the other hyperboloid mirrors except the hyperbolic mirror.
- the complemented composite image may be generated by complementing using an image reflected at least in part.
- the blind spot complementing unit uses the image reflected by at least a part of the plurality of hyperboloid mirrors to a single viewpoint position of the hyperboloid mirror to be complemented among the plurality of hyperboloid mirrors.
- An arbitrary viewpoint image having a viewpoint may be generated, and the complementary composite image may be generated using the arbitrary viewpoint image.
- a point corresponding to the subject corresponding to the designated point is calculated by image matching, and the object is calculated from the coordinates of the designated point and the coordinates of the corresponding point.
- a blind spot complementing unit may be provided which generates a complementary composite image which is an image having no blind spot area on the image by combining the image reflected by at least a part of the plurality of hyperboloid mirrors.
- the blind spot complementing unit generates an arbitrary viewpoint image at a viewpoint of an omnidirectional image using an image reflected by at least a part of the plurality of hyperboloid mirrors, and uses the arbitrary viewpoint image.
- a complementary composite image may be generated.
- the blind spot complementing unit records the correspondence between the image reflected by the hyperbolic mirror used when generating the complemented composite image and the area in the complemented composite image, and the positioning unit A point on the image reflected by the first hyperboloid mirror corresponding to the area in the complemented composite image including the point designated in the complemented composite image, and a point designated on the complemented composite image A first point which is a point corresponding to a corresponding subject, and a point on an image reflected by any one hyperboloid mirror except the first hyperboloid mirror among the plurality of hyperboloid mirrors
- the second point which is a point corresponding to the subject, may be calculated by image matching, and the positioning of the subject may be performed from the first point and the second point.
- the blind spot complementing unit is configured to set a blind spot area on an image reflected by any one of the plurality of hyperboloid mirrors on at least one of the other hyperboloid mirrors except the hyperboloid mirror. It may be complemented using an image partially reflected.
- the blind spot complementing unit uses the image reflected by at least a part of the plurality of hyperboloid mirrors to a single viewpoint position of the hyperboloid mirror to be complemented among the plurality of hyperboloid mirrors.
- An arbitrary viewpoint image having a viewpoint may be generated, and the complementary composite image may be generated using the arbitrary viewpoint image.
- the blind spot complementing unit records the correspondence between the image reflected by the hyperbolic mirror used when generating the complemented composite image and the area in the complemented composite image, and the positioning unit A point on the image reflected by the first hyperboloid mirror corresponding to the area in the complemented composite image including the point designated in the complemented composite image, and a point designated on the complemented composite image A first point which is a point corresponding to a corresponding subject, and a point on an image reflected by any one hyperboloid mirror except the first hyperboloid mirror among the plurality of hyperboloid mirrors
- the second point which is a point corresponding to the subject, may be calculated by image matching, and the positioning of the subject may be performed from the first point and the second point.
- An omnidirectional imaging system combines a plurality of omnidirectional imaging devices for imaging omnidirectional images, and an image captured by at least a part of the plurality of omnidirectional imaging devices. And a blind spot complementing unit that generates a complementary composite image that is an image having no blind spot region on the image.
- the omnidirectional imaging system can output an image in which the blind area is eliminated while securing a high viewing angle regardless of the configuration of the optical system.
- a designated point that is a designated point on an image captured by any one omnidirectional imaging device among the plurality of omnidirectional imaging devices
- an object corresponding to the designated point is supported.
- Corresponding points that are points on the image captured by any one omnidirectional imaging device except the omnidirectional imaging device are calculated by image matching, and the coordinates of the designated point and the corresponding points are calculated. It is good also as a positioning part which performs positioning of the above-mentioned subject from coordinates.
- the blind spot complementing unit records a correspondence between an image captured by the omnidirectional imaging device used when generating the complementary composite image and a region in the complementary composite image, and the positioning unit A point on the image captured by the first omnidirectional imaging device corresponding to the area in the complementary composite image including the point designated in the complementary composite image, and is specified on the complementary composite image
- the second point which is an upper point and corresponds to the subject, may be calculated by image matching, and the positioning of the subject may be performed from the first point and the second point.
- FIG. 1 is a configuration diagram of an optical system of an omnidirectional imaging system according to a first embodiment of the present invention. More specifically, FIG. 1A is a side view of an omnidirectional imaging system. FIG. 1B is a view of the omnidirectional imaging system as viewed obliquely from above.
- the camera 104 captures an image viewed from the viewpoint 105.
- the hyperboloid shaped main mirror 101 reflects light from the subject.
- the light reflected by the main mirror 101 is reflected by the substantially plane mirror 103 and imaged as image information by the camera 104.
- the camera 104 itself is reflected by the approximately plane mirror 103, and thus becomes a blind spot area.
- the hyperboloid shaped sub mirror 102 reflects light from the subject.
- a plurality of sub mirrors 102 are disposed so as to surround the main mirror 101 in accordance with an arrangement condition described later.
- the light reflected by the secondary mirror 102 is reflected by the approximately plane mirror 103 and imaged by the camera 104.
- FIG. 2 is a diagram showing focal positions of the primary mirror and the secondary mirror in the first embodiment of the present invention. In FIG. 2, the reflection by the substantially plane mirror is omitted.
- FIG. 2A is a diagram showing the focal position of the main mirror, and shows the case where the outer focal point 801a of the main mirror 803a is the origin and the central axis of the main mirror is superimposed on the Z axis. In this case, the equation for expressing the shape of the main mirror is
- the coordinates of the inner focal point 802a of the main mirror 803a are (0, 0, 2c m ).
- a m , b m , and c m are coefficients of the main mirror hyperboloid.
- the maximum radius effective for image acquisition is r m, and this is the main mirror effective radius 804 a.
- a set of points having a radius coinciding with the primary mirror effective radius 804a on the primary mirror is taken as a primary mirror effective edge (circular).
- FIG. 2 (b) is a diagram showing the focal position of the sub mirror. The case where the outer focal point 801b of the secondary mirror 803b is set as the origin and the central axis of the secondary mirror is superimposed on the Z axis is shown. In this case, the equation representing the shape of the secondary mirror is
- the coordinates of the inner focus 802b of the secondary mirror 803b is (0,0,2c s1)
- the coordinates of the secondary mirror vertex 806 is a (0,0, c s1 + a s1 ).
- a s1 , b s1 , c s1 are coefficients of the sub-mirror hyperboloid.
- the maximum radius effective for image acquisition is r s1 , which is the secondary mirror effective radius 804 b.
- a set of points having a radius coinciding with the secondary mirror effective radius 804b on the secondary mirror is taken as a secondary mirror effective edge (circular). Denoting the coordinates of a point 805b on the secondary mirror effective edge which is an intersection point of the secondary mirror effective edge and the XZ plane and which is a point in the region of X ⁇ 0 as ( ⁇ rs1 , 0, zs1 )
- FIG. 3 is a diagram showing the positional relationship between the main mirror and one of a plurality of sub mirrors in Embodiment 1 of the present invention, and coordinates in which a plane including focal points of the main mirror and the sub mirror is an XZ plane. Expressed in the system. In FIG. 3, the reflection by the substantially plane mirror is omitted.
- a single viewpoint is known in which light incident from an object toward the inner focal point of the hyperbolic mirror is reflected toward the outer focal point.
- both the primary mirror and the secondary mirror can maintain single viewpoint by aligning the respective outer focal points and placing the camera's viewpoint at that position.
- the primary mirror / secondary mirror outer focal point 901 is a common outer focal point of the primary mirror 903a and the secondary mirror 903b, which is placed at the origin of the XZ plane.
- the central axis of the main mirror 903a overlaps the Z axis, and the central axis of the sub mirror 903b is inclined from the Z axis by the sub mirror inclination angle 907.
- the inner limit straight line 910 is a straight line connecting the outer focal points 901 of the primary and secondary mirrors and the primary mirror effective edge point 905a.
- the inner limit linear 910 and primary mirror center axis inside the limit angle 908 is an angle between the (Z-axis) and theta 1, the secondary mirror vertex 906 coordinates (x T of (806 in FIG. 2 (b)), 0 , Z T ),
- the minor mirror apex 906 is outside the inner limit straight line 910. Therefore, the light from the subject in the downward direction is reflected by the sub mirror 903b and is incident on the main mirror and sub mirror outer focal point 901 to form an image, whereby a part of the blind area of the main mirror image is complemented by the sub mirror image. It becomes possible.
- the lower limit straight line 909 is a straight line connecting the primary mirror inner focal point 902a and the primary mirror effective edge point 905a. With respect to the lower limit straight line 909, the minor mirror apex 906 is
- the minor mirror apex 906 is located above the lower limit straight line 909. Therefore, the secondary mirror does not disturb the effective field of view of the primary mirror.
- the secondary mirror apex 906 is placed in the gray area (shaded area) in FIG. 3, the secondary mirror can be placed without reducing the effective field of view of the primary mirror.
- FIG. 4 is an image processing configuration diagram of a blind spot complementing unit according to Embodiment 1 of the present invention.
- an imaging optical system 201 is the imaging optical system shown in FIG. 1, and imaging of an object is performed in a camera 202 (corresponding to the camera 104 in FIG. 1).
- the obtained omnidirectional image 210 is input to the blind spot complementing unit 205.
- the blind spot complementing unit 205 includes an image combining unit 203 and a synthesis parameter computing unit 204, and receives the omnidirectional image 210 and outputs a blind spot complementary image. More specifically, the image combining unit 203 generates a complementary combined image which is an image for complementing the blind area and is an image corresponding to the blind area.
- the image combining unit 203 combines the position- and size-matched complementation composite image so as to be superimposed on the blind area of the omnidirectional image 210 input to the blind area complementing unit 205 by adding a complementing process described later.
- the blind spot complemented omnidirectional image is generated and output as a blind spot complement image.
- the processing of combining the complementary composite image with the image having the blind spot area on the image is collectively referred to as “generation of the blind spot complementary image”.
- composition parameters for performing complement processing are composition data specifying calibration data representing the structure and characteristics of the optical system and conditions to be matched (eg, height to be matched or distance from camera, camera viewpoint position, etc.) From the instruction information, it is calculated in the synthesis parameter calculation unit 204. In accordance with the synthesis parameters calculated by the synthesis parameter operation unit 204, the image synthesis unit 203 performs complement synthesis processing of the blind spot and outputs the result as a blind spot complement image.
- the synthesis parameter is, for example, a set of coordinate values of the primary mirror area and coordinate values of the secondary mirror area corresponding to the point.
- the image combining unit 203 is specified by a plurality of coordinate values included in the combining parameter in an image (also referred to as a main mirror image) generated by the light reflected by the main mirror being imaged by the camera 202.
- an image also referred to as a main mirror image
- the partial area of this main mirror image is a part of the blind spot area.
- the image combining unit 203 corresponds to a partial region of the main mirror image in an image (hereinafter, also referred to as a sub mirror image) generated by the light reflected by the sub mirror being imaged by the camera 202.
- a partial region in the secondary mirror image is identified by a plurality of coordinate values included in the synthesis parameter.
- the image combining unit 203 enlarges / reduces the partial area of the secondary mirror image so that the partial area of the primary mirror image included in the combination parameter matches the partial area of the secondary mirror image corresponding thereto. Rotate. Thereafter, the image combining unit 203 performs image processing to superimpose the partial region of the sub-mirror image on the partial region of the corresponding main mirror image.
- the image combining unit 203 or the combined parameter computing unit 204 can also perform the complementing process by determining the inside of the polygon formed by the representative points by linear interpolation.
- the omnidirectional imaging system according to the present embodiment may include a positioning unit 1208 described in detail in the second embodiment.
- the omnidirectional imaging system according to the present embodiment can generate the blind spot complementary image without the positioning unit 1208.
- the positioning unit 1208 the subject in parallel with the generation of the blind spot complementary image. You will be able to
- FIG. 5 and FIG. 6 are image diagrams of an example of the complementing process to be added to the image captured by the omnidirectional imaging system shown in FIG.
- the blind spot complementing unit 205 complements the image of the blind spot area 601 by cutting out and attaching a part of each of the sub mirror image areas 602 to 605 to the blind spot area 601 of the main mirror image area 600.
- the partial image area 606 of the secondary mirror image area 602 is associated with the part 606 ′ of the corresponding blind area 601.
- the partial image area 607 of the secondary mirror image area 603 is associated with the partial area 607 ′ of the corresponding blind area 601.
- the partial image area 608 of the secondary mirror image area 604 is associated with the part 608 ′ of the corresponding blind area 601.
- the partial image area 609 of the secondary mirror image area 605 is associated with the part 609 ′ of the corresponding blind area 601.
- the blind spot complementing unit 205 complements the blind spot area 601 by the above association.
- the blind spot complementing unit 205 cuts out and pastes a part of 702 and 704 in each sub mirror image area 702 to 705 to the blind spot area 701 of the main mirror image area 700. Then, the image of the blind spot area 701 is complemented. Specifically, the partial image area 711 of the secondary mirror image area 702 is associated with the partial area 711 ′ of the corresponding blind area 701. Further, the partial image area 712 of the secondary mirror image area 704 is associated with the part 712 ′ of the corresponding blind area 701. The blind spot complementing unit 205 complements the blind spot area 701 by the above association.
- FIGS. 5 and 6 which are image diagrams of the complementing process, an example of bonding of the areas of about 1/4 and about 1/2 is shown. It is not limited. For example, various combinations are conceivable in which the example of FIG. 6 is vertically divided into two and either upper or lower half is synthesized from the sub mirror image areas 703 and 705.
- the image output is an omnidirectional image in the above description, it may be output in another format such as a partially cut out image.
- Non-Patent Document 1 discloses a technique for generating an intermediate arbitrary viewpoint image by segmentation-based stereo processing. According to this method, it is possible to generate an arbitrary viewpoint image from images captured from a plurality of viewpoints (inside focal points of a plurality of sub mirrors). Therefore, since it is possible to complement from an arbitrary viewpoint, it is possible to synthesize an image of a blind spot part with the viewpoint at the inner focal point of the main mirror in the present embodiment.
- FIG. 7 is an image processing configuration diagram of the positioning unit 1208 in the second embodiment of the present invention.
- the positioning unit 1208 is a processing unit for positioning a subject captured by the imaging optical system 1201.
- an imaging optical system 1201 is the imaging optical system shown in FIG. Imaging is performed in the camera 1202 and an omnidirectional image 1210 is output.
- the matching source image generation unit 1203 and the matching target image generation unit 1204 use the obtained omnidirectional image 1210 for image matching according to the matching image information (that is, the matching source image information 1212 and the matching target image information 1211) Generate an image of The images to be used for image matching generated by the matching source image generation unit 1203 and the matching target image generation unit 1204 will be referred to as a matching source image and a matching target image, respectively.
- the matching image information calculation unit 1205 generates the matching image information according to a matching image information calculation method to be described later according to the provided subject specification information and calibration data.
- the image matching unit 1206 searches a region having the highest degree of match with the matching source image from among the regions included in the matching target image, and outputs coordinates for specifying the region in the matching target image.
- the positioning operation unit 1207 performs inverse conversion on the coordinates in the matching object image output from the image matching unit 1206 with reference to the matching object image information 1211 output from the matching image information operation unit 1205. By this inverse transformation, the positioning operation unit 1207 obtains corresponding point coordinates that are coordinates of a point in the omnidirectional image 1210 and that are coordinates of a point corresponding to a point in the matching target image.
- the positioning operation unit 1207 performs inverse conversion on the coordinates in the matching source image output by the image matching unit 1206 with reference to the matching source image information 1212 output by the matching image information operation unit 1205.
- the positioning operation unit 1207 obtains matching source coordinates that are coordinates of points in the omnidirectional image 1210 and that are coordinates of points corresponding to the points in the matching source image.
- the positioning operation unit 1207 performs reflection calculation on the plane mirror, the main mirror, and the sub mirror according to a positioning operation method described later according to calibration data from these two coordinates (matching source coordinates, corresponding point coordinates). Calculate two vectors directed to the subject.
- the positioning operation unit 1207 specifies the three-dimensional position of the subject according to the principle of triangulation based on these two vectors, and outputs it as a positioning result. Details of the positioning operation unit 1207 will be described later.
- the omnidirectional imaging system according to the present embodiment may include the blind spot complementing unit 205 according to the first embodiment.
- the omnidirectional imaging system according to the present embodiment can measure the subject without providing the blind spot complementing unit 205. However, by providing the blind spot complementing unit 205, the blind spot complementary image can be obtained in parallel with the positioning of the subject. It will be able to output.
- FIG. 8 is a conceptual diagram for explaining positioning vector calculation in the first embodiment of the present invention.
- the main mirror 1301, the sub mirror 1302, the roughly flat mirror 1303 and the viewpoint 1305 are respectively the same as the main mirror 101, the minor mirror 102, the roughly flat mirror 103 and the viewpoint 105 shown in FIG.
- the case where an object designation point is given on the main mirror will be described as an example.
- the viewpoint 1305 of the camera which is the entrance pupil position of the lens attached to the camera, and the imaging surface 1306, the coordinates in the omnidirectional image of the object designation point 1307 on the image photographed on the imaging surface 1306 It is assumed that it is given as designation information.
- an intermediate point lens 1308 on the almost plane mirror can be obtained as a point of intersection of the extension line and the substantially plane mirror 1303. Since the light from the subject is incident on the light beam reflected by the approximately planar mirror middle point 1308, a straight line is determined by reflection calculation, and the middle point 1309 on the main mirror is determined as the intersection point of the straight line and the main mirror. Since the reflection at the main mirror is a straight line connecting the middle point 1309 on the main mirror and the focal point 1310 on the main mirror due to the single viewpoint property, the subject 1311 is present on the straight line.
- this straight line corresponds to the case where the subject is at infinity It becomes the vector of the incident light to the secondary mirror of.
- a secondary mirror intermediate point 1313 is obtained as a point of intersection of the secondary mirror 1302 with the straight line corresponding to the incident light from infinity.
- the reflection at the secondary mirror midpoint 1313 is determined as a straight line toward the outer focal point of the secondary mirror 1302 due to the single viewpoint property.
- an approximately planar mirror middle point 1314 is obtained.
- the light reflected by the approximately planar mirror middle point 1314 passes through the camera viewpoint 1305 coincident with the point at which the outer focal point of the sub mirror 1302 is reflected by the approximately plane mirror 1303 and enters the imaging surface 1306. Therefore, an infinitely distant point imaging point 1315 is obtained as an intersection point of the light reflected by the approximately planar mirror middle point 1314 and the imaging surface 1306.
- a plane including two viewpoints and an object constituting a stereo image is called an epipolar plane.
- the primary mirror inner focal point 1310, the secondary mirror inner focal point 1312 and the primary mirror upper middle point 1309 exist on the epipolar surface.
- the normal vector of the epipolar surface is required to generate an image for matching, which is the vector from the primary mirror inner focal point 1310 to the primary mirror upper midpoint 1309 and from the primary mirror inner focal spot 1310 to the secondary mirror inner focal spot 1312 It is obtained by the cross product with the vector.
- the normal vector of the epipolar plane, the image coordinates of the infinity point imaging point 1315, and the image scale ratio of the primary mirror to the secondary mirror are output to the matching target image generation unit 1204 as the matching target image information 1211 in FIG. Further, the normal vector of the epipolar plane, the image coordinates of the subject designation point 1307, and the like are output to the matching source image generation unit 1203 as the matching source image information 1212 in FIG.
- a corresponding point which is a point obtained by reversing (inverse converting) the coordinates in the matching target image obtained by image matching into the omnidirectional image 1210 with respect to the subject designation point 1307, corresponds to the omnidirectional image corresponding point 1316 in FIG. It is.
- the approximately planar mirror intermediate point 1317 can be obtained as the intersection of the extension line and the approximately plane mirror 1303 . Since the light from the subject is incident on the light beam reflected by the approximately planar mirror middle point 1317, a straight line is determined by reflection calculation, and the minor mirror middle point 1318 is determined as the intersection point of the straight line and the minor mirror. .
- the reflection at the secondary mirror is a straight line connecting the secondary mirror upper middle point 1318 and the secondary mirror inner focal point 1312 due to the single viewpoint property, so the subject 1311 is present on the straight line.
- the subject 1311 is also on the straight line connecting the main mirror inner focal point 1310 and the middle point 1309 on the main mirror, so the three-dimensional coordinates of the subject 1311 are the intersection of these two straight lines passing through the subject 1311 Desired.
- the processing of the positioning unit 1208 has been described for the case where the subject designation point is given on the primary mirror, even when the subject designation point is given on one secondary mirror, the secondary mirror and the other secondary Positioning can be performed by the same process with the mirror.
- the omnidirectional imaging system according to the present embodiment with the blind spot complementing unit 205 according to the first embodiment, positioning can be performed for the omnidirectional image region subjected to blind spot complementation while maintaining a wide viewing angle.
- System can be realized.
- the subject specified point is given in the blind spot complementation area for the blind spot complementation area.
- the designated subject designation point is associated with the corresponding subject designation point on the side mirror by the reverse operation of the blind spot complementing process. Therefore, positioning is possible even when the subject designation point is given within the blind spot complementation area by the positioning process of the side mirror to side mirror.
- FIG. 9 is a schematic configuration diagram of an optical system of an omnidirectional imaging system according to a third embodiment of the present invention.
- FIG. 9 shows a plan view and a front view in association with each other at the top and bottom.
- the same number is attached to the last two digits of a number, and description is abbreviate
- the sub mirror 1002 is disposed so as not to overlap the main mirror 1001 in a plan view. With such an arrangement, the range of the subject reflected by the auxiliary mirror is expanded, and the degree of freedom of blind spot complementation is expanded.
- FIG. 10 is a diagram showing the positional relationship between the primary mirror and one of the plurality of secondary mirrors in Embodiment 3 of the present invention, and includes the focal points of the primary and secondary mirrors as in FIG.
- the plane is represented by a coordinate system in which the XZ plane is used.
- the reflection by the substantially plane mirror is omitted.
- the same number is used for the last 2 digits of a number, and description is abbreviate
- the secondary mirror tilt angle 1107 is larger than the secondary mirror tilt angle 907 in FIG. 3, and the omnidirectional image reflected by the entire secondary mirror is captured by the camera. This broadens the degree of freedom of blind spot complementation as described later.
- the secondary mirror inclination angle 1107 is increased, the range which can be photographed by the secondary mirror is expanded, but as the photographed image, the imaging size of the primary mirror becomes relatively small in order to fit the secondary mirror in the image. I will. As a result, the area of the area where no effective image is displayed is increased. Therefore, it is considered practical to set the limit of the outer reference straight line 1111 inclined by the outer reference angle 1112 and design the position of the minor mirror apex 1106 within the range so far.
- a secondary mirror angle 1114 which is a prospective angle of the secondary mirror is introduced.
- 10 is an angle formed by a straight line connecting the sub mirror inner focal point 1102 b and the main mirror / sub mirror outer focal point 1101 and a straight line connecting the sub mirror effective edge point 1105 b and the main mirror / sub mirror outer focal point 1101.
- 1114 is referred to as ⁇ 2.
- the minor mirror apex 1106 is present inside the outer reference straight line 1111 and can be suppressed within a practical range.
- the range of the coefficient k is preferably 0 or more and 2 to 3 or less.
- the omnidirectional imaging system according to the present embodiment includes the blind spot complementing unit 205.
- the blind spot complementing unit 205 corresponds to a part of each sub mirror image area 1402 to 1405 with respect to the blind spot area 1401 of the main mirror image area 1400.
- the image of the blind spot area 1401 is complemented by cutting out and pasting.
- the blind spot complementing unit 205 associates the partial image area 1406 of the secondary mirror image area 1402 with the partial spot 1406 ′ of the corresponding blind area 1401.
- the partial image area 1407 of the secondary mirror image area 1403 is associated with the partial area 1407 ′ of the corresponding blind area 1401.
- the partial image area 1408 of the secondary mirror image area 1404 is associated with the partial area 1408 ′ of the corresponding blind area 1401.
- the partial image area 1409 of the secondary mirror image area 1405 is associated with the partial area 1409 ′ of the corresponding blind area 1401.
- the blind spot complementing unit 205 complements the blind spot area 1401 by performing such correspondence.
- the blind spot complementing unit 205 applies each sub mirror image area 1502 to 1505 to the blind spot area 1501 of the main mirror image area 1500 as in the complementary processing example of FIG. 6 in the first embodiment of the present invention.
- the image of the blind spot area 1501 is complemented by cutting out and pasting a part of 1502 and 1504 of them.
- the blind spot complementing unit 205 associates the partial image area 1511 of the sub mirror image area 1502 with the corresponding partial blind area 1511 ′ of the blind area 1501.
- the partial image area 1512 of the secondary mirror image area 1504 is associated with the corresponding part 1512 ′ of the blind spot area 1501.
- the blind spot complementing unit 205 complements the blind spot area 1501 by performing such correspondence.
- FIG. 13 shows an example in which blind spot complementation is possible even using an image of an inner portion close to the primary mirror of the secondary mirror.
- the blind spot complementing unit 205 complements the image of the blind spot area 1601 by cutting out and attaching a part of each of the sub mirror image areas 1602 to 1605 to the blind spot area 1601 of the main mirror image area 1600.
- FIG. 13 is similar to FIG. 11, but the correspondence between the dead angle area 1601 and the sub mirror is different.
- the blind spot complementing unit 205 associates the partial image area 1606 of the sub-mirror image area 1602 with the partial spot 1606 ′ of the corresponding blind area 1601.
- the partial image area 1607 of the secondary mirror image area 1603 is associated with the partial area 1607 ′ of the corresponding blind area 1601.
- the partial image area 1608 of the secondary mirror image area 1604 is associated with the partial area 1608 ′ of the corresponding blind area 1601.
- the partial image area 1609 of the secondary mirror image area 1605 is associated with the partial area 1609 ′ of the corresponding blind area 1601.
- the blind spot complementing unit 205 complements the blind spot area 1601 by performing such correspondence.
- FIG. 14 is a schematic configuration diagram of an optical system of an omnidirectional imaging system according to a fourth embodiment of the present invention.
- FIG. 14 shows the plan view and the front view in association with each other at the top and bottom.
- the same number is used for the last 2 digits of a number, and description is abbreviate
- FIG. 14 there are six sub mirrors 1702, all of which are arranged so as not to overlap the main mirror in plan view. With such an arrangement, the range of the subject reflected by the auxiliary mirror is expanded, and the degree of freedom of blind spot complementation is expanded.
- the omnidirectional imaging system according to the present embodiment includes the blind spot complementing unit 205.
- the blind spot complementing unit 205 is a part of each of the sub mirror image areas 1802 to 1807 with respect to the blind spot area 1801 of the main mirror image area 1800.
- the image of the blind spot area 1801 is complemented by clipping and pasting.
- the blind spot complementing unit 205 associates the partial image area 1808 of the sub-mirror image area 1802 with the partial spot 1808 ′ of the corresponding blind area 1801.
- the partial image area 1809 of the sub-mirror image area 1803 is associated with the corresponding part 1809 ′ of the blind spot area 1801.
- the partial image area 1810 of the secondary mirror image area 1804 is associated with the part 1810 ′ of the corresponding blind area 1801.
- the partial image area 1811 of the secondary mirror image area 1805 is associated with the part 1811 ′ of the corresponding blind area 1801.
- the partial image area 1812 of the secondary mirror image area 1806 is associated with the part 1812 ′ of the corresponding blind area 1801.
- the partial image area 1813 of the sub-mirror image area 1807 is associated with the corresponding part 1813 ′ of the blind spot area 1801.
- the blind spot complementing unit 205 complements the blind spot area 1801 by performing such correspondence.
- FIG. 16 shows a complementing process using three of the six submirrors.
- the blind spot complementing unit 205 cuts out and pastes a part of 1903, 1905 and 1907 among the sub mirror image areas 1902 to 1907 to the blind spot area 1901 of the main mirror image area 1900 to make the blind spot area 1901 Complement the image.
- the blind spot complementing unit 205 associates the partial image area 1909 of the secondary mirror image area 1903 with the partial spot 1909 ′ of the corresponding blind area 1901.
- the partial image area 1911 of the secondary mirror image area 1905 is associated with the corresponding part 1911 ′ of the blind spot area 1901.
- the partial image area 1913 of the secondary mirror image area 1907 is associated with the corresponding partial area 1913 ′ of the blind spot area 1901.
- the blind spot complementing unit 205 complements the blind spot area 1901 by performing such correspondence.
- FIG. 17 shows an example in which blind spot complementation is possible even using an image of an inner portion close to the primary mirror of the secondary mirror.
- the blind spot complementing unit 205 complements the image of the blind spot area 2001 by cutting out and attaching a part of each of the sub mirror image sections 2002 to 2007 to the blind spot area 2001 of the main mirror image area 2000.
- FIG. 17 is similar to FIG. 15, but the correspondence between the blind area 2001 and the sub mirror is different.
- the dead angle complementing unit 205 associates the partial image region 2008 of the sub-mirror image region 2002 with the partial portion 2008 'of the corresponding blind region 2001.
- the partial image area 2009 of the sub-mirror image area 2003 is associated with the part 2009 'of the corresponding blind area 2001.
- the partial image area 2010 of the secondary mirror image area 2004 is associated with the part 2010 'of the corresponding blind area 2001.
- the partial image area 2011 of the auxiliary mirror image area 2005 is associated with the corresponding part 2011 'of the blind spot area 2001.
- the partial image area 2012 of the secondary mirror image area 2006 is associated with the part 2012 'of the corresponding blind area 2001.
- the partial image area 2013 of the sub-mirror image area 2007 is associated with the part 2013 'of the corresponding blind area 2001.
- the blind spot complementing unit 205 complements the blind spot area 2001 by performing such correspondence.
- FIG. 18 is a schematic diagram of an optical system of an omnidirectional imaging system according to a fifth embodiment of the present invention.
- a camera 2104 captures an omnidirectional image reflected by a plurality of hyperboloid mirrors 2102 and further reflected by an approximately plane mirror 2103.
- the blind area of the main mirror becomes large and the main mirror It corresponds to the situation where the whole has been covered. Since the entire main mirror is blind, the main mirror itself becomes unnecessary.
- the outer focal points of the hyperboloids of each of the hyperboloid mirrors are made to coincide by tilting the central axes of the hyperboloid mirrors (corresponding to the submirrors) inward in the direction in which the apexes of the hyperboloid mirrors face each other.
- FIG. 19 is an image view of a photographed image by the omnidirectional photographing system in the fifth embodiment of the present invention.
- an image 2206 of the camera itself is captured at the center of the captured image 2201, and the images reflected by the hyperboloid mirrors are captured as hyperboloid mirror image areas 2202 to 2205.
- FIG. 20 is an image diagram of a complementing process example to be added to the image captured by the omnidirectional imaging system in the fifth embodiment of the present invention shown in FIG.
- a blind spot complementation image 2311 shown in FIG. 20B is generated separately to generate a omnidirectional image complemented by the blind spot. Since each hyperboloid mirror reflects the reflection image of another hyperboloid mutually, a blind spot area exists in part.
- the blind spot complementing unit 205 cuts out and pastes a part of each hyperboloid mirror image area 2302 to 2305 in the photographed image 2301 in order to obtain a blind spot complementary image 2311 without a blind spot except for these blind spots.
- a complement image 2311 is generated.
- the dead angle complementing unit 205 associates the partial image region 2307 of the hyperboloid mirror image region 2302 with the partial 2307 ′ of the corresponding blind spot complementary image 2311. Further, the partial image area 2308 of the hyperboloid mirror image area 2303 is associated with the partial 2308 ′ of the corresponding blind spot complement image 2311. Further, the partial image area 2309 of the hyperboloid mirror image area 2304 is associated with the part 2309 ′ of the corresponding blind spot complement image 2311. In addition, the partial image area 2310 of the hyperboloid mirror image area 2305 is associated with the part 2310 ′ of the corresponding blind spot complementary image 2311. The blind spot complementing unit 205 performs the blind spot complementation image 2311 by performing such correspondence.
- the synthesis parameter is, for example, a set of coordinate values on the blind spot complement image 2311 and coordinate values in the captured image 2301 corresponding to the point.
- composition shown in FIG. 20 uses a 1/4 area from each hyperboloid mirror for composition, in the fifth embodiment, since there is much overlap between hyperboloid mirrors, for example, for example, A variety of other area selection methods may be considered, such as 1/2, 1/3, etc.
- FIG. 21 is an external view of an imaging system configuration of an omnidirectional imaging system according to a sixth embodiment of the present invention.
- omnidirectional images are taken for each omnidirectional imaging device using four omnidirectional imaging devices which are four cameras 2411 to 2414 combined with four fisheye lenses 2401 to 2404.
- FIG. 22 is a pictorial image view of an omnidirectional imaging system according to a sixth embodiment of the present invention. That is, the video shot by the omnidirectional shooting system according to the present embodiment is visualized as shown in FIG. 22, and specifically, the video shot by the fisheye lens 2401 and the camera 2411 of FIG. An omnidirectional image area 2511 is captured in the captured image 2501.
- an image captured by the fisheye lens 2402 and the camera 2412 in FIG. 21 is captured as an omnidirectional image area 2512 in the captured image 2502.
- the image captured by the fisheye lens 2403 and the camera 2413 in FIG. 21 is captured as an omnidirectional image area 2513 in the image 2503.
- an image captured by the fisheye lens 2404 and the camera 2414 in FIG. 21 is captured as an omnidirectional image region 2514 in the captured image 2504. In these omnidirectional image regions 2511 to 2514, three fisheye lenses other than the fisheye lens photographed respectively are reflected, and this is a blind spot in the omnidirectional image.
- a blind spot is synthesized by combining the images of the captured images 2501 to 2504 photographed by the combination of these four fisheye lenses and the camera, one blind spot.
- a complementary composite image which is an image without a region, can be generated.
- an image having no blind spot area in the present embodiment is an image that does not include a blind spot area lacking image information on the image.
- FIG. 23 is a diagram of the image processing configuration of the blind spot complementing unit in the sixth embodiment of the present invention.
- FIG. 23 shows the imaging optical system in FIG. 4 of the first embodiment of the present invention and four cameras, and the lower two digits of the reference numerals excluding letters a to d are the same as FIG. Correspond to each other.
- the four imaging optical systems and the camera are represented by adding a to d to the code.
- one omnidirectional image 210 includes omnidirectional images of the main mirror and the plurality of sub mirrors.
- each omnidirectional imaging device that is, imaging optical system (fisheye lens)
- the image combining unit 2603 adds a complementing process to be described later, and outputs a blind spot complementary image.
- the blind spot complementing unit 2605 includes an image combining unit 2603 and a synthesis parameter computing unit 2604.
- the blind spot complementing unit 2605 receives a camera image and outputs a blind spot complementary image.
- composition parameters for performing complement processing are composition data specifying calibration data representing the structure and characteristics of the optical system and conditions to be matched (eg, height to be matched or distance from camera, camera viewpoint position, etc.) Calculation is performed in the synthesis parameter calculation unit 2604 from the instruction information.
- the image synthesis unit 203 performs complement synthesis processing of the blind spot and outputs the result as a blind spot complement image.
- the synthesis parameters are, for example, a set of coordinate values on the blind spot complementation image, numbers of omnidirectional images corresponding to the points, and coordinate values in the omnidirectional images.
- the image combining unit 2603 or the combining parameter computing unit 2604 can use a set of only representative points as a combining parameter to obtain the interior of the polygon formed by these by linear interpolation and perform complementing processing. This is the same as the omnidirectional imaging system in mode 1.
- image processing such as adjusting the average of the overall luminance to each omnidirectional image 2610a to 2610d in order to enhance the combined image quality. .
- FIG. 24 is an image diagram of a complementing process example to be added to the image captured by the omnidirectional imaging system according to the sixth embodiment of the present invention shown in FIG.
- the photographed image shown in FIG. 24 (a) is considered to be the same as one obtained by dividing the photographed image shown in FIG. 20 (a) of the fifth embodiment of the present invention into four.
- a blind spot complementation image 2731 shown separately in FIG. 24B is synthesized.
- a blind spot area exists in part because the other fisheye camera itself is reflected mutually.
- the blind spot complementing unit 2605 cuts out and pastes a part of each omnidirectional image region 2711 to 2714 in the photographed images 2701 to 2704.
- a blind spot complement image 2731 is generated.
- the blind spot complementing unit 2605 associates the partial image region 2721 of the omnidirectional image region 2711 with the portion 2721 'of the corresponding blind spot complementary image 2731.
- the partial image area 2722 of the omnidirectional image area 2712 is associated with the part 2722 'of the corresponding blind spot complementary image 2731.
- the partial image area 2723 of the omnidirectional image area 2713 is associated with the part 2723 'of the corresponding blind spot complementary image 2731.
- the partial image area 2724 of the omnidirectional image area 2714 is associated with the part 2724 'of the corresponding blind spot complementary image 2731.
- the blind spot complementing unit 2605 performs the above-described correspondence to synthesize the blind spot complement image 2731.
- the synthetic image shown in FIG. 24 uses 1/4 area from each fisheye camera image for the synthesis, in the sixth embodiment, there is much overlap between the fisheye camera images.
- various area selection methods for synthesis can be considered. For example, it is also possible to select one of the omnidirectional image areas 2711 to 2714 and combine only the blind spots from other omnidirectional image areas.
- FIG. 25 is an image diagram showing the relationship between the angle of incident light of equidistant projection and the imaging point in the photographed image.
- a fisheye lens 2901, a virtual viewpoint 2902 of the fisheye lens 2901, a virtual spherical surface 2903 centered on the virtual viewpoint 2902, incident light 2904 to 2906 from various angles, a photographed image 2907, and projection points corresponding to the incident light 2904 to 2906 2908 to 2910 are shown.
- a virtual spherical surface 2903 is assumed centering on a virtual viewpoint 2902 of the fisheye lens 2901.
- the light incident toward the virtual viewpoint 2902 from the outside of this spherical surface is projected by the fisheye lens 2901, and the incident light 2904 incident from the front of the optical axis forms an angle of 0 degrees with the optical axis, and the projection of the center of the photographed image 2907 It is projected to point 2908.
- incident light 2905 incident from a direction forming an angle of 90 degrees with the optical axis is projected to a projection point 2909 which is separated from the projection point 2908 at the center of the photographed image 2907 by a fixed distance.
- Light incident at an intermediate angle between the two incident lights is projected on the captured image 2907 at a position proportional to the incident angle.
- incident light 2906 incident from a direction forming an angle of 45 degrees with the optical axis is projected to a projection point 2910 which is a midpoint between the projection point 2908 and the projection point 2909.
- the virtual viewpoint 2902 has been described as one point, but more precisely, the virtual viewpoint 2902 moves a small distance on the optical axis according to the incident angle.
- FIG. 26 is an image processing configuration diagram of the positioning unit 2808 in the sixth embodiment of the present invention.
- FIG. 26 shows the imaging optical system and the camera in FIG. 7 of the first embodiment of the present invention being four in number, and the lower two digits of the codes excluding the English letters a to d are the same as FIG. 7 Correspond to each other.
- the four imaging optical systems and the camera are represented by adding a to d to the code.
- one omnidirectional image 1210 includes omnidirectional images of a main mirror and a plurality of sub mirrors, but in FIG. 26, each omnidirectional imaging device (ie, imaging optical system (fisheye lens)) And cameras are obtained and input to the matching source image generation unit 2803 and the matching target image generation unit 2804.
- each omnidirectional imaging device ie, imaging optical system (fisheye lens)
- cameras are obtained and input to the matching source image generation unit 2803 and the matching target image generation unit 2804.
- the matching image information calculation unit 2805 calculates the matching image information calculation unit 2805 according to the given subject specification information and calibration data (including the three-dimensional position and direction of each fisheye camera) by a matching image information calculation method described later. .
- the image matching unit 2806 searches the matching target image for a portion having the highest degree of match with the matching source image, and obtains and outputs coordinates in the matching target image.
- the positioning operation unit 2807 performs inverse transformation on the coordinates in the matching object image output from the image matching unit 2806 using the matching object image information 2811 output from the matching image information operation unit 2805 to obtain omnidirectional images 2810 a Find the corresponding point coordinates in 2810d. Further, as in the second embodiment, the positioning operation unit 2807 obtains matching source coordinates from the matching source image information 2812. From these two coordinates (matching source coordinates, corresponding point coordinates), the positioning operation unit 2807 calculates two vectors directed to the subject by a positioning operation method to be described later according to calibration data. The positioning operation unit 2807 specifies the three-dimensional position of the subject according to the principle of triangulation based on these two vectors, and outputs it as a positioning result.
- FIG. 27 is a conceptual diagram for explaining positioning vector calculation in the sixth embodiment of the present invention.
- FIG. 27 shows two fisheye lenses 3001a and 3001b among the plurality of fisheye lenses, virtual viewpoints 3002a and 3002b of the fisheye lenses 3001a and 3001b, and virtual spherical surfaces 3003a and 3003b described in FIG. Also, photographed images 3004a and 3004b by the fisheye lenses 3001a and 3001b, an object designation point 3005, an infinite distance point imaging point 3006, a virtual spherical intersection point 3007, an omnidirectional image corresponding point 3008, and an object 3010 are shown.
- the three-dimensional coordinate system (X, Y, Z) of the fisheye lens is taken with the optical axis direction of the lens as the Z axis, and the X and Y axes on a plane perpendicular to the optical axis.
- the x and y axes of x and y) and the x and y axes of the three-dimensional coordinate system of the fisheye lens are parallel to each other.
- the direction angle ⁇ with respect to the reference direction (for example, the x-axis direction) is
- the vector passing through the subject 3010 is determined by the ⁇ , ⁇ and the virtual viewpoint 3002 a.
- the infinity point imaging point 3006 in the fisheye lens 3001b is a point on the captured image 3004b where ⁇ and ⁇ coincide, but when a fisheye lens having the same characteristics is arranged in the same direction, the coordinates of the object designation point 3005 Since they coincide with each other, the image coordinates of the infinity imaging point 3006 also become (x t , y t ).
- the epipolar plane is a plane including the virtual viewpoint 3002 a, the virtual viewpoint 3002 b, and the subject 3010.
- three-dimensional coordinates of the subject 3010 can not be calculated yet, it can be considered as a plane including an arbitrary point on a straight line connecting the virtual viewpoint 3002a and the subject 3010, for example, a virtual spherical intersection 3007 which is an intersection with a virtual spherical surface. .
- the normal vector of the epipolar surface is required to generate an image for matching, which is obtained by the outer product of the vector from the virtual viewpoint 3002a to the virtual spherical intersection 3007 and the vector from the virtual viewpoint 3002a to the virtual viewpoint 3002b. .
- the normal vector of the epipolar plane, the image coordinates of the infinity imaging point 3006, the fisheye camera number and the like of the infinity point imaging point 3006 are output to the matching target image generation unit 2804 as the matching target image information 2811 in FIG. Further, the normal vector of the epipolar plane, the image coordinates of the subject designation point 3005, and the like are output to the matching source image generation unit 2803 as the matching source image information 2812 in FIG.
- the corresponding point in the omnidirectional image in FIG. 27 is the corresponding point in which the coordinates in the matching target image determined by image matching with respect to the subject designation point 3005 are returned into the captured image 3004b obtained by imaging all directions.
- the direction of incident light can be determined from the image coordinates of the omnidirectional image corresponding point 3008, and the subject 3010 is present on a straight line in the above direction passing through the virtual viewpoint 3002b.
- the positioning operation unit 2807 sets the three-dimensional coordinates of the subject 3010 as the intersection of straight lines passing these two subjects 3010. You can ask for
- FIG. 28 is a block diagram showing a hardware configuration of a computer system that implements a blind spot complementing unit, a positioning unit, and the like included in the omnidirectional imaging system.
- a blind spot complementing unit, a positioning unit and the like provided by the omnidirectional imaging system include a computer 34, a keyboard 36 and a mouse 38 for giving instructions to the computer 34, and a display 32 for presenting information such as calculation results of the computer 34 , CD-ROM (Compact Disc-Read Only Memory) device 40 for reading a program to be executed by the computer 34, and a communication modem (not shown).
- a computer 34 a keyboard 36 and a mouse 38 for giving instructions to the computer 34
- a display 32 for presenting information such as calculation results of the computer 34
- CD-ROM (Compact Disc-Read Only Memory) device 40 for reading a program to be executed by the computer 34
- a communication modem not shown.
- a program which is a process performed by the blind spot complementing unit and the positioning unit provided in the omnidirectional imaging system is stored in the CD-ROM 42 which is a computer readable medium and read by the CD-ROM device 40. Alternatively, it is read by the communication modem 52 through a computer network.
- the computer 34 includes a central processing unit (CPU) 44, a read only memory (ROM) 46, a random access memory (RAM) 48, a hard disk 50, a communication modem 52, and a bus 54.
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the CPU 44 executes the program read via the CD-ROM device 40 or the communication modem 52.
- the ROM 46 stores programs and data necessary for the operation of the computer 34.
- the RAM 48 stores data such as parameters at the time of program execution.
- the hard disk 50 stores programs, data, and the like.
- the communication modem 52 communicates with other computers via a computer network.
- the bus 54 mutually connects the CPU 44, the ROM 46, the RAM 48, the hard disk 50, the communication modem 52, the display 32, the keyboard 36, the mouse 38 and the CD-ROM device 40 to one another.
- the system LSI is a super-multifunctional LSI manufactured by integrating a plurality of components on one chip, and more specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- IC card or module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or module may include the above-described ultra-multifunctional LSI.
- the IC card or module achieves its functions by the microprocessor operating according to the computer program. This IC card or this module may be tamper resistant.
- the present invention may be the method described above.
- the present invention may also be a computer program that implements these methods by a computer. Also, it may be a digital signal consisting of a computer program.
- the present invention is a computer-readable recording medium that can read the computer program or the digital signal, such as a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a BD (Blu-ray Disc (Registered trademark), a memory card such as a USB memory or an SD card, or a semiconductor memory may be used. Further, the present invention may be the digital signal recorded on these recording media.
- the computer program or the digital signal may be transmitted via a telecommunication line, a wireless or wired communication line, a network represented by the Internet, data broadcasting, and the like.
- the present invention may be a computer system comprising a microprocessor and a memory, wherein the memory stores the computer program, and the microprocessor operates according to the computer program.
- the omnidirectional imaging system according to the present invention is useful as an omnidirectional camera or the like for surveillance applications because it can enable acquisition and positioning of an image without blind spots while securing a high viewing angle. It can also be applied to applications such as human flow line detection.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Lenses (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
X軸-Z軸平面内に主鏡の中心軸と複数の副鏡のうちの1つである第1の副鏡の中心軸とが含まれ、主鏡の中心軸がZ軸と一致し、主鏡及び第1の副鏡の双曲面の外側焦点と、X軸-Z軸平面の原点とが一致している場合であって、
主鏡の外径上の点のうち、外径が最大となる点を
X軸-Z軸平面内に主鏡の中心軸と複数の副鏡のうちの1つである第2の副鏡の中心軸とが含まれ、主鏡の中心軸がZ軸と一致し、主鏡及び第2の副鏡の双曲面の外側焦点とX軸-Z軸平面の原点とが一致している場合であって、
主鏡の中心軸と、第2の副鏡の中心軸とのなす角をθとし、主鏡の外径上の点のうち、外径が最大となる点
図1は、本発明の実施の形態1における全方位撮影システムの光学系構成図である。より詳細には、図1(a)は、全方位撮像システムを横から見た図である。また、図1(b)は全方位撮像システムを斜め上から見た図である。
図7は、本発明の実施の形態2における測位部1208の画像処理構成図である。
図9は、本発明の実施の形態3における全方位撮影システムの光学系概略構成図である。図9は、平面図と正面図とをそれぞれ上下に対応付けて示している。なお、本発明の実施の形態1における図1と同じ構成要素については数字の下2桁に同じ番号を付け、説明を省略する。
図14は、本発明の実施の形態4における全方位撮影システムの光学系概略構成図である。図14は、平面図と正面図とを上下に対応付けてそれぞれ示している。なお、本発明の実施の形態1における図1と同じ構成要素については数字の下2桁に同じ番号を用い、説明を省略する。
図18は、本発明の実施の形態5における全方位撮影システムの光学系構成図である。
図21は、本発明の実施の形態6における全方位撮影システムの撮像系構成外観図である。図21において、4本の魚眼レンズ2401~2404と組み合わされた4台のカメラ2411~2414である4台の全方位撮影装置を用いて、それぞれの全方位撮影装置毎に全方位画像を撮影する。図22は、本発明の実施の形態6における全方位撮影システムの撮影画像イメージ図である。すなわち、本実施の形態に係る全方位撮影システムによって撮影される映像は、図22に示したように映像化され、具体的には、図21の魚眼レンズ2401とカメラ2411とで撮影した映像は、撮影画像2501内に全方位画像領域2511として撮像される。また、図21の魚眼レンズ2402とカメラ2412とで撮影した映像は、撮影画像2502内に全方位画像領域2512として撮像される。また、図21の魚眼レンズ2403とカメラ2413とで撮影した撮影は、画像2503内に全方位画像領域2513として撮像される。また、図21の魚眼レンズ2404とカメラ2414とで撮影した映像は、撮影画像2504内に全方位画像領域2514として撮像される。これらの全方位画像領域2511~2514には、それぞれ撮影した魚眼レンズ以外の3個の魚眼レンズが写り込んでおり、全方位画像内の死角となっている。この死角部分は写り込んでいる魚眼レンズにおいて撮影されているため、これら4台の魚眼レンズとカメラの組で撮影された撮影画像2501~2504の画像から死角のない部分を合成することにより、一つの死角領域のない画像である補完合成画像を生成することができる。
34 コンピュータ
36 キーボード
38 マウス
40 CD-ROM装置
42 CD-ROM
44 CPU
46 ROM
48 RAM
50 ハードディスク
52 通信モデム
54 バス
101、301、803a、903a、1001、1301、1701 主鏡
102、302、803b、903b、1002、1302、1702 副鏡
103、1003、1303、1703、2103 概平面鏡
104、202、510、1004、1202、1704、2104、2411~2414、2602a~2602d、2802a~2802d カメラ
105、1305 視点
201、1201、2601a~2601d、2801a~2801d 撮像光学系
203、2603 画像合成部
204、2604 合成パラメータ演算部
205、2605 死角補完部
210、1210、2610a~2610d、2810a~2810d 全方位画像
303 受光レンズ系
304 撮像面
305 透明管
401 全方位ミラー
402a 半球面鏡
402b 球面鏡
403 撮像装置
512 レンズ光学系
514 撮像素子
600、700、1400、1500、1600、1800、1900、2000 主鏡画像領域
601、701 死角領域
602~605、702~705、1402~1405、1502~1505、1602~1605、1802~1807、1902~1907、2002~2007 副鏡画像領域
606 副鏡画像領域602の一部画像領域
606’ 死角領域601の一部(一部画像領域606に対応)
607 副鏡画像領域603の一部画像領域
607’ 死角領域601の一部(一部画像領域607に対応)
608 副鏡画像領域604の一部画像領域
608’ 死角領域601の一部(一部画像領域608に対応)
609 副鏡画像領域605の一部画像領域
609’ 死角領域601の一部(一部画像領域609に対応)
711 副鏡画像領域702の一部画像領域
711’ 死角領域701の一部(一部画像領域711に対応)
712 副鏡画像領域704の一部画像領域
712’ 死角領域701の一部(一部画像領域712に対応)
801a 主鏡803aの外側焦点
801b 副鏡803bの外側焦点
802a 主鏡803aの内側焦点
802b 副鏡803bの内側焦点
804a 主鏡有効半径
804b 副鏡有効半径
805a、905a 主鏡有効エッジ点
805b、1105b 副鏡有効エッジ点
806、906、1106 副鏡頂点
901、1101 主鏡・副鏡外側焦点
902a、1310 主鏡内側焦点
907、1107 副鏡傾斜角
908、1108 内側限界角
909 下側限界直線
910 内側限界直線
1102b、1312 副鏡内側焦点
1111 外側参考直線
1112 外側参考角
1114 副鏡角
1203、2803 マッチング元画像生成部
1204、2804 マッチング対象画像生成部
1205、2805 マッチング画像情報演算部
1206、2806 画像マッチング部
1207、2807 測位演算部
1208、2808 測位部
1211、2811 マッチング対象画像情報
1212、2812 マッチング元画像情報
1306 カメラの撮像面
1307、3005 被写体指定点
1308 概平面鏡上中間点ア
1309 主鏡上中間点
1311、3010 被写体
1313 副鏡中間点ア
1314 概平面鏡上中間点イ
1315、3006 無限遠点撮像点
1316、3008 全方位画像内対応点
1317 概平面鏡上中間点ウ
1318 副鏡上中間点イ
1401 主鏡画像領域1400の死角領域
1406 副鏡画像領域1402の一部画像領域
1406’ 死角領域1401の一部(一部画像領域1406に対応)
1407 副鏡画像領域1403の一部画像領域
1407’ 死角領域1401の一部(一部画像領域1407に対応)
1408 副鏡画像領域1404の一部画像領域
1408’ 死角領域1401の一部(一部画像領域1408に対応)
1409 副鏡画像領域1405の一部画像領域
1409’ 死角領域1401の一部(一部画像領域1409に対応)
1501 主鏡画像領域1500の死角領域
1511 副鏡画像領域1502の一部画像領域
1511’ 死角領域1501の一部(一部画像領域1511に対応)
1512 副鏡画像領域1504の一部画像領域
1512’ 死角領域1501の一部(一部画像領域1512に対応)
1601 主鏡画像領域1600の死角領域
1606 副鏡画像領域1602の一部画像領域
1606’ 死角領域1601の一部(一部画像領域1606に対応)
1607 副鏡画像領域1603の一部画像領域
1607’ 死角領域1601の一部(一部画像領域1607に対応)
1608 副鏡画像領域1604の一部画像領域
1608’ 死角領域1601の一部(一部画像領域1608に対応)
1609 副鏡画像領域1605の一部画像領域
1609’ 死角領域1601の一部(一部画像領域1609に対応)
1801 主鏡画像領域1800の死角領域
1808 副鏡画像領域1802の一部画像領域
1808’ 死角領域1801の一部(一部画像領域1808に対応)
1809 副鏡画像領域1803の一部画像領域
1809’ 死角領域1801の一部(一部画像領域1809に対応)
1810 副鏡画像領域1804の一部画像領域
1810’ 死角領域1801の一部(一部画像領域1810に対応)
1811 副鏡画像領域1805の一部画像領域
1811’ 死角領域1801の一部(一部画像領域1811に対応)
1812 副鏡画像領域1806の一部画像領域
1812’ 死角領域1801の一部(一部画像領域1812に対応)
1813 副鏡画像領域1807の一部画像領域
1813’ 死角領域1801の一部(一部画像領域1813に対応)
1901 主鏡画像領域1900の死角領域
1909 副鏡画像領域1903の一部画像領域
1909’ 死角領域1901の一部(一部画像領域1909に対応)
1911 副鏡画像領域1905の一部画像領域
1911’ 死角領域1901の一部(一部画像領域1911に対応)
1913 副鏡画像領域1907の一部画像領域
1913’ 死角領域1901の一部(一部画像領域1913に対応)
2001 主鏡画像領域2000の死角領域
2008 副鏡画像領域2002の一部画像領域
2008’ 死角領域2001の一部(一部画像領域2008に対応)
2009 副鏡画像領域2003の一部画像領域
2009’ 死角領域2001の一部(一部画像領域2009に対応)
2010 副鏡画像領域2004の一部画像領域
2010’ 死角領域2001の一部(一部画像領域2010に対応)
2011 副鏡画像領域2005の一部画像領域
2011’ 死角領域2001の一部(一部画像領域2011に対応)
2012 副鏡画像領域2006の一部画像領域
2012’ 死角領域2001の一部(一部画像領域2012に対応)
2013 副鏡画像領域2007の一部画像領域
2013’ 死角領域2001の一部(一部画像領域2013に対応)
2102 複数の双曲面鏡
2201、2301、2701~2704、2907 撮影画像
2202~2205、2302~2305 双曲面鏡画像領域
2206 カメラ自身の画像
2307 双曲面鏡画像領域2302の一部画像領域
2307’ 死角補完画像2311の一部(一部画像領域2307に対応)
2308 双曲面鏡画像領域2303の一部画像領域
2308’ 死角補完画像2311の一部(一部画像領域2308に対応)
2309 双曲面鏡画像領域2304の一部画像領域
2309’ 死角補完画像2311の一部(一部画像領域2309に対応)
2310 双曲面鏡画像領域2305の一部画像領域
2310’ 死角補完画像2311の一部(一部画像領域2310に対応)
2311、2731 死角補完画像
2401~2404、2901、α1 魚眼レンズ
2501~2504 魚眼レンズ2401とカメラ2411とで撮影した撮影画像
2511~2514、2711~2714 全方位画像領域
2721 全方位画像領域2711の一部画像領域
2721’ 死角補完画像2731の一部(一部画像領域2721に対応)
2722 全方位画像領域2712の一部画像領域
2722’ 死角補完画像2731の一部(一部画像領域2722に対応)
2723 全方位画像領域2713の一部画像領域
2723’ 死角補完画像2731の一部(一部画像領域2723に対応)
2724 全方位画像領域2714の一部画像領域
2724’ 死角補完画像2731の一部(一部画像領域2724に対応)
2902 魚眼レンズの仮想視点
2903 仮想球面
2904~2906 入射光
2908 入射光2904に対応する射影点
2909 入射光2905に対応する射影点
2910 入射光2906に対応する射影点
3001a 魚眼レンズ
3001b 魚眼レンズ
3002a 魚眼レンズの仮想視点
3002b 魚眼レンズの仮想視点
3003a 魚眼レンズの仮想球面
3003b 魚眼レンズの仮想球面
3004a 魚眼レンズによる撮影画像
3004b 魚眼レンズによる撮影画像
3007 仮想球面上交点
δ 回転体鏡
Claims (27)
- 双曲面鏡からなる主鏡と、
前記主鏡の周囲に複数配置された、双曲面鏡からなる副鏡と、
前記主鏡により反射された画像と、前記複数の副鏡により反射された画像とを撮影するカメラとを備え、
前記主鏡の双曲面の外側焦点と、前記複数の副鏡の双曲面の外側焦点とが略一致しており、
前記カメラは、前記カメラに取り付けたレンズの入射瞳位置である前記カメラの視点と、前記主鏡及び前記複数の副鏡の双曲面の外側焦点とが略一致するように配置されている
全方位撮影システム。 - さらに、前記主鏡及び前記複数の副鏡の双曲面の外側焦点と、前記主鏡及び前記複数の副鏡の双曲面の内側焦点との間に配置された、概平面形状の概平面鏡を備え、
前記カメラは、前記主鏡及び前記複数の副鏡の双曲面の外側焦点と、概平面鏡について対称の位置に前記カメラの視点が位置するように配置され、
外光は、前記主鏡、又は、前記複数の副鏡により反射され、さらに、前記概平面鏡により反射されることにより、前記カメラに入射する
請求項1に記載の全方位撮影システム。 - 前記主鏡の前記双曲面の形状を表す双曲面方程式を
X軸-Z軸平面内に前記主鏡の中心軸と前記複数の副鏡のうちの1つである第2の副鏡の中心軸とが含まれ、前記主鏡の中心軸がZ軸と一致し、前記主鏡及び前記第2の副鏡の双曲面の外側焦点と前記X軸-Z軸平面の原点とが一致している場合であって、
前記主鏡の中心軸と、前記第2の副鏡の中心軸とのなす角をθとし、前記主鏡の外径上の点のうち、前記外径が最大となる点
請求項1~3のいずれか1項に記載の全方位撮影システム。 - さらに、前記複数の副鏡により反射された画像によって、前記主鏡により反射された画像における死角領域を補完するための画像である補完合成画像を生成する死角補完部を備える
請求項1~4のいずれか1項に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の副鏡により反射された画像を用いて、前記主鏡の内側焦点に視点を置いた任意視点画像を生成し、前記任意視点画像により前記補完合成画像を生成する
請求項5に記載の全方位撮影システム。 - さらに、前記主鏡により反射された画像上の指定された点である指定点に対して、前記指定点に対応する被写体に対応する点であり、前記複数の副鏡のうちいずれか1つにより反射された画像上の点である対応点を画像マッチングにより算出し、前記指定点の座標と前記対応点の座標とから前記被写体の測位を行う測位部を備える
請求項1~4のいずれか1項に記載の全方位撮影システム。 - さらに、前記複数の副鏡のうちいずれか1つの前記副鏡により反射された画像上の指定された点である指定点に対して、前記指定点に対応する被写体に対応する点であり、前記主鏡により反射された画像上の点である対応点を画像マッチングにより算出し、前記指定点の座標と前記対応点の座標とから前記被写体の測位を行う測位部を備える
請求項1~4のいずれか1項に記載の全方位撮影システム。 - さらに、前記複数の副鏡のうちいずれか1つの前記副鏡により反射された画像上の指定された点である指定点に対して、前記指定点に対応する被写体に対応する点であり、該副鏡を除く、いずれか1つの副鏡により反射された画像上の点である対応点を画像マッチングにより算出し、前記指定点の座標と前記対応点の座標とから前記被写体の測位を行う測位部を備える
請求項1~4のいずれか1項に記載の全方位撮影システム。 - さらに、前記複数の副鏡のうち少なくとも一部により反射された画像によって、前記主鏡により反射された画像における死角領域を補完するための画像である補完合成画像を生成する死角補完部を備える
請求項7~9のいずれか1項に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の副鏡のうち少なくとも一部により反射された画像を用いて、前記主鏡の内側焦点に視点を置いた任意視点画像を生成することにより、前記死角領域を補完するための前記補完合成画像を生成する
請求項10に記載の全方位撮影システム。 - 前記死角補完部は、前記補完合成画像を生成する際に使用した副鏡により反射された画像と、前記補完合成画像内の領域との対応関係を記録し、
前記測位部は、前記補完合成画像上で指定された点が含まれる前記補完合成画像内の領域に対応する第1の副鏡により反射された画像上の点であって、前記補完合成画像上で指定された点に対応する被写体に対応する点である第1の点と、前記複数の副鏡のうち前記第1の副鏡を除く、いずれか1つの副鏡により反射された画像上の点であって、前記被写体に対応する点である第2の点とを画像マッチングにより算出し、前記第1の点と前記第2の点とから前記被写体の測位を行う
請求項5又は請求項6に記載の全方位撮影システム。 - 複数の双曲面鏡と、
前記複数の双曲面鏡により反射された画像を撮影するカメラとを備え、
前記複数の双曲面鏡は、それぞれの双曲面鏡の外側焦点が略一致するように、中心軸が傾いて配置され、
前記カメラは、前記カメラに取り付けたレンズの入射瞳位置である前記カメラの視点が、前記複数の双曲面鏡の前記外側焦点と略一致するように配置されている
全方位撮影システム。 - さらに、前記カメラの視点と略一致している前記複数の双曲面鏡の前記外側焦点と、該複数の双曲面鏡の内側焦点との間に配置された、概平面形状の概平面鏡を備え、
前記カメラは、前記複数の双曲面鏡の前記外側焦点と、前記概平面鏡について対称の位置に前記カメラの視点が位置するように配置され、
外光は、前記複数の双曲面鏡のそれぞれにより反射され、さらに、前記概平面鏡により反射されることにより、前記カメラに入射する
請求項13に記載の全方位撮影システム。 - さらに、前記複数の双曲面鏡のうちの少なくとも一部により反射された画像を合成することにより、画像上に死角領域のない画像である補完合成画像を生成する死角補完部を備える
請求項13又は請求項14に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の双曲面鏡のうち、いずれか1つの双曲面鏡により反射された画像上の死角領域を、該双曲面鏡を除く、他の双曲面鏡のうちの少なくとも一部により反射された画像を用いて補完することによって前記補完合成画像を生成する
請求項15に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の双曲面鏡のうちの少なくとも一部により反射された画像を用いて、前記複数の双曲面鏡のうち補完対象である双曲面鏡の単一視点位置に視点を置いた任意視点画像を生成し、前記任意視点画像を用いて、前記補完合成画像を生成する
請求項15又は請求項16に記載の全方位撮影システム。 - さらに、前記複数の双曲面鏡のうちいずれか1つの双曲面鏡により反射された画像上の指定された点である指定点に対して、前記指定点に対応する被写体に対応する点であり、該双曲面鏡を除く、いずれか1つの双曲面鏡により反射された画像上の点である対応点を画像マッチングにより算出し、前記指定点の座標と前記対応点の座標とから前記被写体の測位を行う測位部を備える
請求項13又は請求項14に記載の全方位撮影システム。 - さらに、前記複数の双曲面鏡のうち少なくとも一部により反射された画像を合成することにより、画像上に死角領域のない画像である補完合成画像を生成する死角補完部を備える
請求項18に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の双曲面鏡のうち少なくとも一部により反射された画像を用いて、全方位画像の視点における任意視点画像を生成し、前記任意視点画像を用いて、前記補完合成画像を生成する
請求項19に記載の全方位撮影システム。 - 前記死角補完部は、前記補完合成画像を生成する際に使用した双曲面鏡により反射された画像と、前記補完合成画像内の領域との対応関係を記録し、
前記測位部は、前記補完合成画像で指定された点が含まれる前記補完合成画像内の領域に対応する第1の双曲面鏡により反射された画像上の点であって、前記補完合成画像上で指定された点に対応する被写体に対応する点である第1の点と、前記複数の双曲面鏡のうち前記第1の双曲面鏡を除く、いずれか1つの双曲面鏡により反射された画像上の点であって、前記被写体に対応する点である第2の点とを画像マッチングにより算出し、前記第1の点と前記第2の点とから前記被写体の測位を行う
請求項19又は請求項20に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の双曲面鏡のうち、いずれか1つの双曲面鏡により反射された画像上の死角領域を、該双曲面鏡を除く、他の双曲面鏡のうち少なくとも一部により反射された画像を用いて補完する
請求項19に記載の全方位撮影システム。 - 前記死角補完部は、前記複数の双曲面鏡のうちの少なくとも一部により反射された画像を用いて、前記複数の双曲面鏡のうち補完対象である双曲面鏡の単一視点位置に視点を置いた任意視点画像を生成し、前記任意視点画像を用いて、前記補完合成画像を生成する
請求項22に記載の全方位撮影システム。 - 前記死角補完部は、前記補完合成画像を生成する際に使用した双曲面鏡により反射された画像と、前記補完合成画像内の領域との対応関係を記録し、
前記測位部は、前記補完合成画像で指定された点が含まれる前記補完合成画像内の領域に対応する第1の双曲面鏡により反射された画像上の点であって、前記補完合成画像上で指定された点に対応する被写体に対応する点である第1の点と、前記複数の双曲面鏡のうち前記第1の双曲面鏡を除く、いずれか1つの双曲面鏡により反射された画像上の点であって、前記被写体に対応する点である第2の点とを画像マッチングにより算出し、前記第1の点と前記第2の点とから前記被写体の測位を行う
請求項22又は請求項23に記載の全方位撮影システム。 - 全方位画像を撮影する、複数の全方位撮影装置と、
前記複数の全方位撮影装置のうち少なくとも一部により撮影された画像を合成することによって、画像上に死角領域のない画像である補完合成画像を生成する死角補完部とを備える
全方位撮影システム。 - さらに、前記複数の全方位撮影装置のうち、いずれか1つの全方位撮影装置により撮影された画像上の指定された点である指定点に対して、前記指定点に対応する被写体に対応する点であり、該全方位撮影装置を除く、いずれか1つの全方位撮影装置により撮影された画像上の点である対応点を画像マッチングにより算出し、前記指定点の座標と前記対応点の座標とから前記被写体の測位を行う測位部を備える
請求項25に記載の全方位撮影システム。 - 前記死角補完部は、前記補完合成画像を生成する際に使用した全方位撮影装置により撮影された画像と、前記補完合成画像内の領域との対応関係を記録し、
前記測位部は、前記補完合成画像で指定された点が含まれる前記補完合成画像内の領域に対応する第1の全方位撮影装置により撮影された画像上の点であって、前記補完合成画像上で指定された点に対応する被写体に対応する点である第1の点と、前記複数の全方位撮影装置のうち前記第1の全方位撮影装置を除く、いずれか1つの全方位撮影装置により撮影された画像上の点であって、前記被写体に対応する点である第2の点とを画像マッチングにより算出し、前記第1の点と前記第2の点とから前記被写体の測位を行う
請求項26に記載の全方位撮影システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/391,667 US9244258B2 (en) | 2010-06-24 | 2011-06-24 | Omnidirectional imaging system |
JP2011553188A JP5728393B2 (ja) | 2010-06-24 | 2011-06-24 | 全方位撮影システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-144297 | 2010-06-24 | ||
JP2010144297 | 2010-06-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011161973A1 true WO2011161973A1 (ja) | 2011-12-29 |
Family
ID=45371178
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/003615 WO2011161973A1 (ja) | 2010-06-24 | 2011-06-24 | 全方位撮影システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9244258B2 (ja) |
JP (1) | JP5728393B2 (ja) |
WO (1) | WO2011161973A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014073262A1 (ja) * | 2012-11-07 | 2014-05-15 | シャープ株式会社 | 撮像素子位置検出装置 |
JP2014095808A (ja) * | 2012-11-09 | 2014-05-22 | Nintendo Co Ltd | 画像生成方法、画像表示方法、画像生成プログラム、画像生成システム、および画像表示装置 |
CN107024828A (zh) * | 2017-03-29 | 2017-08-08 | 深圳市未来媒体技术研究院 | 一种可共光心的摄像机装置、全景无缝拼接组件及方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6065474B2 (ja) * | 2012-09-11 | 2017-01-25 | 株式会社リコー | 撮像制御装置、撮像制御方法およびプログラム |
CN104469167B (zh) * | 2014-12-26 | 2017-10-13 | 小米科技有限责任公司 | 自动对焦方法及装置 |
US10043237B2 (en) * | 2015-08-12 | 2018-08-07 | Gopro, Inc. | Equatorial stitching of hemispherical images in a spherical image capture system |
GB201610786D0 (en) * | 2016-06-21 | 2016-08-03 | Observant Tech Ltd | Image capturing apparatus and image reconstruction method of reducing a blind spot |
US20180089822A1 (en) * | 2016-09-15 | 2018-03-29 | Pervacio Inc. | Automated diagnostics for device display and camera |
US10748333B2 (en) | 2017-03-23 | 2020-08-18 | Nvidia Corporation | Finite aperture omni-directional stereo light transport |
US10623727B1 (en) * | 2019-04-16 | 2020-04-14 | Waymo Llc | Calibration systems usable for distortion characterization in cameras |
CN111665517B (zh) * | 2020-05-29 | 2022-11-18 | 同济大学 | 一种基于密度统计的单光子激光测高数据去噪方法及装置 |
KR102585785B1 (ko) * | 2021-05-31 | 2023-10-13 | 한국기계연구원 | 절대거리 측정 기반 다변측정시스템 및 이를 이용한 다변측정방법 |
EP4124014A1 (en) * | 2021-07-20 | 2023-01-25 | Spiideo AB | Devices and methods for wide field of view image capture |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006330735A (ja) * | 2005-05-26 | 2006-12-07 | Korea Advanced Inst Of Sci Technol | 単一カメラの全方向両眼視映像獲得装置 |
JP2008537157A (ja) * | 2005-04-18 | 2008-09-11 | シャープ株式会社 | 光学機器に用いるパノラマ式三次元アダプタおよびパノラマ式三次元アダプタと光学機器との組み合わせ |
WO2009057409A1 (ja) * | 2007-10-30 | 2009-05-07 | Olympus Corporation | 内視鏡装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3523783B2 (ja) | 1998-05-14 | 2004-04-26 | 康史 八木 | 全方位視角センサ |
JP4554954B2 (ja) | 2004-02-19 | 2010-09-29 | 康史 八木 | 全方位撮像システム |
JP2006220603A (ja) | 2005-02-14 | 2006-08-24 | Tateyama Machine Kk | 撮像装置 |
JP4801654B2 (ja) * | 2007-11-30 | 2011-10-26 | パナソニック株式会社 | 合成画像生成装置 |
JP2010181826A (ja) * | 2009-02-09 | 2010-08-19 | Panasonic Corp | 立体画像形成装置 |
US20110214082A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Projection triggering through an external marker in an augmented reality eyepiece |
-
2011
- 2011-06-24 WO PCT/JP2011/003615 patent/WO2011161973A1/ja active Application Filing
- 2011-06-24 US US13/391,667 patent/US9244258B2/en active Active
- 2011-06-24 JP JP2011553188A patent/JP5728393B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008537157A (ja) * | 2005-04-18 | 2008-09-11 | シャープ株式会社 | 光学機器に用いるパノラマ式三次元アダプタおよびパノラマ式三次元アダプタと光学機器との組み合わせ |
JP2006330735A (ja) * | 2005-05-26 | 2006-12-07 | Korea Advanced Inst Of Sci Technol | 単一カメラの全方向両眼視映像獲得装置 |
WO2009057409A1 (ja) * | 2007-10-30 | 2009-05-07 | Olympus Corporation | 内視鏡装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014073262A1 (ja) * | 2012-11-07 | 2014-05-15 | シャープ株式会社 | 撮像素子位置検出装置 |
JP5951793B2 (ja) * | 2012-11-07 | 2016-07-13 | シャープ株式会社 | 撮像素子位置検出装置 |
JP2014095808A (ja) * | 2012-11-09 | 2014-05-22 | Nintendo Co Ltd | 画像生成方法、画像表示方法、画像生成プログラム、画像生成システム、および画像表示装置 |
CN107024828A (zh) * | 2017-03-29 | 2017-08-08 | 深圳市未来媒体技术研究院 | 一种可共光心的摄像机装置、全景无缝拼接组件及方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011161973A1 (ja) | 2013-08-19 |
JP5728393B2 (ja) | 2015-06-03 |
US9244258B2 (en) | 2016-01-26 |
US20120147183A1 (en) | 2012-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011161973A1 (ja) | 全方位撮影システム | |
JP4825980B2 (ja) | 魚眼カメラの校正方法。 | |
JP2019097178A (ja) | 撮像装置、画像処理装置および方法 | |
US7420750B2 (en) | Catadioptric single camera systems having radial epipolar geometry and methods and means thereof | |
JP4825971B2 (ja) | 距離算出装置、距離算出方法、構造解析装置及び構造解析方法。 | |
WO2014208230A1 (ja) | 座標算出装置及び方法、並びに画像処理装置及び方法 | |
EP3765815B1 (en) | Imaging device, image processing apparatus, and image processing method | |
JP2006330735A (ja) | 単一カメラの全方向両眼視映像獲得装置 | |
JP2018189636A (ja) | 撮像装置、画像処理方法及びプログラム | |
JP6674643B2 (ja) | 画像処理装置及び画像処理方法 | |
US10255664B2 (en) | Image processing device and method | |
JP2017208606A (ja) | 画像処理装置、撮像装置、画像処理方法および画像処理プログラム | |
JP4679293B2 (ja) | 車載パノラマカメラシステム | |
JP5169787B2 (ja) | 画像変換装置および画像変換方法 | |
JP2010176325A (ja) | 任意視点画像生成装置及び任意視点画像生成方法 | |
WO2016175043A1 (ja) | 画像処理装置及び画像処理方法 | |
JPH1195344A (ja) | 全方位ステレオ画像撮影装置 | |
JP6674644B2 (ja) | 画像処理装置及び画像処理方法 | |
JP2011087319A (ja) | 車載パノラマカメラシステム | |
WO2017117039A1 (en) | Omnidirectional catadioptric lens with odd aspheric contour or multi-lens | |
US20190230264A1 (en) | Image capturing apparatus and image capturing method | |
JP2005275789A (ja) | 三次元構造抽出方法 | |
JP6684454B2 (ja) | 画像処理装置及び画像処理方法 | |
JP2014116867A (ja) | 立体表示システム、立体像生成装置及び立体像生成プログラム | |
WO2023135842A1 (ja) | 距離計測装置および方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2011553188 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11797861 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13391667 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11797861 Country of ref document: EP Kind code of ref document: A1 |