WO2001068540A2 - Imaging apparatus - Google Patents

Imaging apparatus Download PDF

Info

Publication number
WO2001068540A2
WO2001068540A2 PCT/GB2001/001115 GB0101115W WO0168540A2 WO 2001068540 A2 WO2001068540 A2 WO 2001068540A2 GB 0101115 W GB0101115 W GB 0101115W WO 0168540 A2 WO0168540 A2 WO 0168540A2
Authority
WO
WIPO (PCT)
Prior art keywords
means
image
light
reflecting
apparatus according
Prior art date
Application number
PCT/GB2001/001115
Other languages
French (fr)
Other versions
WO2001068540A3 (en
Inventor
Lee Scott Friend
Original Assignee
Lee Scott Friend
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB0006396.6 priority Critical
Priority to GB0006396A priority patent/GB2360413A/en
Priority to GBGB0018017.4A priority patent/GB0018017D0/en
Priority to GB0018017.4 priority
Priority to GB0019850A priority patent/GB0019850D0/en
Priority to GB0019850.7 priority
Priority to GB0021433.8 priority
Priority to GB0021433A priority patent/GB0021433D0/en
Priority to GB0023786A priority patent/GB0023786D0/en
Priority to GB0023786.7 priority
Priority to GB0028094A priority patent/GB0028094D0/en
Priority to GB0028094.1 priority
Application filed by Lee Scott Friend filed Critical Lee Scott Friend
Priority claimed from AU7264701A external-priority patent/AU7264701A/en
Publication of WO2001068540A2 publication Critical patent/WO2001068540A2/en
Publication of WO2001068540A3 publication Critical patent/WO2001068540A3/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/17Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/06Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe involving anamorphosis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2251Constructional details
    • H04N5/2254Mounting of optical parts, e.g. lenses, shutters, filters or optical parts peculiar to the presence or use of an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2259Means for changing the camera field of view without moving the camera body, e.g. nutating or panning optics or image-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23238Control of image capture or reproduction to achieve a very large field of view, e.g. panorama
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0088Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image

Abstract

Imaging apparatus comprising two imaging units (12) for producing images of respective scenes, each imaging unit comprising optical means for gathering light over a wide sector and directing it to image sensing means (14). The optical means of the imaging units can be back-to-back so as to encompass the imaging sensing means (14) and each unit can include a convex reflector (16) for reflecting light from a panoramic scene onto a planar reflector (18) which reflects light through a port or ports (24) in the convex reflector (16) onto a CCD array (14) within the convex reflector. Other reflecting systems include convex, concave and planar reflecting surfaces. Also disclosed are arrangements for reducing back-workings in the back-to-back fisheye lenses; a CCD with a curved image sensing surface; light shielding means; a method of generating position signals for navigating round a panoramic scene; and a device including a segmental field of view and an annular field of view which are combined to provide a panospherical image, the image output is processed to form a composite image in a flat plane.

Description

BMAGING APPARATUS

This invention relates to imaging apparatus, in particular to' apparatus for producing an image of a panoramic or omnidirectional scene. The invention can be used, for example, to provide a relatively simple and cost-effective surveillance system for monitoring a large zone from a single vantage point. However, it can also be used wherever there is a need to collect image input information over a wide sector, especially over 180 , or 360 deg., in any of three orthogonal planes, e.g. by means of imaging devices such as a video camera or CCD array (or in some cases by film). For example, a large zone can be monitored on a single screen using only one stationary imaging apparatus. Conventional surveillance systems require many video cameras to be distributed throughout the zone to be monitored and these are often panned and tilted to observe the whole scene. Alternatively, the invention can be applied in broadcast or transmission systems, such as television. It can also be mounted on a land, sea, or air vehicle to provide an all-round field of view. The resultant images may be transmitted on the Internet or wide, or local, area network, or via satellite, cable or terrestrial digital or analogue video systems. The invention can also be applied in a 360 by 360 deg. interactive film or video system.

US-A-5760826 discloses an omnidirectional imaging apparatus comprising paraboloid reflectors arranged either singly for reflecting an image of a hemispherical scene onto a camera, or doubly, i.e. back-to-back, to reflect a spherical image onto a camera. However, the disadvantages are that the cameras are located "outside" the paraboloid reflectors, i.e. spaced apart from and looking at each reflector, and they are widely spaced apart since they are arranged on the longitudinal axis passing through each reflector. This makes the system is difficult to install and unsuited for locations where an unobtrusive, compact and enclosed unit is needed. Attempts to make a portable unit result in having a camera supported on an extended post passing through the axis of the paraboloid mirror, the camera being located remotely from the mirror so as not to block the "overhead" sector of the panoramic view. This results in a cumbersome elongated structure, especially when two paraboloid mirrors are joined back-to-back and respective cameras are mounted on long posts which extend in opposite directions away from the mirrors.

One aspect of the invention solves the latter problem by providing imaging apparatus which comprises at least two imaging units, each unit including optical means with a wide field of view, the apparatus also including image sensing means for receiving an image from the optical means so as to provide a corresponding output; the imaging units being adjacently positioned (a) so as to encompass either the image sensing means, or image diverting means, which diverts images to the image sensing means, and (b) so that the output can be used to form a single composite image in the same plane.

The apparatus would normally include processing means for processing the output of the image sensing means so as to form a single composite image in the same plane and such processing means can be of known construction and operation (as explained herein). The fields of view of the respective optical means are generally directed adjacently so that the images captured by each have a common adjacent boundary, e.g. the equatorial boundary in back to back hemispheres, and or the adjoining edges of planar facets, whereby a single continuous image of the whole, or part of the panoramic scene can be formed on said image plane.

An advantage of the invention is that it can make the structure of the omnidirectional imaging apparatus more compact and robust. In some embodiments of the invention, for example where back to back convex reflectors are used, the imaging units or reflectors form a shell, which houses the image sensing means, such as a CCD, within an "inner region". In another embodiment, for example where back to back fisheye lens systems are used, each fisheye lens is effectively separated into two parts, whereby a part of each fisheye lens system is outside the inner region encompassed by the back to back units; the image deflecting means is in the "inner region"; and light is reflected onto the other part of each lens systems outside the inner region. An advantage is that the length (along the common lens axis) between the outermost regions of the back to back fisheye lens systems is greatly reduced. (Fisheye lens systems generally have many different lenses arranged in a sequence in the system. These are known as the "back workings", and they can be lengthy compared to ordinary lenses and back to back units are cumbersome.)

Preferably, the imaging units are arranged back-to-back on a common axis of symmetry whereby the imaging units provide a housing for the image sensing means, and hence the advantages of protecting the imaging sensing means and a more robust structure whilst also providing for 360 x 360 omnidirectional viewing.

Where lenses are used to gather light from reflecting surfaces, these can be telecentric, such as telecentric video lenses (including the L52-271 and others marketed by Edmund Industrial Optics) which provide constant magnification over a defined working distance range and help to reduce or eliminate perspective distortion and magnification error in observing 3D objects.

Some embodiments of the invention use reflecting systems to capture images from a panoramic scene, each imaging unit including first and second reflecting means, the first being convex for reflecting an image from a panoramic scene onto the second which, in turn reflects the image onto the imaging sensing means. For example, the first reflecting means can be hemispherical, parabolic, hyperbolic, ellipsoidal, or of a polygonal type where the polygon includes a plurality of planar or curved reflecting facets surrounding a central axis through the convex reflecting means. These can be whole or frusto-convex (truncated) sections. The second reflector can be planar, concave or convex and any of the hemispherical, parabolic, hyperbolic, ellipsoidal shapes. Where the first and second reflecting means have a curvature, these are in confocal relationships. A particularly useful reflective system is disclosed in US4012126 (Rosendahl).

In one example (a) of a polygonal structure, a convex pyramidal reflecting system has an aperture or light port in its apex which is aligned with the centre of the base of the pyramid on the axis of symmetry. It also has a planar reflector for reflecting light, from each facet or side of the pyramid, through the light port onto the image sensing means. In another example (b), each of the facets of a convex pyramidal reflecting system has an aperture, preferably at a mid-point, and sensing means located beneath the aperture so as to view the reflection of the respective planar facet in the planar mirror. In another example (c), a system for gathering light is located beneath an aperture in the facet and the light is conducted, via a light guide, such as an optic fibre, to the sensing means. A plurality of such light guides can be connected to a manifold for composing respective parts of the panoramic scene (i.e. the images in the facets) directly onto the image plane of a camera or CCD. Appropriate lenses can be used for focussing as may be necessary. In a further example (d), a system for gathering light is located directly opposite the facet and this light is conducted, via a light guide, to the sensing means.

A plurality of such light guides and a manifold for composing respective parts of the panoramic scene (i.e. the images in the facets) directly onto the image plane can be used as before. In yet another example (e), the convex reflector comprises a (first) set of facets, which are sides of a polygonal convex reflector having an axis of symmetry, where the facets symmetrically surround the central axis so as to reflect respective parts of the panoramic scene, the second reflecting means includes a second set of facets for reflecting light through a light port in the apex of the convex reflector, and the second set of facets are arranged to reflect only the light which is incident on them from respective facets in the first set, whereby the image sensing means separately and respectively receive light from those parts of the panoramic scene reflected in the first set of facets (of the convex reflector).

In any of these examples, light from the panoramic scene is reflected by the planar facets of the convex reflector (i) either onto the planar reflector, whereby each sensing means views a respective part of the panoramic scene seen by the corresponding facet, (ii) or onto the sensing means, such as a CCD, or onto a light gathering system, such as an optic fibre leading to a CCD. This means that only the light from a particular facet will be incident on a respective part of a common imaging surface (CCD), or on a respective imaging surface of one of several CCDs. This arrangement is important for composing the composite image on the image sensing means, because it is not generally possible to image the reflection in all facets by just looking down on a pyramid (looking at the reflection in all the facets) with one camera. Light baffles may be arranged between adjacent reflective facets, which baffles extend radially outwardly in line with the corners of the pyramid so as not obscure the scene, for further separation of adjacent sectors (e.g. to avoid spurious reflections). Any such examples can be used as a single imaging unit for 180 x 360 viewing, or with two back-to-back units for 360 x 360 viewing.

The first and second reflecting surfaces can be independently or mutually adjustable to enable the panoramic scene to be composed correctly. For example, the distance between the back-to-back units is adjustable so as to adjust the spacing of the first or convex reflectors, as well as the distance between the first (convex) and second (planar) reflector in each unit. The first and or second reflectors can also be mounted so that each or both are independently, or simultaneously pivotable about an axis, tiltable, or bodily displaceable. The image sensing means (camera) can also have zoom, and/or be mounted in gimbals. All of these adjustments can be respectively and selectively used for focussing, scaling, image positioning, whereby the required part or panorama of the scene can be correctly imaged. These adjustments can be manual or automatic to adjust for different scenes or applications.

Where parts of a panoramic scene are imaged by planar facets (such as the sides of a pyramidal structure), the reflecting system is adjusted so that each adjacent facet sees respective adjacent sectors of the panoramic scene, whereby parts of the scene can be joined or seamed together in the final composite image. When viewing distant objects, it is better to use more rather than less facets, e.g. between 6-10 facets in each unit or pyramid (for 180 x 360 viewing), because these provide for better overlap of images at the corners of the pyramid, i.e between adjacent facets, for joining images of adjacent sectors. It is surprising what can be seen in the adjacent facets of a pyramid with say 6 sides, since the panoramic image spans the boundary (or corner) between the facets without losing too much image information. The relative positions of reflecting surfaces and the focussing of the image sensing means is adjusted so that the overlap between adjacent facet images is optimum for the scene under observation. The angle of slope of the facets is generally 45 deg., but it can be different or made variable.

In the case of using a 4 sided pyramidal structure in one imaging unit (or 8 sided with back to back units forming an octahedron), this is not suitable for viewing objects at a distance, but it can be used with near objects. For example, it can be used in pipe inspection, where the inner pipe wall is close to the reflecting surfaces and the apex of the pyramid (or apices of the pyramids) travel along on the longitudinal axis of the pipe. With back to back units, the forward looking sides of the pyramid would then reflect images of the nearby inner pipe wall for optical inspection, while the rearwardly facing sides can image the same areas a little later, e.g. for inspection by IR, UN or X-ray. Pyramids with more sides can be used for pipes of bigger diameter. Generally speaking, convex parabolic reflectors and fisheye lenses are simple to use, but introduce spherical distortions which must be compensated by, for example, transformations of digital signals which effectively develop the spherical images into cylindrical images, which are then unrolled into adjacent flat sectors that form a continuous image on the same plane.

This can involve bulky storage and processing of image data signals to map panoramic spherical image information onto a flat plane. Such mapping is known in the art, for example, from "Digital Image Warping" by George Wolberg, IEEE Computer Society Press ISBN 0-8186-8944-7; "Space Image Processing" Sanchez and Canton, ISBN 08493 3113-7; "Making environment maps from fisheye photographs" by Ken Turkowski, July 1999, which refers to a paper published in 1986 by Ned Greene relating to environmental mapping ( Environment Mapping and Other Applications of World Projections, in IEE Computer Graphics and Applications, 1986, vol.6, no.11, pp21-29).

Such solutions rely heavily on complex processing of digital video signals to compensate for optical spherical distortions which are severe at the image boundary.

Another known technique is disclosed in US 6002430 (McCall & Martin), which mentions back-to-back lenses for capturing "hemispherical images" from a scene, "storing" these images (by which is meant either digitally storing signals or capturing images on film), and using a "converter" which identifies, joins and smoothes the edges (or seams) of the "hemispherical images". Such a "converter" can employ "automatic image processing" using "digital processing by a computer where images are altered automatically" for combining the two images together. Such prior art techniques can be used with embodiments of the invention which employ e.g. fisheye lenses and paraboloid reflectors. However, with the advent of rapid data rates that can be achieved on current state of the art computers, such processing can significantly add to the cost and complexity of the imaging apparatus. Moreover, with increasing demands for even higher resolution CCDs, for example, when using CCDs, such as Philips or Loran Fairchild chips, where say a pair of 18 megabyte chips could image something in the order of 20x106 pixels and where a frame rate of 10 frames/second would generate data in the order of 400 megabytes/second, the comparatively slow execution of complex mapping routines would not generate image information at a fast enough rate without using exceptionally fast and expensive computers.

In the embodiments of the present invention which use, for example, planar reflecting facets in a pyramidal convex reflector, the amount of processing of digital imaging signals is substantially reduced, because no similar complex techniques are required for the amount of image warping needed with fisheye lenses, since parts of the image are reflected in flat reflecting surfaces, which do not introduce spherical and other distortions usually associated with curved reflecting surfaces. This is a particular advantage over parabolic reflecting and fisheye lens systems. If some slight distortion is introduced, e.g. due to perspective, or due wide angle lenses close to the facets, this can be compensated far more easily than by spherical image transformation techniques.

In other words, curved reflecting surfaces offer lower optical quality than planar reflecting surfaces.

The imaging units each have respective image sensing means or use a common image sensor, which can be for example, a video camera, a CCD imaging system with appropriate lenses, or any other means for producing an output. This could be film in some cases, but typically an optical image of the scene is focussed onto a device which generates an-assiag image of the scene which is then converted into digital signals that are processed in order to carry out any necessary warping or image transformation for composing the final image onto a flat viewing plane. The imaging unit or units which capture the image of the scene optically, can be linked to the imaging sensing means either by connection (e.g. conductors), or by transmission/reception systems operating by RF, IR, Bluetooth, Fireware systems, which enable (e.g.) image signals to be transmitted to a remote point where they are processed for displaying the composite image. This facilitates installation, since the optical viewing parts of the equipment can be positioned on, or moved around a site, without concern for cabling to a remote viewing station.

With general regard to the use of lenses, any suitable prescriptions or groupings can be used for standard, telecentric, fisheye, supplementary or relay lens systems as will be known to those skilled in the art. The same comments apply to the shapes of spherical surfaces, such as hyperbolic, ellipsoidal, parabolic, concave, convex and to confocal relationships between focussing reflectors that are on the same optical paths. Also, the shapes, sizes and spacing of optical elements, including the angles of planar pyramidal sides can be suitably designed according to known optical principles, because the illustrations herein are only schematic and are not to scale, and further reference can be made to works such as "Modern Optical Engineering" referenced herein for further details of construction of the components used in particular arrangements.

Where means are provided for moving the second reflector relative to the first or convex reflector, this can be used to adjust the size of the scene imaged by the imaging unit. For example, by moving the second reflector closer to the first reflector, the size of the scene imaged by the imaging unit is reduced. It can also be used to adjust magnification, for example, to focus on a particular part of the scene, for example, to aid identification of a person appearing in the scene.

The imaging apparatus can be used for surveillance, since the omnidirectional optics can be located on a mast to observe a panoramic scene. It can be protected by armoured glass to prevent physical damage and it can also include protective means to shut down, or to shield the sensitive areas of CCD's in the event of attempts to disable the apparatus, for example, by slurring laser beams, or other high powered beams, onto the mirror system with the aim of burning out the camera. Such protective means can include, for example, overlaid transparent panels which normally admit light to the CCD arrays but which are responsive to a detection signal so as to transmit red (R), green (G) and blue (B) light respectively. When all three RGB "filters" are turned on, no significant light will then reach the CCD arrays. Other kinds of light filters or defocussing or image blurring can be used to a similar effect. Alternatively, a Kerr cell can be used which responds to a detection signal so as to go dark or change its polarisation properties. The protected areas will also be protected if they are blurred or defocused. An independent sensor can be included for detecting incident light above a given brightness threshold, so that the CCD is protected against damage while the threshold is exceeded. The sensor then removes or deactivates the shield, when the brightness of the incident light falls below the threshold. Detection signals alternatively or also can be generated by causing the processing means to respond when a group of pixel values exceeds a threshold brightness so as to generate the camera shutdown or protection signal.

In other examples of the invention, instead of using a "reflecting" system, each imaging unit includes a wide angle "lens" system for focussing an omnidirectional image from a scene onto the image sensing means. For example, each imaging unit includes a fisheye lens system. In the latter case, the second reflector and image focusing system are unnecessary. A further advantage is that the back-to-back lens systems have no blind spots on their central axes, which are otherwise present in some reflecting systems where light is reflected from an upper (or lower) mirror into the space within the convex reflector.

The image sensing means, such as a solid state CCD, can have a spherically curved image receiving surface, which is shaped optically (e.g. concave or convex) to reduce the spherical distortion introduced by curved reflectors or, e.g. fisheye lenses. For example, the light can be directed more at the normal than at an angle to the sensing surface. This facilitates transforming data relating to the spherical image into data relating to an image in a flat plane. Also the shape of the CCD can be circular to correspond with the optical output of the lens (or convex reflector). This maximises the resolution output and avoids processing regions of a rectangular CCD on which a circular image is otherwise focused. This makes more efficient use of pixels on the image surface.

When using back-to-back convex reflectors or wide angle lenses with means for diverting radiation from an intermediate zone between them, the sensing means (CCD) can be remote from the volume enclosed by back-to-back reflectors or lenses, so that any heat generated by the sensing means can be more easily dissipated. The means for diverting radiation from the intermediate zone between back-to-back reflectors or wide angle lenses can be any optical device that can turn radiation through a sufficient angle so that it emerges from between reflectors or lenses and is thereby received by the sensing means which is remotely positioned from the optical means. Suitable diverters are mirrors and refractors.

Whichever type of imaging unit is used, it can have a transparent housing or solid optic, preferably shaped symmetrically with respect to the convex reflector to avoid stigmatism, coma and other optical distortions (but could be cylindrical) to provide environmental protection. Two symmetrical imaging units back to back protect the enclosed image sensing means. The housing can be rotated to drive off rain by centrifugal force, or it can have any suitable wipers which rotate around the housing, or about which the housing rotates. Also heated air systems can be used to pressurise the housing to exclude dust and to maintain the optics at the same temperature.

According to another aspect of the invention, an imaging method comprises the steps of:

providing an omnidirectional image of a scene; deriving image signals from said image;

transforming the image signals into image signal data having three degrees of spatial orientation;

generating position related signals referenced to an artificial horizon; and

using the position related signals to select the image signal data so as to generate a display of at least a part of the image of the scene, so that pitch, roll and yaw of the sensor is transformed into equivalent movements of the displayed image.

This method provides the advantage that movement of the position sensor will cause the relevant part of a panoramic scene to be tracked and imaged and this tracking will automatically continue to match position related signals. For example, if the position sensor were mounted on headgear, simply turning the head and/or tilting the head forward and back or from side to side will provide the pitch, roll and yaw related signals which will cause the part of the panoramic scene to be displayed.

In a preferred embodiment, the digital signals relating to omnidirectional images can be stored in memory so that they are available around the whole omnidirectional scene, whereby the stored signals of any part of the scene are immediately available for processing to remove any image distortions, and to reproduce a display of that part.

Generally speaking, whilst the imaging units are preferably arranged to provide a hemispherical, or panospherical fields of view of the surrounding scene, an "all around" view may not, in practice, include viewable parts of the scene which are severely distorted, or obscured by supporting structure, or which correspond to blind spots in the mirror structures described above. In some cases, peripheral boundaries of the mirror or lens system can introduce maximum distortion which may not be entirely flattened into a single continuous image, but this image could be flattened acceptably enough, or ignored, where the rest of the field of view is of major importance. Some compromise in image quality may be required, for example, with the plane mirror arrangements (such as the octahedral housing), but these are less expensive than the curved mirrors and fisheye lenses.

Generally speaking, the image sensing means produces a digital video signal of the real image produced by the wide angle optical system and this signal is processed to remove or to substantially reduce distortion caused by the reflecting or lens systems. The image processing means can be adapted to map image data in polar co-ordinates from the sensing means (180x360), or each sensing means (360x360) onto adjacent portions of a Cartesian co-ordinate system, so that the displayed images of the one or two hemispherical scenes are continuous. The image data from two back to back units can be displayed in upper and lower portions respectively of a display with a suitable rectangular aspect ratio. Alternatively, images from different parts of the panoramic or omnidirectional scene (such the reflections in a plurality of facets in a pyramidal structure) can be routed through respective channels of a video multiplexing device, which enables independent cropping and scaling of the individual images, so that each can be tailored to fit into the correct locations of the composite image and so that adjacent parts of the scene are seamed correctly. The processing means can also include means for interpolating the image data to create effects of enlargement, reduction, zoom, pan, tilt, and changes in aspect ratio. Preferably, the video outputs from two or more sensing means which capture respective hemispherical images are mapped simultaneously to form a composite image, because this avoids delay in storing sequences of video signals (e.g. representing pan, tilt and rotational segments of a scene) which need to be separately processed and subsequently assembled. Moreover, in an omnidirectional optical system, everything is static because no movement is required to produce the "all around" 360 x 360 image. In contrast, some prior art systems use cameras which are moved or rotated to capture omnidirectional image information.

Although particularly suitable for imaging using visible light, the apparatus may be adapted to image using infra red, ultraviolet or any other form of electromagnetic radiation. In the latter method, operation can be selected in either a multicast mode, where the whole of the omnidirectional image data is processed and is available for viewing, or a selected zone mode, where only different parts of the scene are viewed at different times and the processing will then track the movement around the scene.

In any of the embodiment, shielding means can be included for selectively shielding the imaging unit(s) from one or more predetermined wavelengths of electromagnetic radiation. Preferably, the processing means is adapted to control said shielding means according to the output of the image sensing means. Preferably, the shielding means comprises means arranged before, or extending over the exterior surfaces of the imaging unit(s) and responsive to input signals for causing local changes in shade or colour. The means preferably comprises TOLEDS, OLEDs, FOLEDs or TLCDS controlled by the input signals to provide local changes in shade or colour, defocus, or blur.

The present invention also provides image projection apparatus comprising at least two image projecting units, each unit including optical means for receiving a respective image from image projecting means and outputting the image in wide angle form, the images output from the respective optical means being directed adjacently so as to form a single composite image when displayed on a substantially spherical screen.

Each unit may comprise a lens system for refracting an image received from the image processing means on to the screen. The image projecting units may be adjacently positioned so as to at least partially encompass the image projecting means. The image projecting units may be arranged back-to-back, with a common axis of symmetry, so that each unit outputs an image of a respective part of a scene. The image projecting means may be arranged to output images substantially perpendicular to the axis of symmetry of the image projecting units, the apparatus comprising means for reflecting the images output from the image projecting means on to the optical means.

Another aspect of the invention provides omnidirectional imaging apparatus for viewing a panoramic scene and for displaying an image of the scene in an image plane; the apparatus comprising optical means for receiving radiation from the scene and for directing it onto a focal plane; sensing means responsive to radiation incident on the focal plane to generate imaging signals; processing means for providing data in which values of the imaging signals, for radiation from different elements of the panoramic scene, are related to corresponding positions of incidence of the radiation on the optical means; the processing means also producing drive signals when the values of the imaging signals differ from a threshold value; and shielding means arranged before the optical means for selectively passing or blocking some or all of the incident radiation; the shielding means being responsive to the drive signals so as to pass or to block the incident radiation for which the values of the respective imaging signals differ from the threshold.

In the latter device, the shielding means can include a screen or array of LEDs, such as TOLEDS,OLEDs, FOLEDs or TLCDs, which can be selectively activated, for example, by energising orthogonal electrodes in an electrode matrix, in order to activate selected zones of the screen or array, either to pass, or to block incident radiation. Such a screen or array may, for example, be normally transparent to radiation, whereby selected areas can be activated, by drive signals applied to the electrodes in order to cause local changes in optical properties, such as shade or colour, which affect the opacity of the screen or array, or the focus, at that location. An independent sensor and or the processing means can be made to detect radiation by responding to imaging signals having values greatly exceeding a normal working threshold, so as to rapidly shutdown part of the screen or array to prevent such radiation from reaching and thereby damaging the sensing means. This could be the case where, for example, a beam of laser light is directed to the apparatus to cause damage and the shielding means blocks light of this frequency, where it is incident from the panoramic scene. This would prevent (say) a CCD sensor from being damaged by over exposure. The change in the optical properties could, in such a case, introduce a filtering effect to cut out light of the laser wavelength. Alternatively, in a case where radiation is more generally incident on the apparatus, which could cause damage, the processing means can activate the shielding means so as to shutdown most of the incident radiation, apart from radiation incident on a selected part of the apparatus, for example, which is coming from a minor part of the panoramic scene for which continued observationis necessary, whilst blocking outbackground 5 radiation. Clearly, the activation of the screen or array will depend on the application and radiation can either be blocked or passed as may be required over a small incident area, or a large one, or different areas may be activated or not.

Reference is also made to US 4848890(Horn) which shows a form of sensor control. Also, 0 in embodiments of the invention, the shielding means can be controlled in the manner of an automatic iris, e.g. by switching on and off, selected zones of the screen, so that they form a pattern corresponding to the size of the iris which would be adjusted to compensate for the brightness of the scene, and/or flare, etc. The iris size could be controlled by an external sensor, or by TTL (through the lens) metering, as known in the art, or by using the digital 5 signals to generate something, such as a histogram, from which brightness levels can be calculate. A TTL sensor can be built into a lens system, or in a mirror or prism, in embodiments where these optical components are in the light path to the image sensing means.

Generally, in the prior art the task of capturing panospherical or omnidirectional images is

'.0 often performed by specially designed fisheye lenses, but these have the disadvantages of lengthy backworkings and expense. Various aspects of the invention, described above, offer solutions to this problem by using convex reflectors, some of which have planar reflecting facets. Whilst these provide a more cost-effective solution and avoid spherical distortions thereby reducing the amount of processing required with fisheye lenses, there is still the

.5 problem of using a single imaging unit in the attempt to capture as much of the hemisphere as possible (or sphere with back-to-back units). For example, with fisheye lenses, the spherical distortion gets worse the nearer the circumferential edge is approached. Hence, the quality of the image will degrade the more that it is derived from the edge portions. Moreover, there will need to be some compromise in designing the lens so that there is not too much distortion in

0 the more central regions of the lens. It would therefore be desirable if the lens could be designed to cope more with a smaller field of view, as long as the rest of the panorama could be imaged.

Another aspect of the invention seeks to solve this problem by providing ommdirectional imaging apparatus comprising first optical means having an annular field of view, second optical means having a field of view extending over a segment of a hemisphere, and means for optically combining light received from both fields of view for the purpose of generating an image signal output, the first and second optical means being co-axially disposed so that their combined fields of view are hemispherical.

By suitably arranging the optical geometry of the first and second optical means, so that they each capture optimum images in the respective annular and segmental fields of view and by optically combining this image information, a composite image of the scene can be provided in which optical distortions are avoided or substantially eliminated.

In either 180 x 360 deg, or 360 x 360 deg viewing, the (or each) hemispherical field of view is effectively split into two segments, or sub-fields, the annular spherical field and the segmental spherical field, which are parts of the same hemispherical field of view.

An important advantage of this arrangement is that the optical means used to gather images in one or both sub-fields can be of the kind that introduces little or no optical distortion. By contrast, in the prior art, back-to-back fisheye lens configurations produce spherical and barrel distortion, plus chromatic aberrations, but in a flat image plane which displays a panoramic view as a composite montage, the central image section is the most important for surveillance, but is often the most distorted.. Fisheye lenses also have severe problems in the secondary spectrum of lateral colour, causing chromatic variation or distortion, due to extremely wide angles of the optical system.

On the other hand, the invention seeks to provide image gathering, especially in 360 x

360 deg., whilst preserving integrity of at least the central section and thereby allowing limited distortion only in peripheral or top or bottom areas. However, in some cases, distortions can be reduced to a negligible value. In particular, in one embodiment, an ordinary high quality lens can be used to capture images over the central hemispherical segment, thus avoiding the use of fisheye lenses (which image the whole hemisphere). Alternatively, where very high quality imaging is required, extremely high quality specular reflective surfaces can be used (in place of a central lens), including parabolic and hyperbolic mirror combinations (similar to Cassegrain objectives) to avoid the distortions caused by glass lenses. Hence, the invention can be embodied to avoid or reduce spherical or barrel distortion and to substantially reduce spherical lens aberrations, and distortions and chromatic aberrations (experienced with fisheye lenses or distorted viewpoints from multiple lens configurations). This is particularly advantageous with high resolution surveillance applications employing high resolution CCDs available from manufacturers such as Phillips and Loran Fairchild. A further advantage of some embodiments is that they use reflectors which act also as shields to prevent lens flare. This would be the case with, for example, apparatus used in daylight surveillance when the sun can cause flare due to a fisheye lens which is directed at the sky segment. In some embodiments of the invention, an upper planar, concave or convex reflector acts as a shield to prevent flare.

In one embodiment, the annular field (provided by the first optical means), includes a convex reflector which reflects light from the scene onto a planar (or concave) reflecting surface, which in turn directs light through an aperture in the convex reflector. This field of view can capture images from (say) the sides of a panoramic scene where the omnidirectional apparatus is suspended from a high point in a surveillance zone. The segmental field (provided by the second optical means) can include an ordinary wide angle or other lens (thereby avoiding fisheye lens distortion) located in an aperture in the planar (or concave) reflecting surface. This captures images from a segment of the hemisphere above (or below) the scene. This lens also directs incident light through the same aperture in the apex of the convex reflector. Thus, light passing through this aperture, incident on both the first and second optical means, can be received by an imaging surface, either directly, or via an optical system which reflects or refracts or conducts the combined incident light from both fields of view to a more convenient location (at which an imaging device is located). According to another embodiment, the hemispherical segmental field of view (in the second optical means) is provided by high quality concave and convex reflecting surfaces arranged, for example, as a parabolic/hyperbolic primary/secondary mirror combination to remove optical distortions. This mirror combination can replace a lens in the previous embodiment, or can include a more simple lens of high quality for improving the focusing. This segmental field of view can also be captured by a convex/convex reflecting surface combination, the surfaces being parabolic and having a curvature to provide suitable imaging.

According to a further embodiment, the first optical means comprises a first convex reflector for reflecting light through a planar (or concave) reflecting surface, which in turn reflects light through an aperture or apertures in the convex reflector to provide the annular field of view (as in the latter embodiments). However, the segmental field of view (provided by the second optical means) employs a second reflecting surface (which can be planar or concave or convex, e.g. parabolic) which reflects light onto a second convex reflector, which in turn reflects light through the aperture(s) in the first convex reflector. This device is of more simple construction and may not offer as much reduction in optical distortion, but it is useful in reducing or eliminating lens flare.

Another embodiment uses an imaging device which faces (downwardly) a convex reflector to capture images in the annular segment, and another imaging device which faces (upwardly) to capture images in the upper segment; these imaging devices can be inexpensive back to back CCD cameras.

The convex reflectors used in providing the annular field of view can be any of those described above in connection with the back-to-back convex arrangements including those having first and second reflecting means, the first being convex for reflecting an image from a panoramic scene onto the second which, in turn reflects the image onto the imaging sensing means. For example, the first reflecting means can be hemispherical, parabolic, hyperbolic, ellipsoidal, or of a polygonal type where the polygon includes a plurality of planar or curved reflecting facets surrounding a central axis through the convex reflecting means. These can be whole or frusto-convex (truncated) sections. The second reflector can be planar, concave or convex and any of the hemispherical, parabolic, hyperbolic, ellipsoidal shapes. Where the first and second reflecting means have a curvature, these are in confocal relationships. Convex reflectors having inclined flat sides in a polyhedral structure, such as the six to eight sided pyramids, for 180 x 360 viewing , or each back to back for 360 x 360 viewing are advantageous (as explained above) and because curved reflectors offer lower optical quality compared to plane mirrors. For example, with a parabolic convex reflector used in surveillance, the equatorial images are more important (because they view a zone on the land) than those at the poles (which view the sky). However, these equatorial regions are less well imaged in the sides of the parabolic reflector due to spherical distortion. This aspect of the invention enables the image quality in the equatorial regions to be improved, e.g. because planar facets can be used to reflect the side images onto the imaging means and a simple low quality lens can be used to image the polar regions.

By suitably arranging the optical geometry of the first and second optical means, which can include selecting the size and curvature of reflecting surfaces, the size, power and properties of lenses, the relative dispositions of the optical components including spacing therebetween and aperture width, images from both the annular and spherical segments of the hemispherical field can be captured and optically combined to provide a composite optical image of the scene. This geometry can be designed, or made adjustable, as explained above in connection with other aspects of the invention, including suitable mechanical drives for adjusting the relative positions of lenses, reflecting surfaces, optical co-axial alignment, both to provide a common boundary between the image from the annular field and the image from the segmented spherical field, as well as compose the respective images on the viewing plane. This seams may be made invisible, or they can be visible. Instead of making adjustments of the image optically on the imaging surface, image processing techniques can alternatively be used which can recognise a common boundary in providing a composite image, either with or without seams. Suitable available software and programming techniques for achieving the required composite image are well known to those skilled in the art.

The imaging units described above can be used with means for attaching them to a camera to provide 180x360 or 360x360 imaging, e.g. wherein convex reflectors are arranged back to back and a laterally extending member projecting through the convex reflectors provides an optical and mechanical camera attachment.

Embodiments of the invention will now be described with reference to the accompanying drawings, in which: -

Figure 1 is a schematic drawing of an embodiment of an imaging apparatus;

Figure 2 illustrates the upper and lower hemispherical fields of view of back to back convex reflectors (or fisheye lenses) and how these are developed into a cylindrical field of view which is then unrolled into a flat plane;

Figures 3 a and 3b illustrate respectively the view from each convex reflector (in a flat display plane) and the corresponding developed upper and lower fields of view in a flat plane after mapping the image signal data from polar to cartesian co-ordinates;

Figure 4 is a mapping diagram for the images of Figs 3 a and 3b;

Figure 5 is a schematic drawing of another embodiment of imaging apparatus;

Figure 6 is a schematic drawing of a stereoscopic arrangement of the imaging apparatus;

Figures 7 and 8 are sectional and perspective views of imaging units which employ convex pyramidal reflectors (suitable for pipe inspection);

Figure 9 is a section of a stereoscopic version of Fig. 7; Figures 10 is a perspective view of another pyramidal convex reflector arrangement with more sides;

Figures 11 and 12 are schematic diagrams illustrating mapping for a back-to-back pyramidal structure with eight sides;

Figure. 13 illustrates a back-to-back fisheye lens arrangement having CCDs with curved sensing surfaces;

Figure 14 is a perspective view of an embodiment in which a motion sensor provides position related signals for navigating around a panoramic scene on a display screen;

Figures 15 and 16 are perspective and sectional views of another pyramidal arrangement with upper and lower reflecting systems;

Figure 17 and 18 show a modification with curved sides;

Figure 19 shows a further pyramidal embodiment, and Figure 20 shows a back-to-back arrangement of the same embodiment;

Figure 21 shows a further modification of the embodiment of Figure 19, which includes a wide angle lens for viewing an upper segment of a hemisphere, where the pyramidal reflector views an annular segment of the hemisphere;

Figure 22 shows a back-to-back arrangement of Figure 21;

Figures 23a-23c are mapping diagrams for the embodiment of Figure 22;

Figure 24 shows a similar embodiment but using optical fibres, Figure 25 illustrating a manifold: Figure 26 shows a modification using optic fibres;

Figure 27 shows back-to-back fisheye lenses with CCDs in between;

Figure 28 shows a modification where each fisheye lens is effectively separated into two parts to reduce the length of backworkings;

Figure 29 shows a variation with a prism for a reflector;

Figure 30 shows the arrangement of Figure 28 in a housing, and Figure 30a shows a similar arrangement for 180x360° viewing, e.g. for melting on a ceiling;

Figures 31 and 32 schematically illustrate an arrangement for dealing with lens flare;

Figures 33-35 show other back-to-back fisheye lens arrangements;

Figures 36a-36b is another mapping diagram;

Figure 37 shows a projector arrangement;

Figures 38-42 show further embodiments of arrangements with an upper segmental field of view and a lower annular field of view (in each hemisphere); and

Figures 43 and 44 show 360x360° and 180x360° arrangements with inner planar pyramidal reflectors and outer parabolic convex reflectors.

With reference to Figure 1 one embodiment of an imaging apparatus 10 comprises two imaging units 12, 12'. Each imaging unit 12 comprises an image sensor 14, such as a CCD or camera, a convex parabolic reflector 16 and a circular, substantially planar reflector 18. The apparatus is housed in a transparent housing 19, but the transparent housing is optional and can be dispensed with. Alternatively, a solid optic can be employed, e.g. which is made of transparent plastics extending between the surfaces of the parabolic and planar reflectors (16, 18). The reflecting surfaces can be formed on polished end surfaces of such a plastics body. The housing (or outer surface of a solid optic) is preferably symmetrical with the shape of the convex reflector so as to avoid optical distortion of the light incident from the scene.

The camera 14 and the planar reflector 18 are both positioned along the axis of symmetry 20 of the parabolic reflector 16, the surface of the planar reflector being substantially orthogonal to the axis of symmetry 20. As shown in Figure 1, the CCD array or camera 14 is situated within the parabohc profile of the parabolic reflector 16, so as to provide a relatively compact structure for each imaging unit and so that the parabolic reflector 16 can provide a relatively secure housing for the camera 14. Use can also be made of telecentric optics for reducing perspective distortion.

The convex parabolic reflector 16 is positioned to reflect an image of a scene on to the planar reflector 18. The planar reflector 18 in turn reflects the reflected image through an aperture 22 formed in the parabolic reflector 16 to the CCD or camera 14. A lens 24 is provided between the CCD or camera 14 and planar reflector 18, for example, in the aperture 22 formed in the parabolic reflector 16, to focus the reflected image reflected by the planar reflector 18 on to the CCD or camera 14. The lens may be provided with any suitable means for blocking light which has not been reflected by the planar reflector.

The planar reflector 18 is mounted to the lens 24 of the camera by a shaft 26 extending along the axis of symmetry 20. A motorised gear system, such as a rack and pinion arrangement (not shown), may be provided to move the shaft 26 along the axis 20 in order to vary the optical distance between the parabolic reflector 16 and the planar reflector 18. For example, decreasing the distance between the two reflectors will have the effect of reducing the size of the image reflected by the planar reflector 18, and concomitantly reducing the size of the scene imaged by the imaging unit, thus enabling a user of the apparatus to "focus in" on a particular feature of the imaged scene. The planar reflector can be replaced by a concave reflector in confocal relationship with the convex reflector to reduce optical distortion. The reflecting surfaces can be movable or adjustable, as mentioned above, to assist in composing te final image on the viewing plane.

In the apparatus shown in Figure 1, the two units 12 are arranged back-to-back, with the two parabolic reflectors 16, 16' sharing a common axis of symmetry 20. The parabolic reflectors 16, 16' may be joined, as indicated at 28, to provide a seal to prevent ingress of water or debris, and to prevent any other form of access, into the housing 30 thus defined between the parabolic reflectors 16, 16'.

The CCDs or cameras 14, 14' have image signal outputs connected to image processing apparatus (not shown), such as a suitable programmed computer, for processing the image signals for display on a display device (not shown). These processing and display techniques are well known in the art. The image processing apparatus transforms each of the outputs from the imaging units into signals into image signal data. Figure 2 illustrates an example of the respective spherical image signal data 40, 40' produced from the image signals output from the cameras 14, 14' respectively when the apparatus is utilised as part of a surveillance system. Each CCD or camera 14, 14' views hemispheres of the scene in respective parabolic reflectors 16,16' and produces analog imaging signals which are converted into digital signals representing a substantially circular image 40, 40' of each hemisphere 50,50' of a scene. As shown in Figure 3a, a portion 42, 42' of each of the circular images 40, 40' is masked by the planar reflector 18, 18', the size of these "blind spots" 42, 42' depending on the size of the planar reflectors 18, 18' and the distance between each planar reflector 18, 18' and the associated parabolic reflector 16, 16'.

As will be readily appreciated, it is difficult for a viewer of a display device displaying the circular images 40, 40' to identify easily particular features of the scene. Therefore, the image processing apparatus performs a mapping operation on each image signal data for transformation of the data into a cartesian co-ordinate system for output to the display device. This mapping operation is illustrated in Figure 3.

Comparing Figure 2a with Figure 4, in the mapping operation, each of the circular images is notionally divided into an array of pixels (the grid overlaying each of the circular images 40, 40' should be ignored for the present purposes) in a polar coordinate system. Each of these pixels is then mapped, using look-up tables stored in the image processing apparatus which also compensate for distortion resulting from the mapping, into a cartesian co-ordinate system for display in a rectangular display 44. In this mapping technique as shown in Figure 4, four circular sectors, each numbered 1-4 and 5-8 in the upper and lower hemispheres of view, map to rectangular areas 1-4 and 5-8. The technique is particularly suitable for mapping of the circular images 40, 40' into adjacent portions, such as respective upper and lower portions, of a 16:9 rectangular video display. Figures 3a and 3b illustrate three-dimensionally the result of the mapping operation, in which two substantially hemispherical, or "omni-spherical", images 50, 50' are transformed into a continuous cylindrical "pano-spherical" image 60. This is unrolled into a flat plane as shown in Figure 2b and is the resultant transformation of the displayed circular images 40, 40', providing a continuous two- dimensional image of the two scenes imaged by the imaging units. The resultant rectangular image 70 enables the user of the apparatus to more readily identify particular features of the scene. This enables the user to magnify selected features of the displayed scene and to pan the scene in any desired manner by suitable digital processing. The image 70 may also be transmitted as a flat panoramic image and interpreted by a client, or remote computer software or driver, as a navigable scene which may be cropped, enlarged or rotated in real time in response to control inputs from a remote user.

As the user zooms in on smaller portions of the scene, the granularity of the viewed image will increase. Thus, the apparatus may be programmed to control movement of the planar reflectors relative to the parabolic reflectors with the magnification of the image by the user to compensate for this loss of resolution. As shown in Figure 5, the planar reflectors 18, 18' may be attached to the inner surface of the transparent housing 19, thereby avoiding attachment of the planar reflectors to the parabolic reflectors via shafts 26.

The reflectors 18, 18' may be shaped and arranged in confocal relationship (e.g. concave/convex) to assist in reducing optical distortion. Each image sensing means 14, 14' may comprise a CCD array with a convex or concave image receiving surface to reduce optical distortion. A polarising filter may be located between each camera 14 and reflector 18 to reduce the incidence of reflections from the housing 19 on the cameras.

Instead of using single lenses 24,24' and respective cameras 14,14', these can be replaced by double lenses 24a,24b and 24c,24d and respective CCD devices 14a, 14b and 14c, 14d, separated by light shields 25,25', as shown in Fig 6, for stereoscopic (or stereographic) omnidirectional imaging (see below).

Figure 5 also illustrates that housing 19, having a wall thickness 19a, can be replaced by a solid transparent optic 19b, which extends between the boundary 19c and the reflecting surfaces 16 and 18 (e.g. made of clear plastic, or crown optical, or germanium.)

Fig. 6 shows an arrangement similar to that of Fig. 5, except that two lenses 24a, 24b and 24'a, 24'b, are positioned in the openings at the apex of each triangular arrangement 16a, 16'a, and that respective imaging devices 14a, 14b and 14'a, 14'b, receive the light from the respective lens. Such an arrangement can be used, for example, where simultaneous infrared and video images are sensed of the same scene, or where stereo video images are produced. The imaging devices can be cameras, but are preferably CCD arrays. This arrangement is also suitable for stereoscopic (or stereographic) omnidirectional imaging where the lenses and imaging devices receive light from two vantage points for the purpose of creating a stereoscopic image, by known means (such as those described in "Advanced Photography", by Michael Langford, ISB 024051029- 1, or "Autostereoscopic Computer Displays" made by manufacturers such as Philips or Sharp. Hence, it becomes possible to view any part of a panoramic scene in 3D.

Fig.7 shows another embodiment of the invention where each imaging unit includes a convex pyramidal arrangement with four spherically triangular planar mirrors as the sides 16a (instead of using a convex spherically parabolic reflector 16). Sides 16a are, in fact, quadrilaterals and reference will be made, in general, to facets of a polygonal structure. The most simplified embodiment of a convex pyramidal structure has 4 sides and this will described to illustrate the principles, because pyramidal structures with more sides are preferred for viewing distant objects (the further away the object, the more sides or facets of the pyramid are required, and a preferred number of sides are six-eight in each pyramidal structure, i.e. twelve - sixteen in back-to-back arrangements). The arrangement shown in Figure 7 has a practical use for objects which would be near to the reflecting surfaces. For example, it can be used in pipe inspection, where the back-to-back imaging units travel down the pipe in the direction of the longitudinal axis through the lenses 24,24'. As in the previous embodiment, a substantially planar reflector 18 and image sensor 14 are positioned along the axis of symmetry 20 of the reflecting arrangement, the surface of the planar reflector 18 being substantially orthogonal to the axis of symmetry 20. Fig. 8 schematically shows two such imaging units 12, in perspective, in a back-to-back (360 x 360) arrangement, the lenses being omitted for clarity. The sides or facets 16a form an octahedron, open at each end 16b, 16'b, (so as to receive a respective lens or lenses 24,24'). The imaging devices 14,14' can be cameras or charge coupled devices, CCDs, which are mounted with respect to the lens 24,24' in the manner described above. The same or a similar motorised gear system can also be used to vary the optical distance between each pyramidal reflecting arrangement 16a,16'a and its respective planar reflector 18,18'.

Fig. 9 shows an octahedral planar mirror arrangement for stereoscopic viewing, with two sets of lenses 24a,24b, and image sensing means 14a, 14a', 14b, 14b'. This arrangement is similar to that of Fig. 6 and it can be used for the same purposes. Fig. 10 shows a further embodiment where the four-sided pyramidal planar mirror arrangement (4 facets) is replaced by a six-sided pyramidal planar mirror arrangements (6 facets) 16d, 16d'. The construction and operation is otherwise similar to that shown in Figs. 6 and 8.

The mapping operation for the embodiments of Figs. 6-10 are similar, in principle, to those already explained above, but as they take account of the octahedral and dodecahedral shapes (8 and 12 facets) instead of the parabolic surfaces, they can significantly reduce the amount of processing required for transforming (otherwise) spherical images (from parabaloids) into flat images which need to be composed on a flat image plane with little or no spherical distortion. For example, Fig. 11 shows the octahedral planar faces designated as regions 1 to 8 in polar co-ordinates (Figures 12a and 12b) which are mapped onto cartesian coordinate flat planes as shown in Fig. 12c. As shown in Fig. 12c, the horizontal rows represent a 360° horizontal view, whereas the columns represent the 360° (+/-) vertical view. The mapping operation can be carried out with the aid of coordinate lookup tables whereby the resultant transformation of the displayed images in the planar facets reconstruct the required composite visual image. Any suitable known technique can be used for processing the digital signals so that the images in the facets are cropped, scaled and stitched together to provide the required panoramic image in the viewing plane (e.g. suitable techniques are those known as "Apples Quicktime NR" and as disclosed in McCutcheon in US 6141034). Whilst planar facets may not provide the same field of view as the parabolic reflectors, they are of far simpler construction and will provide images of visually acceptable quality, over a sufficient field of view of interest, so that much less processing is required for image transformation and much less cost. Image processing techniques can be used to refine the pixel information so that the final composite image is according to requirement.

CCDs can be used of 4:3 proportion or 16:9 proportion, or a split 16:9 proportion, but other combinations can be used. The resultant output or display can then be, for example, 4:3, or 16:9 ratio. However, any aspect ratio can be used in accordance with user requirements. Whilst the above example has been described with a pyramidal structure having four sides, more sides are preferred especially for viewing distant objects. For example, 6-8 sides or facets in each imaging unit (12-16 in the back-to-back arrangement) provide a useful field of view and acceptable images. Clearly, as the number of facets is increased, the pyramidal structure becomes more circular and this improves the panoramic view of the scene. Although the facets are inclined at say 45° (and they can be inclined at other angles), it is surprising what can be seen in the reflections from adjacent facets and adjustments can be made so that there is sufficient overlap of images in adjacent facets to enable parts of the digital image of the scene to be seamed or stitched together to provide the continuous panoramic image. The mapping of N facets onto N regions of the composite image will follow the same principles as those explained herein.

The arrangement shown in Figs. 6-9 can be contained in a transparent spherical, cylindrical, tubular or triangular enclosure and the enclosure, preferably cylindrical or spherical, can be mounted on shafts which are driven by a motor in order to spin the enclosure to remove, e.g., rain or dust, or other pollutants, by centrifugal action.

If necessary, section or portions of the convex mirror arrangements, e.g. an area or selected areas on the surface of the parabolic convex mirror, or the surface(s) of the planar mirrors in the sides of the polyhedral convex arrangements can be obscured, defaced, removed or otherwise treated to mask that part of the scene from which light is reflected, hence blocking the view of the imaging sensor (CCD) to prohibited viewing zones. This allows for privacy in surveillance applications and can provide a satisfactory solution to the problem of controlling restricted or limited viewing of an omnidirectional scene, where the mirrors would normally be inaccessible once mounted for surveillance. (Otherwise, in the case of inhibiting or suppressing the processing of parts of the image information to block parts of the view, the processing of all the image information could still be continued illegally, i.e. without authorisation)

Whilst the imaging information from the sensors 14,14', are processed to provide the composite image, means may also be provided to operate each image sensor independently of the other. Each imaging unit can also have one or more image sensors to provide stereo video stereographic/ stereoscopic three-dimensional imaging and also to implement dual infrared and standard video image gathering, and/or to provide for high and low resolution CCD combinations.

The planar reflector 18,18', also need not be planar, since it can have a curvature with is useful for a particular application. Where the imaging units are contained in a transparent enclosure, polarising filters can also be used to remove reflections from the transparent material of the enclosure. The planar reflecting facets can be square, circular, triangular or octagonal in shape.

Fig. 13 shows another embodiment which is based on the same general principle of providing an image sensing device within a light collecting arrangement for producing a 360° horizontal/vertical composite image. However, this embodiment employs lenses instead of parabolic or triangular planar reflecting surfaces and also employs a special CCD arrangement. More particularly, Fig. 13 shows an omnidirectional imaging apparatus including two imaging units 12,12', each unit having a wide angle convex "Fisheye" lens 31,31', and a secondary lens system 32, 32' (which can be concave or convex) arranged on the same optical axis. The inwardly and oppositely facing concave surfaces of lenses 32,32', substantially surround a CCD array 33, 33' with oppositely facing paraboloid (or otherwise spherically shaped) imaging surfaces. This is a purpose- made imaging device . The exact shape of the imaging sensing surface of the CCD will depend on the nature of the optical lens or reflecting arrangement which is used to capture the image of the scene, since this shape is intended to avoid or reduce the amount of processing of the digital signals which would otherwise be required for making transformations or warping, in order to correct for spherical distortion. Moreover, the use of a curved imaging surface makes more efficient use of the pixel area, since no pixels are wasted in an arrangement where the optical image capturing device directs or focuses an appropriate image on the curved surface. In the case where the CCD array is flat and rectangular, some areas (and therefore their pixels) are redundant and hence wasted.

The mapping operation and formation of composite images will also be generally similar, but this arrangement, being spherical, solves problems of mapping which occur with planar surfaces that need to be transformed into a single image plane.

Figure 14 shows a user wearing headgear 40 to which is attached a device 41 (of known construction) containing giros and accelerometers that detect movement of the head and generate corresponding position related signals, referenced to an artificial horizon, and send them to processing means 42. Device 41 could be described as a "3D mouse", since a "2D mouse" provides position data in x/y co-ordinates in a flat plane, and device 41 adds the third axis z or third degree of freedom. Further details of such a Head Mounted Device (HMD) can be seen in US-A-5645077, US-A-5807284, and 5440326, the contents of which are include by cross-reference. An omnidirectional imaging unit 43 is positioned at the scene of interest (not illustrated) provides a panospherical image of the scene and derives image signals from the optical image on the CCD arrays (not shown). The processing means 42 is programmed to transform these image signals into image signal data having three degrees of spatial orientation, i.e.in the panospherical image, which are stored for later reference. The processing means is also programmed to use the position related signals from the device 41 to select the stored image signal data so as to generate a display of at least a part of the image of the scene, so that pitch, roll and yaw of the headgear and hence the head, is transformed into equivalent movements of the displayed image on the viewing device 44. The screen 45 depicts a circular disc 46 representing the panoramic image of device 43. It also depicts a rectangular region which is an enlarged part of the scene shown in 2 dimensions, on a flat plane. This part will roll, pitch and yaw to follow the user's head movements, hence allowing all segments of the image to be available for viewing. The scene may be static or a moving image (e.g. a video recording).

The latter HMD device can be used by an observer to view any part of a panoramic scene, by simply moving the head. Processing of the digital signals can be carried out by known techniques, for example, where the whole panoramic or omnidirectional image is first captured as a frame of information, which is then processed and buffered to provide data relating to any part of the scene. If the user wishes to look at a particular part of the scene, the buffered information can then be processed accordingly. This can help in reducing processor tasking, as well as assisting with bandwidth requirements, and can also allow for enhanced resolution from a particular sector or zone under investigation. Some form of switching device can be included for automatically switching from the full panoramic view to the view of a limited sector or quadrant so that the image data of the part under observation is ready for immediate display. The apparatus can be constructed to operate in either a multicast mode, i.e. where the whole of the panospherical image data is processed to enable many users to observe (interactively) different parts of the scene at the same time, and a single observer mode, e.g. where one user will only observe different parts of the scene at different times and the processing will track the movement around the scene.

Referring to Figs. 15 and 16, these schematically illustrate a lower pyramidal (4 sided) structure 50 which is designed to reflect light from its outer sides or facets 50a,50b, onto the inner sides or facets 5 la, 5 lb, of a corresponding (4 sided) upper pyramidal structure 51. (While 4 sided pyramids will be described to simplify the description and drawings, they are most suitable for imaging close up objects, for example, the interior walls of pipes in a pipe inspection tool, and other structures are described which have more sides or facets and are better suited to imaging distant objects. The pyramidal structures are located one above the other and their vertical spacing will depend on the application. The upper segment of the hemispherical scene will be obscured by the upper structures, but this may be of no consequence, say in the case of using say 6-8 sided pyramidal structures 51 for distant objects, for example, where the sky is not imaged in surveillance equipment for say a car park.) The inner sides 51a,51b each reflect incident light onto an opposite inner side of a downwardly facing pyramid 52, which in turn directs light downwardly onto the sides 14a, 14b , of a CCD, sensing structure. (The structures 50 and 51 are slices of a regular pointed pyramid, whereas pyramid 52 is regular.) Structures 50 and 51 are identical apart from the fact that the outwardly pointing sides or facets 50a,50b... are reflective, whereas the inner downwardly facing sides or facets 51a,51b... are reflective. While pyramid 52 differs in shape, the length of the triangular sides of 50,51 and 52 are the same and all sides are inclined at 45 degrees to the vertical. Similarly, the sides of the CCD structure 14a, 14b.... are of complimentary size and shape as shown in the drawing. However, instead of having a CCD structure with sloping sides the CCD can extend in the horizontal plane and be either made up of triangular CCDs, or a single planar CCD with its surface sub-divided into triangular sectors. The reason for using sloping sides for the CCD structure is to compensate for any trapezoidal distortion caused by the inclination of the reflecting surfaces, with respect to the part of the panoramic scene which is reflected and which is also imaged on the corresponding CCD or sector of a CCD. This significantly reduces the amount of processing required for transforming image signals to remove any distortion introduced by the optical system in casting images onto surfaces which are not designed to provide the usual image plane.

Generally, if planar reflecting facets are used to capture images from parts of the panoramic scene, these planar surfaces do not themselves introduce any distortion (compared with the severe spherical distortion which can be introduced by peripheral edge portions of a fisheye lens). However, if the image plane is flat, but at an angle to the reflecting surface, this can introduce trapezoidal distortion. This is not by any means as serious a problem as dealing with the spherical edge distortion in fisheye lenses. Therefore, even with a CCD extending in the horizontal plane for imaging sectors of the panoramic scene reflected from the facets, the amount of processing required to remove just trapezoidal distortion is minimal.

The design of the structure shown in Figs. 15 and 16 is such that only light reflected by the face or facets 50a is reflected by the optical system onto the respective side or sector of the CCD structure. Therefore, the side or sector 14a, for example, could only see light which is derived from that part of the panoramic scene which is first reflected in the facet 50a, the same applying to the other faces and CCDs or parts of CCDs. The drawing has been simplified, for ease of understanding, because better results will be obtained by using a convex polygonal structure having more than four sides (like a many-sided pyramid), since this will have more facets for reflecting respective parts of the panoramic scene. For example, pyramidal structures 50 and 51 can be replaced by bands having say eight or twelve facets, each band symmetrically encircling the central convex axis through structures 50,51,52, and 14. In this case, in order to turn the incident light through 180 degrees, after reflection from the facet (50a, 50b...), the pyramid 52 would be replaced with a pointed structure having a corresponding number of facets to provide the same effect as that shown with the four sided pyramid structures. The greater the number of facets, the more the number of parts of the panoramic scene will be reflected and these parts will conform in a more optimum way with the continuous spherical image which can be obtained with, for example, hemispherical, parabolic convex reflectors or fisheye lenses. Moreover, a greater number of facets facilitates reflecting parts of the panoramic scene with an overlap of adjacent sides of the facets. This overlap assists in aligning the reflected images from each of the facets in order to provide continuous seams between the parts of the panoramic scene.

In the section shown in Figs.17 and 18, the flat facets are curved, as seen in the perspective view of one facet. However, the reflecting facets are flat, since the spherical distortion introduced by the lower convex facet is compensated by the curvature of the upper concave facet.

Fig. 19 shows a modification of the pyramidal type of structure where the inclined outer reflecting faces or facets 60 of a convex regular pyramid 61 reflect light incident from the panoramic scene onto the upper reflector 18 which in turn reflects light downwardly through apertures 62 in the inclined faces or facets. Located beneath each aperture is a camera or CCD sensing system 63 which captures that part of the panoramic image reflected by the respective facet 60 and part of reflector 18. The aperture 61 can be open, but in order to hide the camera lens from view, the aperture may be, for example, semi-silvered so that it is partly transparent. It can also be clearly transparent. Fig. 20 shows a back-to-back arrangement based on the embodiments of Fig. 19, i.e. with two six-sided mirrors. Similar parts are given the same reference numerals.

Whilst Figs. 19 and 20 illustrate respectively six and twelve sided reflective structures, the number of sides or facets can be increased to provide a more continuous panoramic view.

Fig.21 shows an arrangement based on that of Fig. 19, but with the addition of a wide angle lens system 120 arranged in an aperture 121 in the centre of the upper planar reflector 122 (similar to reflector 18 in Fig. 19). This lens system 120 provides a view of a segment of a hemisphere, whereas the reflecting arrangement 60, 122 provides an annular field of view. The upper segment is imaged in CCD 123, whereas the lens or camera arrangements/CCDs 63 are as shown in Fig. 19, and image the scene in the annular field of view. An important advantage is that the lens system 120 can be selected so that it has a wide field of view but does not introduce optical distortions or aberrations over the field of interest, because the annular field of view of the reflecting system 60,122, can provide an image of the annular segment of the panoramic scene with little or no distortion.

Fig. 22 shows a back-to-back arrangement based on Fig. 21, for 360x360 viewing.

Figs. 23a-23c are mapping diagrams for the structure shown in Fig. 22. In this case, the image in the lenses are 120a,120b have been divided into equal sectors 1A-1F and 2A- 2F respectively, whereas the sides of the pyramidal structure are designated 3-8 and 9- 14 respectively, in the polar co-ordinate diagrams 23a,23b. These map to the cartesian co-ordinates shown in Fig. 23 c.

The lens arrangement 120 in the apex of the pyramidal structure can be located in the other arrangements described below (since they have been simplified to show the pyramidal reflecting arrangement only, where modifications are described below). Fig. 24 is based on the embodiment of Fig. 20 but instead of camera 63 located just below the apertures 62, optical systems 64 are provided which collect incident light and channel it through optical guides 65 to a manifold system 66 situated above a CCD 67. This embodiment is useful for composing images directly onto CCD 67 as a montage of respective parts of the scene reflected by the facets 60. Instead of using a single CCD, individual CCDs can be located at the outlets of the optical guides 65.

Fig. 26 shows a variation on Fig. 21 where the reflective facets 60 capture parts of the panoramic scene and lens systems 68 receive incident light from the reflecting surfaces 60 which is then conducted via optical guides 65 to manifold 66 over CCD 67.

Fig. 25 shows a detail of the manifold 66 where some light guides 65 a enter vertically and others 65b are curved because they are directed upwardly from lower reflective surfaces in a back-to-back assembly of convex reflectors. In this case, four optical guides montage the image from an eight-sided structure.

The advantage of using optical guides, such as optic fibres, is that it is easier to construct the imaging unit, because a plurality of images can be conducted from respective facets through a smaller aperture (62) or 70 (Fig. 20) in the systems described. Suitable fibre optic image conduits are available from Edmund Industrial Optics and these are manufactured with standard and high resolution, and are made of fused glass optical fibre rods with round and polished faces. Images are transmitted from one polished face to the other and the rods can be bent under heat to the required orientation. The same Company provides fibre optic tapers and face plates, where a taper is a coherent fibre optic plate for transmitting either magnified or reduced image from its input to its output surface and a fibre optic face plate is a coherent fibre optic plate that precisely transmits an image from its input surface to its output surface. These can be used, for example, to transfer high resolution images in CRT/LCD displays and CCD coupling. Fig. 27 shows another embodiment of omnidirectional imaging apparatus which employs lenses in imaging units 12, 12', each unit having a convex Fisheye, wide angle or panoramic lens part 100, 100' and a secondary (concave or convex) lens part 102, 102' arranged on the same optical axis. The outwardly facing Fisheye lens systems 12 and 12' encompass CCD array 104, comprising a pair of back-to-back planar CCD devices 106, 106'. The imaging units and CCD array may be mounted so that the optical axis has any desired orientation, for example vertical, horizontal or otherwise. An advantage provided over previously described embodiments employing mirrors in the imaging units is that the lenses 100 and 102 have no blind spots and provide self focussing. They can also be less prone to vibration than the mirrors, improving performance. By providing the CCD devices 106, 106' between the imaging units 12, 12', the CCD devices do not obscure the scenes being imaged by the imaging units. By using two CCD devices rather than one to collect the image data, the resulting superior resolution of the resultant image allows for cropping and scaling of the image in a navigable or interactive video. This is desirable as Fisheye lenses have such a wide field of view that mid-range objects appear distant in the resultant image. To restore normal view, the user would zoom or scale the video image, causing granularity in the image. Superior resolution provided by two CCD devices rather than one ameliorates this problem.

A problem of the arrangement shown in Fig. 27 is that the backworkings of each fisheye lens is on the same optical axis, therefore making the arrangement very long and cumbersome. This problem is solved in the embodiment shown in Fig. 28. In this case, each fisheye lens has been effectively separated into two parts 130a, 130b, and 130c,130d, arranged on optical axes which turn through 90 degrees. The outer parts of these lenses 130a, 130c, encompass an inner region 131 in which a double sided reflector 132 is provided (or a prism for the same purpose). This reflects light, for example, from part 130a to part 130b which focuses on CCD 133a to provide the upper panospherical image. The same applies to the parts 130c, 130d and the CCD 133b.

Fig. 29 shows a similar arrangement for 360x180 viewing, but where a prism 134 is used. Fig. 30 shows a preferred form of housing 135 for the arrangement shown in Fig. 28.

Fig. 31 shows an embodiment in which the Fisheye lenses 100, 100' are covered with an active cover or coating 108, 108' which acts to substantially prevent "lens flare". Lens flare occurs when a lens is pointed towards the sun or other bright artificial light source, which creates a "flare" over the resultant image which can obscure a significant proportion of the image. As will be appreciated, if the imaging apparatus is used for external surveillance, depending on the situation of the apparatus lens flare due to the sun may be a common occurrence. Further, flare may be induced deliberately, for example, by a hand-held laser device, in order to "burn out" the CCD array 104.

Since the apparatus is used to provide an omnidirectional image, use of conventional lens shades and flags to prevent lens flare is not suitable. Suitable shields 108, 108' may be formed from TOLED (Transparent Organic Light Emitting Diode, as described in US patent no, 5,703,436), FOLED (Flexible Organic Light Emitting Diode) or standard TLCD (Transparent Liquid Crystal Display) in the form of a cover, dome, coating or shield arranged before, or extending over the convex outer surfaces of the lenses 100, 100' and connected to the image processing apparatus for processing the image signals output from the CCD array.

The shields are responsive to input signals for causing local changes in shade or colour of the shields so as to selectively shield the imaging units 12, 12'. This is shown in more detail in the perspective view (Fig.38) of the surface of shield 108 (108'), where an element 109 has been activated to block lens flare due to (say) the position of the sum with respect to the shield.

As a flare at the extreme end of the spectrum strikes one, or both, of the lenses 100, 100', it is transmitted from the CCD array 104 to the image processing apparatus where its presence is detected from histogram tables stored in the processing apparatus. The location of the flare is determined from an image map stored in the look-up tables. The processing apparatus then forwards pixel co-ordinates of the flare to a graphics card which transmits an appropriate signal to the coating 108, 108' to produce a "mask" of 100% RGB (red/green/blue) in the portion of the coating through which the flare passes to strike the lens 100, thereby effectively blacking out the flare. This also reduces, or removes, possible refraction around the lenses.

In the embodiments shown in Fig 31, the CCD array 104 comprises a pair of back to back planar CCD devices which, in combination with the two imaging units 12, 12', makes the apparatus is relatively long. A significant gap between the two Fisheye lenses 100, 100', results in a blind spot around the middle of the imaging field. In order to reduce the height of the apparatus, in the embodiment shown in Fig. 33 a double sided planar, or non-planar, reflector 110 or prism is disposed between the lenses 102, 102' at approximately 45° to the optical axis to reflect the images on to the CCD devices 106, 106' arranged with surfaces substantially parallel to the optical axis and to the side of the apparatus. This enables the imaging units 12, 12' to be brought closer together, reducing the height of the apparatus and reducing the size of, or completely removing, the blind spot around the middle of the composite image. In the preferred arrangement, the imaging units 12, 12' are as close together as possible. Furthermore, as the CCD devices 106, 106' are no longer arranged back to back, air cooling of the CCD devices is improved, and as the CCD devices have a common focus and shared angle of incidence, any difference in the view point of the CCD devices is removed.

This arrangement enables further imaging lenses 112, 112' to be inserted between the reflector 110 and the CCD devices 106, 106' to reduce image distortion without further increasing the height of the apparatus. The use of wide angle lenses produces heavy optical distortion, requiring significant transforming by the image processing apparatus. Indeed, distortion may be such that close-up objects can not be accurately imaged. Therefore, in the embodiment shown in Fig. 34, each imaging unit includes a further imaging lens 114, 114' to reduce image distortion. As shown in Figs. 34 and 35, the reflector 110 may have either a convex, concave or otherwise shaped reflecting surfaces to resolve optical distortion prior to image reception by the CCD devices 106, 106'. The mapping operation for the back to back fisheye embodiments are similar, in principle, to those already explained above in connection with Fig 3 a. For example, as shown in Figs. 36a-36c, the circular images produced by each imaging unit are divided into regions 1-8 and these are mapped onto cartesian coordinate planes as shown in Fig. 36a (which receives the upper image), and Fig. 36b (which receives the lower image). These cartesian coordinate planes are then further mapped onto a single cartesian coordinate image plane, shown in Fig. 36c, where the regions 1-8 are assembled adjacent one another so as to reproduce a visually acceptable omnidirectional image. As shown in Fig. 36c, the horizontal rows represent a 360 horizontal view, whereas the columns represent the 360° (+/-) vertical view. The mapping operation can be carried out with the aid of coordinate lookup tables whereby the resultant transformation of the displayed images reconstruct the required composite visual image.

In all of the above embodiments, the imaging units are used to provide images of respective scenes which are received by image sensors and output to image processing apparatus to form a single composite image of the respective images. In an alternative embodiment shown in Fig. 37, which is based on the embodiment shown in Fig. 33, the CCD devices 106, 106' are replaced by projectors 120, 120' in the form of a quartz bulb 122, 122' and transparent LCD active matrix display, or TOLED, 124 for projecting respective halves of an omnidirectional scene onto the imaging units 12, 12' for projection on to a spherical display surrounding the viewer(s), thus providing the viewer(s) with an omnidirectional viewing experience of live or recorded material. The display screen may be a conventional white screen covered with a high gain material. The viewer(s) would be situated at the centre of the display, accessed by flush fitting doors coated with display material. Cooling for the projectors may be provided by any suitable means, for example a fan collar located around the outside of the apparatus and propelled by an externally mounted motor.

Figs. 38-42 show alternative embodiments in which a lens arrangement 150 is located in an aperture 151 in an upper reflector 18 for providing an image of an upper segment, and a convex reflector 16 co-operates with reflector 18 to provide an annular field of view. Similar components have been identified by similar reference numerals and hence have not been explained in detail. However, the following description identifies different components which are generally found in the upper (and lower) region of each embodiment. As these are symmetrical, only the upper unit has been described.

In Fig. 38, convex reflector 151 reflects light from the scene onto a smaller convex reflector 152, which in turn reflects light through the lens system 150 onto CCD 153.

In Fig. 39, the lens system 154 images the upper (lower) hemisphere directly without the need for any other reflecting arrangements.

Fig. 40 shows a planar reflector 155 which reflects light from the scene onto the small convex reflector 152 which in turn reflects light onto lens 150.

In Fig. 41, concave reflector 156 reflects onto convex reflector 151, which in turn reflects onto lens 150.

In Fig. 42, convex reflector 157 reflects light onto concave reflector 158 which reflects light to lens 150.

Generally, where concave and convex reflecting surfaces are arranged to co-operate, they can be high quality reflecting surfaces including parabolic and hyperbolic, primary and secondary mirror combinations which remove optical distortions. Also, the convex and concave surfaces can form a reflecting system similar to that known as a Cassegrain objective, using a parabolic concave primary mirror, with a hyperbolic convex secondary mirror as described for example in "Modern Optical Engineering", Warren J. Smith, "The Design of Optical Systems", pages 476-478. Also, any reflecting surfaces used in any of the embodiments described herein can be specular, including vacuum deposited materials, to achieve high resolution and reflectivity without the chromatic aberrations in glass lenses. Fig. 43 shows another embodiment with split fields of view for 360x360° viewing. In this case, instead of using back to back parabolic reflectors for the inner annular fields of view, a back to back planar/pyramidal reflecting arrangement (18,60) is used instead. This is similar to the arrangements shown in Fig. 20 and hence the construction and operation are the same. Also, in this case, the upper and lower segments of the scene are imaged in respective convex parabolic and planar (or concave) reflecting systems (16,18), which are similar to that shown in Fig. 5 (except they are now separated by the back to back pyramidal arrangements, otherwise the construction and operation of these outer, spaced, parabolic/planar reflecting systems is the same). Fig. 44 shows the same kind of arrangement, but for 180x360° viewing.

The various reflecting systems described herein can be combined in any way to provide the annular and segmental fields of view (in each 180x360° imaging unit) and the embodiments described with reference to the drawings are only examples of the invention. Any embodiment can have different geometries and optical components can be independently or mutually movable and the lens system, or shapes for dispositions of the curved and planar reflectors can be varied, so that the respective part or parts of a hemispherical or spherical scene is correctly focussed and composed on the image plane. The scene will vary in accordance with the application and it may include distant objects, as in the case of surveillance, or near objects (as in the case of pipe inspection), or something in an intermediate field, such as the interior or a room or a vehicle. Also, where vertical arrangements have been described for viewing upper and lower fields, these can be turned on their sides for horizontal viewing of the left and right hand sides of a scene. Therefore, any changes, modifications, or adjustments that are necessary in order to compose the scene correctly will be understood by those skilled in the art. Various features are described above in connection with preferred embodiments of the invention and those skilled in the art will know that they can be used in various other combinations than those shown and described. This also applies to features illustrated in the drawings and included in the claims. Therefore, all such features can provide basis for other claimed combinations in this application, or any divisions.

Claims

CLAEMS
1. Imaging apparatus comprising at least two imaging units, each unit including optical means with a wide field of view, the apparatus also including image sensing means for receiving an image from the optical means so as to provide a corresponding output; the imaging units being adjacently positioned (a) so as to encompass either the image sensing means, or image diverting means, which diverts images to the image sensing means, and (b) so that the output can be used to form a single composite image in the same plane.
2. Apparatus according to Claim 1, wherein each imaging unit includes first and second reflecting means, the first reflecting means being convex for reflecting an image from a panoramic scene onto the second reflecting means, the second reflecting means being arranged to reflect the image onto the image sensing means.
3. Apparatus according to Claim 2, wherein the first or convex reflecting means is hemi-spherical, parabolic, hyperbolic, ellipsoidal, or of a polygonal type where the polygon includes a plurality of planar or curved facets surrounding a central axis through the convex reflecting means.
4. Apparatus according to either Claim 2 or 3, wherein the second reflecting means is planar, concave or convex (hemispherical, parabolic, hyperbolic, ellipsoidal).
5. Apparatus according to Claim 4, wherein the first and second reflecting means have curved surfaces which are in confocal relationship or are catadioptric, or form Cassegrain objectives.
6. Apparatus according to any preceding claim, including a telecentric lens for reducing distortions in light incident on the image sensing means.
7. Apparatus according to any of Claims 2-6, wherein the first convex reflecting means has one or more apertures or light ports for receiving fight reflected from the second reflecting means, whereby incident light passes to image sensing means.
8. Apparatus according to Claim 7, wherein the first or convex reflecting means is of the polygonal type, including a plurality of facets surrounding a central axis through the convex reflecting means, and wherein either a central aperture is provided in the apex of the convex reflector, or apertures are provided at the midpoint of each facet.
9. Apparatus according to Claim 7, wherein a system for gathering light is located beneath the or each aperture, and the system includes a light guide, such as an optic fibre, for conducting light to the sensing means.
10. Apparatus according to Claim 9, wherein a light gathering system is provided beneath each aperture in each facet and a plurality of light guides are connected to a manifold for composing respective parts of a panoramic scene (imaged in the respective facets) directly onto the image plane of the image sensing means.
11. Apparatus according to Claim 7, wherein a light gathering system is located directed opposite each respective facet in the convex reflector, the light being conducted, via a light guide, to the sensing means.
12. Apparatus according to Claim 7, wherein the convex reflector comprises a first set of facets, which are the sides of the polygonal convex reflector having an axis of symmetry, the second reflecting means including a second set of facets for reflecting light through a light port at the apex of the convex reflector, the second set of facets being arranged to reflect light which is incident on them only from the respective facets in the first set, whereby the image sensing means separately and respectively receives light from those parts of the panoramic scene reflected in the first set of facets of the convex reflector.
13. Apparatus according to any of Claims 2-12, wherein the first and second reflecting means are mutually adjustable to enable the panoramic scene to be composed correctly on the image sensing means.
14. Apparatus according to Claim 13, including means for manually or automatically adjusting the distance between back to back imaging units, the distance between the first and second reflectors in each imaging units, for independently or simultaneously pivoting either the first or second reflector about an axis, or tilting the same, or displacing the same with respect to one another.
15. Apparatus according to any preceding claim, wherein the image sensing means is provided with zoom, and or is mounted in gimbals for making image adjustments.
16. Apparatus according to Claim 1, wherein parts of the panoramic scene are imaged by planar facets in a reflecting system, the reflecting system being adjustable so that each adjacent facet sees respective adjacent sectors of the panoramic scene, whereby parts of the scene can be joined together in the composite image.
17. Apparatus according to Claim 1, wherein each imaging unit includes a wide angle lens system, the lens system encompassing the image sensing means which is provided with a spherically curved image receiving surface that is shaped to reduce spherical distortions introduced by the wide angle lens system.
18. Apparatus according to Claim 1, wherein each imaging unit includes a wide angle lens system, which encompasses the image diverting means, each wide angle lens system being separated into two parts, a first part being located so as to gather light from the panoramic scene whereby it is incident on the image deflecting means, and the other parts being located so as to receive light from the image deflecting means, whereby the length of the back workings in the wide angle lens system is reduced on the longitudinal axis through the first parts of each lens system and the image deflecting means.
19. Imaging apparatus for producing stereoscopic images of a panoramic scene, the apparatus including a reflecting system for receiving light from said scene; first and second lens systems located in the same plane, the lens systems having respective optical axes which are spaced apart in said plane and symmetrical with an axis through a central axis of said reflecting system, and image sensing means for receiving focussed stereoscopic images of said scene from said lens systems; said stereoscopic images being formed by light with different optical properties, such as different colours or polarisations, whereby stereoscopic images derived from the image sensing means can be superimposed to give a 3D effect, whereby all or any part of the panoramic scene can be viewed in 3D.
20. Apparatus according to Claim 19, wherein the reflecting system includes a convex reflector.
21. Apparatus according to Claim 20, further including a second reflector, planar or curved, for reflecting light from the convex reflector through a light transmitting port in the apex of the convex reflector.
22. Apparatus according to any preceding claim, comprising shielding means for selectively shielding the imaging units from one or more predetermined wavelengths of electromagnetic radiation.
23. Apparatus according to Claim 22, including means adapted to control said shielding means according to the output of the image sensing means and which acts to cause a local change in the shielding means, or an overall change, or a change in selected regions in order to act as an iris, or any combination of the same.
24. Apparatus according to any of Claims 22 to 23 , wherein said shielding means comprises means arranged before, or extending over the exterior surfaces of the imaging units and responsive to input signals for causing local changes in shade, colour, or focus.
25. Apparatus according to Claim 24, wherein the shielding means comprises TOLEDS, OLEDs, FOLEDs or TLCDS controlled by the input signals to provide local changes in shade or colour.
26. Omnidirectional imaging apparatus for viewing a panoramic scene and for displaying an image of the scene in an image plane; the device comprising: optical means for receiving radiation from the scene and for directing it onto a focal plane; sensing means responsive to radiation incident on the focal plane to generate imaging signals; processing means for providing data in which values of the imaging signals, for radiation from different elements of the panoramic scene, are related to corresponding positions of incidence of the radiation on the optical means; the processing means also producing drive signals when the values of the imaging signals differ from a threshold value; and shielding means arranged before the optical means for selectively passing or blocking some or all of the incident radiation; the shielding means being responsive to the drive signals so as to pass or to block the incident radiation for which the values of the respective imaging signals differ from the threshold.
27. Apparatus according to Claim 26, wherein the shielding means comprises screen means, an electrode structure for activating selected elements of the screen means so that said elements either pass or block the incident radiation.
28. Apparatus according to Claim 27, wherein the screen means comprises TOLEDS, OLEDs, FOLEDs or TLCDS and wherein the electrode structure is responsive to the drive signals to provide local changes in the optical properties of the screen means in order to pass or to block the incident radiation.
29. Apparatus according to any of Claims 26-28, wherein the processing means: (a) detects imaging signals having values which differ from the threshold by an extent that could damage the sensing means; and
(b) provides the drive signals necessary to activate automatically a minor part of the shielding means for blocking the damaging radiation to prevent it being incident on optical means; whilst
(c) enabling non-damaging radiation to pass through the shielding means to be incident of the sensing means to maintain panoramic scene viewing.
30. Apparatus according to any of Claims 27-29, wherein the processing means:
(a) detects imaging signals having values which differ from the threshold by an extent that could damage the sensing means; and
(b) provides the drive signals necessary to activate automatically a major part of the shielding means for blocking the incident radiation to prevent it being received by optical means; whilst
(c) enabling non-damaging radiation to pass through a selected non- activated minor part of the shielding means to maintain restricted viewing of the corresponding part of the scene.
31. Omnidirectional imaging apparatus comprising first optical means having an annular field of view, second optical means having a field of view extending over a segment of a hemisphere and means for optically combining light received from both fields of view for the purpose of generating a video signal output, the first and second optical means being co-axially disposed so that their combined fields of view are hemispherical.
32. Apparatus according to Claim 31, wherein the first optical means comprises a first convex reflector and a planar, concave or second convex reflector for reflecting light through an aperture or apertures in the first convex reflector; and wherein the second optical means comprises an aperture or lens having the field of view extending over the segment of the hemisphere.
33. Apparatus according to Claim 31, wherein the aperture or lens is centrally located in the first planar, concave or second convex reflector so as to direct light from the segmental field through the aperture in the apex of the first convex reflector; the light from the annular field also being directed through the aperture in said apex.
34. Apparatus according to Claim 31, wherein the lens is part of an imaging device which captures image information directly from the segmental field; the light from the annular field being received by another imaging device.
35. Apparatus according to Claim 31, wherein the first optical means comprises a first convex reflector and a first planar, concave or second convex reflector for providing the annular field of view; and wherein the second optical means comprises a concave reflector and a third convex reflector for reflecting light from a scene to provide the segmental field of view.
36. Apparatus according to Claim 31, including an aperture or lens for directing light from the segmental field through the aperture in the apex of the convex reflector together with the light from the annular field.
37. Apparatus according to Claim 31, wherein the first optical means comprises a convex reflector and a first lens or imaging device for receiving light reflected directly from the convex reflector so as to provide the annular field of view, and wherein the second optical means comprises a second lens or imaging device having the field of view extending over a segment of a hemisphere.
38. Apparatus according to Claim 31 wherein the first optical means comprises a convex polygonal structure including a plurality of reflecting facets surrounding a central axis and a second reflector opposite said convex reflector for reflecting light through respective apertures in said facets ,whereby part of the scene in the annular field of view is reflected by the corresponding facet and is incident on image sensing means coupled to the aperture; and wherein the second optical means comprises either a lens system located in the second reflector and on said central axis, said lens having its optical axis aligned with said central axis, or a parabohc convex reflector and a second reflector opposite the convex reflector for reflecting light into an aperture at the apex of the convex reflector, whereby light from the segmental field of view is received by the or other image sensing means.
39. Apparatus according to Claim 31 wherein the first and second optical means respectively comprise a convex polygonal structure including a plurality of reflecting facets surrounding a central axis and a second reflector opposite said convex reflector for reflecting light through respective apertures in said facets, whereby part of the scene reflected in the corresponding facet is incident on image sensing means.
40. Apparatus according to Claim 31, wherein the convex reflector, which is part of the first optical means for providing the annular field of view, has an aperture at its apex for admitting light from both said fields of view, the optical combining means being located at or below the aperture to receive the light incident from both fields of view.
41. Apparatus according to Claim 31, wherein the second optical means comprises a concave reflecting surface for receiving light from a scene and reflecting it on to a convex reflecting surface which in turn reflects light into the optical combining means.
42. Apparatus according to Claim 31, wherein the second optical means comprises a convex reflecting surface for receiving light from a scene and reflecting it onto a concave reflecting surface which in turn reflects light into the optical combining means.
43. Apparatus according to Claim 2 or 42, wherein the concave reflecting surface and the convex reflecting surface of the second optical means are confocal parabolic and hyperbolic structures.
44. Apparatus according to any of Claims 31-43, wherein the optical combining means comprises any one of a prismatic device, light guide means, reflecting means, or any combinations thereof, for directing light from the aperture in the apex of the convex reflector to an imaging surface of a device for converting optical images into image signals.
45. Apparatus according to any of Claims 32-44, wherein the convex reflector which is part of the first optical means comprises planar reflecting surfaces which are inclined so as to form a bases of a regular polyhedron.
46. Apparatus according to Claim 45, wherein the planar reflecting surfaces of the polyhedron are shaped and/or dimensioned to suit the aspect ratio of an imaging device on which the light from both said fields of view is incident.
47. Apparatus according to Claim 41, wherein the first optical means comprises a first imaging device and convex reflector which together provide the annular field of view, and wherein the second optical means comprises a second imaging device having the field of view extending over a segment of a hemisphere.
48. Apparatus according to any of Claims 32-45, including means for attaching the apparatus to a camera to provide omnidirectional imaging.
49. Apparatus according to Claim 48, wherein convex reflectors are arranged back to back and a laterally extending member provides the camera attachment.
50. An imaging method comprising the steps of: providing an omnidirectional image of a scene; deriving image signals from said image; transforming the image signals into signal data having three degrees of spatial orientation; generating position related signals referenced to an artificial horizon; and using the position related signals to select the image signal data so as to generate a display of at least a part of the image of the scene, so that pitch, roll and yaw of the sensor is transformed into equivalent movements of the displayed image.
51. An imaging method according to Claim 50, wherein movement of the position sensor causes a relevant part of the scene to be tracked and imaged continuously.
52. An imaging method according to Claim 50 or 51, in which operation is in either a multicast mode, where the whole of the omnidirectional image data is processed, or a selected zone mode, where only different parts of the scene are viewed at different times and the processing tracks movement around the scene as different parts are viewed.
PCT/GB2001/001115 2000-03-16 2001-03-14 Imaging apparatus WO2001068540A2 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
GB0006396.6 2000-03-16
GB0006396A GB2360413A (en) 2000-03-16 2000-03-16 Wide angle parabolic imaging and image mapping apparatus
GBGB0018017.4A GB0018017D0 (en) 2000-03-16 2000-07-21 Imaging apparatus
GB0018017.4 2000-07-21
GB0019850.7 2000-08-11
GB0019850A GB0019850D0 (en) 2000-03-16 2000-08-11 Imaging apparatus
GB0021433.8 2000-08-31
GB0021433A GB0021433D0 (en) 2000-08-31 2000-08-31 Omnidirectional imaging apparatus
GB0023786A GB0023786D0 (en) 2000-09-28 2000-09-28 Omnidirectional imaging apparatus
GB0023786.7 2000-09-28
GB0028094.1 2000-11-17
GB0028094A GB0028094D0 (en) 2000-08-31 2000-11-17 Omnidirectional imaging attachment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU40828/01A AU4082801A (en) 2000-03-16 2001-03-14 Imaging apparatus
AU7264701A AU7264701A (en) 2000-07-21 2001-07-19 Stereoscopic omnidirectional imaging devices
PCT/GB2001/003251 WO2002008817A2 (en) 2000-07-21 2001-07-19 Stereoscopic omnidirectional imaging devices

Publications (2)

Publication Number Publication Date
WO2001068540A2 true WO2001068540A2 (en) 2001-09-20
WO2001068540A3 WO2001068540A3 (en) 2002-05-16

Family

ID=27546590

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2001/001115 WO2001068540A2 (en) 2000-03-16 2001-03-14 Imaging apparatus

Country Status (2)

Country Link
AU (1) AU4082801A (en)
WO (1) WO2001068540A2 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2385840A (en) * 2001-12-04 2003-09-03 Lee Scott Friend Airborne surveillance vehicle
WO2004051340A1 (en) * 2002-12-05 2004-06-17 Daimlerchrysler Ag Panoramic objective and panoramic camera
WO2004066013A1 (en) * 2003-01-17 2004-08-05 Daimlerchrysler Ag Catadioptric camera
FR2861525A1 (en) * 2003-10-24 2005-04-29 Winlight System Finance Wide angle image capturing device for use in, e.g., airplane, has unit selecting light beam that is representative of region of interest of image, and movable digital camera capturing selected beam only
US7087011B2 (en) 2003-12-30 2006-08-08 Gi View Ltd. Gastrointestinal system with traction member
EP1838086A1 (en) 2006-03-23 2007-09-26 Samsung Electronics Co.,Ltd. Omni-directional stereo camera and method of controlling thereof
FR2902592A1 (en) * 2006-06-12 2007-12-21 Tulip Vision Sarl Panoramic video surveillance system for room, has reflecting mirror returning light rays reflected by dome-shaped mirror towards camera via opening, where reflecting mirror has converging lens at level of orifice on optical axis of camera
EP2130084A1 (en) * 2007-03-16 2009-12-09 Kollmorgen Corporation System for panoramic image processing
EP2182718A1 (en) * 2008-10-29 2010-05-05 Weistech Technology Co., Ltd. Multi-lens image sensor module
EP2184632A2 (en) * 2008-11-07 2010-05-12 Otus Technologies Limited Panoramic camera
EP2353044A1 (en) * 2008-10-02 2011-08-10 Yepp Australia Pty Ltd Imaging system
WO2013015431A1 (en) 2011-07-25 2013-01-31 Ricoh Company, Ltd. Wide-angle lens and imaging device
EP2573605A3 (en) * 2011-08-31 2013-04-24 Ricoh Company, Ltd. Imaging optical system, imaging device and imaging system
JP2014056048A (en) * 2012-09-11 2014-03-27 Ricoh Co Ltd Entire celestial sphere-type imaging system and imaging optical system
US8702620B2 (en) 2008-11-03 2014-04-22 G.I. View Ltd. Remote pressure sensing system and method thereof
US8876730B2 (en) 2007-01-17 2014-11-04 G. I. View Ltd. Diagnostic or treatment tool for colonoscopy
JP2015034995A (en) * 2014-09-30 2015-02-19 株式会社リコー Optical system and image capturing system
US9241614B2 (en) 2005-08-01 2016-01-26 G.I. View Ltd. Tools for use in esophagus
JP2017058684A (en) * 2016-10-06 2017-03-23 株式会社リコー Optical system and image-capturing system
AU2015203745B2 (en) * 2008-10-02 2017-03-30 Yepp Australia Pty Ltd Imaging system
WO2017116328A1 (en) * 2015-12-30 2017-07-06 Yasar Universitesi 360° shooting device
CN106950791A (en) * 2017-02-22 2017-07-14 奇鋐科技股份有限公司 Panoramic imaging apparatus
US9794518B2 (en) 2010-10-21 2017-10-17 Sensormatic Electronics, LLC Method and system for converting privacy zone planar images to their corresponding pan/tilt coordinates
US10080481B2 (en) 2005-02-10 2018-09-25 G.I. View Ltd. Advancement techniques for gastrointestinal tool with guiding element
US10178374B2 (en) * 2015-04-03 2019-01-08 Microsoft Technology Licensing, Llc Depth imaging of a surrounding environment
US10226600B2 (en) 2008-07-30 2019-03-12 G.I. View Ltd. System and method for enhanced maneuverability
EP3480779A1 (en) * 2017-11-01 2019-05-08 Volvo Car Corporation Method and system for handling images

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4395093A (en) * 1981-05-21 1983-07-26 The United States Of America As Represented By The Secretary Of The Navy Lens system for panoramic imagery
US5115266A (en) * 1989-11-08 1992-05-19 Troje Gerald J Optical system for recording or projecting a panoramic image
US5563650A (en) * 1992-11-24 1996-10-08 Geeris Holding Nederland B.V. Method and device for producing panoramic images, and a method and device for consulting panoramic images
US5760826A (en) * 1996-05-10 1998-06-02 The Trustees Of Columbia University Omnidirectional imaging apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4395093A (en) * 1981-05-21 1983-07-26 The United States Of America As Represented By The Secretary Of The Navy Lens system for panoramic imagery
US5115266A (en) * 1989-11-08 1992-05-19 Troje Gerald J Optical system for recording or projecting a panoramic image
US5563650A (en) * 1992-11-24 1996-10-08 Geeris Holding Nederland B.V. Method and device for producing panoramic images, and a method and device for consulting panoramic images
US5760826A (en) * 1996-05-10 1998-06-02 The Trustees Of Columbia University Omnidirectional imaging apparatus

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2385840A (en) * 2001-12-04 2003-09-03 Lee Scott Friend Airborne surveillance vehicle
WO2004051340A1 (en) * 2002-12-05 2004-06-17 Daimlerchrysler Ag Panoramic objective and panoramic camera
WO2004066013A1 (en) * 2003-01-17 2004-08-05 Daimlerchrysler Ag Catadioptric camera
FR2861525A1 (en) * 2003-10-24 2005-04-29 Winlight System Finance Wide angle image capturing device for use in, e.g., airplane, has unit selecting light beam that is representative of region of interest of image, and movable digital camera capturing selected beam only
US7087011B2 (en) 2003-12-30 2006-08-08 Gi View Ltd. Gastrointestinal system with traction member
US10080481B2 (en) 2005-02-10 2018-09-25 G.I. View Ltd. Advancement techniques for gastrointestinal tool with guiding element
US9241614B2 (en) 2005-08-01 2016-01-26 G.I. View Ltd. Tools for use in esophagus
EP1838086A1 (en) 2006-03-23 2007-09-26 Samsung Electronics Co.,Ltd. Omni-directional stereo camera and method of controlling thereof
US7877007B2 (en) 2006-03-23 2011-01-25 Samsung Electronics Co., Ltd. Omni-directional stereo camera and method of controlling thereof
FR2902592A1 (en) * 2006-06-12 2007-12-21 Tulip Vision Sarl Panoramic video surveillance system for room, has reflecting mirror returning light rays reflected by dome-shaped mirror towards camera via opening, where reflecting mirror has converging lens at level of orifice on optical axis of camera
US8876730B2 (en) 2007-01-17 2014-11-04 G. I. View Ltd. Diagnostic or treatment tool for colonoscopy
US8106936B2 (en) 2007-03-16 2012-01-31 Kollmorgen Corporation Panoramic video imaging and display system
EP2130084A4 (en) * 2007-03-16 2010-06-30 Kollmorgen Corp System for panoramic image processing
EP2130084A1 (en) * 2007-03-16 2009-12-09 Kollmorgen Corporation System for panoramic image processing
JP2010521879A (en) * 2007-03-16 2010-06-24 コールモージェン・コーポレーション Panoramic image processing system
EP2562578A3 (en) * 2007-03-16 2013-06-26 Kollmorgen Corporation System for panoramic image processing
US10226600B2 (en) 2008-07-30 2019-03-12 G.I. View Ltd. System and method for enhanced maneuverability
EP2353044A1 (en) * 2008-10-02 2011-08-10 Yepp Australia Pty Ltd Imaging system
CN107007988A (en) * 2008-10-02 2017-08-04 Yepp澳大利亚有限公司 System for shooting images in rowing race, and training method
EP2353044A4 (en) * 2008-10-02 2012-05-16 Yepp Australia Pty Ltd Imaging system
AU2015203745B2 (en) * 2008-10-02 2017-03-30 Yepp Australia Pty Ltd Imaging system
CN102227668A (en) * 2008-10-02 2011-10-26 Yepp澳大利亚有限公司 Imaging system
EP2182718A1 (en) * 2008-10-29 2010-05-05 Weistech Technology Co., Ltd. Multi-lens image sensor module
US8702620B2 (en) 2008-11-03 2014-04-22 G.I. View Ltd. Remote pressure sensing system and method thereof
EP2184632A3 (en) * 2008-11-07 2010-07-07 Otus Technologies Limited Panoramic camera
EP2184632A2 (en) * 2008-11-07 2010-05-12 Otus Technologies Limited Panoramic camera
US9794518B2 (en) 2010-10-21 2017-10-17 Sensormatic Electronics, LLC Method and system for converting privacy zone planar images to their corresponding pan/tilt coordinates
EP2737354A4 (en) * 2011-07-25 2015-02-25 Ricoh Co Ltd Wide-angle lens and imaging device
US9019342B2 (en) * 2011-07-25 2015-04-28 Ricoh Company, Ltd. Wide-angle lens and imaging device
WO2013015431A1 (en) 2011-07-25 2013-01-31 Ricoh Company, Ltd. Wide-angle lens and imaging device
US9453991B2 (en) 2011-07-25 2016-09-27 Ricoh Company, Ltd. Wide-angle lens and imaging device
US20140132709A1 (en) * 2011-07-25 2014-05-15 Hiroyuki Satoh Wide-angle lens and imaging device
US9739983B2 (en) 2011-08-31 2017-08-22 Ricoh Company, Ltd. Imaging optical system, imaging device and imaging system
EP2573605A3 (en) * 2011-08-31 2013-04-24 Ricoh Company, Ltd. Imaging optical system, imaging device and imaging system
US9110273B2 (en) 2011-08-31 2015-08-18 Ricoh Company, Ltd. Imaging optical system, imaging device and imaging system
US10295797B2 (en) 2011-08-31 2019-05-21 Ricoh Company, Ltd. Imaging optical system, imaging device and imaging system
US10151905B2 (en) 2012-09-11 2018-12-11 Ricoh Company, Ltd. Image capture system and imaging optical system
JP2014056048A (en) * 2012-09-11 2014-03-27 Ricoh Co Ltd Entire celestial sphere-type imaging system and imaging optical system
US9013544B2 (en) 2012-09-11 2015-04-21 Ricoh Company, Ltd. Image capture system and imaging optical system
US9798117B2 (en) 2012-09-11 2017-10-24 Ricoh Company, Ltd. Image capture system and imaging optical system
US9413955B2 (en) 2012-09-11 2016-08-09 Ricoh Company, Ltd. Image capture system and imaging optical system
JP2015034995A (en) * 2014-09-30 2015-02-19 株式会社リコー Optical system and image capturing system
US10178374B2 (en) * 2015-04-03 2019-01-08 Microsoft Technology Licensing, Llc Depth imaging of a surrounding environment
WO2017116328A1 (en) * 2015-12-30 2017-07-06 Yasar Universitesi 360° shooting device
JP2017058684A (en) * 2016-10-06 2017-03-23 株式会社リコー Optical system and image-capturing system
CN106950791A (en) * 2017-02-22 2017-07-14 奇鋐科技股份有限公司 Panoramic imaging apparatus
EP3480779A1 (en) * 2017-11-01 2019-05-08 Volvo Car Corporation Method and system for handling images

Also Published As

Publication number Publication date
WO2001068540A3 (en) 2002-05-16
AU4082801A (en) 2001-09-24

Similar Documents

Publication Publication Date Title
US3505465A (en) Panoramic television viewing system
US8482595B2 (en) Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof
US9602700B2 (en) Method and system of simultaneously displaying multiple views for video surveillance
JP4643583B2 (en) Display apparatus and an imaging apparatus
US7965314B1 (en) Foveal camera systems and methods
CA2247612C (en) Panoramic viewing system with offset virtual optical centers
EP0982946B1 (en) Compact high resolution panoramic viewing system
EP1765014B1 (en) Surveillance camera apparatus and surveillance camera system
AU714042B2 (en) System and method for wide angle imaging
US5023725A (en) Method and apparatus for dodecahedral imaging system
JP3485261B2 (en) System and method for electronic imaging and processing the hemispherical field of view
EP2008445B1 (en) Improved plenoptic camera
US7306383B2 (en) Compound dome window for a surveillance camera
US7837330B2 (en) Panoramic three-dimensional adapter for an optical instrument and a combination of such an adapter and such an optical instrument
US8248515B2 (en) Variable imaging arrangements and methods therefor
US8305425B2 (en) Solid-state panoramic image capture apparatus
US8648958B2 (en) Variable imaging arrangements and methods therefor
US10162184B2 (en) Wide-field of view (FOV) imaging devices with active foveation capability
JP4247113B2 (en) Method for imaging a panoramic image by rectangular image sensor
US5497188A (en) Method for virtualizing an environment
EP0989436A2 (en) Stereoscopic panoramic viewing system
CN103026700B (en) An image capture device and method
EP2284814A1 (en) Systems and methods for night time surveillance
ES2714278T3 (en) Multi-camera system using folded optics
US7161615B2 (en) System and method for tracking objects and obscuring fields of view under video surveillance

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC (EPO FORM 1205A DATED 26.11.02)

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP