WO2012038601A1 - Optical system - Google Patents

Optical system Download PDF

Info

Publication number
WO2012038601A1
WO2012038601A1 PCT/FI2011/050812 FI2011050812W WO2012038601A1 WO 2012038601 A1 WO2012038601 A1 WO 2012038601A1 FI 2011050812 W FI2011050812 W FI 2011050812W WO 2012038601 A1 WO2012038601 A1 WO 2012038601A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
omnidirectional
component
vehicle
information
Prior art date
Application number
PCT/FI2011/050812
Other languages
French (fr)
Inventor
Mauri Aikio
Sergiu Nedevschi
Original Assignee
Teknologian Tutkimuskeskus Vtt
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologian Tutkimuskeskus Vtt filed Critical Teknologian Tutkimuskeskus Vtt
Publication of WO2012038601A1 publication Critical patent/WO2012038601A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • B60K31/0008Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/08Catadioptric systems
    • G02B17/0856Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors
    • G02B17/086Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors wherein the system is made of a single block of optical material, e.g. solid catadioptric systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/06Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe involving anamorphosis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/107Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors
    • B60Y2400/301Sensors for position or displacement
    • B60Y2400/3015Optical cameras
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces

Definitions

  • the exemplary and non-limiting embodiments of this invention relate generally to an optical system for forming wide field of view stereoscopic imag- es.
  • Stereoscopic images may be formed by two cameras which are apart from each other. To have a stereoscopic image of wide field of view, each of the two cameras should have a suitable optics for such a purpose. Alt- hough an omnidirectional stereoscopic image would be desirable, current fish- eye cameras cannot have a wider field of view than about 120° - 160°. Many applications require the entire 360° to be observable which means that at least three stereoscopic camera systems have to be used or the cameras has to be rotating.
  • a plurality of cameras or a rotating system is large and complicated, produces a lot of data, and requires a lot image processing capacity.
  • an apparatus as specified in claim 1 .
  • the invention provides advantages.
  • the optical system is simple and it may be made compact.
  • the optical system provides image data with high efficiency which is easy to process.
  • Figure 1 shows the general architecture of the optical system
  • Figure 2 shows a camera
  • Figure 3 presents the optical surfaces of the omnidirectional optical component
  • Figure 4 presents two omnidirectional optical components having second sides facing each other;
  • Figure 5 shows the optical system placed in a vehicle
  • Figure 6 shows a vehicle with the optical system, actuator and a controller
  • Figure 7 presents a flow chart of the method.
  • Figure 1 A general architecture of an optical system of the apparatus is illustrated in Figure 1 which is a simplified representation and shows only some elements and functional entities. The implementation of the apparatus may differ from what is shown.
  • the apparatus may comprise a first camera 100 and a second camera 102, an image processing unit 106 and a user interface 108.
  • the user interface 108 may also comprise or may be coupled to a control means similar to those illustrated in Figure 6 by reference number 602.
  • the control means may include brake pedal and/or steering wheel or the control means may be automatic.
  • the cameras may be integrated together and they may reside in a common case 104.
  • the case may be made of plastic or metal, for example.
  • the case 104 may have an optical window 1 10 for imaging.
  • the cameras 100, 102 may have at least nearly similar structure and they may operate at least nearly identically.
  • a camera 100, 102 may comprise a detecting component 222 which may comprise a matrix of pixels and an omnidirectional optical component 220 which may direct optical radiation coming from environment of the omnidirectional optical component 220 towards the detect- ing component 222.
  • a pixel is an element in a matrix of elements which may parts of a detecting element or a screen.
  • an imaging lens system 223 which may be used to further correct aberrations in the image.
  • the imaging lens 223 may be aspheric.
  • the detecting component 222 may be a CCD cell (Charge Coupled Device) or a CMOS cell (Com- plementary Metal Oxide Semiconductor), for instance.
  • the optical radiation is a band of electromagnetic radiation from about 10nm to about 50000nm i.e. it comprises bands of ultraviolet light, visible light and infrared light. However, even a single wavelength or any band in said range can be called optical radiation.
  • the omnidirectional optical component 220 may be well attached the frame and/case 104 of the optical system. In that way, the omnidirectional optical component 220 may protect the detecting component 222 and other parts of the optical system from dust and dirt, and hence the optical system does not necessarily need any external protection.
  • the omnidirectional optical component 220 may generally have a shape resembling a disc which has a first side 226, a second side 228 and a circumferential surface 230 between the first side 226 and the second side 228.
  • the shape of the disc may be understood topologically and hence the disc may be curved and/or stretched.
  • the shape may resemble a disc curved into a form resembling the letter V, for example.
  • An optical axis 232 of the optical radiation directed towards the detecting component 222 may go through a center 236 of the omnidirectional optical component 220.
  • the cameras 100, 102 may be integrated together in the direction of optical axes 232, 234 such that an optical axis 232 of one omnidirectional opti- cal component 220 may pass through another omnidirectional optical component 218. Since the cameras 100, 102 may be identical, the components in the different cameras 100, 102 have been numbered using the same numbers.
  • Figure 3 presents the optical surfaces of the omnidirectional optical component.
  • the omnidirectional optical component 220 may have four optical surfaces 300, 302, 304 and 306.
  • a receiving surface 300 in the circumference surface 230 and an outputting surface 302 in the middle of the second side 228 may have a concave curvature for refracting optical radiation.
  • the two remaining optical surfaces 304, 306 may reflect optical radiation inside the omnidirectional component 220.
  • a first reflecting surface 304 may be on the first side 226 in the middle of the omnidirectional optical component 220.
  • the first reflecting surface 304 may also have a concave curvature, and its reflection may be based on specular reflection.
  • a second reflecting surface 306 being placed on the second side 228 around the outputting surface 302.
  • the omnidirectional optical components 218, 220 may be identical with identical sides and optical surfaces.
  • the omnidirectional optical compo- nent 218, 220 may be made of plastic or glass, and it may have a diameter of less than one centimeter to few centimeters and its thickness may be a few millimeters to a couple of centimeters, for example. The potential for a small size enables an extraordinary compact size for the whole optical system.
  • the omnidirectional opti- cal component 218, 220 When beams of optical radiation approach the omnidirectional opti- cal component 218, 220, they may pass through the receiving surface 300 which may diverge the beams, the divergence taking place in the direction of the optical axis 232, 234 due to the concave curvature of the surface.
  • the direction of the optical axis 232, 234 may be considered the same as the direction of a normal of a flat disc corresponding to the omnidirectional optical com- ponent 218, 220.
  • the angular aperture a in which the omnidirectional optical component 220 may receive optical radiation may be about 20°. More generally the angular aperture may be 10° to 60°, for instance.
  • the beams In the transverse direction with respect to the optical axis 232, 234, the beams converge due to the convex curvature of the surface 300 since the surface 300 forms a circumfer- ence of a disc.
  • the beams of optical radiation When the beams of optical radiation have entered the omnidirectional optical component 218, 220, they may reflect from the second reflecting surface 306 towards the first reflecting surface 304 inside the omnidirectional optical component 218, 220.
  • the beams of optical radiation reflected from the second reflecting surface 306 may diverge in the transverse direction with respect to the optical axis 232, 234 thus at least partly compensating the convergence in the receiving surface 300.
  • the reflection from the second reflecting surface 306 diverges the beams, the diverging taking place in the direction of the optical axis 232, 234 due to the convex curvature of the second reflecting surface 306.
  • the first reflecting surface 304 diverges the beams due to the convex curvature of the surface.
  • the beams of optical radiation reflected from the first reflecting surface 304 may travel toward the outputting surface 302 which may pass them towards the detecting component 222.
  • the outputting surface 302 may diverge the beams in the direction of the optical axis 232, 234 due to the concave curvature of the surface.
  • the concave receiving surface 300 and/or the first convex reflecting surface 304 may have a freeform shape.
  • the freeform means a form or shape deviating from a spherical or even aspheric curvature which has been conventional in optics.
  • the freeform shape may be asymmetrical.
  • the detecting components 222 of the cameras 100, 102 may transform optical images formed through the omnidirectional optical components 218, 220 on the detecting components 222 into electrical signals and they may feed the electrical signals to the image processing unit 106.
  • the cameras 100, 102 may form still images of the environment at a regular rate. The rate may be an image per one second. Alternatively, the cameras 100, 102 may form video image of the environment the rate of which may a conventional video rate.
  • the frame rate of the video images may be from a few images per second to thousands of images per second, for example. The frame rate of the video images is not limited to that, however.
  • the image processing unit 106 may form stereoscopic information on the environment on the basis of the electrical signals from different detecting components 222.
  • the stereoscopic information may also be called 3D (three dimensional) information, and the potential images thus formed may be called stereoscopic images or 3D images.
  • 3D three dimensional
  • the potential images thus formed may be called stereoscopic images or 3D images.
  • the image of different cameras 100, 102 are taken at a different angle with respect to an object in the environment. The difference in the angle may result in a shift of objects with respect to each other in the images of different cameras 100, 102, and the sorts of deviations associated with images of different cameras 100, 102 may be used to form stereoscopic information on the environment.
  • the user interface 108 may comprise a screen for presenting information on the basis of the stereoscopic information to a user.
  • the stereoscopic information may comprise information on the environment in three dimensions.
  • Figure 4 presents an embodiment where the integration of the cam- eras 100,102 together may be realized by making the second side 228 of one omnidirectional optical component 220 to face the second side 228 of another omnidirectional optical component 218.
  • the cameras 100,102 may be integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical com- ponents 218, 220 face each other.
  • Figure 5 presents an embodiment, where the stereo camera 500 explained above may be placed in a vehicle 502. The cameras may be placed on the roof of the vehicle, for instance. The cameras may be on the top of each other such that the cameras 100, 102 have a vertical displacement which is different from the usual horizontal displacement. The vertical displacement makes it possible for the cameras have a true omnidirectional imaging since the cameras do not see each other. Instead of calling the displacement vertical, it may be expressed that the cameras 100, 102 are physically separated in the direction of optical axis 232, 234.
  • the image processing unit 106 may form on the basis of the stereoscopic information at least one of the following information: a distance D between the vehicle 502 and an object 504 in the environment, a velocity V1 of the vehicle 502, a velocity V2 of the object 504.
  • the user interface 108 may present the information.
  • the distance D may be expressed in a scalar form in a vector form giving information on the direction between the stereo camera 500 and the vehicle 502.
  • the velocities V1 , V2 may include a speed in scalar form and/or a direction of the speed. That is, the velocities V1 , V2 may be formed and presented in a vector form or in a scalar form.
  • the image processing unit 106 may estimate on the basis of the stereoscopic information a future state of at least one of the following: a distance Df between the vehicle 502 and an object 504 in the environment, a velocity V1f of the vehicle 502, a velocity V2f of the object 504.
  • the user interface 108 may present the information.
  • Figure 6 presents an embodiment, where the vehicle 502 comprises at least one actuator 600 for controlling the movement of the vehicle 502, and a controller 602.
  • the controller 602 may receive the stereoscopic information from the image processing unit 106 and control the at least one actuator 600 on the basis of the stereoscopic information.
  • the actuator 600 may be a braking system, for example. Alternatively or additionally, the actuator 600 may be a steering system or the like.
  • the controller 602 may command brakes to stop the vehicle 502 before a contact between the person 504 and the vehicle 502 without an input by the user.
  • the communication between the stereo camera 500, the image processing unit 106, the controller 602 and the actuator 600 may be performed through a wire or wirelessly.
  • the image processing unit 106 may form stereoscopic information on the basis of images from different cameras by a suitable computer program.
  • a suitable computer program may be based on a Semi-global Matching (SMG) method or some modification thereof, for instance.
  • SMG Semi-global Matching
  • FIG. 7 presents a flow chart of the method.
  • optical ra- diation is directed towards detecting components 222.
  • the directing takes place in two cameras 100, 102 each having an omnidirectional optical component 218, 220 of a shape of a disc with a first side 226, a second side 228 and a circumferential side 230 between the first side 226 and the second side 228.
  • the optical radiation comes from the environment through the omnidirectional optical components 218, 220, the cameras 100,102 being integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical components 218, 220 face each other.
  • the directing is performed by four optical surfaces 300, 302, 304, 306 of which a receiving surface 300 in the circumference side 230 and an outputting surface 302 in the middle of the second side 228 having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation inside the omnidirectional component 218, 220.
  • step 702 optical images formed through the omnidirectional opti- cal components 218, 220 on the detecting components 222 are transformed into electrical signals by the detecting components 222 of the cameras 100, 102.
  • step 704 the electrical signals are fed from the detecting components 222 to the image processing unit 106.
  • step 706 stereoscopic information on the environment is formed on the basis of the electrical signals from different detecting components 218, 220 by the image processing unit 106.
  • step 708 information is presented on the basis of the stereoscopic information to a user by the user interface 108.
  • the image processing unit 106 and the controller 602 may comprise a processor and memory.
  • the memory may include volatile and/or non- volatile memory and typically stores content, data, or the like.
  • the memory may store computer program code such as software applications or operating systems, information, data, content, or the like for the processor to perform steps associated with operation of the apparatus in accordance with embodiments.
  • the memory may be, for example, random access memory (RAM), a hard drive, or other fixed data memory or storage device. Further, the memory, or part of it, may be removable memory detachably connected to the apparatus.
  • Th e softwa re cod es may be stored in a ny su itable , processor/computer-readable data storage medium(s) or memory unit(s) or article(s) of manufacture and executed by one or more processors/computers.
  • the data storage medium or the memory unit may be implemented within the processor/computer or external to the processor/computer, in which case it can be communicatively coupled to the processor/computer via various means as is known in the art.
  • the presented solution may be used as a part of a surveillance system inside or outside.
  • the presented solution may be (a part of) a system helping a driver of a vehicle like a car to notice obstacles in the road, find a way through obstacles, determine where is the road or a drivable way, detect a lane in the absence of lane markings etc.
  • the optical system may alarm the user about a danger or problems in the environment.
  • the optical system may be placed or moved in a pipe or a duct for inspecting the pipe or the duct.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Optics & Photonics (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

A first camera and a second camera are integrated together in a common case. Each camera comprises a detecting component and an omnidirectional optical component for directing optical radiation from the environment towards the detecting component by four optical surfaces each omnidirectional optical component having a first side, a second side and a circumferential side. Detecting components transform optical images formed through the omnidirectional optical components on the detecting components into electrical signals and feed the electrical signals to the image processing unit. The image processing unit forms stereoscopic information on the environment on the basis of the electrical signals from different detecting components. The user interface presents information to a user on the basis of the stereoscopic information.

Description

Optical system Field
The exemplary and non-limiting embodiments of this invention relate generally to an optical system for forming wide field of view stereoscopic imag- es.
Background
Stereoscopic images may be formed by two cameras which are apart from each other. To have a stereoscopic image of wide field of view, each of the two cameras should have a suitable optics for such a purpose. Alt- hough an omnidirectional stereoscopic image would be desirable, current fish- eye cameras cannot have a wider field of view than about 120° - 160°. Many applications require the entire 360° to be observable which means that at least three stereoscopic camera systems have to be used or the cameras has to be rotating.
A plurality of cameras or a rotating system is large and complicated, produces a lot of data, and requires a lot image processing capacity.
Brief description
According to an aspect of the present invention, there is provided an apparatus as specified in claim 1 .
According to another aspect of the present invention, there is provided a method as specified in claim 10.
Preferred embodiments of the invention are disclosed in the dependent claims.
The invention provides advantages. The optical system is simple and it may be made compact. The optical system provides image data with high efficiency which is easy to process.
List of drawings
Embodiments of the present invention are described below, by way of example only, with reference to the accompanying drawings, in which
Figure 1 shows the general architecture of the optical system;
Figure 2 shows a camera;
Figure 3 presents the optical surfaces of the omnidirectional optical component; Figure 4 presents two omnidirectional optical components having second sides facing each other;
Figure 5 shows the optical system placed in a vehicle;
Figure 6 shows a vehicle with the optical system, actuator and a controller; and
Figure 7 presents a flow chart of the method.
Detailed description of some embodiments
Exemplary embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Although the specification may refer to "an", "one", or "some" embodiment(s) in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other em- bodiments. Therefore, all words and expressions should be interpreted broadly and they are intended to illustrate, not to restrict, the embodiment.
A general architecture of an optical system of the apparatus is illustrated in Figure 1 which is a simplified representation and shows only some elements and functional entities. The implementation of the apparatus may differ from what is shown.
The apparatus may comprise a first camera 100 and a second camera 102, an image processing unit 106 and a user interface 108. The user interface 108 may also comprise or may be coupled to a control means similar to those illustrated in Figure 6 by reference number 602. The control means may include brake pedal and/or steering wheel or the control means may be automatic. The cameras may be integrated together and they may reside in a common case 104. The case may be made of plastic or metal, for example. The case 104 may have an optical window 1 10 for imaging. The cameras 100, 102 may have at least nearly similar structure and they may operate at least nearly identically.
Figure 2 shows a camera. A camera 100, 102 may comprise a detecting component 222 which may comprise a matrix of pixels and an omnidirectional optical component 220 which may direct optical radiation coming from environment of the omnidirectional optical component 220 towards the detect- ing component 222. In general, a pixel is an element in a matrix of elements which may parts of a detecting element or a screen. Between the omnidirec- tional optical component 220 and the detecting component 222 there may be an imaging lens system 223 which may be used to further correct aberrations in the image. The imaging lens 223 may be aspheric. The detecting component 222 may be a CCD cell (Charge Coupled Device) or a CMOS cell (Com- plementary Metal Oxide Semiconductor), for instance.
The optical radiation is a band of electromagnetic radiation from about 10nm to about 50000nm i.e. it comprises bands of ultraviolet light, visible light and infrared light. However, even a single wavelength or any band in said range can be called optical radiation.
The omnidirectional optical component 220 may be well attached the frame and/case 104 of the optical system. In that way, the omnidirectional optical component 220 may protect the detecting component 222 and other parts of the optical system from dust and dirt, and hence the optical system does not necessarily need any external protection.
The omnidirectional optical component 220 may generally have a shape resembling a disc which has a first side 226, a second side 228 and a circumferential surface 230 between the first side 226 and the second side 228. The shape of the disc may be understood topologically and hence the disc may be curved and/or stretched. The shape may resemble a disc curved into a form resembling the letter V, for example. An optical axis 232 of the optical radiation directed towards the detecting component 222 may go through a center 236 of the omnidirectional optical component 220.
The cameras 100, 102 may be integrated together in the direction of optical axes 232, 234 such that an optical axis 232 of one omnidirectional opti- cal component 220 may pass through another omnidirectional optical component 218. Since the cameras 100, 102 may be identical, the components in the different cameras 100, 102 have been numbered using the same numbers.
Figure 3 presents the optical surfaces of the omnidirectional optical component. The omnidirectional optical component 220 may have four optical surfaces 300, 302, 304 and 306. A receiving surface 300 in the circumference surface 230 and an outputting surface 302 in the middle of the second side 228 may have a concave curvature for refracting optical radiation. The two remaining optical surfaces 304, 306 may reflect optical radiation inside the omnidirectional component 220. A first reflecting surface 304 may be on the first side 226 in the middle of the omnidirectional optical component 220. The first reflecting surface 304 may also have a concave curvature, and its reflection may be based on specular reflection. A second reflecting surface 306 being placed on the second side 228 around the outputting surface 302.
The omnidirectional optical components 218, 220 may be identical with identical sides and optical surfaces. The omnidirectional optical compo- nent 218, 220 may be made of plastic or glass, and it may have a diameter of less than one centimeter to few centimeters and its thickness may be a few millimeters to a couple of centimeters, for example. The potential for a small size enables an extraordinary compact size for the whole optical system.
When beams of optical radiation approach the omnidirectional opti- cal component 218, 220, they may pass through the receiving surface 300 which may diverge the beams, the divergence taking place in the direction of the optical axis 232, 234 due to the concave curvature of the surface. The direction of the optical axis 232, 234 may be considered the same as the direction of a normal of a flat disc corresponding to the omnidirectional optical com- ponent 218, 220. The angular aperture a in which the omnidirectional optical component 220 may receive optical radiation may be about 20°. More generally the angular aperture may be 10° to 60°, for instance. In the transverse direction with respect to the optical axis 232, 234, the beams converge due to the convex curvature of the surface 300 since the surface 300 forms a circumfer- ence of a disc.
When the beams of optical radiation have entered the omnidirectional optical component 218, 220, they may reflect from the second reflecting surface 306 towards the first reflecting surface 304 inside the omnidirectional optical component 218, 220. The beams of optical radiation reflected from the second reflecting surface 306 may diverge in the transverse direction with respect to the optical axis 232, 234 thus at least partly compensating the convergence in the receiving surface 300. Similarly, the reflection from the second reflecting surface 306 diverges the beams, the diverging taking place in the direction of the optical axis 232, 234 due to the convex curvature of the second reflecting surface 306.
The first reflecting surface 304 diverges the beams due to the convex curvature of the surface. The beams of optical radiation reflected from the first reflecting surface 304 may travel toward the outputting surface 302 which may pass them towards the detecting component 222. The outputting surface 302 may diverge the beams in the direction of the optical axis 232, 234 due to the concave curvature of the surface. The concave receiving surface 300 and/or the first convex reflecting surface 304 may have a freeform shape. The freeform means a form or shape deviating from a spherical or even aspheric curvature which has been conventional in optics. The freeform shape may be asymmetrical.
The detecting components 222 of the cameras 100, 102 may transform optical images formed through the omnidirectional optical components 218, 220 on the detecting components 222 into electrical signals and they may feed the electrical signals to the image processing unit 106. The cameras 100, 102 may form still images of the environment at a regular rate. The rate may be an image per one second. Alternatively, the cameras 100, 102 may form video image of the environment the rate of which may a conventional video rate. The frame rate of the video images may be from a few images per second to thousands of images per second, for example. The frame rate of the video images is not limited to that, however.
The image processing unit 106 may form stereoscopic information on the environment on the basis of the electrical signals from different detecting components 222. The stereoscopic information may also be called 3D (three dimensional) information, and the potential images thus formed may be called stereoscopic images or 3D images. Because the there is a physical dis- tance L between the omnidirectional optical components 218, 220, the image of different cameras 100, 102 are taken at a different angle with respect to an object in the environment. The difference in the angle may result in a shift of objects with respect to each other in the images of different cameras 100, 102, and the sorts of deviations associated with images of different cameras 100, 102 may be used to form stereoscopic information on the environment.
The user interface 108 may comprise a screen for presenting information on the basis of the stereoscopic information to a user. The stereoscopic information may comprise information on the environment in three dimensions.
Figure 4 presents an embodiment where the integration of the cam- eras 100,102 together may be realized by making the second side 228 of one omnidirectional optical component 220 to face the second side 228 of another omnidirectional optical component 218. In general, the cameras 100,102 may be integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical com- ponents 218, 220 face each other. Figure 5 presents an embodiment, where the stereo camera 500 explained above may be placed in a vehicle 502. The cameras may be placed on the roof of the vehicle, for instance. The cameras may be on the top of each other such that the cameras 100, 102 have a vertical displacement which is different from the usual horizontal displacement. The vertical displacement makes it possible for the cameras have a true omnidirectional imaging since the cameras do not see each other. Instead of calling the displacement vertical, it may be expressed that the cameras 100, 102 are physically separated in the direction of optical axis 232, 234.
The image processing unit 106 may form on the basis of the stereoscopic information at least one of the following information: a distance D between the vehicle 502 and an object 504 in the environment, a velocity V1 of the vehicle 502, a velocity V2 of the object 504. The user interface 108 may present the information. The distance D may be expressed in a scalar form in a vector form giving information on the direction between the stereo camera 500 and the vehicle 502. Similarly, the velocities V1 , V2 may include a speed in scalar form and/or a direction of the speed. That is, the velocities V1 , V2 may be formed and presented in a vector form or in a scalar form.
The image processing unit 106 may estimate on the basis of the stereoscopic information a future state of at least one of the following: a distance Df between the vehicle 502 and an object 504 in the environment, a velocity V1f of the vehicle 502, a velocity V2f of the object 504. The user interface 108 may present the information.
Figure 6 presents an embodiment, where the vehicle 502 comprises at least one actuator 600 for controlling the movement of the vehicle 502, and a controller 602. The controller 602 may receive the stereoscopic information from the image processing unit 106 and control the at least one actuator 600 on the basis of the stereoscopic information. The actuator 600 may be a braking system, for example. Alternatively or additionally, the actuator 600 may be a steering system or the like. Hence, if there is a danger that a person 504 will be run over by the vehicle 502, the controller 602 may command brakes to stop the vehicle 502 before a contact between the person 504 and the vehicle 502 without an input by the user. The communication between the stereo camera 500, the image processing unit 106, the controller 602 and the actuator 600 may be performed through a wire or wirelessly.
The image processing unit 106 may form stereoscopic information on the basis of images from different cameras by a suitable computer program. Such a computer program may be based on a Semi-global Matching (SMG) method or some modification thereof, for instance.
Figure 7 presents a flow chart of the method. In step 700, optical ra- diation is directed towards detecting components 222. The directing takes place in two cameras 100, 102 each having an omnidirectional optical component 218, 220 of a shape of a disc with a first side 226, a second side 228 and a circumferential side 230 between the first side 226 and the second side 228. The optical radiation comes from the environment through the omnidirectional optical components 218, 220, the cameras 100,102 being integrated together such that the first sides 226, the second sides 228 or the first side 226 and the second side 228 of the omnidirectional optical components 218, 220 face each other. The directing is performed by four optical surfaces 300, 302, 304, 306 of which a receiving surface 300 in the circumference side 230 and an outputting surface 302 in the middle of the second side 228 having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation inside the omnidirectional component 218, 220.
In step 702, optical images formed through the omnidirectional opti- cal components 218, 220 on the detecting components 222 are transformed into electrical signals by the detecting components 222 of the cameras 100, 102.
In step 704, the electrical signals are fed from the detecting components 222 to the image processing unit 106.
In step 706, stereoscopic information on the environment is formed on the basis of the electrical signals from different detecting components 218, 220 by the image processing unit 106.
In step 708, information is presented on the basis of the stereoscopic information to a user by the user interface 108.
The image processing unit 106 and the controller 602 may comprise a processor and memory. The memory may include volatile and/or non- volatile memory and typically stores content, data, or the like. For example, the memory may store computer program code such as software applications or operating systems, information, data, content, or the like for the processor to perform steps associated with operation of the apparatus in accordance with embodiments. The memory may be, for example, random access memory (RAM), a hard drive, or other fixed data memory or storage device. Further, the memory, or part of it, may be removable memory detachably connected to the apparatus.
The techniques described herein may be implemented by various means. Th e softwa re cod es may be stored in a ny su itable , processor/computer-readable data storage medium(s) or memory unit(s) or article(s) of manufacture and executed by one or more processors/computers. The data storage medium or the memory unit may be implemented within the processor/computer or external to the processor/computer, in which case it can be communicatively coupled to the processor/computer via various means as is known in the art.
The presented solution may be used as a part of a surveillance system inside or outside.
The presented solution may be (a part of) a system helping a driver of a vehicle like a car to notice obstacles in the road, find a way through obstacles, determine where is the road or a drivable way, detect a lane in the absence of lane markings etc. The optical system may alarm the user about a danger or problems in the environment.
The optical system may be placed or moved in a pipe or a duct for inspecting the pipe or the duct.
It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. The invention and its embodiments are not limited to the examples described above but may vary within the scope of the claims.

Claims

Claims
1 . An apparatus comprising: a first camera and a second camera integrated together in a common case, an image processing unit and a user interface;
each camera comprising a detecting component and an omnidirectional optical component for directing optical radiation coming from environment of the omnidirectional optical component towards the detecting component;
each omnidirectional optical component having a shape of a disc with a first side, a second side and a circumferential side between the first side and the second side being integrated together such that the first sides, the second sides or the first side and the second side of the omnidirectional optical components face each other;
each omnidirectional optical component having four optical surfaces of which a receiving surface in the circumference side and an outputting surface in the middle of the second side having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation inside the omnidirectional component;
detecting components of the cameras being configured to transform optical images formed through the omnidirectional optical components on the detecting components into electrical signals and feed the electrical signals to the image processing unit;
the image processing unit being configured to form stereoscopic information on the environment on the basis of the electrical signals from differ- ent detecting components; and
the user interface being configured to present information on the basis of the stereoscopic information.
2. The apparatus of claim 1 , wherein a first reflecting surface of the optical surfaces on the first side in the middle of the omnidirectional optical component having a convex curvature, and a second reflecting surface of the optical surfaces being placed on the second side around the outputting surface.
3. The apparatus of claim 1 , wherein the concave receiving surface and/or the first convex reflecting surface having freeform shapes.
4. The apparatus of claim 1 , wherein an optical axis () of one omnidirectional optical component being configured to pass through a center of another omnidirectional optical component.
5. The apparatus of claim 1 , wherein the cameras are configured to form video image of the environment.
6. The apparatus of claim 1 , wherein the second side of one omnidirectional optical component being configured to face the second side of another omnidirectional optical component.
7. A vehicle, the vehicle comprising the apparatus of claim 1 , the image processing unit being configured to form on the basis of the stereoscopic information at least one of the following information: a distance between the vehicle and an object in the environment, a speed of the vehicle, a speed of the object, and the user interface being configured to present the information.
8. The vehicle of claim 7, wherein the image processing unit being configured to estimate on the basis of the stereoscopic information a future state of at least one of the following: a distance between the vehicle and an object in the environment, a speed of the vehicle, a speed of the object, and the user interface being configured to present the information.
9. The vehicle of claim 7, wherein the vehicle comprising at least one actuator for controlling the movement of the vehicle, and a controller, the controller being configured to receive the stereoscopic information from the image processing unit and control the at least one actuator on the basis of the stereoscopic information.
10. A method comprising: directing,
- in two cameras each having an omnidirectional optical component of a shape of a disc with a first side, a second side and a circumferential side between the first side and the second side,
- optical radiation from environment through the omnidirectional optical components, the cameras being integrated together such that the first sides, the second sides or the first side and the second side of the omnidirectional optical components face each other,
- towards detecting components, - by four optical surfaces of which a receiving surface in the circumference side and an outputting surface in the middle of the second side having a concave curvature for refracting optical radiation, and two remaining optical surfaces being configured to reflect optical radiation in- side the omnidirectional component;
transforming, by the detecting components of the cameras, optical images formed through the omnidirectional optical components on the detecting components into electrical signals;
feeding the electrical signals from the detecting components to the image processing unit;
forming, by the image processing unit, stereoscopic information on the environment on the basis of the electrical signals from different detecting components; and
presenting, by the user interface, information on the basis of the ste- reoscopic information.
PCT/FI2011/050812 2010-09-22 2011-09-21 Optical system WO2012038601A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20105977A FI20105977A0 (en) 2010-09-22 2010-09-22 Optical system
FI20105977 2010-09-22

Publications (1)

Publication Number Publication Date
WO2012038601A1 true WO2012038601A1 (en) 2012-03-29

Family

ID=42829707

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2011/050812 WO2012038601A1 (en) 2010-09-22 2011-09-21 Optical system

Country Status (2)

Country Link
FI (1) FI20105977A0 (en)
WO (1) WO2012038601A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2835973A1 (en) 2013-08-06 2015-02-11 Sick Ag 3D camera and method for capturing of three-dimensional image data
US10789730B2 (en) 2016-03-18 2020-09-29 Teknologian Tutkimuskeskus Vtt Oy Method and apparatus for monitoring a position
CN112219086A (en) * 2018-09-18 2021-01-12 株式会社日立制作所 Stereo camera, vehicle-mounted lamp assembly and stereo camera system
EP3819671A1 (en) 2019-11-07 2021-05-12 Sick Ag Optoelectronic sensor and method for detecting objects

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042428A2 (en) * 2002-11-04 2004-05-21 O.D.F. Optronics Ltd. Omni-directional imaging and illumination assembly
JP2009015253A (en) * 2007-07-09 2009-01-22 Olympus Corp Optical device, optical system with the same and endoscope using the same
CN101414054A (en) * 2008-11-21 2009-04-22 浙江大学 Device and method for implementing stereo imaging by overall view ring belt imaging lens
EP2059058A2 (en) * 2007-11-09 2009-05-13 Honeywell International Inc. Stereo camera having 360 degree field of view
KR20090095761A (en) * 2008-03-06 2009-09-10 엘지전자 주식회사 System and Method for Robot Vision employing Panoramic Stereo Camera
EP2172798A1 (en) * 2007-07-09 2010-04-07 Olympus Corp. Optical element, optical system equipped with same and endoscope using same
US20100110565A1 (en) * 2007-07-09 2010-05-06 Takayoshi Togino Optical element, optical system having the same and endoscope using the same

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042428A2 (en) * 2002-11-04 2004-05-21 O.D.F. Optronics Ltd. Omni-directional imaging and illumination assembly
JP2009015253A (en) * 2007-07-09 2009-01-22 Olympus Corp Optical device, optical system with the same and endoscope using the same
EP2172798A1 (en) * 2007-07-09 2010-04-07 Olympus Corp. Optical element, optical system equipped with same and endoscope using same
US20100110565A1 (en) * 2007-07-09 2010-05-06 Takayoshi Togino Optical element, optical system having the same and endoscope using the same
EP2059058A2 (en) * 2007-11-09 2009-05-13 Honeywell International Inc. Stereo camera having 360 degree field of view
KR20090095761A (en) * 2008-03-06 2009-09-10 엘지전자 주식회사 System and Method for Robot Vision employing Panoramic Stereo Camera
CN101414054A (en) * 2008-11-21 2009-04-22 浙江大学 Device and method for implementing stereo imaging by overall view ring belt imaging lens

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2835973A1 (en) 2013-08-06 2015-02-11 Sick Ag 3D camera and method for capturing of three-dimensional image data
US10789730B2 (en) 2016-03-18 2020-09-29 Teknologian Tutkimuskeskus Vtt Oy Method and apparatus for monitoring a position
CN112219086A (en) * 2018-09-18 2021-01-12 株式会社日立制作所 Stereo camera, vehicle-mounted lamp assembly and stereo camera system
CN112219086B (en) * 2018-09-18 2022-05-06 株式会社日立制作所 Stereo camera, vehicle-mounted lamp assembly and stereo camera system
EP3819671A1 (en) 2019-11-07 2021-05-12 Sick Ag Optoelectronic sensor and method for detecting objects

Also Published As

Publication number Publication date
FI20105977A0 (en) 2010-09-22

Similar Documents

Publication Publication Date Title
TWI569036B (en) Image capturing optical lens assembly, image capturing device and electronic device
JP7266165B2 (en) Imaging device, imaging system, and display system
US11210533B1 (en) Method of predicting trajectory of vehicle
KR101895343B1 (en) Collision avoidance apparatus for vehicles
JP6522630B2 (en) Method and apparatus for displaying the periphery of a vehicle, and driver assistant system
JP7126123B2 (en) LEARNING DEVICE, LEARNING METHOD AND PROGRAM
CN108957703B (en) Optical image capturing system and image capturing device
US10623618B2 (en) Imaging device, display system, and imaging system
CN107122770B (en) Multi-camera system, intelligent driving system, automobile, method and storage medium
US11159744B2 (en) Imaging system, and mobile system
CN108107543B (en) Optical photographing lens assembly, image capturing device and electronic device
WO2016068095A1 (en) Optical system, image capturing device and distance measuring system
EP1803014A1 (en) Rectilinear mirror and imaging system having the same
JP2006337691A (en) Image-formation optical system
CN112215053B (en) Multi-sensor multi-object tracking
CN109782416B (en) Optical image capturing lens assembly, image capturing device and electronic device
CN111566536A (en) Optical lens assembly and electronic device including the same
TWI583990B (en) Imaging optical lens assembly, image capturing apparatus and electronic device
JP2006337690A (en) Image-formation optical system
WO2012038601A1 (en) Optical system
US20180338095A1 (en) Imaging system and moving body control system
JP2020150427A (en) Imaging device, imaging optical system, and moving object
JP6857147B2 (en) 3D image processing device and 3D image processing method
JP7170167B2 (en) Imaging device, display system, and imaging system
WO2020129398A1 (en) Observation apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11826465

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11826465

Country of ref document: EP

Kind code of ref document: A1