WO2012051751A1 - Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module - Google Patents
Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module Download PDFInfo
- Publication number
- WO2012051751A1 WO2012051751A1 PCT/CN2010/077872 CN2010077872W WO2012051751A1 WO 2012051751 A1 WO2012051751 A1 WO 2012051751A1 CN 2010077872 W CN2010077872 W CN 2010077872W WO 2012051751 A1 WO2012051751 A1 WO 2012051751A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- lenses
- sensor
- sensors
- optical module
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 53
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 12
- 239000011521 glass Substances 0.000 claims description 58
- 238000000034 method Methods 0.000 claims description 11
- 230000008878 coupling Effects 0.000 claims description 2
- 238000010168 coupling process Methods 0.000 claims description 2
- 238000005859 coupling reaction Methods 0.000 claims description 2
- 239000003086 colorant Substances 0.000 description 13
- 238000013519 translation Methods 0.000 description 9
- 230000014616 translation Effects 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 8
- 230000004075 alteration Effects 0.000 description 6
- 238000003491 array Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 238000000465 moulding Methods 0.000 description 4
- 239000004593 Epoxy Substances 0.000 description 2
- 238000000137 annealing Methods 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 229910052734 helium Inorganic materials 0.000 description 2
- 239000001307 helium Substances 0.000 description 2
- SWQJXJOGLNCZEY-UHFFFAOYSA-N helium atom Chemical compound [He] SWQJXJOGLNCZEY-UHFFFAOYSA-N 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000011347 resin Substances 0.000 description 2
- 229920005989 resin Polymers 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010583 slow cooling Methods 0.000 description 1
- 229910000679 solder Inorganic materials 0.000 description 1
- 125000006850 spacer group Chemical group 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0012—Arrays characterised by the manufacturing method
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0056—Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/41—Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
Definitions
- the present invention relates to optical modules for capturing images, such as used in mobile telephones.
- Optical modules generally comprise at least an image sensor and a lens for forming on the sensor an image of an object (or object image) located in front of the module.
- Optical modules for telephones must generally be capable of being produced in very high volumes, in the order of several hundred thousand per day, at a cost as low as possible.
- a standard structure of a color image sensor comprises a plurality of sensor pixels, each sensitive to a given light wavelength.
- the pixels are arranged in a matrix, for example along a pattern known as the Bayer pattern, as shown in Figure 1, in which one line is composed of alternatively blue and green pixels.
- the successive line is composed of red and green pixels, such that a subset of 2x2 pixels is always composed of two green pixels disposed along one diagonal and a red and blue pixel disposed along the other diagonal, as shown on fig I -
- a “red” pixel is for example a pixel made sensitive to the red wavelength by applying on top of it a filter that has a maximum of transparency in the range of wavelength considered as red, and which cuts off the others frequencies. Similar considerations apply to a "blue” pixel and a “green” pixel.
- Red, Blue and Green are called primary colors, as their combination in different proportion allows in theory to restitute any color perceived by the human eye.
- primary colors like Yellow, Magenta, and Cyan.
- the accuracy of the recombination of the colors depends on the accuracy of the filtering of the incident light into the primary colors, and to overcome the limitations coming from inaccuracy of the filtering, image sensors done with more than three primary colors have been proposed.
- a lens is provided to form an image on the matrix of pixels.
- the matrix of pixels must be contained within the field of view (FOV) of the lens.
- the pixels of each pixel subset of the Bayer pattern are read to recompose the color sensed by each pixel subset, thus allowing to recompose the color image formed on the matrix.
- the FOV is given by the relation:
- the focal distance F is proportional to D. It follows that, the more a sensor matrix comprises pixels, the larger the diagonal D of the matrix is, and the larger the focal distance F must be for a module comprising the sensor.
- US patent 5,276,538 and US application No. 2006/0066922 disclose arrays of microlenses disposed over the pixels of sensor arrays to increase the amount of light sent to each pixel or light sensing device of the array.
- the microlenses concentrate the light of the portion of the image formed on the microlenses' array above the corresponding pixels to enhance the light sensing efficiency of the pixels. Summary of the invention
- the present invention relates to an optical module with at least three monochromatic image sensors sensible each to a different wavelength, where one lens is coupled to each image sensor; and each lens is provided to form an identical image onto the image sensor to which it is coupled.
- An embodiment of the invention provides for an optical module comprising: at least three monochromatic image sensors sensitive each to a different wavelength; and as many lenses as there are image sensors, each lens being coupled to a distinct image sensor; wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
- the image sensors are coplanar, the lenses are coplanar, and the radiuses on axis, the thickness and the refractive indexes of the lenses are chosen such that the lenses have a same Back Focal Length.
- the lenses are formed on a common transparent plate.
- the lenses and the plate are made of glass.
- the lenses are formed in recesses of the transparent plate.
- each monochromatic image sensor comprises an array of pixels sensitive to a monochromatic light.
- the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and one sensor sensitive to green light.
- the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and two sensors sensitive to green light.
- Another embodiment of the invention provides for a system comprising: an optical module according to any of the preceding embodiments; and means for aligning the images sensed by each of the monochromatic image sensors.
- the means for aligning the images comprise: means for recognizing a predetermined pattern in the images of a predetermined pattern object on each of the monochromatic image sensors; and means for determining on which portions of each monochromatic image sensor are formed the same predetermined pattern.
- the means for aligning the images sensed by each of the monochromatic image sensors comprise means for storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
- Another embodiment of the invention provides for a method of manufacturing an optical module comprising: providing at least three distinct monochromatic image sensors sensitive each to a different wavelength; and coupling a distinct lens to each image sensor, wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
- An embodiment of the invention further comprises providing coplanar image sensors and providing lenses formed on a common transparent plate.
- An embodiment of the invention further comprises: forming on each sensor the image of a predetermined pattern object; determining on which portions of each monochromatic image sensor are formed same predetermined patterns from the predetermined pattern object; and determining and storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
- forming on each sensor the image of a predetermined pattern object comprises: using a predetermined pattern object having edges parallel with the edges of the pixels of the sensors; and arranging the pattern object at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
- Figure 1 illustrates a Bayer pattern for an array of sensor pixels.
- Figure 2 is a projection view of a sensor of a module according to an embodiment of the invention.
- Figure 3 is a projection view of a lens array provided for being used with the sensor of figure 2.
- Figure 4 is a projection view showing three elements of a module according to an embodiment of the invention, including the sensor of figure 2 and the array of lenses of figure 3.
- Figure 5 A is a projection view of a glass plate suitable for manufacturing the array of lenses of figure 3.
- Figure 5B is a cross section of the glass plate of Figure 5 A.
- Figures 6A to 6E illustrate steps of manufacturing an embodiment of an array of lenses as shown on Figure 3 using the glass plate of figures 5A-B.
- FIGS 7A to 7C illustrate steps of manufacturing another embodiment of an array of lenses as shown on Figure 3.
- Figure 8 is a cross-section of one lens of an embodiment of an array of lenses as shown on Figure 3.
- Figure 9 is a cross-section of one lens of another embodiment of an array of lenses as shown on Figure 3.
- the focal distance F of an optical module is proportional to the diagonal D of the sensor of the module.
- the present invention provides, in order to reduce the focal distance F, for replacing a polychromatic sensor having a diagonal D by a cluster of smaller monochromatic sensors, or sub-sensors, having smaller diagonals.
- the present invention provides for replacing a polychromatic sensor having RGB pixels arranged along a Bayer pattern and having a diagonal D by four monochromatic sensors (one with red pixels, one with blue pixels and two with green pixels) having each a diagonal of about D/2, as shown on Fig2.
- the images formed on the sub-sensors are read and then processed to recompose the image.
- the image can be recomposed either in an intermediate memory, called DVI hereafter, or in real time when the image is displayed onto a display device, such as a LCD screen, or a CRT tube.
- the present invention provides for superposing exactly, i.e. with an accuracy better than the dimension of an elementary pixel, each of the monochromatic images given by each of the sub-sensors.
- the superposition of the monochromatic images is preferably done by identifying the positions of a number of characteristic features on the image, such as edges, or remarkable pre-determined shapes, and using these positions to superpose the monochromatic images, for example stored in a memory.
- the present invention requires that all the monochromatic images have the same sharpness, or in other terms that an edge on one of the monochromatic images is also an edge on the other monochromatic images. This implies that not only the focal length of the lenses should be the same, but also that the various aberrations, geometric and chromatic, which can affect the image should be as identical as possible.
- the monochromatic images of remarkable pre-determined shapes in each sub-sensor differ only by their position, so once the positions of the pre-determined shapes in the monochromatic images are identified, translations that allow superposing the monochromatic images can be determined.
- the operation of determining the translations of the monochromatic images can be made only once, for example when testing the module after the sensor and the lenses have been assembled together, and the data defining these translation can be stored into a non volatile memory (generally called NVM).
- NVM non volatile memory
- the translations can be defined in number of pixels rather than in absolute values, the data that define the translations do not require a large memory.
- the number of additional rows and columns can be reduced, effectively reducing the usable size of the sub-sensors.
- the number of additional rows and columns can also be increased where the assembly precision of the module introduces larger image offset than plus or minus 10 pixels.
- an embodiment of the invention provides for detecting (with means internal or external to the module) which portions of each sub-sensor receive corresponding portions of the images formed on the sub-sensors.
- the module can then comprise means for storing the information of how an address, corresponding to a predetermined image portion in one of the sub-sensors, must be changed for corresponding to the same predetermined image portion in the other sub-sensors.
- the module can comprise electronic means for automatically correcting a single reading address sent to the module into appropriate reading addresses in each sub-sensor. This allows sending a single reading address to the module.
- An example of the remarkable predetermined pattern object or shapes can comprise four black L-shaped object located close to each of the four corners of a rectangle, the monochromatic image of which is formed onto each of the sub-sensors.
- the predetermined pattern object has edges parallel with the edges of the pixels of the sensors; and the pattern object is arranged at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
- each lens is calculated to correctly focus a primary color on an appropriate corresponding sub-sensor.
- the distance of each of the lenses to the surface of the sensors is identical.
- a low cost module assembly is achieved by providing a simple assembly scheme involving a single adjustment of the position of an array of lenses with respect to an array of sub-sensors.
- An imaging device intended for high volumes manufacturing is usually "reflowable", which means that it can withstand the temperature necessary to solder in one single operation all the components on the main Printed Circuit Board (PCB).
- the lenses able to withstand such temperatures are made either of glass or of some optical grade of thermoformed resin such as Epoxy.
- Epoxy to make a lens provides a quite inexpensive way of producing large volumes of lenses able to withstand the reflow temperature.
- the optical characteristics of such resin have not allowed, so far, attaining the optical performances that can be obtained when using glass, which is the preferred material to make lenses of superior quality.
- the present invention provides for positioning an array of N glass lenses (with N at least equal to three) over a sensor composed of N sub-sensors, each of these sub-sensors being characteristic of a primary color, and each corresponding lens having a focal length determined to provide a precise focus of the light color component onto the corresponding sub-sensor.
- an additional array of lenses can be stacked above a first array of lenses.
- the first array of lenses can be composed of plan convex lenses, the convex surface facing the object side
- the second array of lenses can be composed of plan concave lenses, the concave surface facing the image side. If necessary, more arrays can be added. However, for mobile phone applications, two layers of array are generally enough.
- others schemes can be used according to embodiments of the present invention; for example with three different primary colors Magenta, Yellow, Cyan, or with more than three primary colors.
- Figure 4 shows a module according to a preferred embodiment of the invention, using one red sub-sensor, one blue sub-sensor and two green sub-sensors, as illustrated in figure 2; and an array of lenses as illustrated in Figure 3.
- a lens holder 40 having four cavities 42 is arranged on a sensor 44 divided into four sub-sensors 46.
- the axis of each of the four cavities 42 is centered on the center of each sub-sensor 46.
- An array of lenses 48 comprising four lenses 50 is arranged above the lens holder 40 so that the optical axis of each of the four lenses 50 goes through the center of each corresponding sub-sensor 46.
- Figure 3 illustrates an array of four lenses according to an embodiment of the present invention.
- the lenses comprise each a first half-lens: called Nl , N2, N3 and N4, or in a generic way Nx.
- Each first half-lens is made of glass with a refractive index noted NxBlue for the Blue light, NxGreen for Green light and NxRed for Red light. In the same way, the Abbe Number are noted Vdx.
- the glass lenses are formed by molding the first half-lenses in glasses having low transition temperatures, noted Tgx, onto a common plan parallel glass plate that forms second half-lenses of the lenses.
- the common glass plate has refractive indexes NcommonRed, NcommonGreen, NcommonBlue, and an Abbe number Vdcommon, and a high transition temperature, noted Tgo, preferably higher by around 100 degree Celsius than the higher of the Tgx.
- the optical axis of the four half-lenses Nx intersect the glass plate in four points, noted Ox.
- the four half-lenses are plan convex, or plan concave, as the interface with the glass plate is plan.
- the interface may be also given a spherical or aspherical shape. To do so, before molding the half-lenses on the plate, optical surfaces centered on Ox are formed by molding the glass plate, as shown on Figure 5A.
- Figure 5A shows a glass plate 52 having four recesses 54 for providing non plane interfaces.
- Figure 5B shows a cross section of plate 52 across the optical axes 56 of two recesses 54.
- the optical axes 56 of the recesses 54 in the glass plate 52 are aligned with the optical axes of the half-lenses, typically within 2 microns.
- Figures 6A to 6E illustrate a sequence of operations for manufacturing an array of lenses as shown in Figure 3.
- Figure 6A shows a cross-section of a glass plate 62 placed into a tool comprising a mold plate 60 above the top surface of plate 52, having cylindrical holes 62 axially aligned with the locations where the lenses are to be formed. Mold cores 64 having bottom surfaces shaped as the desired half-lenses interfaces are introduced in the cylindrical holes 62.
- Figure 6B shows the cores 64 being pressed onto the glass plate 52 while the temperature of the plate is maintained above Tgo, thus forming recesses 66 axially aligned with the cylindrical holes 62.
- the plate 52 is thereafter cooled down.
- Tgo can be in the range of 650 to 720 degree Celsius
- the molding temperature can be in the range of 710-780 degree Celsius.
- the temperature can be reached in approximately 30-40 seconds, then cooled down after pressing of the cores.
- the cooling time is of the same order of magnitude than the heating time, but in order to suppress the stress induced in the glass by a fast cooling, a subsequent annealing can be performed, consisting in heating again the glass below Tgo, and making a slow cooling, around 100 degree Celsius per hour.
- this annealing is done at the end of the process, which means after the operation described hereafter in relation with figure 6E, to not immobilize the mold.
- Figure 6C shows the tool 60 where the cores 64 have been removed and where glass balls 68 of different, appropriate glasses, have been placed in the recesses 66.
- glass gobs (not shown) can be dropped in these locations. If such process option is chosen, which improves the alignment of the optical axis, the temperature at which the glass gobs are fluid must be lower than the Tg of the glass plate. This is made possible with a Tgo for the glass plate above 650 degree Celsius .
- the Tgx of the glass balls 68 or glass gobs must be lower than the Tgo of the glass plate.
- Figure 6D shows mold cores 70 having bottom surfaces shaped as the desired top surfaces of the top half-lenses, introduced in the cylindrical holes 62 of mold plate 60 and thus axially aligned with cylindrical holes 62.
- the temperature of the ensemble is elevated above the higher Tgx, but below Tgo and the cores 70 are pressed on the glass balls 68 or glass gobs.
- the pressure on the cores 70 gives the top half-lenses the desired shape; then the temperature of the ensemble is decreased in a process similar to what is described hereabove.
- Figure 6E shows a cross-section of the finished array of lenses, where the glass balls 68 of appropriate glass materials have been shaped into the first half-lenses of each lens, and where the portion of the glass-plate 52 below the first half-lenses form the second half-lenses of each lens.
- the lens array is then preferably annealed, as described above. It is of course possible to use a process such as described above to make an array of a larger number of lenses, intended either to be subsequently divided into arrays of N lenses, or to be used as it is onto a corresponding array of sensors composed of sub-sensors. In such application, an entire wafer of imaging sensors is covered with an array of lenses of the same dimensions, with the appropriate mechanical spacers between the wafer and the array of lenses.
- Figures 7A-7C illustrate a process such as illustrated in Figures 6A-6E, but where a flat interface is desired between the first and second half lenses. In such case, the recesses 66 are not formed.
- Figures 7A-7C illustrate, mutatis mutandis, the same features as Figures 6C-6E.
- one lens must focus the light on the Red sub-sensor, one lens focus the light on the Blue sub-sensor, and two lenses focus on the green sub-sensors.
- all lenses have almost identical optical performances and characteristics, very close geometric aberrations and very close chromatic aberrations, and identical BFL.
- Figure 8 illustrates a single lens 80 of a lens array according to an embodiment of the present invention, having a flat interface 82 between its first 84 and second 86 half lenses.
- the first half lens 84 is realized in a material with a refractive index Na.
- the curvature on axis is equal to 1/R.
- the distance from the apex of the curved surface to the interface 82 with the second half lens 86, in the glass plate 52 (done with a material of refractive index Nb) is D.
- the thickness of the glass plate 52 is T.
- the lenses (first and second half lenses) are immersed in a medium of refractive index Nc.
- the Back Focal Length BFL is the distance between the object side of the glass plate and the object focal point of the system.
- the focal length of the lens is given by the formula
- two conditions must be respected in an array of lenses:
- a common focal length is chosen for the module, using equation [1], the focal length is chosen in view of the dimensions of the sub-sensors, as well as the desired resolution and Field of View of the lenses.
- This glass must have a high Tg, preferably above 650 deg C.
- the values of the refractive indexes Nb for each color are fixed, referred here after as NcommonGreen, NcommonRed, NcommonBlue.
- BFL R.Nc/(Na-Nc) - D.Nc/Na - T.Nc/Nb ,
- BFL Rgreen/ (N2Green-l) - D/N2Green - T/ NcommonGreen
- NIRed is the refractive index in the red of the glass chosen for the lens corresponding to the red sensor
- NZGreen is the refractive index in the green of the glass chosen for the lens corresponding to the green sensor
- N3Blue is the refractive index in the blue of the glass chosen for the lens corresponding to the blue sensor
- T is the thickness of the glass plate onto which the four lenses are molded.
- NcommonRed, NcommonGreen, NcommonBlue are the refractive indexes of the glass chosen for the glass plate respectively in the red, green and blue.
- the remaining variable is the thickness D of the first half-lens.
- the glass chosen for the glass plate is a glass known under the commercial name Schott N-LASF31A, with a high Tg of 719 degree Celsius.
- the refractive indexes in the normalized red helium Nr, yellow helium Nd, and blue mercury Ng are:
- Nr 1.873
- the thickness T of the plate is 0.3 mm
- the Focal length of the lens (including first and second half-lenses) is 2.5 mm
- the thickness D of the first half-lens is around 1 mm
- the glass chosen for the red sub-sensor first half-lens lens is known under the commercial name Schott NPK51 ; with a Tg of 487 degree Celsius and with a refractive index in the red:
- the glass chosen for the blue sub-sensor first half-lens is the glass known under the commercial name Schott NPK51, as for the Red sensor.
- Figure 9 illustrates a single lens 90 of a lens array according to the present invention, having a non-flat interface 92 between its first 94 and second 96 half lenses
- the compound lens 90 is characterized by the radius of its top optical surface, the radius of the interface between the first and second half lenses, the refractive index of the glasses of the first and second half lenses, and the thickness T on axis of the glass plate, as shown on Fig 9.
- the present invention was described in relation with an optical module having four monochromatic image sensors, coupled to four lenses provided each to form an image on a corresponding sensor.
- the person skilled in the art will understand without difficulty that the present invention can comprise any number of monochromatic sensors coupled each to a lens, or even to any number of bi-chromatic sensors coupled each to a lens, wherein the lenses would be optimized for the wavelength sensed by the bi-chromatic sensors.
- the language "identical image" of an object means an image having the same magnification and the same sharpness as the image to which it is identical.
- each monochromatic image sensor preferably comprises the same number of pixels.
- the present invention has been described in relation with monochromatic sensors having the same size, coupled to lenses provided for forming identical images on each sensor.
- an embodiment of the invention also provides for having monochromatic sensors of different size, for example having pixel sizes different by predetermined ratios, coupled to lenses provided for forming on the sensors images of sizes differing by the same ratios.
- the sub-sensors of monochromatic sensors may comprise additional pixels for correcting the position of the images formed on the sub-sensors.
- choosing one of the at least three sub-sensors as a reference sub-sensor means that only the position of the images formed on the other sub-sensors must be adjusted, whereby no additional correction pixel need be provided in the reference sub-sensor.
- the pattern object used for alignment of the images comprises edges parallel with the edges of the pixels of the sensor (i.e. perpendicular edges for pixels having perpendicular edges) and the pattern object is located at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
- the module can provide means for aligning the images by automatically correcting a single reading address, sent to the module and corresponding to a predetermined image portion, into the actual addresses of the portions of each image sensor that receive the predetermined image portion.
- the module can be provided for determining and storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors (such as the reference sub-sensor), must be changed for corresponding to the same predetermined image portion in the other image sensors/sub-sensors.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Manufacturing & Machinery (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Lenses (AREA)
- Lens Barrels (AREA)
Abstract
An optical module comprises: at least three monochromatic image sensors (46) each sensitive to a different wavelength; and lenses (30,50) as many as image sensors (46), each lens (30,50) being coupled to a distinct image sensor (46); wherein each lens (30,50) is provided to form an identical image of an object located in front of the optical module onto the image sensor (46) to which it is coupled. A system comprising the optical module and a method of manufacturing the optical module are also provided.
Description
OPTICAL MODULE COMPRISING MONOCHROMATIC IMAGE SENSORS,
SYSTEM COMPRISING OPTICAL MODULE AND METHOD OF MANUFACTURING OPTICAL MODULE
Domain of the invention.
The present invention relates to optical modules for capturing images, such as used in mobile telephones. Optical modules generally comprise at least an image sensor and a lens for forming on the sensor an image of an object (or object image) located in front of the module. Optical modules for telephones must generally be capable of being produced in very high volumes, in the order of several hundred thousand per day, at a cost as low as possible.
Background of the invention
Because it is technically difficult to produce a light sensor pixel that satisfactorily senses color, a standard structure of a color image sensor comprises a plurality of sensor pixels, each sensitive to a given light wavelength. The pixels are arranged in a matrix, for example along a pattern known as the Bayer pattern, as shown in Figure 1, in which one line is composed of alternatively blue and green pixels. The successive line is composed of red and green pixels, such that a subset of 2x2 pixels is always composed of two green pixels disposed along one diagonal and a red and blue pixel disposed along the other diagonal, as shown on fig I -
A "red" pixel is for example a pixel made sensitive to the red wavelength by applying on top of it a filter that has a maximum of transparency in the range of wavelength considered as red, and which cuts off the others frequencies. Similar considerations apply to a "blue" pixel and a "green" pixel.
Red, Blue and Green, are called primary colors, as their combination in different proportion allows in theory to restitute any color perceived by the human eye. However, there are others primary colors like Yellow, Magenta, and Cyan. The accuracy of the recombination of the colors depends on the accuracy of the filtering of the incident light into the primary colors, and to overcome the limitations coming from inaccuracy of the filtering, image sensors done with more
than three primary colors have been proposed.
The description hereafter relates only to the case of a matrix of pixels arranged along the Bayer pattern with Red, Green, and Blue as primary colors; but all reasoning and conclusions can be extended to others choices of primary colors, including the choice of more than three primary colors
In a standard optical module for forming an image with an image sensor having a matrix of pixels, a lens is provided to form an image on the matrix of pixels. In order to achieve this result, the matrix of pixels must be contained within the field of view (FOV) of the lens. The pixels of each pixel subset of the Bayer pattern are read to recompose the color sensed by each pixel subset, thus allowing to recompose the color image formed on the matrix.
For an optical module with a sensor which diagonal has a dimension D, and with a lens having a focal length F, the FOV is given by the relation:
F= D / [2 .Tan (FOV/2)] [1]
So, for a given Field of View FOV, which is around 54 deg for the standard lenses and around 65-90 deg for the wide angle lenses, or above for so called "fish eyes lenses", the focal distance F is proportional to D. It follows that, the more a sensor matrix comprises pixels, the larger the diagonal D of the matrix is, and the larger the focal distance F must be for a module comprising the sensor.
However, the trend in mobile phone is to promote thinner and thinner devices as well as modules having sensors with more and more pixels. There is accordingly a need for an optical module having a short focal distance while having a sensor comprising a large number of pixels.
US patent 5,276,538 and US application No. 2006/0066922 disclose arrays of microlenses disposed over the pixels of sensor arrays to increase the amount of light sent to each pixel or light sensing device of the array. The microlenses concentrate the light of the portion of the image formed on the microlenses' array above the corresponding pixels to enhance the light sensing efficiency of the pixels.
Summary of the invention
The present invention relates to an optical module with at least three monochromatic image sensors sensible each to a different wavelength, where one lens is coupled to each image sensor; and each lens is provided to form an identical image onto the image sensor to which it is coupled.
An embodiment of the invention provides for an optical module comprising: at least three monochromatic image sensors sensitive each to a different wavelength; and as many lenses as there are image sensors, each lens being coupled to a distinct image sensor; wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
According to an embodiment of the invention, the image sensors are coplanar, the lenses are coplanar, and the radiuses on axis, the thickness and the refractive indexes of the lenses are chosen such that the lenses have a same Back Focal Length.
According to an embodiment of the invention, the lenses are formed on a common transparent plate.
According to an embodiment of the invention, the lenses and the plate are made of glass.
According to an embodiment of the invention, the lenses are formed in recesses of the transparent plate.
According to an embodiment of the invention, each monochromatic image sensor comprises an array of pixels sensitive to a monochromatic light.
According to an embodiment of the invention, the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and one sensor sensitive to green light.
According to an embodiment of the invention, the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and two sensors sensitive to green light.
Another embodiment of the invention provides for a system comprising: an optical module according to any of the preceding embodiments; and means for
aligning the images sensed by each of the monochromatic image sensors.
According to an embodiment of the invention, the means for aligning the images comprise: means for recognizing a predetermined pattern in the images of a predetermined pattern object on each of the monochromatic image sensors; and means for determining on which portions of each monochromatic image sensor are formed the same predetermined pattern.
According to an embodiment of the invention, the means for aligning the images sensed by each of the monochromatic image sensors comprise means for storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
Another embodiment of the invention provides for a method of manufacturing an optical module comprising: providing at least three distinct monochromatic image sensors sensitive each to a different wavelength; and coupling a distinct lens to each image sensor, wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
An embodiment of the invention further comprises providing coplanar image sensors and providing lenses formed on a common transparent plate.
An embodiment of the invention further comprises: forming on each sensor the image of a predetermined pattern object; determining on which portions of each monochromatic image sensor are formed same predetermined patterns from the predetermined pattern object; and determining and storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
According to an embodiment of the invention, forming on each sensor the image of a predetermined pattern object comprises: using a predetermined pattern object having edges parallel with the edges of the pixels of the sensors; and arranging the pattern object at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer
numbers of pixels of the sensors.
Brief description of the drawings
Figure 1 illustrates a Bayer pattern for an array of sensor pixels.
Figure 2 is a projection view of a sensor of a module according to an embodiment of the invention.
Figure 3 is a projection view of a lens array provided for being used with the sensor of figure 2.
Figure 4 is a projection view showing three elements of a module according to an embodiment of the invention, including the sensor of figure 2 and the array of lenses of figure 3.
Figure 5 A is a projection view of a glass plate suitable for manufacturing the array of lenses of figure 3.
Figure 5B is a cross section of the glass plate of Figure 5 A.
Figures 6A to 6E illustrate steps of manufacturing an embodiment of an array of lenses as shown on Figure 3 using the glass plate of figures 5A-B.
Figures 7A to 7C illustrate steps of manufacturing another embodiment of an array of lenses as shown on Figure 3.
Figure 8 is a cross-section of one lens of an embodiment of an array of lenses as shown on Figure 3.
Figure 9 is a cross-section of one lens of another embodiment of an array of lenses as shown on Figure 3.
Detailed description of the invention
As outlined above, the focal distance F of an optical module is proportional to the diagonal D of the sensor of the module. The present invention provides, in order to reduce the focal distance F, for replacing a polychromatic sensor having a diagonal D by a cluster of smaller monochromatic sensors, or sub-sensors, having smaller diagonals. For example, the present invention provides for replacing a polychromatic sensor having RGB pixels arranged along a Bayer pattern and having a diagonal D by four monochromatic sensors (one with red pixels, one
with blue pixels and two with green pixels) having each a diagonal of about D/2, as shown on Fig2. According to an embodiment of the invention, the images formed on the sub-sensors are read and then processed to recompose the image. The image can be recomposed either in an intermediate memory, called DVI hereafter, or in real time when the image is displayed onto a display device, such as a LCD screen, or a CRT tube.
The present invention provides for superposing exactly, i.e. with an accuracy better than the dimension of an elementary pixel, each of the monochromatic images given by each of the sub-sensors. The superposition of the monochromatic images is preferably done by identifying the positions of a number of characteristic features on the image, such as edges, or remarkable pre-determined shapes, and using these positions to superpose the monochromatic images, for example stored in a memory.
The present invention requires that all the monochromatic images have the same sharpness, or in other terms that an edge on one of the monochromatic images is also an edge on the other monochromatic images. This implies that not only the focal length of the lenses should be the same, but also that the various aberrations, geometric and chromatic, which can affect the image should be as identical as possible.
When the conditions on the aberrations mentioned hereabove are met, the monochromatic images of remarkable pre-determined shapes in each sub-sensor differ only by their position, so once the positions of the pre-determined shapes in the monochromatic images are identified, translations that allow superposing the monochromatic images can be determined. The operation of determining the translations of the monochromatic images can be made only once, for example when testing the module after the sensor and the lenses have been assembled together, and the data defining these translation can be stored into a non volatile memory (generally called NVM). As the translations can be defined in number of pixels rather than in absolute values, the data that define the translations do not require a large memory.
To give an order of magnitude, assuming that each sub-sensor is composed
of N lines of P pixels (for a VGA sub-sensor, N=640, and P=480), if the pixels in each sub-sensors are identified by their position along a line and by the position of the line, then a translation is defined by two adjustment numbers, one smaller than N and the other smaller than P. The inventors have noted that adjustment numbers of about plus or minus 10 pixels for N and P allow aligning the monochromatic images obtained with VGA sub-sensors having N=640 and P=480. Accordingly, an embodiment of the invention provides for adding 20 rows and 20 columns of pixels to monochromatic sub-sensors having N=640 and P=480 used in a module according to the present invention. Alternatively, the number of additional rows and columns can be reduced, effectively reducing the usable size of the sub-sensors. The number of additional rows and columns can also be increased where the assembly precision of the module introduces larger image offset than plus or minus 10 pixels.
Assuming for example that each translation is a positive or negative value limited to 10 pixels, which is a proportion of the total number of lines or row of the sensor large enough to cover a translation stemming from manufacturing tolerances, each translation can be coded on 4+1 = 5 bits.
Present manufacturing techniques of semiconductors allow embedding in a standard sensor chip enough cells of NVM to contain these numbers without impacting the cost of making this chip, which is an important consideration to have in mind as the cost consideration are of primary importance; in particular in the mobile phones industry, which is an important application of the optical modules described in this invention.
Alternatively, an embodiment of the invention provides for detecting (with means internal or external to the module) which portions of each sub-sensor receive corresponding portions of the images formed on the sub-sensors. The module can then comprise means for storing the information of how an address, corresponding to a predetermined image portion in one of the sub-sensors, must be changed for corresponding to the same predetermined image portion in the other sub-sensors. According to an embodiment of the invention, the module can comprise electronic means for automatically correcting a single reading address
sent to the module into appropriate reading addresses in each sub-sensor. This allows sending a single reading address to the module.
An example of the remarkable predetermined pattern object or shapes can comprise four black L-shaped object located close to each of the four corners of a rectangle, the monochromatic image of which is formed onto each of the sub-sensors. Preferably the predetermined pattern object has edges parallel with the edges of the pixels of the sensors; and the pattern object is arranged at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
As shown on Figure 3, the formation of the monochromatic images on four sub-sensors requires four lenses 30. According to the invention, each lens is calculated to correctly focus a primary color on an appropriate corresponding sub-sensor. Preferably, the distance of each of the lenses to the surface of the sensors is identical.
According to the present invention, replacing a single large multi-chromatic sensor by multiple mono-chromatic sub-sensors allows reducing the height of the camera module without substantially increasing the overall cost of the optical module. According to an embodiment of the invention, a low cost module assembly is achieved by providing a simple assembly scheme involving a single adjustment of the position of an array of lenses with respect to an array of sub-sensors.
An imaging device intended for high volumes manufacturing is usually "reflowable", which means that it can withstand the temperature necessary to solder in one single operation all the components on the main Printed Circuit Board (PCB). In the present state of the art, the lenses able to withstand such temperatures are made either of glass or of some optical grade of thermoformed resin such as Epoxy. Using Epoxy to make a lens provides a quite inexpensive way of producing large volumes of lenses able to withstand the reflow temperature. However, the optical characteristics of such resin have not allowed, so far, attaining the optical performances that can be obtained when using glass, which is the preferred material to make lenses of superior quality.
Preferably, the present invention provides for positioning an array of N glass lenses (with N at least equal to three) over a sensor composed of N sub-sensors, each of these sub-sensors being characteristic of a primary color, and each corresponding lens having a focal length determined to provide a precise focus of the light color component onto the corresponding sub-sensor.
According to an embodiment of the invention, if the desired optical qualities of the lenses require more than two optical surfaces, then an additional array of lenses can be stacked above a first array of lenses. For example, the first array of lenses can be composed of plan convex lenses, the convex surface facing the object side, and the second array of lenses can be composed of plan concave lenses, the concave surface facing the image side. If necessary, more arrays can be added. However, for mobile phone applications, two layers of array are generally enough.
As outlined previously, a commonly used choice of primary colors comprises Red, Green and Blue, having N=4 with two Green sub-sensors, one Red and one Blue. However, others schemes can be used according to embodiments of the present invention; for example with three different primary colors Magenta, Yellow, Cyan, or with more than three primary colors.
Figure 4 shows a module according to a preferred embodiment of the invention, using one red sub-sensor, one blue sub-sensor and two green sub-sensors, as illustrated in figure 2; and an array of lenses as illustrated in Figure 3.
A lens holder 40 having four cavities 42 is arranged on a sensor 44 divided into four sub-sensors 46. The axis of each of the four cavities 42 is centered on the center of each sub-sensor 46.
An array of lenses 48 comprising four lenses 50 is arranged above the lens holder 40 so that the optical axis of each of the four lenses 50 goes through the center of each corresponding sub-sensor 46.
As outlined above, Figure 3 illustrates an array of four lenses according to an embodiment of the present invention. The lenses comprise each a first half-lens: called Nl , N2, N3 and N4, or in a generic way Nx. Each first half-lens is made of
glass with a refractive index noted NxBlue for the Blue light, NxGreen for Green light and NxRed for Red light. In the same way, the Abbe Number are noted Vdx.
According to an embodiment of the invention, the glass lenses are formed by molding the first half-lenses in glasses having low transition temperatures, noted Tgx, onto a common plan parallel glass plate that forms second half-lenses of the lenses. The common glass plate has refractive indexes NcommonRed, NcommonGreen, NcommonBlue, and an Abbe number Vdcommon, and a high transition temperature, noted Tgo, preferably higher by around 100 degree Celsius than the higher of the Tgx. The optical axis of the four half-lenses Nx intersect the glass plate in four points, noted Ox.
In an embodiment of the invention, the four half-lenses are plan convex, or plan concave, as the interface with the glass plate is plan. According to another embodiment of the invention, the interface may be also given a spherical or aspherical shape. To do so, before molding the half-lenses on the plate, optical surfaces centered on Ox are formed by molding the glass plate, as shown on Figure 5A.
Figure 5A shows a glass plate 52 having four recesses 54 for providing non plane interfaces. Figure 5B shows a cross section of plate 52 across the optical axes 56 of two recesses 54. Preferably, the optical axes 56 of the recesses 54 in the glass plate 52 are aligned with the optical axes of the half-lenses, typically within 2 microns.
Figures 6A to 6E illustrate a sequence of operations for manufacturing an array of lenses as shown in Figure 3.
Figure 6A shows a cross-section of a glass plate 62 placed into a tool comprising a mold plate 60 above the top surface of plate 52, having cylindrical holes 62 axially aligned with the locations where the lenses are to be formed. Mold cores 64 having bottom surfaces shaped as the desired half-lenses interfaces are introduced in the cylindrical holes 62.
Figure 6B shows the cores 64 being pressed onto the glass plate 52 while the temperature of the plate is maintained above Tgo, thus forming recesses 66 axially aligned with the cylindrical holes 62. The plate 52 is thereafter cooled
down. Typically, Tgo can be in the range of 650 to 720 degree Celsius, and the molding temperature can be in the range of 710-780 degree Celsius. The temperature can be reached in approximately 30-40 seconds, then cooled down after pressing of the cores. The cooling time is of the same order of magnitude than the heating time, but in order to suppress the stress induced in the glass by a fast cooling, a subsequent annealing can be performed, consisting in heating again the glass below Tgo, and making a slow cooling, around 100 degree Celsius per hour. Advantageously, this annealing is done at the end of the process, which means after the operation described hereafter in relation with figure 6E, to not immobilize the mold.
Figure 6C shows the tool 60 where the cores 64 have been removed and where glass balls 68 of different, appropriate glasses, have been placed in the recesses 66. Alternatively, glass gobs (not shown) can be dropped in these locations. If such process option is chosen, which improves the alignment of the optical axis, the temperature at which the glass gobs are fluid must be lower than the Tg of the glass plate. This is made possible with a Tgo for the glass plate above 650 degree Celsius .
In any case, the Tgx of the glass balls 68 or glass gobs must be lower than the Tgo of the glass plate.
Figure 6D shows mold cores 70 having bottom surfaces shaped as the desired top surfaces of the top half-lenses, introduced in the cylindrical holes 62 of mold plate 60 and thus axially aligned with cylindrical holes 62. The temperature of the ensemble is elevated above the higher Tgx, but below Tgo and the cores 70 are pressed on the glass balls 68 or glass gobs. The pressure on the cores 70 gives the top half-lenses the desired shape; then the temperature of the ensemble is decreased in a process similar to what is described hereabove.
Figure 6E shows a cross-section of the finished array of lenses, where the glass balls 68 of appropriate glass materials have been shaped into the first half-lenses of each lens, and where the portion of the glass-plate 52 below the first half-lenses form the second half-lenses of each lens. The lens array is then preferably annealed, as described above.
It is of course possible to use a process such as described above to make an array of a larger number of lenses, intended either to be subsequently divided into arrays of N lenses, or to be used as it is onto a corresponding array of sensors composed of sub-sensors. In such application, an entire wafer of imaging sensors is covered with an array of lenses of the same dimensions, with the appropriate mechanical spacers between the wafer and the array of lenses.
Figures 7A-7C illustrate a process such as illustrated in Figures 6A-6E, but where a flat interface is desired between the first and second half lenses. In such case, the recesses 66 are not formed. Figures 7A-7C illustrate, mutatis mutandis, the same features as Figures 6C-6E.
In an array of lenses according to an embodiment of the present invention, such as illustrated in Figure 3, one lens must focus the light on the Red sub-sensor, one lens focus the light on the Blue sub-sensor, and two lenses focus on the green sub-sensors. As outlined previously, it is preferable that all lenses have almost identical optical performances and characteristics, very close geometric aberrations and very close chromatic aberrations, and identical BFL.
Figure 8 illustrates a single lens 80 of a lens array according to an embodiment of the present invention, having a flat interface 82 between its first 84 and second 86 half lenses.
The first half lens 84 is realized in a material with a refractive index Na.
The curvature on axis is equal to 1/R. The distance from the apex of the curved surface to the interface 82 with the second half lens 86, in the glass plate 52 (done with a material of refractive index Nb) is D. The thickness of the glass plate 52 is T.
The lenses (first and second half lenses) are immersed in a medium of refractive index Nc.
The back focal length a lens, comprising the first and second half lenses, is given by the expression:
BFL= R.Nc/(Na-Nc) - D.Nc/Na - T.Nc/Nb [2]
The Back Focal Length BFL is the distance between the object side of the
glass plate and the object focal point of the system.
Further, the focal length of the lens is given by the formula
F=R.Nc/(Na-Nc) [3]
Usually, the medium of refractive index is the air, so that Nc=l .
(One can refer to the book: "Modern Optical Engineering" by Warren J. Smith . MacGraw Hill Chapter 2 for the definition of the Back Focal Length and the way to compute it).
According to an embodiment of the invention, two conditions must be respected in an array of lenses:
a) The BFL must be the same for all lenses
b) A common focal length is chosen for the module, using equation [1], the focal length is chosen in view of the dimensions of the sub-sensors, as well as the desired resolution and Field of View of the lenses.
As the focal length depends only on R and on the refractive indexes Na and Nc, there is some flexibility to adjust the focal length of each lens, keeping in mind that the value of R must be very close for all lenses in order to keep the geometric aberrations quite similar for all the lenses. According to an embodiment of the invention, it is considered that a variation of R of 5% between the lenses does not lead to significant differences in the geometric aberrations.
Referring to the lens illustrated in Figure 8, the procedure to determine the desired characteristics of the array of lenses will be:
1) Determine the type of glass used for the common plate. This glass must have a high Tg, preferably above 650 deg C.
2) Determine the R for the first half-lens(es) corresponding to one of the primary colors. The choice of the type of glass is done considering the constraints on Tg, as seen previously, and others considerations such as cost.
3) Determine then the radius and refractive indexes for the first half-lenses corresponding to the others primary colors, to satisfy the equations [2]
and [3], taking into account that the radii must be within a range of around 5% (and preferably 1%).
Once the glass for the common plate is choosen, the values of the refractive indexes Nb for each color are fixed, referred here after as NcommonGreen, NcommonRed, NcommonBlue.
There are then, for each of the 3 lenses, 3 unknowns: R, D and N; or 9 unknowns altogether since there are 3 equations of the type:
BFL= R.Nc/(Na-Nc) - D.Nc/Na - T.Nc/Nb ,
Where Nc =1 (where lenses are immersed in air), and BFL, T are the same for the 3 lenses.
One has then, for the Red, Green and Blue lenses, the following equations: BFL= Rred/ (NlRed-1) - D / NIRed - T/ NcommonRed
BFL= Rgreen/ (N2Green-l) - D/N2Green - T/ NcommonGreen
BFL= Rblue/ (N3Blue-l) - D/N3Blue - T/ NcommonBlue
Where :
NIRed is the refractive index in the red of the glass chosen for the lens corresponding to the red sensor;
NZGreen is the refractive index in the green of the glass chosen for the lens corresponding to the green sensor;
N3Blue is the refractive index in the blue of the glass chosen for the lens corresponding to the blue sensor;
T is the thickness of the glass plate onto which the four lenses are molded; and
NcommonRed, NcommonGreen, NcommonBlue are the refractive indexes of the glass chosen for the glass plate respectively in the red, green and blue.
One can remark that the first term R.Nc/(Na-Nc) of each equation is the focal length of the corresponding lens, chosen to make the focal length identical for all lenses, which means that once this focal length has been chosen for one of the primary color, as said above, the value of this term is known.
In order to adjust then the BFL for each lens, the remaining variable is the thickness D of the first half-lens.
According to a preferred embodiment of the invention, for an array of lenses such as illustrated in Figure 8, the glass chosen for the glass plate is a glass known under the commercial name Schott N-LASF31A, with a high Tg of 719 degree Celsius.
The refractive indexes in the normalized red helium Nr, yellow helium Nd, and blue mercury Ng, are:
Nd= 1.883
Nr= 1.873
Ng=1.910
The thickness T of the plate is 0.3 mm
The Focal length of the lens (including first and second half-lenses) is 2.5 mm
The thickness D of the first half-lens is around 1 mm
According to a preferred embodiment of the invention, the glass chosen for the red sub-sensor first half-lens lens is known under the commercial name Schott NPK51 ; with a Tg of 487 degree Celsius and with a refractive index in the red:
Nr= 1.525
F=2.5 mm
Rred= 2.5 x ( 1.525-1)= 1.312 mm
With D= 1mm for the first half-lens thickness, then the BFL is fully determined:
BFL = 2.5 - 1/ 1.525 -0.3/ 1.883 = 1.68 mm
According to a preferred embodiment of the invention, the glass chosen for the green sub-sensor first half-lens is known under the commercial name Schott NPK52A; with a Tg of 467 deg C and a refractive index in the green Nd= 1.497 With the same focal length of 2.5 mm as above,
Rgreen= 2.5 x (1.497-1)= 1.24, which is in the range of +/- 5% compared to the radius of the first half-lens for the Red sensor.
With the same BFL of 1.68mm, the same F of 2.5 mm, and a refractive index in the green of 1.883 for the glass plate, one obtains:
1.68= 2.5-D/1.497 - 0.3/ 1.883
Which gives immediately D= 0.988
According to a preferred embodiment of the invention, the glass chosen for the blue sub-sensor first half-lens is the glass known under the commercial name Schott NPK51, as for the Red sensor.
The refractive index in the Blue is Ng= 1.537
With a similar calculation as above, and keeping F= 2.5mm, one obtains: Rblue= 2.5 x (1.537-1) = 1.34 mm, well within the desired range
And D= 1.01mm.
According to a preferred embodiment of the invention, one can choose lenses that have a same BFL, and give a perfect focus on the Red, Green and Blue sub-sensors, by determining for each of them the radius on axis and the thickness of the half-lenses; while respecting the constraints of radii and focal length.
Figure 9 illustrates a single lens 90 of a lens array according to the present invention, having a non-flat interface 92 between its first 94 and second 96 half lenses The compound lens 90 is characterized by the radius of its top optical surface, the radius of the interface between the first and second half lenses, the refractive index of the glasses of the first and second half lenses, and the thickness T on axis of the glass plate, as shown on Fig 9.
The formula giving the focal length and the BFL are complex, but the principle of the calculation is the same as in the previous case.
One can chose to simplify the problem by taking the same radius for the interface of all the lenses, and one has then the same number of equations and same number of variables as in the previous case, and one can follow the same methodology.
If such simplification is not made, one has then 3 more variables, which give a larger number of possible solutions. However, as it has been seen here above, the usage of a plan convex lens allows solving the large majority of practical cases, while leading to simple calculations.
The present invention was described in relation with an optical module having four monochromatic image sensors, coupled to four lenses provided each to form an image on a corresponding sensor. However, the person skilled in the art
will understand without difficulty that the present invention can comprise any number of monochromatic sensors coupled each to a lens, or even to any number of bi-chromatic sensors coupled each to a lens, wherein the lenses would be optimized for the wavelength sensed by the bi-chromatic sensors.
In the present description, the language "identical image" of an object means an image having the same magnification and the same sharpness as the image to which it is identical.
According to an embodiment of the present invention, each monochromatic image sensor preferably comprises the same number of pixels.
The present invention has been described in relation with monochromatic sensors having the same size, coupled to lenses provided for forming identical images on each sensor. However, an embodiment of the invention also provides for having monochromatic sensors of different size, for example having pixel sizes different by predetermined ratios, coupled to lenses provided for forming on the sensors images of sizes differing by the same ratios.
Also, as outlined above the sub-sensors of monochromatic sensors may comprise additional pixels for correcting the position of the images formed on the sub-sensors. According to an embodiment, choosing one of the at least three sub-sensors as a reference sub-sensor means that only the position of the images formed on the other sub-sensors must be adjusted, whereby no additional correction pixel need be provided in the reference sub-sensor.
Preferably, the pattern object used for alignment of the images comprises edges parallel with the edges of the pixels of the sensor (i.e. perpendicular edges for pixels having perpendicular edges) and the pattern object is located at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
According to an embodiment of the present invention, once the information of which portions of each monochromatic image sensor receive identical images from the lenses has been determined, for example by determining on which portions of each monochromatic image sensor are formed same predetermined patterns from a predetermined pattern object, the module can provide means for
aligning the images by automatically correcting a single reading address, sent to the module and corresponding to a predetermined image portion, into the actual addresses of the portions of each image sensor that receive the predetermined image portion.
Alternatively, the module can be provided for determining and storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors (such as the reference sub-sensor), must be changed for corresponding to the same predetermined image portion in the other image sensors/sub-sensors.
Claims
1. An optical module comprising:
at least three monochromatic image sensors sensitive each to a different wavelength; and
as many lenses as there are image sensors, each lens being coupled to a distinct image sensor;
wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
2. The optical module of claim 1, wherein the image sensors are coplanar, the lenses are coplanar, and wherein the radiuses on axis, the thickness and the refractive indexes of the lenses are chosen such that the lenses have a same Back Focal Length.
3. The optical module of claim 2, wherein the lenses are formed on a common transparent plate .
4. The optical module of claim 3, wherein the lenses and the plate are made of glass.
5. The optical module of claim 3 or 4, wherein the lenses are formed in recesses of the transparent plate.
6. The optical module of any of the preceding claims, wherein each monochromatic image sensor comprises an array of pixels sensitive to a monochromatic light.
7. The optical module of any of the preceding claims, wherein the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and one sensor sensitive to green light.
8. The optical module of any of claims 1-6, wherein the at least three image sensors comprise one sensor sensitive to red light, one sensor sensitive to blue light, and two sensors sensitive to green light.
9. A system comprising:
an optical module according to any of the preceding claims; and
means for aligning the images sensed by each of the monochromatic image sensors.
10. The system of claim 9, wherein the means for aligning the images comprise:
means for recognizing a predetermined pattern in the images of a predetermined pattern object on each of the monochromatic image sensors; and means for determining on which portions of each monochromatic image sensor are formed the same predetermined pattern.
11 The system of claim 9 or 10, wherein the means for aligning the images sensed by each of the monochromatic image sensors comprise means for storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
12. A method of manufacturing an optical module comprising:
providing at least three distinct monochromatic image sensors sensitive each to a different wavelength; and
coupling a distinct lens to each image sensor, wherein each lens is provided to form an identical image of an object located in front of the module onto the image sensor to which it is coupled.
13. The method of claim 12, comprising providing coplanar image sensors and providing lenses formed on a common transparent plate.
14. The method of any of claim 12-13, further comprising:
forming on each sensor the image of a predetermined pattern object;
determining on which portions of each monochromatic image sensor are formed same predetermined patterns from the predetermined pattern object; and determining and storing the information of how an address, corresponding to a predetermined image portion in one of the image sensors, must be changed for corresponding to the same predetermined image portion in the other image sensors.
15. The method of claim 14, wherein forming on each sensor the image of a predetermined pattern obj ect comprises :
using a predetermined pattern object having edges parallel with the edges of the pixels of the sensors; and
arranging the pattern object at a distance from the optical module such that the dimensions of the image of the pattern object on the sensors are equal to integer numbers of pixels of the sensors.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2010/077872 WO2012051751A1 (en) | 2010-10-19 | 2010-10-19 | Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module |
TW100120529A TW201218778A (en) | 2010-10-19 | 2011-06-13 | Optical module comprising monochromatic image sensors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2010/077872 WO2012051751A1 (en) | 2010-10-19 | 2010-10-19 | Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012051751A1 true WO2012051751A1 (en) | 2012-04-26 |
Family
ID=45974618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2010/077872 WO2012051751A1 (en) | 2010-10-19 | 2010-10-19 | Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module |
Country Status (2)
Country | Link |
---|---|
TW (1) | TW201218778A (en) |
WO (1) | WO2012051751A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4060739A4 (en) * | 2019-11-13 | 2023-02-08 | Sony Semiconductor Solutions Corporation | Imaging device and electronic instrument |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI486567B (en) * | 2012-05-21 | 2015-06-01 | Himax Tech Ltd | Testing method of optical sensing module group |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1544990A (en) * | 2003-11-12 | 2004-11-10 | 浙江大学 | Imaging method as Dichronic imaging by four lens |
CN101213830A (en) * | 2004-12-15 | 2008-07-02 | 美光科技公司 | Multi-lens imaging systems and methods |
CN101395926A (en) * | 2006-03-06 | 2009-03-25 | 美光科技公司 | Fused multi-array color image sensor |
CN101437168A (en) * | 2007-11-12 | 2009-05-20 | 索尼株式会社 | Image pickup apparatus |
CN101681916A (en) * | 2007-05-08 | 2010-03-24 | 美光科技公司 | Microlenses formed on array of greater lenses to adjust for shifted photodiode positions within group of pixels |
US20100097491A1 (en) * | 2008-10-21 | 2010-04-22 | Stmicroelectronics S.R.L. | Compound camera sensor and related method of processing digital images |
-
2010
- 2010-10-19 WO PCT/CN2010/077872 patent/WO2012051751A1/en active Application Filing
-
2011
- 2011-06-13 TW TW100120529A patent/TW201218778A/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1544990A (en) * | 2003-11-12 | 2004-11-10 | 浙江大学 | Imaging method as Dichronic imaging by four lens |
CN101213830A (en) * | 2004-12-15 | 2008-07-02 | 美光科技公司 | Multi-lens imaging systems and methods |
CN101395926A (en) * | 2006-03-06 | 2009-03-25 | 美光科技公司 | Fused multi-array color image sensor |
CN101681916A (en) * | 2007-05-08 | 2010-03-24 | 美光科技公司 | Microlenses formed on array of greater lenses to adjust for shifted photodiode positions within group of pixels |
CN101437168A (en) * | 2007-11-12 | 2009-05-20 | 索尼株式会社 | Image pickup apparatus |
US20100097491A1 (en) * | 2008-10-21 | 2010-04-22 | Stmicroelectronics S.R.L. | Compound camera sensor and related method of processing digital images |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4060739A4 (en) * | 2019-11-13 | 2023-02-08 | Sony Semiconductor Solutions Corporation | Imaging device and electronic instrument |
Also Published As
Publication number | Publication date |
---|---|
TW201218778A (en) | 2012-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101937125B (en) | Image pickup lens, image pickup device, and mobile terminal device | |
CN104808321B (en) | Optical imaging lens and electronic device applying same | |
CN105629447B (en) | The electronic installation of this camera lens of optical imaging lens and application | |
CN100576010C (en) | Zoom lens, image pick-up device and personal digital assistant | |
CN105589182B (en) | The electronic installation of this camera lens of optical imaging lens and application | |
CN107272143A (en) | Optical imaging system | |
CN105589181A (en) | Portable electronic device and optical imaging lens thereof | |
CN106707465A (en) | Optical imaging system | |
CN107132642A (en) | Optical imaging system | |
KR20120082585A (en) | Camera module and method for manufacturing the same | |
CN103185952B (en) | A kind of portable electronic devices and its optical imaging lens | |
CN104820276A (en) | Portable electronic device and optical imaging lens thereof | |
CN103135204B (en) | Portable electronic devices and its optical imaging lens | |
CN106094164A (en) | Optical imaging lens | |
CN106468818A (en) | Portable electronic devices and its optical imaging lens | |
CN108020905A (en) | Optical imaging system | |
CN107305278A (en) | Optical imaging system | |
CN105445901B (en) | The electronic installation of this camera lens of optical imaging lens and application | |
CN107305277A (en) | Optical imaging system | |
KR20110083524A (en) | Image pickup lens, image pickup module, and portable information device | |
CN107976775A (en) | Camera optical camera lens | |
CN108227122A (en) | Camera optical camera lens | |
CN103185957B (en) | Portable electronic device and optical imaging lens thereof | |
WO2012051751A1 (en) | Optical module comprising monochromatic image sensors, system comprising optical module and method of manufacturing optical module | |
CN104898254B (en) | Portable electronic devices and its optical imaging lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10858539 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10858539 Country of ref document: EP Kind code of ref document: A1 |