WO2022094215A1 - A camera system with offset image sensors - Google Patents

A camera system with offset image sensors Download PDF

Info

Publication number
WO2022094215A1
WO2022094215A1 PCT/US2021/057256 US2021057256W WO2022094215A1 WO 2022094215 A1 WO2022094215 A1 WO 2022094215A1 US 2021057256 W US2021057256 W US 2021057256W WO 2022094215 A1 WO2022094215 A1 WO 2022094215A1
Authority
WO
WIPO (PCT)
Prior art keywords
image sensor
image
offset
filter array
sensor
Prior art date
Application number
PCT/US2021/057256
Other languages
French (fr)
Inventor
Michael Dominik Steiner
Efrain O. MORALES CORREA
Original Assignee
Arthrex, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arthrex, Inc. filed Critical Arthrex, Inc.
Publication of WO2022094215A1 publication Critical patent/WO2022094215A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements

Definitions

  • the present disclosure relates to devices used in endoscopic surgery and, more particularly, to an improved endoscopic camera system utilizing multiple image sensors.
  • a camera system has a first filter array image sensor; and a second filter array image sensor; wherein the first and second filter array image sensors are positioned at an offset to each other.
  • the second filter array image sensor may be positioned at an offset of at least one-half pixel to the first filter array image sensor.
  • the second filter array image sensor may be positioned at a one pixel diagonal offset to the first filter array image sensor.
  • the second filter array image sensor may also be positioned at a half pixel diagonal offset to the first filter array image sensor.
  • the second filter array image sensor may also be positioned at a one pixel horizontal offset to the first filter array image sensor.
  • the second filter array image sensor may also be positioned at a one pixel vertical offset to the first Bayer image sensor.
  • the system may also have a third filter array image sensor; and wherein each filter array image sensor is positioned at an offset to the other filter array image sensors.
  • the first filter array image sensor may have a horizontal offset relative to the third filter array image sensor and the second filter array image sensor may have a vertical offset to the third filter array image sensor.
  • a camera system has a plurality of image sensors containing pixels with a plurality of wavelength response types arranged in a regular pattern wherein at least one wavelength response type is adjacent in at least one direction to a different wavelength response type. At least two image sensors are positioned at an offset to each other. Each offset is of a distance corresponding to at least one half pixel.
  • a second image sensor may be positioned at a diagonal offset to a first image sensor.
  • a second image sensor may be positioned at a one pixel diagonal offset to a first image sensor.
  • a second image sensor may be positioned at a half pixel diagonal offset to a first image sensor.
  • a second image sensor may be positioned at a horizontal offset to a first image sensor.
  • a second image sensor may be positioned at a one pixel horizontal offset to a first image sensor.
  • a second image sensor may be positioned at a vertical offset to a first image sensor.
  • a second image sensor may be positioned at a one pixel vertical offset to the first image sensor.
  • the system has three image sensors; and wherein each image sensor is positioned at an offset to the other image sensors.
  • a first image sensor may have a horizontal offset relative to a third image sensor and a second image sensor may have a vertical offset to the third image sensor.
  • the system has two image sensors; and the image sensors are Bayer sensors.
  • the system has three image sensors; and the image sensors are Bayer sensors.
  • the system has three image sensors; and two of the image sensors are Bayer sensors and one of the image sensors is a near infra-red sensor.
  • a method for processing images from a camera system with offset image sensors has the steps of: receiving a first image from a first image sensor; receiving a second image from a second image sensor, the second image sensor being offset from the first image sensor; combining the first image and the second image; interpolating missing pixels from surrounding pixels; and displaying the combined image with interpolated pixels.
  • the method may also have the steps of: receiving at least one additional image from an additional sensor the at least one additional sensor being offset from at least one of the first sensor and the second sensor; and combining the at least one additional image with the first image and the second image.
  • the method may also have the step of averaging pixel values received from multiple sensors.
  • FIG. 1 is a schematic diagram of a camera system having two image sensors according to an implementation
  • FIG. 2 is a schematic diagram of a camera system having three image sensors according to an implementation
  • FIG. 3 is a schematic diagram of an image sensor pixel array according to an implementation
  • FIG. 4 is a schematic diagram showing an image sensor according to an implementation
  • FIG. 5 is a schematic diagram showing two Bayer image sensors offset with a one pixel diagonal shift according to an implementation
  • FIG. 6 is a schematic diagram showing two Bayer image sensors offset with a half pixel diagonal shift according to an implementation
  • FIG. 7 is a schematic diagram showing two Bayer image sensors offset with a one pixel vertical shift according to an implementation
  • FIG. 8 is a schematic diagram showing three Bayer image sensors offset according to an implementation.
  • FIG. 9 is a flowchart showing a method for processing images from a camera system with offset image sensors according to an implementation.
  • a camera system may utilize a plurality of image sensors.
  • a camera system 10 may have a first sensor 12 and a second sensor 14.
  • a beam splitter 16 may direct incoming light to the first and second sensors 12, 14.
  • the term “beam splitter” includes, but is not limited to a beam splitter, prism and mirror.
  • One or more lenses 18 may focus incoming light onto the beam splitter 16.
  • a camera system 20 may have three image sensors and two beam splitters.
  • the first beam splitter 16 directs light to the first image sensor 12 and to a second beam splitter 22.
  • the second beam splitter 22 directs light to the second image sensor 14 and to a third image sensor 24.
  • a single beam splitter may be used to split light to the first, second and third image sensors 12, 14 and 24.
  • At least some of the image sensors according to an implementation are filter array image sensors, such as, for example and without limitation, Bayer sensors.
  • Bayer sensor refers to an image sensor that has a color filter array of RGB filters on a square grid of photosensors as described in U.S. Patent No. 3,971,065 to Bayer, issued on July 20, 1976, the entire contents of which are incorporated herein by reference.
  • the filter pattern is typically 50% green, 25% red and 25% blue, usable image sensors may have different amounts of each color.
  • the color filters may have many different orientations.
  • the image sensors may have filters other than RGB.
  • the image sensors may have pixels with different wavelength response types arranged in a regular pattern where at least one wavelength response type is adjacent in at least one direction to a different wavelength response type. This allows for image sensors with, for example and without limitation, a pattern with red, green, blue and near infra-red pixels. Additionally, this allows for image sensors with, for example and without limitation, a pattern with red, two types of green, and blue pixels.
  • Using multiple Bayer sensors to image a given scene offers the ability to achieve higher spatial resolution depending on a physical offset between the sensors.
  • a combination of aperture, pixel pitch (size), and a horizontal or vertical sensor spatial offset can allow for better perceived resolution.
  • an orientation of the spatial offset of the sensors may be varied to obtain different characteristics.
  • Fig. 3 shows a schematic diagram of an image sensor pixel array according to an implementation.
  • the sensor array shown has a 4 x 4 array of 16 pixels.
  • the schematic diagram is scaled to show half pixels.
  • the areas not considered in making a picture are shown in hashed lines.
  • Fig. 4 shows the red and green pixel locations of an image sensor according to an implementation. The locations would be the same for multiple image sensors with no offset.
  • the red pixels are labeled with an R and the green pixels are labeled with Gr for those green pixels located on the same rows as red pixels and Gb for those green pixels located on the same rows as blue pixels.
  • Fig. 5 shows a system of two Bayer sensors offset by one (1) pixel diagonally according to an implementation.
  • the red pixels of the first Bayer sensor are labeled with an R.
  • the red pixels of the second Bayer sensor are labeled with an r.
  • the greens of each sensor are co-located.
  • Missing blue and green pixels can be similarly bilinearly interpolated.
  • Fig. 6 shows a system of two Bayer sensors offset by one half (1/2) pixel diagonally according to an implementation.
  • the red pixels of the first Bayer sensor are labeled with an R.
  • the red pixels of the second Bayer sensor are labeled with an r.
  • the green pixels of the first Bayer sensor are labeled with Grl and Gbl.
  • the green pixels of the second Bayer sensor are labeled with Gr2 and Gb2.
  • the red sampling improves over the use of a single Bayer sensor, but not as much as in the implementation shown in Fig. 5, because there are rows and columns that do not have any red pixels.
  • the green sampling improves along a diagonal, but green sampling in the horizontal and vertical directions stays the same.
  • Bilinear interpolation for each missing red pixel can be done with the 8 nearest neighbors of the missing pixel. Similar bilinear interpolation can be done for blue and green missing pixels.
  • Fig. 7 shows a system of two Bayer sensors offset by one (1) pixel vertically according to an implementation.
  • the red pixels of the first Bayer sensor are labeled with an R.
  • the red pixels of the second Bayer sensor are labeled with an r.
  • the green pixels of the first Bayer sensor are labeled with Grl and Gbl.
  • the green pixels of the second Bayer sensor are labeled with Gr2 and Gb2.
  • red (and blue) vertical sampling improves, but horizontal red (and blue) sampling remains the same. Green sampling improves in all directions, but the sampling in one of the diagonal directions is less than in the implementation shown in Fig. 6.
  • Bilinear interpolation for each missing red pixel can be done with the 6 nearest neighbors of the missing pixel. Similar bilinear interpolation can be done for the missing blue pixels. There is no interpolation needed for the green pixels.
  • three Bayer sensors may be used to further improve spatial resolutions, such as by using the structure illustrated in Fig. 2.
  • the red pixels of the first Bayer sensor are labeled with an Rl.
  • the red pixels of the second Bayer sensor are labeled with an R2.
  • the red pixels of the third Bayer sensor are labeled with an R3.
  • the green pixels of the first Bayer sensor are labeled with Grl and Gbl.
  • the green pixels of the second Bayer sensor are labeled with Gr2 and Gb2.
  • the green pixels of the third Bayer sensor are labeled with Gr3 and Gb3.
  • the first Bayer sensor may have a horizontal shift (for example, one pixel) relative to a third Bayer sensor and a second Bayer sensor may have a vertical shift (for example, one pixel) relative to the third Bayer sensor.
  • This implementation allows for the green channel to have the same sampling advantage as shown in Fig. 7, along with reduced noise (because there are two green pixels that can be combined at half of the green sample locations).
  • this implementation has an additional advantage that the red and blue resolutions are improved in both the horizontal and vertical directions as compared to the systems in Figs. 6 and 7.
  • Bilinear interpolation for each missing red pixel can be done with its 8 nearest neighbors. Missing blue pixels can be similarly bilinearly interpolated. No interpolation is needed for the green pixels (although the Bayer 2 and Bayer 3 green pixels fall on the same locations and can be averaged together to lower the noise).
  • light may be split, such as by one or more additional beam splitters, to further image sensors.
  • the additional sensors may have different dynamic ranges, different wavelength responses, geometric properties, further shifted locations or different orientations.
  • this disclosure is also directed to a method for processing images from a camera system with offset image sensors.
  • a first image is received from a first image sensor, step 100.
  • a second image is received from a second image sensor offset to the first sensor, step 102.
  • the first and second images are combined, step 104. Missing pixels in each color are interpolated from the surrounding pixels, step 106.
  • the combined image with the interpolated pixels is then displayed step 108.
  • the image sensors are Bayer sensors.
  • the method may also have the steps of receiving at least one additional image from an additional sensor and combining the at least one additional image with the first image and the second image. Additionally, the method may have the step of averaging pixel values received from multiple sensors. In an implementation, the method is performed in an image processing system.
  • the image processing system may include one or more processors and field programmable gate arrays (FPGAs).

Abstract

A camera system having a first filter array image sensor; and a second filter array image sensor; wherein the first and second filter array image sensors are positioned at an offset to each other.

Description

A CAMERA SYSTEM WITH OFFSET IMAGE SENSORS
BACKGROUND
[0001] The present disclosure relates to devices used in endoscopic surgery and, more particularly, to an improved endoscopic camera system utilizing multiple image sensors.
[0002] There is a need for endoscopic camera systems having higher spatial resolution and perceived pixel resolution.
SUMMARY
[0003] A camera system according to an implementation has a first filter array image sensor; and a second filter array image sensor; wherein the first and second filter array image sensors are positioned at an offset to each other. The second filter array image sensor may be positioned at an offset of at least one-half pixel to the first filter array image sensor. The second filter array image sensor may be positioned at a one pixel diagonal offset to the first filter array image sensor. The second filter array image sensor may also be positioned at a half pixel diagonal offset to the first filter array image sensor. The second filter array image sensor may also be positioned at a one pixel horizontal offset to the first filter array image sensor. The second filter array image sensor may also be positioned at a one pixel vertical offset to the first Bayer image sensor.
[0004] The system may also have a third filter array image sensor; and wherein each filter array image sensor is positioned at an offset to the other filter array image sensors. The first filter array image sensor may have a horizontal offset relative to the third filter array image sensor and the second filter array image sensor may have a vertical offset to the third filter array image sensor.
[0005] In an implementation, a camera system has a plurality of image sensors containing pixels with a plurality of wavelength response types arranged in a regular pattern wherein at least one wavelength response type is adjacent in at least one direction to a different wavelength response type. At least two image sensors are positioned at an offset to each other. Each offset is of a distance corresponding to at least one half pixel.
[0006] A second image sensor may be positioned at a diagonal offset to a first image sensor. A second image sensor may be positioned at a one pixel diagonal offset to a first image sensor. A second image sensor may be positioned at a half pixel diagonal offset to a first image sensor. A second image sensor may be positioned at a horizontal offset to a first image sensor. A second image sensor may be positioned at a one pixel horizontal offset to a first image sensor. A second image sensor may be positioned at a vertical offset to a first image sensor. A second image sensor may be positioned at a one pixel vertical offset to the first image sensor.
[0007] In an implementation, the system has three image sensors; and wherein each image sensor is positioned at an offset to the other image sensors. A first image sensor may have a horizontal offset relative to a third image sensor and a second image sensor may have a vertical offset to the third image sensor. In an implementation, the system has two image sensors; and the image sensors are Bayer sensors. In an implementation, the system has three image sensors; and the image sensors are Bayer sensors. In an implementation, the system has three image sensors; and two of the image sensors are Bayer sensors and one of the image sensors is a near infra-red sensor.
[0008] According to an implementation, a method for processing images from a camera system with offset image sensors has the steps of: receiving a first image from a first image sensor; receiving a second image from a second image sensor, the second image sensor being offset from the first image sensor; combining the first image and the second image; interpolating missing pixels from surrounding pixels; and displaying the combined image with interpolated pixels. The method may also have the steps of: receiving at least one additional image from an additional sensor the at least one additional sensor being offset from at least one of the first sensor and the second sensor; and combining the at least one additional image with the first image and the second image. The method may also have the step of averaging pixel values received from multiple sensors. [0009] These and other features are described below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The features, aspects and advantages of the present invention will become better understood with regard to the following description, appended claims and accompanying figures wherein:
[0011] FIG. 1 is a schematic diagram of a camera system having two image sensors according to an implementation;
[0012] FIG. 2 is a schematic diagram of a camera system having three image sensors according to an implementation;
[0013] FIG. 3 is a schematic diagram of an image sensor pixel array according to an implementation;
[0014] FIG. 4 is a schematic diagram showing an image sensor according to an implementation;
[0015] FIG. 5 is a schematic diagram showing two Bayer image sensors offset with a one pixel diagonal shift according to an implementation;
[0016] FIG. 6 is a schematic diagram showing two Bayer image sensors offset with a half pixel diagonal shift according to an implementation;
[0017] FIG. 7 is a schematic diagram showing two Bayer image sensors offset with a one pixel vertical shift according to an implementation;
[0018] FIG. 8 is a schematic diagram showing three Bayer image sensors offset according to an implementation; and
[0019] FIG. 9 is a flowchart showing a method for processing images from a camera system with offset image sensors according to an implementation. DETAILED DESCRIPTION
[0020] In the following description of the preferred implementations, reference is made to the accompanying drawings which show by way of illustration specific implementations in which the invention may be practiced. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. It is to be understood that other implementations may be utilized, and structural and functional changes may be made without departing from the scope of this disclosure.
[0021] A camera system according to an implementation of the present invention may utilize a plurality of image sensors. With reference to Fig. 1, a camera system 10 according to an implementation of the present invention may have a first sensor 12 and a second sensor 14. A beam splitter 16 may direct incoming light to the first and second sensors 12, 14. As used herein, the term “beam splitter” includes, but is not limited to a beam splitter, prism and mirror. One or more lenses 18 may focus incoming light onto the beam splitter 16.
[0022] With reference to Fig. 2, a camera system 20 according to another implementation of the present invention may have three image sensors and two beam splitters. The first beam splitter 16 directs light to the first image sensor 12 and to a second beam splitter 22. The second beam splitter 22 directs light to the second image sensor 14 and to a third image sensor 24. Alternatively, a single beam splitter may be used to split light to the first, second and third image sensors 12, 14 and 24.
[0023] At least some of the image sensors according to an implementation are filter array image sensors, such as, for example and without limitation, Bayer sensors. As used herein, the term “Bayer sensor” refers to an image sensor that has a color filter array of RGB filters on a square grid of photosensors as described in U.S. Patent No. 3,971,065 to Bayer, issued on July 20, 1976, the entire contents of which are incorporated herein by reference. Although the filter pattern is typically 50% green, 25% red and 25% blue, usable image sensors may have different amounts of each color. The color filters may have many different orientations. [0024] Additionally, the image sensors may have filters other than RGB. The image sensors may have pixels with different wavelength response types arranged in a regular pattern where at least one wavelength response type is adjacent in at least one direction to a different wavelength response type. This allows for image sensors with, for example and without limitation, a pattern with red, green, blue and near infra-red pixels. Additionally, this allows for image sensors with, for example and without limitation, a pattern with red, two types of green, and blue pixels.
[0025] Using multiple Bayer sensors to image a given scene offers the ability to achieve higher spatial resolution depending on a physical offset between the sensors. For endoscopic camera applications, a combination of aperture, pixel pitch (size), and a horizontal or vertical sensor spatial offset can allow for better perceived resolution. As illustrated below, an orientation of the spatial offset of the sensors may be varied to obtain different characteristics.
[0026] Fig. 3 shows a schematic diagram of an image sensor pixel array according to an implementation. The sensor array shown has a 4 x 4 array of 16 pixels. However, the schematic diagram is scaled to show half pixels. The areas not considered in making a picture are shown in hashed lines.
[0027] Fig. 4 shows the red and green pixel locations of an image sensor according to an implementation. The locations would be the same for multiple image sensors with no offset. In Fig. 4, the red pixels are labeled with an R and the green pixels are labeled with Gr for those green pixels located on the same rows as red pixels and Gb for those green pixels located on the same rows as blue pixels.
[0028] Fig. 5 shows a system of two Bayer sensors offset by one (1) pixel diagonally according to an implementation. In Fig. 5, the red pixels of the first Bayer sensor are labeled with an R. The red pixels of the second Bayer sensor are labeled with an r. As seen in Fig. 5, despite the offset, the greens of each sensor are co-located. By comparing Fig. 5 against Fig. 4, it can be seen that the red (and blue) sampling improves over the use of a single Bayer sensor. Missing red pixels can be bilinearly interpolated using the north, south, east, west pixels. For example, for location 23: 7?23 = (R13 + rzz + F24 + l?33)/4
Missing blue and green pixels can be similarly bilinearly interpolated.
[0029] Fig. 6 shows a system of two Bayer sensors offset by one half (1/2) pixel diagonally according to an implementation. In Fig. 6, the red pixels of the first Bayer sensor are labeled with an R. The red pixels of the second Bayer sensor are labeled with an r. The green pixels of the first Bayer sensor are labeled with Grl and Gbl. The green pixels of the second Bayer sensor are labeled with Gr2 and Gb2. As seen in Fig. 6, the red sampling improves over the use of a single Bayer sensor, but not as much as in the implementation shown in Fig. 5, because there are rows and columns that do not have any red pixels. The green sampling improves along a diagonal, but green sampling in the horizontal and vertical directions stays the same. Bilinear interpolation for each missing red pixel can be done with the 8 nearest neighbors of the missing pixel. Similar bilinear interpolation can be done for blue and green missing pixels.
[0030] Fig. 7 shows a system of two Bayer sensors offset by one (1) pixel vertically according to an implementation. In Fig. 7, the red pixels of the first Bayer sensor are labeled with an R. The red pixels of the second Bayer sensor are labeled with an r. The green pixels of the first Bayer sensor are labeled with Grl and Gbl. The green pixels of the second Bayer sensor are labeled with Gr2 and Gb2. As seen in Fig. 7, red (and blue) vertical sampling improves, but horizontal red (and blue) sampling remains the same. Green sampling improves in all directions, but the sampling in one of the diagonal directions is less than in the implementation shown in Fig. 6. Bilinear interpolation for each missing red pixel can be done with the 6 nearest neighbors of the missing pixel. Similar bilinear interpolation can be done for the missing blue pixels. There is no interpolation needed for the green pixels.
[0031] In an additional implementation, as shown in Fig. 8, three Bayer sensors may be used to further improve spatial resolutions, such as by using the structure illustrated in Fig. 2. In Fig. 8, the red pixels of the first Bayer sensor are labeled with an Rl. The red pixels of the second Bayer sensor are labeled with an R2. The red pixels of the third Bayer sensor are labeled with an R3. The green pixels of the first Bayer sensor are labeled with Grl and Gbl. The green pixels of the second Bayer sensor are labeled with Gr2 and Gb2. The green pixels of the third Bayer sensor are labeled with Gr3 and Gb3.
[0032] As shown in Fig. 8, the first Bayer sensor may have a horizontal shift (for example, one pixel) relative to a third Bayer sensor and a second Bayer sensor may have a vertical shift (for example, one pixel) relative to the third Bayer sensor. This implementation allows for the green channel to have the same sampling advantage as shown in Fig. 7, along with reduced noise (because there are two green pixels that can be combined at half of the green sample locations). However, this implementation has an additional advantage that the red and blue resolutions are improved in both the horizontal and vertical directions as compared to the systems in Figs. 6 and 7. Bilinear interpolation for each missing red pixel can be done with its 8 nearest neighbors. Missing blue pixels can be similarly bilinearly interpolated. No interpolation is needed for the green pixels (although the Bayer 2 and Bayer 3 green pixels fall on the same locations and can be averaged together to lower the noise).
[0033] In additional implementations, light may be split, such as by one or more additional beam splitters, to further image sensors. The additional sensors may have different dynamic ranges, different wavelength responses, geometric properties, further shifted locations or different orientations.
[0034] With reference to Fig. 9, this disclosure according to an implementation is also directed to a method for processing images from a camera system with offset image sensors. A first image is received from a first image sensor, step 100. A second image is received from a second image sensor offset to the first sensor, step 102. The first and second images are combined, step 104. Missing pixels in each color are interpolated from the surrounding pixels, step 106. The combined image with the interpolated pixels is then displayed step 108. In an implementation, the image sensors are Bayer sensors.
[0035] The method may also have the steps of receiving at least one additional image from an additional sensor and combining the at least one additional image with the first image and the second image. Additionally, the method may have the step of averaging pixel values received from multiple sensors. In an implementation, the method is performed in an image processing system. The image processing system may include one or more processors and field programmable gate arrays (FPGAs).
[0036] The use of multiple Bayer sensors at an offset to each other is highly advantageous because it offers the ability to achieve higher spatial resolution. For endoscopic camera applications, the combination of aperture, pixel size and a spatial offset can allow for better perceived resolution.
[0037] This invention disclosure outlines a spatial offset for image sensors that optimizes for signal sensitivity and resolution and effectively overcomes the disadvantages associated with the prior art. However, it will be apparent that variations and modifications of the disclosed implementations may be made without departing from the principles of the invention. The presentation of the implementations herein is offered by way of example only and not limitation, with a true scope and spirit of the invention being indicated by the following claims.

Claims

WHAT IS CLAIMED IS:
1. A camera system comprising: a first filter array image sensor; and a second filter array image sensor; wherein the first and second filter array image sensors are positioned at an offset to each other.
2. The system of claim 1 wherein the second filter array image sensor is positioned at an offset of at least one-half pixel to the first filter array image sensor.
3. The system of claim 1 wherein the second filter array image sensor is positioned at a one pixel diagonal offset to the first filter array image sensor.
4. The system of claim 1 wherein the second filter array image sensor is positioned at a half pixel diagonal offset to the first filter array image sensor.
5. The system of claim 1 wherein the second filter array image sensor is positioned at a one pixel horizontal offset to the first filter array image sensor.
6. The system of claim 1 wherein the second filter array image sensor is positioned at a one pixel vertical offset to the first filter array image sensor.
7. The system of claim 1 further comprising a third filter array image sensor; and wherein each filter array image sensor is positioned at an offset to the other filter array image sensors.
-9-
8. The system of claim 7 wherein the first filter array image sensor has a horizontal offset relative to the third filter array image sensor and the second color filter array image sensor has a vertical offset to the third filter array image sensor.
9. A camera system comprising: a plurality of image sensors containing pixels with a plurality of wavelength response types arranged in a regular pattern wherein at least one wavelength response type is adjacent in at least one direction to a different wavelength response type; wherein at least two image sensors are positioned at an offset to each other; wherein each offset is of a distance corresponding to at least one half pixel.
10. The system of claim 9 wherein a second image sensor is positioned at a diagonal offset to a first image sensor.
11. The system of claim 9 wherein a second image sensor is positioned at a one pixel diagonal offset to a first image sensor.
12. The system of claim 9 wherein a second image sensor is positioned at a half pixel diagonal offset to a first image sensor.
13. The system of claim 9 wherein a second image sensor is positioned at a horizontal offset to a first image sensor.
14. The system of claim 9 wherein a second image sensor is positioned at a one pixel horizontal offset to a first image sensor. ls. The system of claim 9 wherein a second image sensor is positioned at a vertical offset to the first image sensor.
16. The system of claim 9 wherein a second image sensor is positioned at a one pixel vertical offset to a first image sensor.
17. The system of claim 9 comprising three image sensors; and wherein each image sensor is positioned at an offset to the other image sensors.
18. The system of claim 17 wherein a first image sensor has a horizontal offset relative to a third image sensor and a second image sensor has a vertical offset to the third image sensor.
19. The system of claim 18 comprising two image sensors; and wherein the image sensors are Bayer sensors.
20. The system of claim 18 comprising three image sensors; and wherein the image sensors are Bayer sensors.
21. The system of claim 18 comprising three image sensors; and wherein two of the image sensors are Bayer sensors and one of the image sensors is a near infra-red sensor.
22. A method for processing images from a camera system with offset image sensors comprising the steps of: receiving a first image from a first image sensor; receiving a second image from a second image sensor, the second image sensor being offset from the first image sensor; combining the first image and the second image;
-Il interpolating missing pixels from surrounding pixels; and displaying the combined image with interpolated pixels.
23. The method of claim 22 further comprising the steps of: receiving at least one additional image from an additional sensor, the at least one additional sensor being offset from at least one of the first sensor and the second sensor; and combining the at least one additional image with the first image and the second image.
24. The method of claim 22 further comprising the step of averaging pixel values received from multiple sensors.
-12-
PCT/US2021/057256 2020-10-30 2021-10-29 A camera system with offset image sensors WO2022094215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063107709P 2020-10-30 2020-10-30
US63/107,709 2020-10-30

Publications (1)

Publication Number Publication Date
WO2022094215A1 true WO2022094215A1 (en) 2022-05-05

Family

ID=78819634

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/057256 WO2022094215A1 (en) 2020-10-30 2021-10-29 A camera system with offset image sensors

Country Status (1)

Country Link
WO (1) WO2022094215A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0135300A2 (en) * 1983-07-21 1985-03-27 Victor Company Of Japan, Limited Color television camera with two or more solid-state imaging devices
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
US20120105690A1 (en) * 2010-11-03 2012-05-03 Sony Corporation Camera system and imaging method using multiple lens and aperture units
US20180278857A1 (en) * 2017-03-23 2018-09-27 JVC Kenwood Corporation Imaging device and imaging method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0135300A2 (en) * 1983-07-21 1985-03-27 Victor Company Of Japan, Limited Color television camera with two or more solid-state imaging devices
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
US20120105690A1 (en) * 2010-11-03 2012-05-03 Sony Corporation Camera system and imaging method using multiple lens and aperture units
US20180278857A1 (en) * 2017-03-23 2018-09-27 JVC Kenwood Corporation Imaging device and imaging method

Similar Documents

Publication Publication Date Title
USRE48697E1 (en) High resolution thin multi-aperture imaging systems
KR0146260B1 (en) Solid state image pick-up apparatus
US7834927B2 (en) Apparatus and method for producing video signals
US20060233439A1 (en) Method and apparatus for processing a Bayer-pattern color digital image signal
EP2728545B1 (en) Image processing method and device based on bayer format
CN102292975A (en) Solid state imaging element, camera system and method for driving solid state imaging element
EP1974549A2 (en) Method and apparatus for producing bayer color mosaic interpolation for imagers
US20030151685A1 (en) Digital video camera having only two CCDs
TWI338511B (en)
CN106067935A (en) Image pick-up device, image picking system and signal processing method
WO2022094215A1 (en) A camera system with offset image sensors
EP0199738A1 (en) Single-chip solid-state color image sensor and camera incorporating such a sensor
EP2680590B1 (en) Color image pick-up element
US9906744B2 (en) Image sensor having phase difference detection pixels for focus detection, and image pick-up apparatus including the image sensor
JP2815497B2 (en) Color video camera
JP3079839B2 (en) Solid-state imaging device
JP3767367B2 (en) Imaging device
JPH0795595A (en) Color image pickup device
JP2001231052A (en) Method for processing output signal from solid-state image pickup element and camera using it
US11742365B2 (en) High dynamic range image sensor having reduced crosstalk and jaggy
JP3551670B2 (en) Electronic still camera
JPH06350904A (en) Camera
JP3406674B2 (en) Two-chip imaging device
JP3515585B2 (en) Two-chip imaging device
JPH0974571A (en) Image pickup device and image signal processor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21816230

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21816230

Country of ref document: EP

Kind code of ref document: A1