WO2017127844A1 - Imaging system optimized for nanosatellites with multiple cameras and image stabilization and pixel shifting - Google Patents

Imaging system optimized for nanosatellites with multiple cameras and image stabilization and pixel shifting Download PDF

Info

Publication number
WO2017127844A1
WO2017127844A1 PCT/US2017/014636 US2017014636W WO2017127844A1 WO 2017127844 A1 WO2017127844 A1 WO 2017127844A1 US 2017014636 W US2017014636 W US 2017014636W WO 2017127844 A1 WO2017127844 A1 WO 2017127844A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging system
satellite
camera
sensor array
satellite imaging
Prior art date
Application number
PCT/US2017/014636
Other languages
French (fr)
Inventor
David D. SQUIRES
Peter Mrdjen
Robert MACHINSKI
Jolyon D. THURGOOD
Brij AGRAWAL
Greg DEFOUW
Jeffrey WEDMORE
Matthew SORGENFREI
Original Assignee
Hera Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hera Systems, Inc. filed Critical Hera Systems, Inc.
Publication of WO2017127844A1 publication Critical patent/WO2017127844A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/08Mirrors
    • G02B5/09Multifaceted or polygonal mirrors, e.g. polygonal scanning mirrors; Fresnel mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/02Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors
    • G02B23/06Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors having a focussing action, e.g. parabolic mirror
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/18Mountings, adjusting means, or light-tight connections, for optical elements for prisms; for mirrors
    • G02B7/182Mountings, adjusting means, or light-tight connections, for optical elements for prisms; for mirrors for mirrors

Definitions

  • the present disclosure generally relates to satellite-based imaging systems and more generally to nanosatellites that are orbiting vehicles airborne payloads, or other imaging systems with highly constrained size requirements and for capturing high- resolution images.
  • the costs to launch a rocket into Earth orbit can run into the millions of dollars. As a rocket can carry multiple satellites and other equipment, the cost of the launch can be allocated among the different payloads. Consequently, smaller satellites might incur smaller costs to get into orbit.
  • the measure of a size of a satellite could relate to its mass, its volume, its height, width and depth, as well as its shape. As for shape, it might be that the cost of getting equipment onto a launch bay is a function of the envelope of the satellite.
  • nanosatellites are often deployed, especially where the desired functionality fits in a nanosatellite form factor and where a constellation of satellites are needed.
  • the term "nanosatellite” often refers to an artificial satellite with a wet mass between 1 and 10 kg, but it should be understood that features might be present in satellites outside that range.
  • a constellation of smaller satellites might be more useful than one large satellite for the same or similar construction and launch budget.
  • the result is usually that a rocket payload comprises many more independent vehicles.
  • satellites be rectangular prisms or other shapes that are spacefilling. For example, some nanosatellites are generally cube shaped.
  • these satellites include propulsion, solar panels for on-board electrical power generation, power management systems, thermal control systems, attitude controls systems, computer processors, storage, and communications capabilities.
  • Some satellites are used for imaging and might include a telescope assembly for light gathering and a camera assembly for converting gathered light into electronic data, which can then be processed on-board and/or communicated to another satellite or a ground station.
  • a satellite For imaging purposes, a satellite might have to address significant size and operational constraints, such as the resolution of images and spectra covered. The light available to a satellite might be limited by the amount of time available for image capture.
  • a satellite imaging system used in a satellite has a telescope section including a first reflector that is substantially square and sized to fit into a substantially square aperture of a satellite body, a second reflector, positioned to reflect light reflected from the first reflector, and a lens set including one or more lenses positioned in an optical path of the telescope section.
  • a sensor array is positioned to receive light from the telescope section when light is received through the substantially square aperture, where the sensor array is substantially square.
  • the satellite imaging system may have a second reflector that is substantially square and/or constructed to counteract image distortions as might occur due to being substantially square.
  • the second reflector might be round.
  • the lens set might be square or round.
  • the satellite imaging system might include baffles that are substantially square or substantially round.
  • the elements are substantially square when the elements have a form factor of a square adjusted for structural components needed to provide structural stability to the elements or other components of a satellite containing the satellite imaging system.
  • the first reflector, the second reflector, and/or one or more lenses of the set of lenses can include thermally matched materials in that the thermally matched materials will limit distortion of a final image over a predetermined set of operating conditions.
  • the first reflector, the second reflector, and/or one or more lenses of the set of lenses might be movable by motorized or deformable positioners to perform dynamic compensation of positional error and/or compensation for mechanical variations.
  • the satellite imaging system may also include an imaging system with optical path optimized for nanosatellites.
  • a satellite imaging system used in a satellite has a telescope section arranged to receive incoming light along an optical path, a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges including one or more wavelength ranges within a visible spectrum, a second camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges including one or more wavelength ranges outside the visible spectrum, and a dichroic beam splitter in the optical path, where light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second camera.
  • the dichroic beam splitter might be a whole-spectrum beam splitter, where either the first sensor array or the second sensor array is a panchromatic sensor array, and the other sensor array (second or first array) is a non-panchromatic sensor array providing pixelated, color-filtered images, and where outputs of the panchromatic sensor array are usable to enhance the pixelated, color-filtered images.
  • the satellite imaging system might include electronic bandwidth controls for controlling and/or modifying a passband defined by the dichroic beam splitter where the first set wavelength ranges and/or the second set wavelength ranges wavelengths can be controlled and/or modified.
  • a third camera and a second beam splitter might be provided in the optical path, where at least a portion of the incoming light is directed to the third camera.
  • Electronic field of view controls might be used for controlling and/or modifying a telescope field of view.
  • a satellite imaging system used in a satellite has a telescope section arranged to receive incoming light along an optical path, a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges including one or more wavelength ranges within a visible spectrum, a second sensor array that shifts relative to an optical axis to the camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges including one or more wavelength ranges outside the visible spectrum, and a dichroic beam splitter in the optical pathwhile capturing an image to improve the resolution thereof, and an attitude determination and control system, where the satellite stabilizes the satellite while capturing an image.
  • the satellite imaging system might include an active disturbance cancellation system, where light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second cameraa combination of mechanisms, sensors, and computer algorithms reduce the effect of disturbances while capturing an image, and a high-rate data readout and recording system, where multiple images can be collected within a shortened time period to limit the effect of vibration upon image quality.
  • the satellite imaging system may also include image stabilization and pixel shifting for a nanosatellite imaging system.
  • FIG. 1 is a front isometric view of an imaging nanosatellite.
  • FIG. 2 is a rear isometric view of an imaging nanosatellite.
  • FIG. 3 is a front planar view of a telescope section of an imaging nanosatellite.
  • FIG. 4 is an isometric view of an imaging nanosatellite in a deployed mode with communications antennas deployed.
  • FIG. 5 is an illustration of a rotation control assembly.
  • FIG. 6 is an illustration of an example imaging system with arrangement of multiple cameras and a square aperture telescope section.
  • FIG. 7 is a schematic illustration of light paths of the imaging system of FIG. 6.
  • FIG. 8 is a top, cross-sectional view of components of the imaging system of FIG. 6.
  • FIG. 9 is a top, cross-sectional view of components of the imaging system of FIG. 6, illustrating light paths.
  • FIG. 10 is a cut-away view of an optical barrel section; FIG. 10(a) is an angled view; FIG. 10(b) is a straight-on side view.
  • FIG. 11 is a cut-away view of a telescope section showing a square secondary mirror baffle and an optical barrel section.
  • FIG. 12 is a chart of spectral bands.
  • FIG. 13 illustrates examples of color filters used on pixel arrays.
  • FIG. 14 illustrates an example of increased resolution from the use of subpixel shifting.
  • Techniques described and suggested herein include an imaging satellite with a square primary reflector and other variations include other noncircular shapes.
  • An imaging system of an imaging satellite will have an aperture, optical components and light sensors.
  • the aperture is where the light enters the imaging system. That light is optically processed by reflectors and other components so that the light generally falls onto a light sensor.
  • a common sensor is a sensor array comprising a two-dimensional (2D) array of pixels. Each pixel electrically responds to light impinging on the pixel.
  • the imaging system also includes electronics to measure a pixel's response and record image data based on the responses.
  • An example sensor array might be a 1024 by 1024 pixel array (1 megapixel) or a 5, 120 by 5, 120 pixel array (25 megapixel), but could also be a non-square array, such as a rectangle or some other shape.
  • the imaging system also includes a processor that interacts with the electronics to, for example, process the image data recorded by the electronics or image data sent directly to the processor from the electronics without recording.
  • the processor might compress the image data, perform some calculations, and/or modify the image data.
  • the processor might also coordinate with a communications system to transmit some of the image data to other satellites and/or ground stations.
  • the image data can be in the form of single 2D array of values or multiple 2D arrays of values.
  • the image data might also include a time component, such as where not all of the pixels were read at the same instant in time and the image data might include a sequence of readings, such as might be used to form a video sequence.
  • the processor In communicating image data, or even when not communicating image data, the processor might direct the communications system to send non-image data, such as satellite performance data, settings and other data.
  • Some data stored by the processor might include mission data.
  • Mission data might specify, for one or more missions, which images to capture and when and other details of image capture.
  • the processor might also control other aspects of the satellite, such as how to position and reposition the satellite during, before or after an imaging event.
  • the processor might implement various functions and provide equivalent structure to physical components by virtue of program code that is stored in program code memory and is executed by the processor as part of the operation of the processor.
  • the particular operation(s) of the processor might be determined by the particular construction of the processor and/or the program code in program memory that the processor is designed to execute.
  • the optical components might include reflectors (mirrors), lenses, prisms, splitters, collimators, filters, obscurers and the like. Some of these components can be movable, possibly under control of the processor via motors, actuators, etc. that are controllable by the processor. Some of the obscurers are part of the optical components, while others are unrelated to the optical path. For example, where a primary reflector is used with a secondary reflector, the secondary reflector and struts that hold the secondary reflector in place might be obscurers in the optical path.
  • the optical path might also be obscured by hinges or other necessary parts of the satellite that cannot be, or cannot easily be, removed from the optical path from the aperture to the sensor array(s).
  • a shroud or baffle might be used in or around the aperture to block undesired light from entering the aperture.
  • the primary reflector and other elements of the optical path should have a shape optimized to the shape of the satellite and have surfaces as large as possible.
  • the first mirror and other elements of the telescope section in the optical path can have a square cross-section. This increases the light collecting area, and also allows the shape of the image plane formed by these optics to more nearly match the square or rectangular shape of commercially available sensor arrays used in the camera section of an imaging system.
  • Another advantage is that with a square aperture, the aperture diameter (diagonal dimension) is larger and thus the resolution of the telescope is improved.
  • the aperture is square, or at least substantially square.
  • the aperture is square with chamfered corners. Such an aperture might be found on a satellite that uses chamfered corners.
  • the aperture can have a larger area than a corresponding inscribed circle, thereby increasing the amount of light collected and also causing a larger percentage of collected light to fall on pixels, per unit time, in a sensor array than if the profile of the light were circular.
  • the chamfering can be used to reduce diffraction artifacts and point spread function distortion.
  • each other optical component is shaped commensurate with the aperture shape.
  • the secondary reflector can be square, lenses can be square, etc.
  • some of the components in the path can be round so as to simplify construction, while still taking in all available light.
  • the result is greater use of light energy both in collecting light and in applying light to the sensor.
  • a further advantage is the increase in aperture that is represented by the diagonal dimension of a square aperture as compared to the diametric dimension of a circular aperture designed to fit in the same spacecraft.
  • the diagonal dimension increases the sampling frequency of the telescope, and the square shape of the aperture provides a point spread function (PSF) that has a squarish central spot that is narrower than the Airy disk provided by a circular aperture.
  • PSF point spread function
  • the other elements in the telescope also have used a conventional round shape, although the sensing array may be rectangular, resulting in the loss of a portion of the gathered light. This is similar to the situation in a standard digital camera, where the sensor array of charge coupled devices (CCDs) or complementary metal- oxide semiconductor (CMOS) devices is rectangular (corresponding to the shape of a photograph), while the lens and other optics are round for convenience, resulting in a loss of a crescent shaped region at each of the image's edges.
  • CCDs charge coupled devices
  • CMOS complementary metal- oxide semiconductor
  • a square or rectangular dispenser compartment is provided to encapsulate and fit multiple satellites in a single launch dispenser.
  • the individual nanosatellites are no longer round, but typically will have a square cross-section. Accordingly, to optimize telescope performance, it is more beneficial to make a mirror with increased surface area to fit within the square satellite. Maximizing the first reflector's size to capture more light and increase aperture dimensions while still conforming to the shape of the nanosatellite can be obtained by use of a square first reflector, perhaps with rounded or chamfered corners if needed due to the satellite's internal construction or optical diffraction characteristics. To compensate for the increased mass of a square mirror with larger area, material is machined away from the back of the mirror in a process called "light-weighting" which results in a stiff but light weight structure. The mass difference relative to a circular mirror is very manageable.
  • the resultant image from the first reflector will be square, and the image sensing array is square or rectangular, it is not necessary to use conventional round elements for a secondary reflector, lenses, and other elements in the optical path.
  • a square secondary reflector may be used while any internal corrective lenses may be circular.
  • the reflected light beam remains square in shape until it arrives at the sensor.
  • using a square secondary reflector, baffles, lenses (if desired) and other elements of the optical path will reduce weight while still transmitting all of the light gathered at the primary reflector to the sensor array. This can require use of some unusually shaped elements, since lenses, for example, are usually round.
  • the non-round elements should be aligned in rotation about the central optical axis as any square optical or baffle elements will lack rotational symmetry about this axis, but in some cases that might not be a concern.
  • image rotation devices e.g. prisms
  • the diagonal of the aperture provides higher resolution than the width or height. Image rotation would take advantage of this by enabling alignment of the higher resolution diagonal dimension of the optics with the higher resolution width dimension of the sensor array, should one be present.
  • the exemplary embodiments described here are based on a satellite with a square cross-section, as this is a typical configuration for the packing of nanosatellites as a payload into a launch vehicle, but can be extended to other configurations that lack the conventional rotational symmetry.
  • rectangular or hexagonal cross-sections also lend themselves to dense packing.
  • the mirror would again be determined by the shape of the satellite in order maximize the available light gathering, while minimizing the mass of the elements along the optical path by using the same shape for these.
  • Techniques described and suggested herein include an imaging satellite having an imaging system that provides for separate cameras, such as separate lensing, separate filtering, and separate sensor arrays, possibly also including processing logic to combine outputs of the separate sensor arrays in various manners to improve over what could be done with a single sensor array.
  • a camera might be formed of a set of zero or more focusing elements, a set of zero or more light deflection elements, a set of zero or more filtering elements, and a sensor array that captures incident light.
  • the sensor array might comprise a plurality of light sensitive pixels in a two-dimensional array (2D) of pixels.
  • the individual pixel sensors might be charge coupled devices (CCDs), complementary metal-oxide semiconductor (CMOS) type, microbolometer arrays, or other sensor elements.
  • CCDs charge coupled devices
  • CMOS complementary metal-oxide semiconductor
  • a sensor array might have all of its pixels sensitive to the same range of light wavelengths, or it might have a varying pattern of sensitivities over the array.
  • the array will need to accommodate sensors for each of the red, green, blue, and yellow color wavelength ranges, which will reduce the resolution of the each wavelength range by half (doubling the size of the smallest detectable object in green light). If additional wavelength ranges are to be sensed with the same array, such as into the infra-red or the ultra-violet, this will further reduce the resolution of the individual wavelength ranges.
  • multiple cameras are used, such as where incoming light from a telescope section of a satellite may go through a dichroic beam splitter, with the standard visible spectrum going to a first camera and wavelengths outside of the standard visible spectrum, such as in the infrared or coastal blue range, being sent to a second camera, allowing image data from multiple wavelength ranges to be captured simultaneously.
  • the image data from the different wavelengths of two (or more) cameras can then be selectively recombined.
  • the balancing of weight against utilization of wavelength bands is an important concern for a satellite imaging system.
  • the satellite imaging system described herein performs this balancing by using multiple cameras having sensor arrays sensitive to different wavelength ranges, in order to improve the spectral resolution of the multi-camera system, and make use of the full sensitive spectrum of the sensors so the individual cameras can each be sensitive to more than one wavelength range, in order to save on the mass of the satellite while utilizing the full available spectrum of light and provide other benefits.
  • the incoming image can be exposed to at least two cameras, with each of the cameras getting a distinct set of one or more wavelength bands which can be sensed in one or more narrower filtered wavelength band that can be captured simultaneously.
  • the images of the selected wavelength ranges from each of the cameras are then aligned and combined to form a remerged image having color components from more than one of the cameras.
  • the selection and combination of wavelength ranges from the different cameras can be done on the satellite, done terrestrially, or some combination of these.
  • the incoming image can be split using a double-dichroic beam splitter.
  • there are two cameras with the first camera receiving the visible light wavelengths, while the second camera gets the red edge (RE), Near Infrared 1 (Nl) and Near Infrared 2 (N2), and possibly also wavelengths below the range of standard visible wavelength sensors, such as Coastal Blue (CB).
  • RE red edge
  • Nl Near Infrared 1
  • N2 Near Infrared 2
  • CB Coastal Blue
  • Use of one camera for the visible range can have the advantage that data from the visible range is commonly wanted in applications and that such sensors are more readily available. This also allows for use of differing resolution levels, if desired, where the visible image can have a higher resolution that can be overlaid with data from selected ones of the additional wavelengths that are of interest for a particular application.
  • available light from a light path can be partitioned into two (or more) separate subpaths for use by camera sensors with different spectral sensitivities. While a given sensor array might not be sensitive to a particular range of wavelengths, the light in that range of wavelengths is not wasted, as it can be directed to a different sensor array that is sensitive to it.
  • SWIR shortwave infrared
  • LWIR longwave infrared
  • Other implementations can combine hyperspectral imagers in these wavebands with a visible waveband imager.
  • Techniques described and suggested herein include an imaging satellite with an example satellite imaging system having a camera, a pixel shifting mechanism, thermally stable imaging payload, and high-stability attitude determination and control system (ADCS) to improve image resolution.
  • the satellite incorporates a number of design elements which, in combination, provide the stable result required for good imaging.
  • the resolution of a satellite imaging system is dependent upon the amount of light that the system's telescope is able to gather, the diffraction limited resolution that the optical design is able to provide, the sensitivity and pixel arrangement of the sensor, and the stability of the imaging system (i.e. the relative motion between the platform and target).
  • the resolution of the camera depends on the density and arrangement of the individual pixel sensors on the sensor array (typically known as the pitch, or distance, between pixels). Pixel pitch establishes what is known as the spatial sampling rate of the sensor.
  • the individual pixel sensors such as a charge coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) type sensor, can only be packed with a certain density into a square two-dimensional array without obstructing one another. If the pixels of the array are to have elements that are sensitive in more than one wavelength band, this will further decrease the resolution within each band.
  • CCD charge coupled device
  • CMOS complementary metal-oxide semiconductor
  • the array will need to accommodate sensors for each of the red, green, and blue color ranges (such as in a Bayer filter mosaic), which will reduce (worsen) the resolution of the green range by half and the red and blue ranges by a factor of four relative to the maximum un-aliased diffraction-limited and sensor sampling rate limited resolution. If additional wavelength ranges are to be sensed with the same filter array, such as into the near infra-red or the near ultra-violet, this will further reduce the resolution of the individual wavelength ranges.
  • the additional filter array that comes with it is very beneficial for optimizing the use of light energy (radiometric performance), wavelengths (spectral resolution) and retention of spatial (sampling) resolution.
  • the added weight and volume are manageable in the described satellite system.
  • the satellite imaging system described here balances mass, volume, and imaging performance by using multiple cameras having sensor arrays sensitive to different wavelength ranges, in order to improve sampling resolution, but where the individual cameras can each be sensitive to more than one wavelength range, in order to save on the mass of the satellite.
  • the incoming image can be exposed to at least two cameras, with each of the cameras getting a distinct set of one or more wavelength band that can be captured simultaneously.
  • One of the cameras is pixel- shifted in two axes to increase collected information and reduce imaging artifacts giving an apparent spatial and spectral resolution improvement.
  • the images of the selected wavelength ranges from each of the cameras are then aligned and combined to form a remerged image having color components from more than one of the cameras.
  • the selection and combination of wavelength ranges from the different cameras can be done on satellite, done terrestrially, or some combination of these.
  • FIG. 1 is a front isometric view for an example of an imaging nanosatellite.
  • the satellite is a rectangular box shape, with a square cross-section, that allows a number of such satellites to be stacked compactly as the payload of a launch vehicle.
  • the square telescope provides for optimal use of the satellite cross-section, thereby allowing an increase in aperture area and a diagonal aperture dimension beyond what a circular aperture would provide within the same cross-section. This provides a correspondingly increased light collecting ability and a larger effective aperture dimension on the diagonal of the mirror.
  • the satellite 102 is shown with a deployable fore-baffle 104 extended to aid with stray light control, but which can be retracted to keep the satellite compact.
  • At the rear is the camera system, which is partially visible in the view of FIG. 1.
  • the camera system is more visible in the rear isometric view of FIG. 2.
  • a dual camera system with a dichroic beam splitter in the optical pathway is shown, but the arrangement of the optimized optical path can also be used with a more conventional single sensing array arrangement or with additional beam splitters and cameras.
  • Ground Sample Distance is essentially the distance between pixel centers as they would appear if the pixel outlines were projected on the ground, namely the boundary of the area sensed by each pixel on the Earth's (or another target's) surface. The distance between these square area centers is the GSD.
  • FIG. 3 is a front planar view of a telescope section of an imaging system of a satellite 302.
  • FIG. 3 shows a primary reflector 304, a secondary reflector 308 and struts 306(l)-(4) for supporting secondary reflector 308.
  • Struts 306 can be attached to satellite 302 in the plane of secondary reflector 308 and/or attached more to the rear, such as further back on or through primary reflector 304.
  • secondary reflector 308 is square and occupies around 20% of the aperture. In other variations, the percentage is higher or lower.
  • secondary reflector 308 is round, but square secondary reflectors might be preferred when the area of the round secondary reflector that is outside an inscribed square is not illuminated by light from the primary reflector, as that area outside the inscribe square might obscure light from entering the aperture in the first place.
  • FIG. 4 is an isometric view of an imaging nanosatellite 402 in a deployed mode with communications antennas deployed.
  • FIG. 5 is an illustration of a rotation control assembly 502 including reaction wheels 504(l)-(4) and torque rods 506(l)-(3).
  • the reaction wheels may be used for rotational control of the satellite. Because of their orientation, slowing down and speeding up the reaction wheels can impart a torque about the center of mass of the satellite. By slowing down (or speeding up) followed by speeding up (or slowing down), the reaction wheels can cause the satellite to start rotating, rotate to a position and stop at that position.
  • a processor can compute the speed changes needed on each of the reaction wheels to maintain a "stare mode" view of a fixed point on the Earth even as the satellite moves along its orbit, thus allowing the collection of large amounts of light from a source.
  • the torque rods can all be identical, but do not need to be.
  • the three torque rods can be mutually orthogonal, but in some implementations, there could be more than three, for redundancy and/or additional torque.
  • the reaction wheels might hit a point of saturation.
  • the three orthogonal torque rods can add torque as they pass through the Earth's magnetic field to help with desaturation.
  • the torque rods and reaction wheels are mounted on an integrated rotation control assembly, leading to a more compact design.
  • FIG. 6 is an illustration of an example dual-camera imaging system with arrangement of multiple cameras and a square aperture telescope section.
  • the imaging system includes a square aperture telescope section 602, a beam splitter 604, a mirror 610, a first camera 608 and a second camera 606.
  • FIG. 7 is a schematic illustration of light paths of the imaging system of FIG. 6.
  • FIG. 7 schematically illustrates the splitting of the incident light for the two cameras.
  • the full spectrum incoming radiation is incident upon a cube beam splitter that includes a dichroic filter that separates out the visual portion of the spectrum (R, G, B, Y) from the wavelengths outside of the RGBY space (CB, RE, Nl, N2).
  • CB, RE, Nl, N2 the wavelengths outside of the RGBY space
  • Other arrangements can be used, such as putting CB in the same sensing array as (G, B, Y) and putting red in with the longer wavelengths (RE, Nl, N2).
  • FIG. 1 is a schematic illustration of light paths of the imaging system of FIG. 6.
  • FIG. 7 schematically illustrates the splitting of the incident light for the two cameras.
  • the full spectrum incoming radiation is incident upon a cube beam splitter that includes a dichroic filter that separates out the visual portion of the spectrum (R,
  • both cameras are shown to be of the same spatial (sampling) resolution, but, for example, it may be desirable for the camera for the visible range to have a relatively higher sampling resolution.
  • Other wavelength ranges can similarly be incorporated by using corresponding array sensors; and more cameras can be incorporated by further splitting of the incoming radiation.
  • FIG. 8 is a top, cross-sectional view of components of the imaging system of FIG. 6, including a primary reflector 902, a secondary reflector 903, and a set of lenses 905.
  • An air gap might be provided so that vibrations, in orbit and/or during launch, do not easily transfer from the telescope section to the cameras.
  • FIG. 9 is a top, cross-sectional view of components of the imaging system of FIG. 6, illustrating light paths.
  • the incoming light is incident on a primary reflector 902.
  • the primary reflector 902 has increased hyperbolic curvature relative to most commercial telescopes.
  • Primary reflector 902 reflects the incident light onto secondary reflector 903, also with increased curvature, which in turn reflects the light through the set of lenses 905 and on to the sensors, where the paths of a number of rays are shown.
  • the inner primary baffle and secondary baffle are also designed to be square to minimize mass.
  • the square shape of the secondary baffle also allows more light energy per unit time to arrive at the primary mirror than a traditional circular baffle would. The latter further enhances signal to noise ratio (SNR) of the telescope.
  • SNR signal to noise ratio
  • the sensor array includes two separate cameras.
  • the optical path includes a dichroic splitter to separate out different wavelength ranges used by the sensor arrays after filtering of wavelengths by the Color Filter Arrays (CFAs) for two cameras, which in this example has one camera for the standard visible spectrum that uses an RGBY colorspace sensor array and another camera for wavelengths on either or both sides of the visible, such as bands known as Coastal Blue (near UV), Red Edge, and near infrared (NIR). More generally, other sensor arrays can be used, with sensitivity to bands well beyond those discussed here; and, more cameras with different filter and sensor arrays, or a single camera with a different filter and sensor array can be used after lenses 905.
  • CFAs Color Filter Arrays
  • the system can have the ability to modify the telescope field of view such that the beam splitter (or beams splitters) and multiple cameras can enable imaging of wider target areas.
  • the RGB camera has a sensor array sensitive to the visual spectrum and the second camera has a sensor array that is sensitive to wavelengths on one or both sides of the visual spectrum.
  • the use of a dichroic beam splitter allows for each of the cameras to receive more or less all of the light of their respective wavelength ranges. Additionally, this helps to keep the undesired bands from leaking through color filter arrays (CFAs) on each sensor to some degree, providing better signal-to-noise results.
  • CFAs color filter arrays
  • CMOS complementary metal-oxide-semiconductor
  • the two optical subpaths from the beam splitter surface to each of the sensor arrays in cameras 606 and 608 are different lengths.
  • the optical subpaths are the same length. As illustrated in FIG. 9, this may involve a set of mirrors and/or lenses to maintain compactness.
  • the subpaths are the same length, that may make image correction and coregi strati on easier and with the distortions due to other optical elements and features, the distortions are equalized. This can be important in applications where images from more than one camera are then recombined into a single image or data structure representing light capture for many different spectral wavelengths.
  • one or more additional beam splitters can be added along with one or more additional cameras.
  • a second beam splitter can be added in path to one of the shown cameras to provide the incident light (or some range of it) to a third camera. This can be used to increase spatial (sampling) resolution, the wavelengths that are sensed (spectral resolution), or some combination of these.
  • the beam splitter (or one or more of multiple beam splitters) may be a whole-spectrum beam splitter, so that a panchromatic sensor can be used on one of the cameras to enhance the images on another camera that uses pixelated color filtered images.
  • FIG. 10 is a cut-away view of an optical barrel section; FIG. 10(a) is an angled view. In the background is the primary mirror.
  • the cut-away image of a central baffle shows a square fore-section with a circular barrel section passing through the primary mirror. Light enters the square opening from the left, and passes out of the baffle section to the right where the remaining optics and cameras are positioned.
  • FIG. 10(b) is a straight-on side view of the same cut-away.
  • the baffle that passes through the primary mirror may be square, but if it is round at the point it passes through the primary mirror and above that (closer to the secondary mirror) the baffle is square, this can improve the handling of stray light by restricting the size of the open end of the baffle.
  • FIG. 11 is a cut-away view of a telescope section showing a square secondary mirror baffle and an optical barrel section. This shows the relative positioning of a square secondary mirror baffle relative to square internal baffle.
  • FIG. 12 illustrates one example of a correspondence between the two cameras and the wavelengths to which their sensor arrays respond.
  • the array for the first camera is for the visual range using the RGBY (red, green, blue, yellow) colorspace.
  • the second camera is sensitive to Coastal Blue (CB) with a wavelength below the visible and Red Edge (RE), Near Infrared 1 (Nl) and Near Infrared 2 (N2) at wavelengths above the visible range.
  • CB Coastal Blue
  • RE visible and Red Edge
  • Nl Near Infrared 1
  • N2 Near Infrared 2
  • this multi-camera arrangement uses full-frame imaging cameras capable of using global -shutter-mode imaging. This allows the full array of all included wavelengths to be captured in images simultaneously. Consequently, the different images at different sensor locations do not suffer from the sort of time lags that can affect images when the different wavelengths or areas of the image are not captured at the same time, such as can occur when using a push broom scanner, or rolling-shutter imaging mode, for example, for obtaining the image data.
  • Post-capture processing registers the pixels from the different cameras and from the different color filters.
  • This post-capture processing might be performed by processing circuity (e.g., a processor, memory and program instructions) located at the camera, elsewhere in the satellite, or at a ground station. Registration is desirable when, for example, a single one-pixel wide white light source on Earth is being captured, effectively as a point source. In that case, the pixel arrays from the two cameras might show that point source in different locations in the pixel array due to the optics used, or due to the differences in capture rate or capture time between the cameras (a small asynchronicity in the capture times can result in a considerable image shift given the resolution and the velocity of the satellite).
  • processing circuity e.g., a processor, memory and program instructions
  • the processing might be based on pixelated color filtered image data.
  • the realignment process is simplified in the implementation example using full-frame imaging sensors that perform simultaneous imaging.
  • the accumulated pixel data can be combined on the satellite, sent back uncombined, or in some intermediate state. Although sending back all of the data requires more transmission time, this provides greater flexibility in that many additional combinations or views can be generated for different user needs. For example, one user may want relatively low resolution coastal blue data overlaid on an RGB image, while another may only be interested in red and longer wavelengths, but at the highest recorded resolution.
  • FIG. 13 illustrates examples of color filters used on pixel arrays. These filters would act on the incident light that has already been split by the beam splitter, as the filters would only see, for the most part, light in the spectra designated for that pixel array.
  • FIG. 14 illustrates an example of increased resolution (more accurately, increased discernible spatial content, derived from what is known as de-aliasing) from the use of subpixel shifting which reduces errors in the sampled image known as aliasing errors.
  • the ability to provide high quality images is improved by mechanically stabilizing the imaging system, both as this improves the images from the individual cameras and makes combining the images from the different cameras of a multi- camera system easier.
  • a terrestrial imaging system can be mounted to an object for steadying, but for a satellite operating in a weightless environment, the situation is more complicated.
  • the achieved resolution is limited by the pixel density of the sensor array used by the camera, with this resolution being further diluted when a filter array is used to divide the spectrum of light into multiple wavelength bands distributed to a pattern of pixels.
  • One way to improve the effective density of the sensor pixel array is through use of a motor mechanism that can move the sensor array by some subpixel distance, so that the sensor captures multiple images, effectively providing images at subpixel resolution.
  • the motor mechanism might move the sensor array one half of a pixel width left/right and/or one half of a pixel width up/down, resulting in four different positions.
  • the motor mechanism moves one third of a pixel width up, one third of a pixel width down, one third of a pixel left, one third of a pixel right, and four combinations of those (e.g., (0, 0), (0, 1/3), (0, 2/3), (1/3, 0), (1/3, 1/3), (1/3, 2/3), (2/3, 0), (2/3, 1/3), (2/3, 2/3)), resulting in nine different positions.
  • This provides a sensor array with the equivalent of four or nine times the number of pixels by combining the shifted images.
  • each of the four colors is represented in one fourth of the array's pixels.
  • the effective number of pixels is quadrupled, so that a 25-megapixel (MP) array provides the resolution of a 100 MP array.
  • a 3x3 sub-pixel shifting would allow a 25-megapixel (MP) array to provide the resolution of a 225 MP array.
  • the visual spectrum camera is represented as larger than the extended spectrum camera to allow incorporation of the pixel shifting mechanism for higher resolution of the visible image.
  • the satellite can incorporate a number of design elements that can provide the stable result required for good imaging. These design elements can allow the satellite to point at a target for longer periods, which provides benefits to both signal-to-noise (S R) and apparent image resolution through use of a pixel shifting super-resolution technique. Stabilization can be enabled by a combination of high performance, 3-axis attitude determination and control system (ADCS), which is optimized to reduce disturbances and errors imparted to the image pointing function, design features that reduce the amount of time during which the imager should be stable.
  • ADCS 3-axis attitude determination and control system
  • modifications to the programmed motion of the pixel shifter can be made to compensate for the disturbance caused by the pixel shifter itself.
  • the torque applied to the spacecraft by the pixel shift causes slight movement of the telescope pointing that are predictable and correctable using fine adjustments or profiled smoothing of the commanded pixel shifting motion.
  • Other improvements include use of a second pixel shifting mechanism with the same mass and motion performance, rotated 180 degrees, such that it can cancel or modify the disturbances caused by the first pixel shifting mechanism. Either, or both of these mechanisms may also be used in combination with disturbance sensors and software to cancel out any predictable disturbances from other sources.
  • the countering mass that counters the movement of a pixel sensor array by subpixel distances might be the other pixel array in another camera.
  • the movement can be timed appropriately if both sensor arrays are being used at the same time.
  • other masses could be used.
  • the mass might be equal to the mass of the moving sensor array, with both having the same distance from the center of mass of the satellite.
  • software can be used to adjust between the movements of masses at different distances, to adjust for equal moment of inertia.
  • the spacing between pixels is around 4.5 microns, so a movement of a third of a pixel is 1.5 microns. Movements as fine as 1 micron, or finer, might be used.
  • the sensor array might be moved using semiconductor elements that expand/contract upon the application of electrical potential. Piezoelectric elements might provide such movement in the ranges of microns.
  • a processor, on board or ground-based, could process images to determine how much movement is occurring, allowing for calibration of the movement mechanism.
  • the satellite can apply an inertial measurement unit with a very precise 3-axis rate sensor and use ultra-smooth bearings in the reaction wheels to cancel angular motion. It can also point a star tracker at such an angle as to provide a stable external reference and ensure optimum pointing accuracy when included in the control system's Kalman filtering algorithm.
  • the attitude control system operates at a high sampling rate to limit the effect of signal quantization, and sensor noise is suppressed.
  • Gains applied to the control algorithm are modified using a control-mode dependent gain schedule to ensure low jitter amplitudes during pixel-shifted imaging periods.
  • Damping devices and vibration isolators can be added to reduce and isolate the effects of vibration, keeping them away from the telescope's pointing behavior.
  • a notch filter algorithm can be applied to suppress the control system from responding to or exciting resonant modes in the satellite structure or vibration isolation devices, thereby suppressing instabilities.
  • the satellite can include a processor or processing system that includes program code that includes and executes a set of program steps to ensure that reaction wheels operate at mean rates that impart minimum resonant excitation to the satellite and control system.
  • the reaction wheels can be balanced in advance using laser ablation. When the reaction wheels are more balanced, that reduces the vibrations that spinning reaction wheels impart to the satellite and therefore to the imaging systems, telescope orientation, etc. In this approach, imbalances are measured, perhaps by lasers or other methods, and then portions of the reaction wheels are ablated by a laser to bring the reaction wheels more into balance.
  • the satellite can also reduce imaging sensitivity to angular rates by shortening the time period during which stable pointing must be maintained. This can be done by use of the highest available data rate for sensor readout in a pixel shifting camera, so that a high-rate data readout and recording system can be used to collect multiple images within a shortened time period to limit the effect of vibration upon image quality.
  • sensor readout time may be the dominant factor in the sum of things that add to the pixel shifting time period.
  • cameras can use multiple high rate output channels to provide an image readout rate that halves the time period during which the multiple pixel shifted images are collected and transferred to storage.
  • the use of buffer memories installed in the camera electronics can shorten the readout time requirement.
  • the resulting shortened time period means that the imaging system is less affected by angular rate effects since the area imaged remains acceptably stable during the shortened period of imaging.
  • the movement can also be counter-balanced to add to stability.
  • a mechanism similar to the pixel shifter, but with equal and opposite motion can be added to counteract the force imparted by the pixel shifting mechanism.
  • the image can also be stabilized by coordinating the operation of the satellite's thrusters with the camera operation. When a request for the satellite to capture an image is received or scheduled, the status of the satellites thrusters is checked. If the requested imaging requires that the thrusters be stopped, this can be done. It may be the case that the satellite needs to allow the thrusters to operate while imaging takes place. Accordingly, thrusters would need to operate with very low thrust disturbance.
  • the satellite should have an attitude control system capable of providing suppression of angular rates sufficient to enable imaging while thrusters are operating. Then, to enable the imaging function to proceed, it evaluates a three-axis angular rate sensor to see that rates are low enough for imaging events to proceed.
  • angular rate fidelity of concern there are at least two levels of angular rate fidelity of concern: 1) an angular rate low enough to eliminate "pixel smear" for routine single- frame imaging, and 2) a lower angular rate to ensure that pixel shifted imaging can be effectively performed.
  • active damping motion compensation can be added as previously described.
  • a satellite imaging system comprising:
  • a telescope section comprising:
  • a lens set comprising one or more lenses positioned in an optical path of the telescope section
  • a sensor array positioned to receive light from the telescope section when light is received through the substantially square aperture, wherein the sensor array is substantially square.
  • a satellite imaging system comprising:
  • a telescope section arranged to receive incoming light along an optical path
  • a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges comprising one or more wavelength ranges within a visible spectrum;
  • a second camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges comprising one or more wavelength ranges outside the visible spectrum;
  • a dichroic beam splitter in the optical path, whereby light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second camera.
  • the dichroic beam splitter is a whole-spectrum beam splitter
  • the one of the first sensor array and the second sensor array is a panchromatic sensor array
  • the other of the first sensor array and the second sensor array is a non-panchromatic sensor array providing pixelated, color-filtered images
  • outputs of the panchromatic sensor array are usable to enhance the pixelated, color-filtered images.
  • a second beam splitter in the optical path, whereby at least a portion of the incoming light is directed to the third camera.
  • a satellite imaging system comprising:
  • a telescope section arranged to receive incoming light
  • a camera having a sensor array that shifts relative to an optical axis to the camera while capturing an image to improve the resolution thereof;
  • the techniques described herein are implemented by one or more generalized computing systems programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
  • processing might be performed by a processor that accesses instructions in a program memory and controls communication and processing information.
  • a processing system might include random access memory (RAM) or other dynamic storage device, or other intermediate information during execution of instructions to be executed by the processor.
  • RAM random access memory
  • Such instructions when stored in non-transitory storage media accessible to the processor, render the processing system into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • the processing system might also include a read only memory (ROM) or other static storage device for storing static information and instructions for the processor.
  • ROM read only memory
  • the processing system may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which causes or programs the processing system to be a special-purpose machine.
  • hard-wired circuitry may be used in place of or in combination with software instructions.
  • storage media refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a processor bus.
  • Transmission media can also take the form of radio waves or light waves. Communication can be two- way data communication coupling to a ground station or another satellite.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Studio Devices (AREA)

Abstract

To maximize the light capturing and imaging resolution capability of an imaging satellite while minimizing weight, the primary reflector and other elements of the optical path have a shape optimized to the shape of the satellite. For a nanosatellite with a square cross-section, the first mirror and other elements of the telescope section in the optical path have a square cross-section, as does the sensor array of the camera section. The satellite imaging system uses multiple cameras. For example, the incoming light from a telescope section of the satellite goes through a dichroic beam splitter, with the standard visible spectrum going to a first camera and wavelengths outside of the standard visible spectrum, such as in the infrared or coastal blue range, are sent to a second camera, allowing image data from multiple wavelength ranges to be captured simultaneously. The image data from the different wavelengths of the two cameras can then be selectively recombined.

Description

Imaging System Optimized for Nanosatellites with Multiple Cameras and Image Stabilization and Pixel Shifting
FIELD OF THE INVENTION
[0001] The present disclosure generally relates to satellite-based imaging systems and more generally to nanosatellites that are orbiting vehicles airborne payloads, or other imaging systems with highly constrained size requirements and for capturing high- resolution images.
CROSS-REFERENCES TO PRIORITY AND RELATED APPLICATIONS
[0002] This application claims priority from:
[0003] 1) U.S. Provisional Patent Application No. 62/286,234 filed January 22, 2016 entitled "Imaging System with an Optical Path and Telescope Shape Optimized for Nanosatellites" naming Squires et al. ("Squires I");
[0004] 2) U.S. Provisional Patent Application No. 62/286,225 filed January 22, 2016 entitled "Multi-Camera Imaging System for Nanosatellites" naming Mrdjen et al. ("Mrdjen"); and
[0005] 3) U.S. Provisional Patent Application No. 62/286,229 filed January 22, 2016 entitled "Image Stabilization and Pixel Shifting for a Nanosatellite Imaging System" naming Squires et al. ("Squires Π").
[0006] The entire disclosure(s) of application(s)/patent(s) recited above is(are) hereby incorporated by reference, as if set forth in full in this document, for all purposes.
[0007] All patents, patent applications, articles, other publications, documents and things referenced herein are hereby incorporated herein by this reference in their entirety for all purposes. To the extent of any inconsistency or conflict in the definition or use of terms between any of the incorporated publications, documents or things and the present application, those of the present application shall prevail.
BACKGROUND
[0008] The costs to launch a rocket into Earth orbit can run into the millions of dollars. As a rocket can carry multiple satellites and other equipment, the cost of the launch can be allocated among the different payloads. Consequently, smaller satellites might incur smaller costs to get into orbit. The measure of a size of a satellite could relate to its mass, its volume, its height, width and depth, as well as its shape. As for shape, it might be that the cost of getting equipment onto a launch bay is a function of the envelope of the satellite.
[0009] In view of these considerations, nanosatellites are often deployed, especially where the desired functionality fits in a nanosatellite form factor and where a constellation of satellites are needed. The term "nanosatellite" often refers to an artificial satellite with a wet mass between 1 and 10 kg, but it should be understood that features might be present in satellites outside that range. A constellation of smaller satellites might be more useful than one large satellite for the same or similar construction and launch budget. However, the result is usually that a rocket payload comprises many more independent vehicles.
[0010] To accommodate a large number of independent satellites, rocket logistics often dictate that the satellites be rectangular prisms or other shapes that are spacefilling. For example, some nanosatellites are generally cube shaped. Typically these satellites include propulsion, solar panels for on-board electrical power generation, power management systems, thermal control systems, attitude controls systems, computer processors, storage, and communications capabilities. Some satellites are used for imaging and might include a telescope assembly for light gathering and a camera assembly for converting gathered light into electronic data, which can then be processed on-board and/or communicated to another satellite or a ground station.
[0011] For a celestial imaging system that has missions to capture images of the Sun, the Moon, starts and other astronomical objects, the particular orbit might not matter. However, for Earth-observing satellites, closer is better. Of course, there are limits to how low an orbit can be and still be viable. As a result, such a satellite is performing as a terrestrial long distance imaging system, and has a number of challenges. One is the distance between the satellite and the target of an imaging process. Another is that the satellite is not anchored, so internal movements can cause rotations of the satellite. Also, the satellite is moving at a high speed in order to maintain its orbit, which means the satellite is not stationary with respect to the target. The terrestrial long distance imaging system has to also deal with the conditions of operating in space and the stress of launch.
[0012] For imaging purposes, a satellite might have to address significant size and operational constraints, such as the resolution of images and spectra covered. The light available to a satellite might be limited by the amount of time available for image capture.
[0013] Resolution and other factors can limit the usefulness and clarity of images taken from a satellite.
[0014] Consequently, there are a number of areas in which satellite imaging systems can benefit from improvement.
SUMMARY
[0015] A satellite imaging system used in a satellite has a telescope section including a first reflector that is substantially square and sized to fit into a substantially square aperture of a satellite body, a second reflector, positioned to reflect light reflected from the first reflector, and a lens set including one or more lenses positioned in an optical path of the telescope section. A sensor array is positioned to receive light from the telescope section when light is received through the substantially square aperture, where the sensor array is substantially square.
[0016] The satellite imaging system may have a second reflector that is substantially square and/or constructed to counteract image distortions as might occur due to being substantially square. The second reflector might be round. The lens set might be square or round.
[0017] The satellite imaging system might include baffles that are substantially square or substantially round. The elements are substantially square when the elements have a form factor of a square adjusted for structural components needed to provide structural stability to the elements or other components of a satellite containing the satellite imaging system. The first reflector, the second reflector, and/or one or more lenses of the set of lenses can include thermally matched materials in that the thermally matched materials will limit distortion of a final image over a predetermined set of operating conditions. The first reflector, the second reflector, and/or one or more lenses of the set of lenses might be movable by motorized or deformable positioners to perform dynamic compensation of positional error and/or compensation for mechanical variations. The satellite imaging system may also include an imaging system with optical path optimized for nanosatellites.
[0018] A satellite imaging system used in a satellite has a telescope section arranged to receive incoming light along an optical path, a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges including one or more wavelength ranges within a visible spectrum, a second camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges including one or more wavelength ranges outside the visible spectrum, and a dichroic beam splitter in the optical path, where light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second camera.
[0019] The dichroic beam splitter might be a whole-spectrum beam splitter, where either the first sensor array or the second sensor array is a panchromatic sensor array, and the other sensor array (second or first array) is a non-panchromatic sensor array providing pixelated, color-filtered images, and where outputs of the panchromatic sensor array are usable to enhance the pixelated, color-filtered images. The satellite imaging system might include electronic bandwidth controls for controlling and/or modifying a passband defined by the dichroic beam splitter where the first set wavelength ranges and/or the second set wavelength ranges wavelengths can be controlled and/or modified. A third camera and a second beam splitter might be provided in the optical path, where at least a portion of the incoming light is directed to the third camera. Electronic field of view controls might be used for controlling and/or modifying a telescope field of view.
[0020] A satellite imaging system used in a satellite has a telescope section arranged to receive incoming light along an optical path, a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges including one or more wavelength ranges within a visible spectrum, a second sensor array that shifts relative to an optical axis to the camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges including one or more wavelength ranges outside the visible spectrum, and a dichroic beam splitter in the optical pathwhile capturing an image to improve the resolution thereof, and an attitude determination and control system, where the satellite stabilizes the satellite while capturing an image.
[0021] The satellite imaging system might include an active disturbance cancellation system, where light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second cameraa combination of mechanisms, sensors, and computer algorithms reduce the effect of disturbances while capturing an image, and a high-rate data readout and recording system, where multiple images can be collected within a shortened time period to limit the effect of vibration upon image quality. The satellite imaging system may also include image stabilization and pixel shifting for a nanosatellite imaging system.
[0022] Various aspects, advantages, features and embodiments are included in the following description of exemplary examples thereof, which description should be taken in conjunction with the accompanying drawings. The following detailed description together with the accompanying drawings will provide a better understanding of the nature and advantages of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Various embodiments in accordance with the present disclosure will be described with reference to the drawings, in which:
[0024] FIG. 1 is a front isometric view of an imaging nanosatellite.
[0025] FIG. 2 is a rear isometric view of an imaging nanosatellite.
[0026] FIG. 3 is a front planar view of a telescope section of an imaging nanosatellite.
[0027] FIG. 4 is an isometric view of an imaging nanosatellite in a deployed mode with communications antennas deployed.
[0028] FIG. 5 is an illustration of a rotation control assembly.
[0029] FIG. 6 is an illustration of an example imaging system with arrangement of multiple cameras and a square aperture telescope section.
[0030] FIG. 7 is a schematic illustration of light paths of the imaging system of FIG. 6.
[0031] FIG. 8 is a top, cross-sectional view of components of the imaging system of FIG. 6.
[0032] FIG. 9 is a top, cross-sectional view of components of the imaging system of FIG. 6, illustrating light paths.
[0033] FIG. 10 is a cut-away view of an optical barrel section; FIG. 10(a) is an angled view; FIG. 10(b) is a straight-on side view.
[0034] FIG. 11 is a cut-away view of a telescope section showing a square secondary mirror baffle and an optical barrel section.
[0035] FIG. 12 is a chart of spectral bands.
[0036] FIG. 13 illustrates examples of color filters used on pixel arrays. [0037] FIG. 14 illustrates an example of increased resolution from the use of subpixel shifting.
DETAILED DESCRIPTION
[0038] In the following description, various embodiments will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the embodiments may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described.
Square Telescope
[0039] Techniques described and suggested herein include an imaging satellite with a square primary reflector and other variations include other noncircular shapes.
[0040] An imaging system of an imaging satellite will have an aperture, optical components and light sensors. The aperture is where the light enters the imaging system. That light is optically processed by reflectors and other components so that the light generally falls onto a light sensor. A common sensor is a sensor array comprising a two-dimensional (2D) array of pixels. Each pixel electrically responds to light impinging on the pixel. The imaging system also includes electronics to measure a pixel's response and record image data based on the responses. An example sensor array might be a 1024 by 1024 pixel array (1 megapixel) or a 5, 120 by 5, 120 pixel array (25 megapixel), but could also be a non-square array, such as a rectangle or some other shape.
[0041] The imaging system also includes a processor that interacts with the electronics to, for example, process the image data recorded by the electronics or image data sent directly to the processor from the electronics without recording. The processor might compress the image data, perform some calculations, and/or modify the image data. The processor might also coordinate with a communications system to transmit some of the image data to other satellites and/or ground stations. The image data can be in the form of single 2D array of values or multiple 2D arrays of values. The image data might also include a time component, such as where not all of the pixels were read at the same instant in time and the image data might include a sequence of readings, such as might be used to form a video sequence. [0042] In communicating image data, or even when not communicating image data, the processor might direct the communications system to send non-image data, such as satellite performance data, settings and other data. Some data stored by the processor might include mission data. Mission data might specify, for one or more missions, which images to capture and when and other details of image capture. The processor might also control other aspects of the satellite, such as how to position and reposition the satellite during, before or after an imaging event. As should be understood, the processor might implement various functions and provide equivalent structure to physical components by virtue of program code that is stored in program code memory and is executed by the processor as part of the operation of the processor. The particular operation(s) of the processor might be determined by the particular construction of the processor and/or the program code in program memory that the processor is designed to execute.
[0043] The optical components might include reflectors (mirrors), lenses, prisms, splitters, collimators, filters, obscurers and the like. Some of these components can be movable, possibly under control of the processor via motors, actuators, etc. that are controllable by the processor. Some of the obscurers are part of the optical components, while others are unrelated to the optical path. For example, where a primary reflector is used with a secondary reflector, the secondary reflector and struts that hold the secondary reflector in place might be obscurers in the optical path. The optical path might also be obscured by hinges or other necessary parts of the satellite that cannot be, or cannot easily be, removed from the optical path from the aperture to the sensor array(s). A shroud or baffle might be used in or around the aperture to block undesired light from entering the aperture.
[0044] To maximize the light capturing and imaging resolution capability of an imaging satellite while minimizing weight, the primary reflector and other elements of the optical path should have a shape optimized to the shape of the satellite and have surfaces as large as possible. For a nanosatellite with a square cross-section, the first mirror and other elements of the telescope section in the optical path can have a square cross-section. This increases the light collecting area, and also allows the shape of the image plane formed by these optics to more nearly match the square or rectangular shape of commercially available sensor arrays used in the camera section of an imaging system. Another advantage is that with a square aperture, the aperture diameter (diagonal dimension) is larger and thus the resolution of the telescope is improved.
[0045] In an example imaging system described herein, the aperture is square, or at least substantially square. In some embodiments, the aperture is square with chamfered corners. Such an aperture might be found on a satellite that uses chamfered corners. Regardless of the details, the aperture can have a larger area than a corresponding inscribed circle, thereby increasing the amount of light collected and also causing a larger percentage of collected light to fall on pixels, per unit time, in a sensor array than if the profile of the light were circular.
[0046] In some aspects, the chamfering can be used to reduce diffraction artifacts and point spread function distortion.
[0047] In some embodiments, each other optical component is shaped commensurate with the aperture shape. For example, where the aperture is square, the secondary reflector can be square, lenses can be square, etc. Alternatively, some of the components in the path can be round so as to simplify construction, while still taking in all available light.
[0048] The result is greater use of light energy both in collecting light and in applying light to the sensor. A further advantage is the increase in aperture that is represented by the diagonal dimension of a square aperture as compared to the diametric dimension of a circular aperture designed to fit in the same spacecraft. The diagonal dimension increases the sampling frequency of the telescope, and the square shape of the aperture provides a point spread function (PSF) that has a squarish central spot that is narrower than the Airy disk provided by a circular aperture. The result is higher resolution performance.
[0049] While telescope resolution benefits from increased aperture size, the resolution and image quality performance of a satellite imaging system is also, in part, dependent upon the amount light that it is able to gather. Additional light can improve signal-to-noise ratios and dynamic ranges of images collected, which generally results in better images, with improved crispness and better contrast.
[0050] For reflecting type telescopes, improving aperture size and light collection depends, in large part, upon the unobstructed area of the primary mirror of the telescope. The useful area of the mirror is limited by the size of the satellite, and the size of obstructions (e.g., from the secondary mirror, support structures, and light baffles) depend, mainly, on the focal length and structural requirements of the telescope design. Conventionally, such telescopes have used round mirrors for their first reflector and imaging satellites have had round cross sections, conforming to the mirror shape as well as the shape of the launch vehicle. A round telescope is also often used in a square cross-section satellite. The other elements in the telescope (such as secondary mirrors, light baffles, and lenses) also have used a conventional round shape, although the sensing array may be rectangular, resulting in the loss of a portion of the gathered light. This is similar to the situation in a standard digital camera, where the sensor array of charge coupled devices (CCDs) or complementary metal- oxide semiconductor (CMOS) devices is rectangular (corresponding to the shape of a photograph), while the lens and other optics are round for convenience, resulting in a loss of a crescent shaped region at each of the image's edges.
[0051] For many available nanosatellite launches, a square or rectangular dispenser compartment is provided to encapsulate and fit multiple satellites in a single launch dispenser. For these dispensers, the individual nanosatellites are no longer round, but typically will have a square cross-section. Accordingly, to optimize telescope performance, it is more beneficial to make a mirror with increased surface area to fit within the square satellite. Maximizing the first reflector's size to capture more light and increase aperture dimensions while still conforming to the shape of the nanosatellite can be obtained by use of a square first reflector, perhaps with rounded or chamfered corners if needed due to the satellite's internal construction or optical diffraction characteristics. To compensate for the increased mass of a square mirror with larger area, material is machined away from the back of the mirror in a process called "light-weighting" which results in a stiff but light weight structure. The mass difference relative to a circular mirror is very manageable.
[0052] As the resultant image from the first reflector will be square, and the image sensing array is square or rectangular, it is not necessary to use conventional round elements for a secondary reflector, lenses, and other elements in the optical path. To optimize light gathering while minimizing cost, a square secondary reflector may be used while any internal corrective lenses may be circular. The reflected light beam remains square in shape until it arrives at the sensor. As minimizing the mass of the nanosatellite is important, using a square secondary reflector, baffles, lenses (if desired) and other elements of the optical path will reduce weight while still transmitting all of the light gathered at the primary reflector to the sensor array. This can require use of some unusually shaped elements, since lenses, for example, are usually round. The non-round elements should be aligned in rotation about the central optical axis as any square optical or baffle elements will lack rotational symmetry about this axis, but in some cases that might not be a concern.
[0053] For optical alignment of square optical elements (mirrors, lenses, prisms, and baffles), this can be done during construction and fixed in place, but in some embodiments, image rotation devices (e.g. prisms) take advantage of the beneficial asymmetry of the square aperture's point spread function. The diagonal of the aperture provides higher resolution than the width or height. Image rotation would take advantage of this by enabling alignment of the higher resolution diagonal dimension of the optics with the higher resolution width dimension of the sensor array, should one be present.
[0054] The exemplary embodiments described here are based on a satellite with a square cross-section, as this is a typical configuration for the packing of nanosatellites as a payload into a launch vehicle, but can be extended to other configurations that lack the conventional rotational symmetry. For example, rectangular or hexagonal cross-sections also lend themselves to dense packing. In these other shapes, the mirror would again be determined by the shape of the satellite in order maximize the available light gathering, while minimizing the mass of the elements along the optical path by using the same shape for these.
Multi-Imaging Camera
[0055] Techniques described and suggested herein include an imaging satellite having an imaging system that provides for separate cameras, such as separate lensing, separate filtering, and separate sensor arrays, possibly also including processing logic to combine outputs of the separate sensor arrays in various manners to improve over what could be done with a single sensor array.
[0056] A camera might be formed of a set of zero or more focusing elements, a set of zero or more light deflection elements, a set of zero or more filtering elements, and a sensor array that captures incident light. The sensor array might comprise a plurality of light sensitive pixels in a two-dimensional array (2D) of pixels. The individual pixel sensors might be charge coupled devices (CCDs), complementary metal-oxide semiconductor (CMOS) type, microbolometer arrays, or other sensor elements. A sensor array might have all of its pixels sensitive to the same range of light wavelengths, or it might have a varying pattern of sensitivities over the array. For example, for a sensor array using an RGBY colorspace for the visual spectrum, the array will need to accommodate sensors for each of the red, green, blue, and yellow color wavelength ranges, which will reduce the resolution of the each wavelength range by half (doubling the size of the smallest detectable object in green light). If additional wavelength ranges are to be sensed with the same array, such as into the infra-red or the ultra-violet, this will further reduce the resolution of the individual wavelength ranges.
[0057] In an example satellite imaging system, multiple cameras are used, such as where incoming light from a telescope section of a satellite may go through a dichroic beam splitter, with the standard visible spectrum going to a first camera and wavelengths outside of the standard visible spectrum, such as in the infrared or coastal blue range, being sent to a second camera, allowing image data from multiple wavelength ranges to be captured simultaneously. The image data from the different wavelengths of two (or more) cameras can then be selectively recombined. In a more general case, there is a first range of wavelengths and a second range of wavelengths.
[0058] As adding additional cameras and sensors to detect individual wavelength bands can cause an undesirable increase in the weight of a satellite, the balancing of weight against utilization of wavelength bands (also known as spectral resolution) is an important concern for a satellite imaging system. The satellite imaging system described herein performs this balancing by using multiple cameras having sensor arrays sensitive to different wavelength ranges, in order to improve the spectral resolution of the multi-camera system, and make use of the full sensitive spectrum of the sensors so the individual cameras can each be sensitive to more than one wavelength range, in order to save on the mass of the satellite while utilizing the full available spectrum of light and provide other benefits.
[0059] For example, the incoming image can be exposed to at least two cameras, with each of the cameras getting a distinct set of one or more wavelength bands which can be sensed in one or more narrower filtered wavelength band that can be captured simultaneously. The images of the selected wavelength ranges from each of the cameras are then aligned and combined to form a remerged image having color components from more than one of the cameras. Depending on the embodiment, the selection and combination of wavelength ranges from the different cameras can be done on the satellite, done terrestrially, or some combination of these.
[0060] The incoming image can be split using a double-dichroic beam splitter. In one embodiment, there are two cameras with the first camera receiving the visible light wavelengths, while the second camera gets the red edge (RE), Near Infrared 1 (Nl) and Near Infrared 2 (N2), and possibly also wavelengths below the range of standard visible wavelength sensors, such as Coastal Blue (CB). Use of one camera for the visible range can have the advantage that data from the visible range is commonly wanted in applications and that such sensors are more readily available. This also allows for use of differing resolution levels, if desired, where the visible image can have a higher resolution that can be overlaid with data from selected ones of the additional wavelengths that are of interest for a particular application. In general, using a dichroic beam splitter, available light from a light path can be partitioned into two (or more) separate subpaths for use by camera sensors with different spectral sensitivities. While a given sensor array might not be sensitive to a particular range of wavelengths, the light in that range of wavelengths is not wasted, as it can be directed to a different sensor array that is sensitive to it.
[0061] Some implementations of this design will make it is possible to include cameras with shortwave infrared (SWIR) and longwave infrared (LWIR) sensors. Sensors in these wavelength bands make it possible to collect mineralogical and thermal images and see through smoke and cloud cover. Other implementations can combine hyperspectral imagers in these wavebands with a visible waveband imager. Subpixel Imaging
[0062] Techniques described and suggested herein include an imaging satellite with an example satellite imaging system having a camera, a pixel shifting mechanism, thermally stable imaging payload, and high-stability attitude determination and control system (ADCS) to improve image resolution. To address the overall stabilization requirement, the satellite incorporates a number of design elements which, in combination, provide the stable result required for good imaging.
[0063] The resolution of a satellite imaging system is dependent upon the amount of light that the system's telescope is able to gather, the diffraction limited resolution that the optical design is able to provide, the sensitivity and pixel arrangement of the sensor, and the stability of the imaging system (i.e. the relative motion between the platform and target). For a digital camera, the resolution of the camera depends on the density and arrangement of the individual pixel sensors on the sensor array (typically known as the pitch, or distance, between pixels). Pixel pitch establishes what is known as the spatial sampling rate of the sensor. The individual pixel sensors, such as a charge coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) type sensor, can only be packed with a certain density into a square two-dimensional array without obstructing one another. If the pixels of the array are to have elements that are sensitive in more than one wavelength band, this will further decrease the resolution within each band.
[0064] For example, for a sensor array using an RGB colorspace for the visual spectrum, the array will need to accommodate sensors for each of the red, green, and blue color ranges (such as in a Bayer filter mosaic), which will reduce (worsen) the resolution of the green range by half and the red and blue ranges by a factor of four relative to the maximum un-aliased diffraction-limited and sensor sampling rate limited resolution. If additional wavelength ranges are to be sensed with the same filter array, such as into the near infra-red or the near ultra-violet, this will further reduce the resolution of the individual wavelength ranges. Although an additional camera adds weight to the satellite, the additional filter array that comes with it is very beneficial for optimizing the use of light energy (radiometric performance), wavelengths (spectral resolution) and retention of spatial (sampling) resolution. The added weight and volume are manageable in the described satellite system.
[0065] The satellite imaging system described here balances mass, volume, and imaging performance by using multiple cameras having sensor arrays sensitive to different wavelength ranges, in order to improve sampling resolution, but where the individual cameras can each be sensitive to more than one wavelength range, in order to save on the mass of the satellite. For example, the incoming image can be exposed to at least two cameras, with each of the cameras getting a distinct set of one or more wavelength band that can be captured simultaneously. One of the cameras is pixel- shifted in two axes to increase collected information and reduce imaging artifacts giving an apparent spatial and spectral resolution improvement. The images of the selected wavelength ranges from each of the cameras are then aligned and combined to form a remerged image having color components from more than one of the cameras. Depending on the embodiment, the selection and combination of wavelength ranges from the different cameras can be done on satellite, done terrestrially, or some combination of these.
Description of the Figures
[0066] FIG. 1 is a front isometric view for an example of an imaging nanosatellite. The satellite is a rectangular box shape, with a square cross-section, that allows a number of such satellites to be stacked compactly as the payload of a launch vehicle. The square telescope provides for optimal use of the satellite cross-section, thereby allowing an increase in aperture area and a diagonal aperture dimension beyond what a circular aperture would provide within the same cross-section. This provides a correspondingly increased light collecting ability and a larger effective aperture dimension on the diagonal of the mirror. The satellite 102 is shown with a deployable fore-baffle 104 extended to aid with stray light control, but which can be retracted to keep the satellite compact. At the rear is the camera system, which is partially visible in the view of FIG. 1.
[0067] The camera system is more visible in the rear isometric view of FIG. 2. In this example, a dual camera system with a dichroic beam splitter in the optical pathway is shown, but the arrangement of the optimized optical path can also be used with a more conventional single sensing array arrangement or with additional beam splitters and cameras.
[0068] This arrangement provides for an increase in aperture, improvement in optical resolution, reduction in achievable Ground Sample Distance (GSD), and an increase in light collection. Ground Sample Distance is essentially the distance between pixel centers as they would appear if the pixel outlines were projected on the ground, namely the boundary of the area sensed by each pixel on the Earth's (or another target's) surface. The distance between these square area centers is the GSD.
[0069] FIG. 3 is a front planar view of a telescope section of an imaging system of a satellite 302. FIG. 3 shows a primary reflector 304, a secondary reflector 308 and struts 306(l)-(4) for supporting secondary reflector 308. Struts 306 can be attached to satellite 302 in the plane of secondary reflector 308 and/or attached more to the rear, such as further back on or through primary reflector 304. In this example, secondary reflector 308 is square and occupies around 20% of the aperture. In other variations, the percentage is higher or lower. [0070] In some variations, secondary reflector 308 is round, but square secondary reflectors might be preferred when the area of the round secondary reflector that is outside an inscribed square is not illuminated by light from the primary reflector, as that area outside the inscribe square might obscure light from entering the aperture in the first place.
[0071] FIG. 4 is an isometric view of an imaging nanosatellite 402 in a deployed mode with communications antennas deployed. In this example, there is a planar antenna 406 and another antenna 404.
FIG. 5 is an illustration of a rotation control assembly 502 including reaction wheels 504(l)-(4) and torque rods 506(l)-(3). In operation, the reaction wheels may be used for rotational control of the satellite. Because of their orientation, slowing down and speeding up the reaction wheels can impart a torque about the center of mass of the satellite. By slowing down (or speeding up) followed by speeding up (or slowing down), the reaction wheels can cause the satellite to start rotating, rotate to a position and stop at that position. A processor can compute the speed changes needed on each of the reaction wheels to maintain a "stare mode" view of a fixed point on the Earth even as the satellite moves along its orbit, thus allowing the collection of large amounts of light from a source. The torque rods can all be identical, but do not need to be. The three torque rods can be mutually orthogonal, but in some implementations, there could be more than three, for redundancy and/or additional torque.
[0072] The reaction wheels might hit a point of saturation. The three orthogonal torque rods can add torque as they pass through the Earth's magnetic field to help with desaturation. The torque rods and reaction wheels are mounted on an integrated rotation control assembly, leading to a more compact design.
[0073] FIG. 6 is an illustration of an example dual-camera imaging system with arrangement of multiple cameras and a square aperture telescope section. The imaging system includes a square aperture telescope section 602, a beam splitter 604, a mirror 610, a first camera 608 and a second camera 606.
[0074] FIG. 7 is a schematic illustration of light paths of the imaging system of FIG. 6. FIG. 7 schematically illustrates the splitting of the incident light for the two cameras. The full spectrum incoming radiation is incident upon a cube beam splitter that includes a dichroic filter that separates out the visual portion of the spectrum (R, G, B, Y) from the wavelengths outside of the RGBY space (CB, RE, Nl, N2). Other arrangements can be used, such as putting CB in the same sensing array as (G, B, Y) and putting red in with the longer wavelengths (RE, Nl, N2). In FIG. 7 both cameras are shown to be of the same spatial (sampling) resolution, but, for example, it may be desirable for the camera for the visible range to have a relatively higher sampling resolution. Other wavelength ranges can similarly be incorporated by using corresponding array sensors; and more cameras can be incorporated by further splitting of the incoming radiation.
[0075] FIG. 8 is a top, cross-sectional view of components of the imaging system of FIG. 6, including a primary reflector 902, a secondary reflector 903, and a set of lenses 905. An air gap might be provided so that vibrations, in orbit and/or during launch, do not easily transfer from the telescope section to the cameras.
[0076] FIG. 9 is a top, cross-sectional view of components of the imaging system of FIG. 6, illustrating light paths. The incoming light is incident on a primary reflector 902. For compactness in the telescope volume, the primary reflector 902 has increased hyperbolic curvature relative to most commercial telescopes. Primary reflector 902 reflects the incident light onto secondary reflector 903, also with increased curvature, which in turn reflects the light through the set of lenses 905 and on to the sensors, where the paths of a number of rays are shown. The inner primary baffle and secondary baffle are also designed to be square to minimize mass. The square shape of the secondary baffle also allows more light energy per unit time to arrive at the primary mirror than a traditional circular baffle would. The latter further enhances signal to noise ratio (SNR) of the telescope.
[0077] In the example here, the sensor array includes two separate cameras. After the lens, the optical path includes a dichroic splitter to separate out different wavelength ranges used by the sensor arrays after filtering of wavelengths by the Color Filter Arrays (CFAs) for two cameras, which in this example has one camera for the standard visible spectrum that uses an RGBY colorspace sensor array and another camera for wavelengths on either or both sides of the visible, such as bands known as Coastal Blue (near UV), Red Edge, and near infrared (NIR). More generally, other sensor arrays can be used, with sensitivity to bands well beyond those discussed here; and, more cameras with different filter and sensor arrays, or a single camera with a different filter and sensor array can be used after lenses 905. [0078] The performance demands of this design make it sensitive to thermal variations. Accordingly, the structural materials and lens arrangement should be carefully selected to compensate for the temperature range expected in a wide range of orbits. This so-called athermal design provides for consistent imaging performance and also makes it possible to use a wide number of launch opportunities, even if the initial orbit altitude of some launches is above the nominal operating altitude of the telescope design. Good imaging will still be possible, and the spacecraft propulsion system will lower altitude to improve the GSD of the imager.
[0079] The system can have the ability to modify the telescope field of view such that the beam splitter (or beams splitters) and multiple cameras can enable imaging of wider target areas. Here, the RGB camera has a sensor array sensitive to the visual spectrum and the second camera has a sensor array that is sensitive to wavelengths on one or both sides of the visual spectrum. The use of a dichroic beam splitter allows for each of the cameras to receive more or less all of the light of their respective wavelength ranges. Additionally, this helps to keep the undesired bands from leaking through color filter arrays (CFAs) on each sensor to some degree, providing better signal-to-noise results. The result is that a very high percentage of the light for each band reaches one camera or the other, and that the full sensitivity spectrum of the CMOS (or other type) sensors can be utilized. With the dichroic beam splitter and the addition of special mirror coatings, different sensors may be used at each camera location that have sensitivity in wavelengths beyond what CMOS silicon sensors can detect.
[0080] In the example of FIG. 6, the two optical subpaths from the beam splitter surface to each of the sensor arrays in cameras 606 and 608 are different lengths. In some preferred embodiments, the optical subpaths are the same length. As illustrated in FIG. 9, this may involve a set of mirrors and/or lenses to maintain compactness. When the subpaths are the same length, that may make image correction and coregi strati on easier and with the distortions due to other optical elements and features, the distortions are equalized. This can be important in applications where images from more than one camera are then recombined into a single image or data structure representing light capture for many different spectral wavelengths.
[0081] A number of variations and extensions of the arrangement shown in FIGs. 8 and 9 are possible. For example, one or more additional beam splitters can be added along with one or more additional cameras. For example, after the beam splitter, a second beam splitter can be added in path to one of the shown cameras to provide the incident light (or some range of it) to a third camera. This can be used to increase spatial (sampling) resolution, the wavelengths that are sensed (spectral resolution), or some combination of these. In other variations, the beam splitter (or one or more of multiple beam splitters) may be a whole-spectrum beam splitter, so that a panchromatic sensor can be used on one of the cameras to enhance the images on another camera that uses pixelated color filtered images.
[0082] FIG. 10 is a cut-away view of an optical barrel section; FIG. 10(a) is an angled view. In the background is the primary mirror. The cut-away image of a central baffle shows a square fore-section with a circular barrel section passing through the primary mirror. Light enters the square opening from the left, and passes out of the baffle section to the right where the remaining optics and cameras are positioned.
[0083] FIG. 10(b) is a straight-on side view of the same cut-away. The baffle that passes through the primary mirror may be square, but if it is round at the point it passes through the primary mirror and above that (closer to the secondary mirror) the baffle is square, this can improve the handling of stray light by restricting the size of the open end of the baffle.
[0084] FIG. 11 is a cut-away view of a telescope section showing a square secondary mirror baffle and an optical barrel section. This shows the relative positioning of a square secondary mirror baffle relative to square internal baffle.
[0085] FIG. 12 illustrates one example of a correspondence between the two cameras and the wavelengths to which their sensor arrays respond. At the left of the figure, eight color bands and their corresponding wavelength ranges are listed. The array for the first camera is for the visual range using the RGBY (red, green, blue, yellow) colorspace. The second camera is sensitive to Coastal Blue (CB) with a wavelength below the visible and Red Edge (RE), Near Infrared 1 (Nl) and Near Infrared 2 (N2) at wavelengths above the visible range.
[0086] Note that this multi-camera arrangement uses full-frame imaging cameras capable of using global -shutter-mode imaging. This allows the full array of all included wavelengths to be captured in images simultaneously. Consequently, the different images at different sensor locations do not suffer from the sort of time lags that can affect images when the different wavelengths or areas of the image are not captured at the same time, such as can occur when using a push broom scanner, or rolling-shutter imaging mode, for example, for obtaining the image data.
[0087] Post-capture processing registers the pixels from the different cameras and from the different color filters. This post-capture processing might be performed by processing circuity (e.g., a processor, memory and program instructions) located at the camera, elsewhere in the satellite, or at a ground station. Registration is desirable when, for example, a single one-pixel wide white light source on Earth is being captured, effectively as a point source. In that case, the pixel arrays from the two cameras might show that point source in different locations in the pixel array due to the optics used, or due to the differences in capture rate or capture time between the cameras (a small asynchronicity in the capture times can result in a considerable image shift given the resolution and the velocity of the satellite). The processing might be based on pixelated color filtered image data. There might also be offsets in that a white light directed at a camera with a multi-color array can illuminate four adjacent pixels so that the four-color arrays for one camera would need to be realigned. The realignment process is simplified in the implementation example using full-frame imaging sensors that perform simultaneous imaging.
[0088] The accumulated pixel data can be combined on the satellite, sent back uncombined, or in some intermediate state. Although sending back all of the data requires more transmission time, this provides greater flexibility in that many additional combinations or views can be generated for different user needs. For example, one user may want relatively low resolution coastal blue data overlaid on an RGB image, while another may only be interested in red and longer wavelengths, but at the highest recorded resolution.
[0089] FIG. 13 illustrates examples of color filters used on pixel arrays. These filters would act on the incident light that has already been split by the beam splitter, as the filters would only see, for the most part, light in the spectra designated for that pixel array.
[0090] FIG. 14 illustrates an example of increased resolution (more accurately, increased discernible spatial content, derived from what is known as de-aliasing) from the use of subpixel shifting which reduces errors in the sampled image known as aliasing errors.
[0091] The ability to provide high quality images is improved by mechanically stabilizing the imaging system, both as this improves the images from the individual cameras and makes combining the images from the different cameras of a multi- camera system easier. A terrestrial imaging system can be mounted to an object for steadying, but for a satellite operating in a weightless environment, the situation is more complicated. Although the following is presented in the multi-camera context described above, the issues related to image stability also apply to single camera arrangements.
[0092] As noted above, in a diffraction limited (near-optimal) telescope design, the achieved resolution is limited by the pixel density of the sensor array used by the camera, with this resolution being further diluted when a filter array is used to divide the spectrum of light into multiple wavelength bands distributed to a pattern of pixels. One way to improve the effective density of the sensor pixel array is through use of a motor mechanism that can move the sensor array by some subpixel distance, so that the sensor captures multiple images, effectively providing images at subpixel resolution.
[0093] For example, the motor mechanism might move the sensor array one half of a pixel width left/right and/or one half of a pixel width up/down, resulting in four different positions. In another example, the motor mechanism moves one third of a pixel width up, one third of a pixel width down, one third of a pixel left, one third of a pixel right, and four combinations of those (e.g., (0, 0), (0, 1/3), (0, 2/3), (1/3, 0), (1/3, 1/3), (1/3, 2/3), (2/3, 0), (2/3, 1/3), (2/3, 2/3)), resulting in nine different positions. This provides a sensor array with the equivalent of four or nine times the number of pixels by combining the shifted images. However, this requires moving mass around, which, when mounted terrestrially or in a much larger satellite might not be much of a problem. In the weightless environment of an orbiting satellite, however, the rapid movement of even a relatively small mass can shift the rest of the satellite, including the telescope portion, over the time period when portions of an individual image are captured via multiple sub-images to be combined. Consequently, the issue of image stability is further complicated when such a pixel shifting mechanism is used in an orbital camera.
[0094] The improved sampling resolution (de-aliased image), made possible by pixel shifting is illustrated with respect to an RGBY colorspace sensor array in FIG. 14.
[0095] This is effective in part because the area between CMOS pixels, for example, is not all light-sensitive area. Accordingly, shifting pixels into these blank spaces, and into adjacent pixel spaces, can allow the collection of additional information. As discussed with respect to FIG. 13, each of the four colors is represented in one fourth of the array's pixels. By shifting the array, as illustrated for the red pixel, to one of each of the other three positions, the effective number of pixels is quadrupled, so that a 25-megapixel (MP) array provides the resolution of a 100 MP array. Similarly, a 3x3 sub-pixel shifting would allow a 25-megapixel (MP) array to provide the resolution of a 225 MP array. As shown in FIG. 6 and other figures, the visual spectrum camera is represented as larger than the extended spectrum camera to allow incorporation of the pixel shifting mechanism for higher resolution of the visible image.
[0096] To address overall stabilization requirement, the satellite can incorporate a number of design elements that can provide the stable result required for good imaging. These design elements can allow the satellite to point at a target for longer periods, which provides benefits to both signal-to-noise (S R) and apparent image resolution through use of a pixel shifting super-resolution technique. Stabilization can be enabled by a combination of high performance, 3-axis attitude determination and control system (ADCS), which is optimized to reduce disturbances and errors imparted to the image pointing function, design features that reduce the amount of time during which the imager should be stable.
[0097] In addition, modifications to the programmed motion of the pixel shifter can be made to compensate for the disturbance caused by the pixel shifter itself. Here, the torque applied to the spacecraft by the pixel shift causes slight movement of the telescope pointing that are predictable and correctable using fine adjustments or profiled smoothing of the commanded pixel shifting motion. Other improvements include use of a second pixel shifting mechanism with the same mass and motion performance, rotated 180 degrees, such that it can cancel or modify the disturbances caused by the first pixel shifting mechanism. Either, or both of these mechanisms may also be used in combination with disturbance sensors and software to cancel out any predictable disturbances from other sources.
[0098] In some embodiments, the countering mass that counters the movement of a pixel sensor array by subpixel distances might be the other pixel array in another camera. The movement can be timed appropriately if both sensor arrays are being used at the same time. However, in the case where only one of the pixel arrays employ subpixel shifting, other masses could be used. The mass might be equal to the mass of the moving sensor array, with both having the same distance from the center of mass of the satellite. However, that is not required, as software can be used to adjust between the movements of masses at different distances, to adjust for equal moment of inertia.
[0099] In one example, the spacing between pixels is around 4.5 microns, so a movement of a third of a pixel is 1.5 microns. Movements as fine as 1 micron, or finer, might be used. The sensor array might be moved using semiconductor elements that expand/contract upon the application of electrical potential. Piezoelectric elements might provide such movement in the ranges of microns. A processor, on board or ground-based, could process images to determine how much movement is occurring, allowing for calibration of the movement mechanism.
[00100] With respect to the attitude determination and control system (ADCS), the satellite can apply an inertial measurement unit with a very precise 3-axis rate sensor and use ultra-smooth bearings in the reaction wheels to cancel angular motion. It can also point a star tracker at such an angle as to provide a stable external reference and ensure optimum pointing accuracy when included in the control system's Kalman filtering algorithm. The attitude control system operates at a high sampling rate to limit the effect of signal quantization, and sensor noise is suppressed. Gains applied to the control algorithm are modified using a control-mode dependent gain schedule to ensure low jitter amplitudes during pixel-shifted imaging periods. Damping devices and vibration isolators can be added to reduce and isolate the effects of vibration, keeping them away from the telescope's pointing behavior. A notch filter algorithm can be applied to suppress the control system from responding to or exciting resonant modes in the satellite structure or vibration isolation devices, thereby suppressing instabilities. Additionally, the satellite can include a processor or processing system that includes program code that includes and executes a set of program steps to ensure that reaction wheels operate at mean rates that impart minimum resonant excitation to the satellite and control system.
[00101] The reaction wheels can be balanced in advance using laser ablation. When the reaction wheels are more balanced, that reduces the vibrations that spinning reaction wheels impart to the satellite and therefore to the imaging systems, telescope orientation, etc. In this approach, imbalances are measured, perhaps by lasers or other methods, and then portions of the reaction wheels are ablated by a laser to bring the reaction wheels more into balance.
[00102] The satellite can also reduce imaging sensitivity to angular rates by shortening the time period during which stable pointing must be maintained. This can be done by use of the highest available data rate for sensor readout in a pixel shifting camera, so that a high-rate data readout and recording system can be used to collect multiple images within a shortened time period to limit the effect of vibration upon image quality. Here, it should be noted that sensor readout time may be the dominant factor in the sum of things that add to the pixel shifting time period. To reduce readout time, cameras can use multiple high rate output channels to provide an image readout rate that halves the time period during which the multiple pixel shifted images are collected and transferred to storage. Further, the use of buffer memories installed in the camera electronics can shorten the readout time requirement. The resulting shortened time period means that the imaging system is less affected by angular rate effects since the area imaged remains acceptably stable during the shortened period of imaging.
[00103] In the pixel shifting case, the movement can also be counter-balanced to add to stability. For example, a mechanism similar to the pixel shifter, but with equal and opposite motion can be added to counteract the force imparted by the pixel shifting mechanism. The image can also be stabilized by coordinating the operation of the satellite's thrusters with the camera operation. When a request for the satellite to capture an image is received or scheduled, the status of the satellites thrusters is checked. If the requested imaging requires that the thrusters be stopped, this can be done. It may be the case that the satellite needs to allow the thrusters to operate while imaging takes place. Accordingly, thrusters would need to operate with very low thrust disturbance. More importantly, the satellite should have an attitude control system capable of providing suppression of angular rates sufficient to enable imaging while thrusters are operating. Then, to enable the imaging function to proceed, it evaluates a three-axis angular rate sensor to see that rates are low enough for imaging events to proceed. Here, there are at least two levels of angular rate fidelity of concern: 1) an angular rate low enough to eliminate "pixel smear" for routine single- frame imaging, and 2) a lower angular rate to ensure that pixel shifted imaging can be effectively performed. Additionally, for extreme stability in all pointing modes, active damping motion compensation can be added as previously described.
[00104] Additionally, embodiments of the present disclosure can be described in view of the following clauses:
[00105] 1. A satellite imaging system, comprising:
[00106] a telescope section comprising:
[00107] a) a first reflector that is substantially square and sized to fit into a substantially square aperture of a satellite body; and
[00108] b) a second reflector, positioned to reflect light reflected from the first reflector;
[00109] c) a lens set comprising one or more lenses positioned in an optical path of the telescope section; and
[00110] a sensor array positioned to receive light from the telescope section when light is received through the substantially square aperture, wherein the sensor array is substantially square.
[00111] 2. The satellite imaging system of clause 1, wherein the second reflector is substantially square.
[00112] 3. The satellite imaging system of clause 1 or 2, wherein the second reflector is constructed to counteract image distortions as might occur due to being substantially square.
[00113] 4. The satellite imaging system of any of clauses 1-3, wherein the second reflector is substantially round.
[00114] 5. The satellite imaging system of any of clauses 1-4, wherein the lens set comprises lenses that are substantially square.
[00115] 6. The satellite imaging system of any of clauses 1-5, wherein the lens set is constructed to counteract image distortions as might occur due to being substantially square.
[00116] 7. The satellite imaging system of any of clauses 1-6, wherein the lens set comprises lenses that are substantially round.
[00117] 8. The satellite imaging system of any of clauses 1-7, further comprising baffles.
[00118] 9. The satellite imaging system of any of clauses 1-8, wherein the baffles are substantially square. [00119] 10. The satellite imaging system of any of clauses 1-9, wherein the baffles are substantially round.
[00120] 11. The satellite imaging system of any of clauses 1-10, wherein elements are substantially square when the elements have a form factor of a square adjusted for structural components needed to provide structural stability to the elements or other components of a satellite containing the satellite imaging system.
[00121] 12. The satellite imaging system of any of clauses 1-11, wherein the first reflector, the second reflector, and/or one or more lenses of the set of lenses comprises thermally matched materials in that the thermally matched materials to limit distortion of a final image over a predetermined set of operating conditions.
[00122] 13. The satellite imaging system of any of clauses 1-12, wherein the first reflector, the second reflector, and/or one or more lenses of the set of lenses are movable by motorized or deformable positioners to perform dynamic compensation of positional error and/or compensation for mechanical variations.
[00123] 14. A satellite imaging system, comprising:
[00124] a telescope section arranged to receive incoming light along an optical path;
[00125] a first camera having a first sensor array positioned in the optical path and sensitive to a first set wavelength ranges comprising one or more wavelength ranges within a visible spectrum;
[00126] a second camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges comprising one or more wavelength ranges outside the visible spectrum;
[00127] a dichroic beam splitter in the optical path, whereby light in the first set wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second camera.
[00128] 15. The satellite imaging system of clause 14, wherein the dichroic beam splitter is a whole-spectrum beam splitter, wherein the one of the first sensor array and the second sensor array is a panchromatic sensor array and the other of the first sensor array and the second sensor array is a non-panchromatic sensor array providing pixelated, color-filtered images, and wherein outputs of the panchromatic sensor array are usable to enhance the pixelated, color-filtered images.
[00129] 16. The satellite imaging system of clause 14 or 15, further comprising processing circuitry for performing image enhancement. [00130] 17. The satellite imaging system of any of clauses 14-16, further comprising electronic bandwidth controls for controlling and/or modifying a passband defined by the dichroic beam splitter whereby the first set wavelength ranges and/or the second set wavelength ranges wavelengths can be controlled and/or modified.
[00131] 18. The satellite imaging system of any of clauses 14-17, further comprising:
[00132] a third camera; and
[00133] a second beam splitter in the optical path, whereby at least a portion of the incoming light is directed to the third camera.
[00134] 19. The satellite imaging system of any of clauses 14-18, further comprising electronic field of view controls for controlling and/or modifying a telescope field of view.
[00135] 20. A satellite imaging system, comprising:
[00136] a telescope section arranged to receive incoming light;
[00137] a camera having a sensor array that shifts relative to an optical axis to the camera while capturing an image to improve the resolution thereof; and
[00138] an attitude determination and control system, whereby the satellite stabilizes the satellite while capturing an image.
[00139] 21. The satellite imaging system of clause 20, further comprising:
[00140] an active disturbance cancellation system, whereby a combination of mechanisms, sensors, and computer algorithms reduce the effect of disturbances while capturing an image; and
[00141] a high-rate data readout and recording system, whereby multiple images can be collected within a shortened time period to limit the effect of vibration upon image quality.
[00142] According to one embodiment, the techniques described herein are implemented by one or more generalized computing systems programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Although not shown, processing might be performed by a processor that accesses instructions in a program memory and controls communication and processing information. A processing system might include random access memory (RAM) or other dynamic storage device, or other intermediate information during execution of instructions to be executed by the processor. Such instructions, when stored in non-transitory storage media accessible to the processor, render the processing system into a special-purpose machine that is customized to perform the operations specified in the instructions. The processing system might also include a read only memory (ROM) or other static storage device for storing static information and instructions for the processor. The processing system may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which causes or programs the processing system to be a special-purpose machine. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
[00143] The term "storage media" as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a processor bus. Transmission media can also take the form of radio waves or light waves. Communication can be two- way data communication coupling to a ground station or another satellite.
[00144] The use of any and all examples, or exemplary language (e.g., "such as") provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
[00145] In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. The sole and exclusive indicator of the scope of the invention, and what is intended by the applicants to be the scope of the invention, is the literal and equivalent scope of the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. [00146] Further embodiments can be envisioned to one of ordinary skill in the art after reading this disclosure. In other embodiments, combinations or subcombinations of the above-disclosed invention can be advantageously made. The example arrangements of components are shown for purposes of illustration and it should be understood that combinations, additions, re-arrangements, and the like are contemplated in alternative embodiments of the present invention. Thus, while the invention has been described with respect to exemplary embodiments, one skilled in the art will recognize that numerous modifications are possible.
[00147] For example, the processes described herein may be implemented using hardware components, software components, and/or any combination thereof. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope of the invention as set forth in the claims and that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
[00148] All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

Claims

WHAT IS CLAIMED IS:
1. A satellite imaging system, comprising:
a telescope section comprising:
a) a first reflector that is substantially square and sized to fit into a substantially square aperture of a satellite body; and
b) a second reflector, positioned to reflect light reflected from the first reflector;
c) a lens set comprising one or more lenses positioned in an optical path of the telescope section; and
a sensor array positioned to receive light from the telescope section when light is received through the substantially square aperture, wherein the sensor array is substantially square.
2. The satellite imaging system of claim 1, wherein the second reflector is substantially square.
3. The satellite imaging system of claim 2, wherein the second reflector is constructed to counteract image distortions as might occur due to being substantially square.
4. The satellite imaging system of claim 1, wherein the second reflector is substantially round.
5. The satellite imaging system of claim 1, wherein the lens set comprises lenses that are substantially square.
6. The satellite imaging system of claim 2, wherein the lens set is constructed to counteract image distortions as might occur due to being substantially square.
7. The satellite imaging system of claim 1, wherein the lens set comprises lenses that are substantially round.
8. The satellite imaging system of claim 1, further comprising baffles.
9. The satellite imaging system of claim 8, wherein the baffles are substantially square.
10. The satellite imaging system of claim 8, wherein the baffles are substantially round.
11. The satellite imaging system of claim 1, wherein elements are substantially square when the elements have a form factor of a square adjusted for structural components needed to provide structural stability to the elements or other components of a satellite containing the satellite imaging system.
12. The satellite imaging system of claim 1, wherein the first reflector, the second reflector, and/or one or more lenses of the set of lenses comprises thermally matched materials in that the thermally matched materials to limit distortion of a final image over a predetermined set of operating conditions.
13. The satellite imaging system of claim 1, wherein the first reflector, the second reflector, and/or one or more lenses of the set of lenses are movable by motorized or deformable positioners to perform dynamic compensation of positional error and/or compensation for mechanical variations.
14. A satellite imaging system, comprising:
a telescope section arranged to receive incoming light along an optical path; a first camera having a first sensor array positioned in the optical path and
sensitive to a first set wavelength ranges comprising one or more wavelength ranges within a visible spectrum;
a second camera having a second sensor array positioned in the optical path and sensitive to a second set wavelength ranges comprising one or more wavelength ranges outside the visible spectrum;
a dichroic beam splitter in the optical path, whereby light in the first set
wavelength ranges is directed to the first camera and light in the second set wavelength ranges is directed to the second camera.
15. The satellite imaging system of claim 14, wherein the dichroic beam splitter is a whole-spectrum beam splitter, wherein the one of the first sensor array and the second sensor array is a panchromatic sensor array and the other of the first sensor array and the second sensor array is a non-panchromatic sensor array providing pixelated, color-filtered images, and wherein outputs of the panchromatic sensor array are usable to enhance the pixelated, color-filtered images.
16. The satellite imaging system of claim 15, further comprising processing circuitry for performing image enhancement.
17. The satellite imaging system of claim 14, further comprising electronic bandwidth controls for controlling and/or modifying a passband defined by the dichroic beam splitter whereby the first set wavelength ranges and/or the second set wavelength ranges wavelengths can be controlled and/or modified.
18. The satellite imaging system of claim 14, further comprising: a third camera; and
a second beam splitter in the optical path, whereby at least a portion of the
incoming light is directed to the third camera.
19. The satellite imaging system of claim 14, further comprising electronic field of view controls for controlling and/or modifying a telescope field of view.
20. A satellite imaging system, comprising:
a telescope section arranged to receive incoming light;
a camera having a sensor array that shifts relative to an optical axis to the camera while capturing an image to improve the resolution thereof; and
an attitude determination and control system, whereby the satellite stabilizes the satellite while capturing an image.
21. The satellite imaging system of claim 20, further comprising: an active disturbance cancellation system, whereby a combination of
mechanisms, sensors, and computer algorithms reduce the effect of disturbances while capturing an image; and a high-rate data readout and recording system, whereby multiple images can be collected within a shortened time period to limit the effect of vibration upon image quality.
PCT/US2017/014636 2016-01-22 2017-01-23 Imaging system optimized for nanosatellites with multiple cameras and image stabilization and pixel shifting WO2017127844A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201662286234P 2016-01-22 2016-01-22
US201662286225P 2016-01-22 2016-01-22
US201662286229P 2016-01-22 2016-01-22
US62/286,229 2016-01-22
US62/286,234 2016-01-22
US62/286,225 2016-01-22

Publications (1)

Publication Number Publication Date
WO2017127844A1 true WO2017127844A1 (en) 2017-07-27

Family

ID=59362192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/014636 WO2017127844A1 (en) 2016-01-22 2017-01-23 Imaging system optimized for nanosatellites with multiple cameras and image stabilization and pixel shifting

Country Status (1)

Country Link
WO (1) WO2017127844A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107444674A (en) * 2017-09-08 2017-12-08 中国人民解放军战略支援部队航天工程大学 Magic square satellite
CN107783355A (en) * 2017-10-21 2018-03-09 湖南华南光电(集团)有限责任公司 A kind of infrared double-view field fast switch type camera lens
CN108844901A (en) * 2018-06-25 2018-11-20 南京工程学院 Multi-optical spectrum image collecting device
WO2019140159A1 (en) * 2018-01-11 2019-07-18 Skeyeon, Inc. Radio frequency data downlink for a high revist rate, near earth orbit satellite system
US10583632B2 (en) 2018-01-11 2020-03-10 Skeyeon, Inc. Atomic oxygen-resistant, low drag coatings and materials
US10590068B2 (en) 2016-12-06 2020-03-17 Skeyeon, Inc. System for producing remote sensing data from near earth orbit
CN111147699A (en) * 2018-11-02 2020-05-12 南昌欧菲光电技术有限公司 Electronic equipment, camera device and mounting base thereof
US10715245B2 (en) 2016-12-06 2020-07-14 Skeyeon, Inc. Radio frequency data downlink for a high revisit rate, near earth orbit satellite system
US11053028B2 (en) 2016-12-06 2021-07-06 Skeyeon, Inc. Satellite system
CN115034100A (en) * 2022-04-15 2022-09-09 中国科学院上海技术物理研究所 Integrated satellite platform camera frame structure and design method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090015891A1 (en) * 2005-06-13 2009-01-15 Kane David M Optical systems and methods using large microelectromechanical-systems mirrors
US20110193814A1 (en) * 2008-11-28 2011-08-11 Gregory Gay Optical system and display
US20110292505A1 (en) * 2010-05-21 2011-12-01 Kurtz Andrew F Low thermal stress birefringence imaging lens
US20130155218A1 (en) * 2009-12-22 2013-06-20 Dr. Thomas Kalkbrenner High Resolution Microscope and Image Divider Assembly
US20130223832A1 (en) * 2012-02-24 2013-08-29 Lockheed Martin Corporation System and method for controlling scattered light in a reflective optical filter
US20140195150A1 (en) * 2009-12-18 2014-07-10 Edward Oscar Rios High Altitude, Long Endurance, Unmanned Aircraft and Methods of Operation Thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090015891A1 (en) * 2005-06-13 2009-01-15 Kane David M Optical systems and methods using large microelectromechanical-systems mirrors
US20110193814A1 (en) * 2008-11-28 2011-08-11 Gregory Gay Optical system and display
US20140195150A1 (en) * 2009-12-18 2014-07-10 Edward Oscar Rios High Altitude, Long Endurance, Unmanned Aircraft and Methods of Operation Thereof
US20130155218A1 (en) * 2009-12-22 2013-06-20 Dr. Thomas Kalkbrenner High Resolution Microscope and Image Divider Assembly
US20110292505A1 (en) * 2010-05-21 2011-12-01 Kurtz Andrew F Low thermal stress birefringence imaging lens
US20130223832A1 (en) * 2012-02-24 2013-08-29 Lockheed Martin Corporation System and method for controlling scattered light in a reflective optical filter

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11671169B2 (en) 2016-12-06 2023-06-06 Skeyeon, Inc. Radio frequency data downlink for a high revisit rate, near earth orbit satellite system
US11053028B2 (en) 2016-12-06 2021-07-06 Skeyeon, Inc. Satellite system
US10590068B2 (en) 2016-12-06 2020-03-17 Skeyeon, Inc. System for producing remote sensing data from near earth orbit
US10858309B2 (en) 2016-12-06 2020-12-08 Skeyeon, Inc. System for producing remote sensing data from near earth orbit
US10715245B2 (en) 2016-12-06 2020-07-14 Skeyeon, Inc. Radio frequency data downlink for a high revisit rate, near earth orbit satellite system
CN107444674A (en) * 2017-09-08 2017-12-08 中国人民解放军战略支援部队航天工程大学 Magic square satellite
CN107783355B (en) * 2017-10-21 2020-08-28 湖南华南光电(集团)有限责任公司 Infrared double-view-field quick switching type lens
CN107783355A (en) * 2017-10-21 2018-03-09 湖南华南光电(集团)有限责任公司 A kind of infrared double-view field fast switch type camera lens
WO2019140159A1 (en) * 2018-01-11 2019-07-18 Skeyeon, Inc. Radio frequency data downlink for a high revist rate, near earth orbit satellite system
US10583632B2 (en) 2018-01-11 2020-03-10 Skeyeon, Inc. Atomic oxygen-resistant, low drag coatings and materials
CN108844901A (en) * 2018-06-25 2018-11-20 南京工程学院 Multi-optical spectrum image collecting device
CN111147699A (en) * 2018-11-02 2020-05-12 南昌欧菲光电技术有限公司 Electronic equipment, camera device and mounting base thereof
CN111147699B (en) * 2018-11-02 2022-01-07 南昌欧菲光电技术有限公司 Electronic equipment, camera device and mounting base thereof
CN115034100A (en) * 2022-04-15 2022-09-09 中国科学院上海技术物理研究所 Integrated satellite platform camera frame structure and design method
CN115034100B (en) * 2022-04-15 2023-12-22 中国科学院上海技术物理研究所 Integrated satellite platform camera frame structure and design method

Similar Documents

Publication Publication Date Title
WO2017127844A1 (en) Imaging system optimized for nanosatellites with multiple cameras and image stabilization and pixel shifting
US10670853B2 (en) Imaging system with an optical path and telescope shape optimized for nanosatellites
US6477326B1 (en) Dual band framing reconnaissance camera
US7961386B2 (en) Low orbit missile-shaped satellite for electro-optical earth surveillance and other missions
US11320637B2 (en) Small form factor 4-mirror based imaging systems
US11579430B2 (en) Small form factor, multispectral 4-mirror based imaging systems
US11668915B2 (en) Dioptric telescope for high resolution imaging in visible and infrared bands
US10475171B2 (en) Multi-camera imaging system for nanosatellites
US6555803B1 (en) Method and apparatus for imaging a field of regard by scanning the field of view of an imaging electro-optical system in a series of conical arcs to compensate for image rotation
US10338376B2 (en) Image stabilization and pixel shifting for a nanosatellite imaging system
WO2002019030A1 (en) Dual band framing reconnaissance camera
Delamere et al. MRO high resolution imaging science experiment (HIRISE): instrument development
EP4359838A1 (en) Dioptric telescope for high resolution imaging in visible and infrared bands
KR20230136669A (en) Small form factor multi-spectrum four-reflector-based imaging system
Blommaert et al. CHIEM: the development of a new compact hyperspectral imager
Cronje et al. Multi sensor satellite imagers for commercial remote sensing
Jauffraud et al. FLEX & Sentinel 3: A tandem to monitor vegetation
Warren et al. The CONTOUR remote imager and spectrometer (CRISP)
Hawkins et al. Sustainable satellite constellation development, calibration and on-orbit results
Hoefft et al. Multispectral EO LOROP camera
Reilly et al. CCD imaging instruments for planetary spacecraft applications
Doyon et al. MIRCAMOS: a mosaic IR camera and multi-object spectrograph for CFHT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17742137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/12/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 17742137

Country of ref document: EP

Kind code of ref document: A1