US20170054927A1 - Image capture system with motion compensation - Google Patents
Image capture system with motion compensation Download PDFInfo
- Publication number
- US20170054927A1 US20170054927A1 US14/832,335 US201514832335A US2017054927A1 US 20170054927 A1 US20170054927 A1 US 20170054927A1 US 201514832335 A US201514832335 A US 201514832335A US 2017054927 A1 US2017054927 A1 US 2017054927A1
- Authority
- US
- United States
- Prior art keywords
- image
- edge
- sensor
- objects
- exposing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 77
- 239000003381 stabilizer Substances 0.000 claims abstract description 30
- 238000000034 method Methods 0.000 claims description 26
- 230000010355 oscillation Effects 0.000 claims description 26
- 239000002131 composite material Substances 0.000 claims description 12
- 230000002250 progressing effect Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 2
- 230000015654 memory Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 7
- 238000012015 optical character recognition Methods 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
-
- H04N5/3532—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/64—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
- G02B27/646—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for small deviations, e.g. due to vibration or shake
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/689—Motion occurring during a rolling shutter mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/531—Control of the integration time by controlling rolling shutters in CMOS SSIS
-
- H04N5/3743—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/768—Addressed sensors, e.g. MOS or CMOS sensors for time delay and integration [TDI]
Definitions
- Defocus blur, motion blur, and noise due to short exposures at low light levels often limit cameras capturing images while in motion.
- cameras typically focus on a shallow range of depths.
- the image capture system includes a lens that produces an image; an image sensor having a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that the lens focuses on more distant objects; an image stabilizer configured to provide a time-varying compensation of image motion at the image sensor; and a controller configured to (1) operate the image capture system in a repeating cycle, wherein during each cycle the controller operates the sensor to expose and read out an image progressively from one edge to the opposite edge, and (2) operate the image stabilizer to provide an image motion compensation that varies in time such that the image motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- the image stabilizer includes an oscillation of the image capture system that is a rotational oscillation about an axis normal to a direction of the image motion.
- the rotational oscillation of the image capture system is about a vertical axis.
- the placement of the image sensor relative to the lens is unaffected by the oscillation of the image capture system.
- the controller is also configured to operate the image stabilizer to provide an amount of image motion compensation that compensates for the image motion for objects at a first distance when exposing and reading the first edge of the sensor and for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance.
- the controller is also configured to operate the image stabilizer to provide an amount of image motion compensation that compensates for the image motion for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges.
- the controller is also configured to process the composite image to extract textual information from objects in the composite image.
- the image sensor is a rolling-shutter image sensor with rows oriented vertically and readout times progressing from back to front.
- the image stabilizer compensation comprises a rotational oscillation of the image sensor about an axis normal to a direction of the image motion. In this example, the oscillation of the one or more sensors includes a rotation of 10 degrees or less.
- the method includes operating, using one or more controllers, an image capture system in a repeating cycle to capture a plurality of images, the image capture system having a lens, an image sensor, and an image stabilizer, wherein the image sensor has a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that it focuses on more distant objects; during each cycle, operating, using the one or more controllers, a sensor to expose and read out an image progressively from one edge to the opposite edge, and operating, using the one or more controllers, the image stabilizer to provide a time-varying image motion compensation such that the motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- the image stabilizer comprises an oscillation of the image capture system that is a rotational oscillation about an axis normal to a direction of the image motion.
- the rotational oscillation of the image capture system is about a vertical axis.
- the angle of the image sensor is fixed relative to the lens.
- operating the image stabilizer to provide a time-varying image motion compensation also includes compensating for the image motion for objects at a first distance when exposing and reading the first edge of the sensor, for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance, and for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges.
- the method also includes processing, by the one or more controllers, the plurality of images to extract textual information from objects in the images.
- the image sensor is a rolling-shutter image sensor with rows oriented vertically and readout times progressing from back to front.
- the method includes operating an image capture system in a repeating cycle to capture a plurality of images, the image capture system having a lens, an image sensor, and an image stabilizer, wherein the image sensor has a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that it focuses on more distant objects; during each cycle, operating a sensor to expose and read out an image progressively from one edge to the opposite edge, and operating the image stabilizer to provide a time-varying image motion compensation such that the motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- operating the image stabilizer to provide a time-varying image motion compensation also includes compensating for the image motion for objects at a first distance when exposing and reading the first edge of the sensor, for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance, and for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges.
- the method also includes processing the plurality of images to extract textual information from objects in the images.
- FIG. 1 is a functional diagram of an exemplary traditional image capture system according to aspects of the disclosure.
- FIG. 2 is a functional diagram of an example image capture system according to aspects of the disclosure.
- FIG. 3 is a pictorial diagram of the image capture system of FIG. 2 in accordance with aspects of the disclosure.
- FIG. 4 is a functional diagram of an image capture system according to aspects of the disclosure.
- FIG. 5A is a graph representing an example operation of the image capture system according to aspects of the disclosure.
- FIG. 5B is a graph representing an example operation of the image capture system according to aspects of the disclosure.
- FIG. 6 is a functional diagram of an example system in accordance with an exemplary embodiment.
- FIG. 7 is a functional diagram of an example system in accordance with an exemplary embodiment.
- FIG. 8 is a pictorial diagram of the system of FIG. 2 in accordance with aspects of the disclosure.
- FIG. 9 is a flow diagram of an example method according to aspects of the disclosure.
- the technology relates to an image capture system that is able to capture images having a range of different distances in focus while the vehicle is in motion.
- the image capture system may be mounted or mountable on a vehicle.
- Typical camera systems may utilize image sensors oriented parallel to a plane of a lens. In systems in which the image sensors are parallel to the lens, the distance at which objects are in focus is the same across the image sensor.
- These systems may utilize camera stabilizer systems, liquid wedge prism lenses, sensor shift, or post-processing adjustments in order to attempt to compensate for small movements and shakes due to a user's hand movement.
- these techniques have well-understood limits, making them less effective at addressing quality issues such as defocus blur, motion blur, and noise due to short exposures at low light levels.
- OCR optical character recognition
- typical camera systems may require either multiple passes or multiple cameras.
- an image capture system may utilize an image sensor installed on a tilt in relation to a lens rather than parallel to the lens.
- the image sensor in the image capture system may be tilted at an angle, with a first edge slightly closer to the lens, to focus best on distant objects, than a second edge opposite the first edge, which focuses best on near objects.
- the first edge of the image sensor may be focused on objects farther away from the camera than objects on which the second edge of the image sensor is focused, such that objects that appear in different positions in more than one image may be in better focus in one than in another.
- the image sensor may be a rolling-shutter image sensor with rows oriented vertically, or in portrait mode, and readout times progressing from back to front, or vice versa.
- the rolling shutter reads out an image starting at one edge of the image sensor and progressing to the opposite edge, such that different portions of the image are read at slightly different times.
- Image stabilization may be applied by various known means, including moving the image sensor behind the lens, rotating the whole camera, or moving the image using a liquid-wedge prism.
- the system may rotate on an axis normal to the direction in which the vehicle is traveling and normal to the direction in which the camera is pointing in order to compensate for vehicle movement; for example, a vertical axis if the camera is looking to the side of a moving car.
- the rotation may be a torsional oscillation and may also be sinusoidal or any other type of oscillation.
- the system may rotate back and forth on the axis, swinging the lens in the direction of travel and then opposite the direction of travel.
- the orientation of the image sensor relative to the lens may remain fixed.
- the system may compensate for vehicle movement. This compensation may allow the system to capture clearer images of stationary objects that would otherwise be blurry due to the vehicle movement.
- system may be set up to only capture images when the system is rotating opposite the direction of vehicle travel.
- the combination of the tilted image sensor and rotating system may allow the system to compensate for varied angular velocities of multiple planes of focus.
- objects close to the vehicle have a higher angular velocity than objects farther away from the vehicle.
- Objects at infinity have zero angular velocity.
- the angular velocity of an object changes in a linear fashion in relation to the reciprocal distance of the object from the system.
- the tilted image sensor is rotated, the angular velocity of the first edge of the image sensor may be faster than the angular velocity of the second edge.
- This arrangement may compensate for the difference in angular velocity at different distances away from the vehicle and allow the system to capture portions of the image that are both well focused and well motion compensated for objects at different distances in different parts of the image.
- the rotation of the system may be synchronized with a rolling shutter exposure and readout, and the amount of rotational velocity may be adjusted to be proportional to the vehicle velocity, such that the image motion on the image sensor is approximately stopped for objects that are in focus in each region of the image sensor.
- the system may capture images at a high enough rate that objects of interest appear in several successive images, but are clearer or more in focus in one image or another depending on their distances and on their positions within the image. Because the tilted image sensor may provide a gradient of focal distances across a single image, neighboring images may capture the same object or location at different image locations having different best focus distances.
- the captured images may be processed in order to extract textual information from objects in the image. Having been generated from images that have a variety of focus distances and have been compensated for motion as described above, traditional machine-reading methods may be applied.
- FIG. 1 is a functional diagram of a traditional image capture system.
- Image capture system 110 may have lens 120 and an image sensor 130 arranged parallel to one another. As a result, the plane of focus 140 is also parallel to the lens.
- An image capture system may utilize an image sensor installed on a tilt in relation to a lens rather than parallel to the lens in order to improve the quality of images captured by the image capture system while the vehicle is in motion.
- an image capture system 210 may have a lens 220 and image sensor 230 .
- the image sensor 230 may be tilted at an angle ⁇ , with a first edge slightly closer to the lens, to focus best on distant objects, than a second edge opposite the first edge, which focuses best on near objects.
- lens 220 may have a focal length 0.010 m (10 mm), with far distance of infinity, the near edge of the image sensor 230 may be placed 0.010 m behind the lens plane; with near object distance of 4 m, the opposite edge of the image sensor 230 may be placed 0.010025 m behind the lens plane.
- This slight tilt of 25 microns (0.000025 m) from one edge to the other corresponds to an angle of 2.5 milliradians (about 0.14 degree) if the image sensor edges are 10 mm apart.
- the first edge of the image sensor may be focused on objects farther away from the camera than objects on which the second edge of the image sensor 230 is focused, such that objects that appear in different positions in more than one image may be in better focus in one than in another.
- the specific amount of tilt may be determined to be that which best covers the range of object distances to be imaged.
- the direction of tilt may be about an axis normal to the direction in which the vehicle is traveling, such as a vertical axis, but may be in any other direction as well.
- the image sensor 230 may be a rolling-shutter image sensor with rows oriented vertically, or in portrait mode, and readout times progressing from back to front, or vice versa.
- the rolling shutter reads out an image starting at one edge of the image sensor and progressing to the opposite edge, such that different portions of the image are read at slightly different times.
- the image may also be a collection of pixel photosensors in a single array on one chip or on a plurality of chips.
- the system may utilize a plurality of image sensors arranged on an angle behind the lens. An image sensor at a first edge of the lens may be a smaller distance from the lens than an image sensor at a second edge of the lens. Each of the plurality of image sensors may be tilted as described above.
- the plurality of image sensors may be configured to behave as a single rolling-shutter image sensor; in other words, expose and readout images starting at one edge of the lens and progressing to the opposite edge of the lens.
- Each image sensor may be a rolling-shutter image sensor, but not necessarily.
- FIG. 3 is a pictorial diagram of the image capture system of FIG. 2 .
- image capture system 210 may have one or more controllers 310 .
- the one or more controllers 310 may control operations of the image capture system.
- the one or more controllers 310 may cause the image capture system 210 to move or capture an image.
- the one or more controllers may also control components of the image capture system 210 , such as lens 220 or image sensors 230 , individually.
- the image capture system may include a memory storing data and instructions that may be used executed to operate the system.
- the system may be configured to rotate on an axis normal to the direction in which the vehicle is traveling and normal to the direction in which the camera is pointing in order to compensate for vehicle movement.
- the rotation may be a torsional oscillation and may also be sinusoidal.
- the system may rotate back and forth on the axis, swinging the lens in the direction of travel and then opposite the direction of travel.
- image capture system 210 may rotate back and forth about axis 420 , covering angular distance ⁇ .
- the orientation of the image sensor relative to the lens may remain fixed.
- the image sensor may rotate together with the lens.
- the system may rotate back and forth within 10 degrees of angular distance.
- the system may therefore compensate for displacement of the system in the period of time in which the system is rotating opposite the direction of travel.
- the system may compensate for image motion across the sensor by rotating in the direction of image motion.
- the system may be configured to capture images when the system is rotating opposite the direction of displacement (or in the direction of image motion) and best compensating for the displacement.
- FIGS. 5A and 5B show the angular velocity and camera rotation angle versus time through a cycle of image capture for an image capture system calibrated to focus on object distances of 4 meters through infinity, according to example embodiments of the invention.
- FIG. 5A graphically depicts an example of the rotation of the image capture system as angular velocity over time shown in solid line 510 .
- Dotted line 512 represents the amount of rotation in radians.
- Line 514 represents the exposure and readout interval for an image, and line segment 516 represents the distance with the best motion compensation based on the tilt of the image sensor and the angular velocity of the system.
- the + marks 518 represent the ideal angular velocity to compensate for velocity of travel at 4 meter and 8 meter distances.
- FIG. 5A graphically depicts an example of the rotation of the image capture system as angular velocity over time shown in solid line 510 .
- Dotted line 512 represents the amount of rotation in radians.
- Line 514 represents the exposure and readout interval
- the image capture rate is 8 frames per second (fps) with readout in 1/48 second, and vehicle velocity at 10 m/s. Therefore, at 8 fps, the system images during 1 ⁇ 6 of a sinusoidal motion, highlighted by the bold line 520 , where the system rotation best compensates for distances ranging from 4 meters to infinity.
- FIG. 5B graphically depicts another example of the rotation of the image capture system as angular velocity over time shown in solid line 530 .
- Dotted line 532 represents the amount of rotation in radians.
- Line 534 represents the exposure and readout interval for an image, and line segment 536 represents the distance with the best motion compensation based on the tilt of the image sensor and the angular velocity of the system.
- the + marks 538 represent the ideal angular velocity to compensate for velocity of travel at 4 meter and 8 meter distances.
- the image capture rate is 12 frames per second with readout in 1/48 second, and vehicle velocity at 10 m/s.
- the higher frame rate is obtained by using a motion closer to a sawtooth waveform (slower ramp one direction than the other, using up to fifth harmonic of the 12 Hz cycle).
- the system images during 1 ⁇ 4 of the cycle, as highlighted by bold line 540 .
- the system may oscillate side-to-side as a translational oscillator in a plane parallel to the direction in which the vehicle is traveling.
- the image sensor may be oscillated relative to the lens to compensate image motion.
- the combination of the tilted image sensor and rotating system may allow the system to compensate for varied angular velocities of multiple planes of focus.
- objects close to the vehicle have a higher angular velocity than objects farther away from the vehicle.
- the angular velocity of an object at 4 meters away is 2.5 rad/s, or 2.5 mrad/ms.
- Objects at infinity have zero angular velocity.
- the angular velocity of an object changes in a linear fashion in relation to the reciprocal distance of the object from the system.
- the angular velocity of the first edge of the image sensor may be faster than the angular velocity of the second edge.
- the edge of the image sensor with the near focus may have the faster angular velocity.
- the rotation of the system may be synchronized with a rolling shutter exposure and readout, and the amount of rotational velocity may be adjusted to be proportional to the vehicle velocity, such that the image motion on the image sensor is approximately stopped for objects that are in focus in each region of the image sensor.
- the system may capture images at a high enough rate that objects of interest appear in several successive images, but are clearer in one image or another depending on their distances and on their positions within the image. In other words, objects may appear in more than one image and at different points in different images.
- the system may capture one shot per oscillation cycle, each oscillation being within 10 degrees of rotation. This 1 ⁇ 6 cycle exposure and readout may correspond to a frame interval equal to about 6 times the readout time, or 1 ⁇ 6 of the image sensor's maximum frame rate.
- a total of 8 images may be captured per second, or an image every 1.25 meters if the vehicle is traveling at 10 m/s, which may capture multiple views of an object that is more than a few meters from the vehicle (depending on the field of view of the camera).
- neighboring images may capture the same object or location at different image locations having different best focus distances. For example, at object at 8 m distance might appear in 3 or more different images (depending on the field of view), and in at least one of those the object appear near the center of image where the best focus distance is 8 m (half way between the infinity and 4 m edges in terms of reciprocal distance). If the motion compensation is adjusted to no rotation at the infinity edge and compensation for 10 m/s vehicle motion for objects at 4 m distance (rotation of 2.5 radians per second) at the other edge, and varying approximately linearly between those, then it will approximately cancel the motion blur for the object at 8 m distance near the center of the image. Similarly, other objects at distances between 4 m and infinity will be both well focus and motion compensated at the approximately corresponding image locations, so that if they appear in several image they will be sharp in at least one.
- image capture system 210 may be incorporated into vehicle 600 . While certain aspects of the disclosure are particularly useful in connection with specific types of vehicles, the vehicle may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, buses, recreational vehicles, etc.
- the vehicle may have one or more computing devices, such as computing device 610 containing one or more processors 620 , memory 630 and other components typically present in general purpose computing devices.
- Image capture system 210 may be connected to computing device 610 and mounted onto vehicle 600 .
- the memory 630 stores information accessible by the one or more processors 620 , including data 632 and instructions 634 that may be executed or otherwise used by the processor 620 .
- the one or more processors 620 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor.
- the memory 630 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
- the data 632 may be retrieved, stored or modified by processor 620 in accordance with the instructions 634 .
- the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files.
- the data may also be formatted in any computing device-readable format.
- the instructions 634 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor.
- the instructions may be stored as computing device code on the computing device-readable medium.
- the terms “instructions” and “programs” may be used interchangeably herein.
- the instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
- FIG. 6 functionally illustrates the processor, memory, and other elements of computing device 610 as being within the same block, it will be understood by those of ordinary skill in the art that the processor, computing device, or memory may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing.
- memory may be a hard drive or other storage media located in a housing different from that of computing device 610 . Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.
- Computing device 610 may have all of the components normally used in connection with a computing device such as the processor and memory described above as well as a user input 650 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device that is operable to display information).
- a user input 650 e.g., a mouse, keyboard, touch screen and/or microphone
- various electronic displays e.g., a monitor having a screen or any other electrical device that is operable to display information.
- the vehicle includes an internal electronic display 652 as well as one or more speakers 654 to provide information or audio visual experiences.
- internal electronic display 652 may be located within a cabin of vehicle 600 and may be used by computing device 610 to provide information to passengers within the vehicle 600 .
- Computing device 610 may also include one or more wireless network connections 654 to facilitate communication with other computing devices, such as the client computing devices and server computing devices described in detail below.
- the wireless network connections may include short range communication protocols such as Bluetooth, Bluetooth low energy (LE), cellular connections, as well as various configurations and protocols including the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
- computing device 610 may also be in communication with one or more vehicle operation systems 660 of vehicle 600 .
- Vehicle operation systems 660 may include systems involved in operations of the vehicle, such as one or more of deceleration, acceleration, steering, signaling, navigation, positioning, detection, etc. Although one or more vehicle operation systems 660 are shown as external to computing device 610 , in actuality, the systems may also be incorporated into computing device 610 .
- Image capture system 210 may also receive or transfer information, such as captured images, to and from other computing devices.
- FIGS. 7 and 8 are pictorial and functional diagrams, respectively, of an example system 700 that includes a plurality of computing devices 710 , 720 , 730 , 740 and a storage system 750 connected via a network 760 .
- System 700 also includes image capture system 210 . Although only a few computing devices are depicted for simplicity, a typical system may include significantly more. Additionally or alternatively, one or more vehicles such as vehicle 600 may be included in system 700 .
- each of computing devices 710 , 720 , 730 , 740 may include one or more processors, memory, data and instructions. Such processors, memories, data and instructions may be configured similarly to one or more processors 620 , memory 630 , data 632 , and instructions 634 of computing device 610 .
- the network 760 may include various configurations and protocols including short range communication protocols such as Bluetooth, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
- short range communication protocols such as Bluetooth, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
- Such communication may be facilitated by any device capable of transmitting data to and from other computing devices, such as modems and wireless interfaces.
- server computing devices 710 may use network 760 to transmit and present information to a user, such as user 722 , 732 , 742 on a display, such as displays 724 , 734 , 742 of computing devices 720 , 730 , 740 .
- computing devices 720 , 730 , 740 may be considered client computing devices.
- each client computing device 720 , 730 , 740 may be a personal computing device intended for use by a user 722 , 732 , 742 , and have all of the components normally used in connection with a personal computing device including a one or more processors (e.g., a central processing unit (CPU)), memory (e.g., RAM and internal hard drives) storing data and instructions, a display such as displays 724 , 734 , 744 (e.g., a monitor having a screen, a touch-screen, a projector, a television, or other device that is operable to display information), and user input devices 726 , 736 , 746 (e.g., a mouse, keyboard, touch screen or microphone).
- the client computing devices may also include a camera for recording video streams, speakers, a network interface device, and all of the components used for connecting these elements to one another.
- client computing devices 720 and 730 may also include components 728 and 738 for determining the position and orientation of client computing devices.
- these components may include a GPS receiver to determine the device's latitude, longitude and/or altitude as well as an accelerometer, gyroscope or another direction/speed detection device.
- client computing devices 720 , 730 , and 740 may each comprise a full-sized personal computing device, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the Internet.
- client computing device 720 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook that is capable of obtaining information via the Internet or other networks.
- client computing device 730 may be a wearable computing system, shown as a head-mounted computing system in FIG. 7 .
- the user may input information using a small keyboard, a keypad, microphone, using visual signals with a camera, or a touch screen.
- Storage system 750 may store various types of information that may be retrieved or otherwise accessed by a server computing device, such as one or more server computing devices 710 , in order to perform some or all of the features described herein.
- the storage system 750 may store images captured by image capture system 210 .
- Information associated with images such as location information and pose information may be stored in association with the images.
- storage system 750 can be of any type of computerized storage capable of storing information accessible by the server computing devices 710 , such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, and read-only memories.
- storage system 750 may include a distributed storage system where data is stored on a plurality of different storage devices which may be physically located at the same or different geographic locations.
- Storage system 750 may be connected to the computing devices via the network 760 as shown in FIG. 7 and/or may be directly connected to or incorporated into any of the computing devices 710 , 720 , 730 , 740 , image capture device 210 , etc.
- FIG. 9 is an example flow diagram 900 in accordance with some of the aspects described above that may be performed by one or more controllers in the system.
- one or more image sensors may be configured at an angle behind a lens of an image capture system at block 910 .
- the image capture system may then be oscillated based on a direction of motion of the image capture system at block 920 .
- the image capture system may be mounted on a vehicle, in which case, the direction the vehicle is traveling would be the direction of motion of the image capture system.
- the image capture system may expose and read images starting at a first edge of a lens in the system and progressing to a second edge of the lens opposite the first edge.
- a sensor of the image capture system may be a rolling-shutter image sensor.
- the captured images may be processed in order to extract textual information from objects in the image. Having been generated from images that have a variety of focus distances and have been compensated for motion as described above, traditional machine-reading methods may be applied. For example, words on signs may be read through use of OCR.
- the captured images may be processed at the image capture device 210 .
- the captured images may be sent via network 760 to one or more computing devices 710 , 720 , 730 , 740 and processed at the one or more computing devices.
- the captured images and/or the extracted information may be sent from image capture system 210 or one or more computing devices 710 , 720 , 730 , 740 via the network 760 to storage system 750 .
- a captured image or extracted information may be retrieved from storage system 750 .
- a composite image may be generated by stitching together select portions of the captured images that are most in focus and processed in order to extract information from the image.
- the composite image may be a panoramic image.
- portions of different images capturing a particular location may be compared.
- the portion of a captured image that depicts a particular location the clearest or at a location closest to the focal distance may be selected to be included in the composite image.
- the system may use LIDAR to detect the distance between the particular location and the vehicle.
- the portion of a captured image in which the particular location is captured at the image sensor portion focused on a distance matching the detected distance may be selected to be included in the composite image.
- the generated composite image may then be processed to extract textual information from objects in the image.
- the features described above allow for the capture of images that are in focus at a wide range of distances when traveling at a high velocity, such as when driving in a car. While each of a tilted image sensor, rotation or oscillation, or overlapping images provides individual advantages over a typical image capture system as discussed above, the combination of these may provide the best source of captured images for generating composite images as described above. As a result, the composite images may contain more information than what is typically captured in an image taken from a moving vehicle. For example, the composite images may be used in post-processing to extract textual information like words on signs that would otherwise be too blurry or out-of-focus to read if captured by a typical image capture system.
- the tilted image sensor and the motion of the image sensor allows for the efficient capture of images in a single pass with one camera where the images are captured at a variety of focus distances and degrees of motion compensation.
- the features disclosed above may allow for the use of wider aperture and slower shutter speeds, enabling the image capture system to get enough light to make clean images at a high velocity.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Facsimile Scanning Arrangements (AREA)
Abstract
Description
- Defocus blur, motion blur, and noise due to short exposures at low light levels often limit cameras capturing images while in motion. In addition, cameras typically focus on a shallow range of depths.
- Aspects of the disclosure provide an image capture system. The image capture system includes a lens that produces an image; an image sensor having a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that the lens focuses on more distant objects; an image stabilizer configured to provide a time-varying compensation of image motion at the image sensor; and a controller configured to (1) operate the image capture system in a repeating cycle, wherein during each cycle the controller operates the sensor to expose and read out an image progressively from one edge to the opposite edge, and (2) operate the image stabilizer to provide an image motion compensation that varies in time such that the image motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- In one example, the image stabilizer includes an oscillation of the image capture system that is a rotational oscillation about an axis normal to a direction of the image motion. In this example, the rotational oscillation of the image capture system is about a vertical axis. Also in this example, the placement of the image sensor relative to the lens is unaffected by the oscillation of the image capture system.
- In a further example, the controller is also configured to operate the image stabilizer to provide an amount of image motion compensation that compensates for the image motion for objects at a first distance when exposing and reading the first edge of the sensor and for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance. In this example, the controller is also configured to operate the image stabilizer to provide an amount of image motion compensation that compensates for the image motion for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges.
- In yet another example, the controller is also configured to process the composite image to extract textual information from objects in the composite image. In a further example, the image sensor is a rolling-shutter image sensor with rows oriented vertically and readout times progressing from back to front. In another example, the image stabilizer compensation comprises a rotational oscillation of the image sensor about an axis normal to a direction of the image motion. In this example, the oscillation of the one or more sensors includes a rotation of 10 degrees or less.
- Other aspects of the disclosure provide a method. The method includes operating, using one or more controllers, an image capture system in a repeating cycle to capture a plurality of images, the image capture system having a lens, an image sensor, and an image stabilizer, wherein the image sensor has a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that it focuses on more distant objects; during each cycle, operating, using the one or more controllers, a sensor to expose and read out an image progressively from one edge to the opposite edge, and operating, using the one or more controllers, the image stabilizer to provide a time-varying image motion compensation such that the motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- In one example, the image stabilizer comprises an oscillation of the image capture system that is a rotational oscillation about an axis normal to a direction of the image motion. In this example, the rotational oscillation of the image capture system is about a vertical axis. Also in this example, the angle of the image sensor is fixed relative to the lens.
- In another example, operating the image stabilizer to provide a time-varying image motion compensation also includes compensating for the image motion for objects at a first distance when exposing and reading the first edge of the sensor, for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance, and for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges. In yet another example, the method also includes processing, by the one or more controllers, the plurality of images to extract textual information from objects in the images. In a further example, the image sensor is a rolling-shutter image sensor with rows oriented vertically and readout times progressing from back to front.
- Further aspects of the disclosure provide a non-transitory, computer-readable medium on which instructions are stored, the instructions, when executed by one or more controllers, cause the one or more controllers to perform a method. The method includes operating an image capture system in a repeating cycle to capture a plurality of images, the image capture system having a lens, an image sensor, and an image stabilizer, wherein the image sensor has a first edge and a second edge opposite the first edge, the first edge being placed closer to lens such that it focuses on more distant objects; during each cycle, operating a sensor to expose and read out an image progressively from one edge to the opposite edge, and operating the image stabilizer to provide a time-varying image motion compensation such that the motion compensation is greater when exposing and reading the second edge of the sensor than when exposing and reading the first edge of the sensor.
- In one example, operating the image stabilizer to provide a time-varying image motion compensation also includes compensating for the image motion for objects at a first distance when exposing and reading the first edge of the sensor, for objects at a second distance when exposing and reading the second edge of the sensor, the first distance being greater than the second distance, and for objects between the first and second distances when exposing and reading portions of the image sensor between the first and second edges. In another example, the method also includes processing the plurality of images to extract textual information from objects in the images.
-
FIG. 1 is a functional diagram of an exemplary traditional image capture system according to aspects of the disclosure. -
FIG. 2 is a functional diagram of an example image capture system according to aspects of the disclosure. -
FIG. 3 is a pictorial diagram of the image capture system ofFIG. 2 in accordance with aspects of the disclosure. -
FIG. 4 is a functional diagram of an image capture system according to aspects of the disclosure. -
FIG. 5A is a graph representing an example operation of the image capture system according to aspects of the disclosure. -
FIG. 5B is a graph representing an example operation of the image capture system according to aspects of the disclosure. -
FIG. 6 is a functional diagram of an example system in accordance with an exemplary embodiment. -
FIG. 7 is a functional diagram of an example system in accordance with an exemplary embodiment. -
FIG. 8 is a pictorial diagram of the system ofFIG. 2 in accordance with aspects of the disclosure. -
FIG. 9 is a flow diagram of an example method according to aspects of the disclosure. - The technology relates to an image capture system that is able to capture images having a range of different distances in focus while the vehicle is in motion. The image capture system may be mounted or mountable on a vehicle.
- Typical camera systems may utilize image sensors oriented parallel to a plane of a lens. In systems in which the image sensors are parallel to the lens, the distance at which objects are in focus is the same across the image sensor. These systems may utilize camera stabilizer systems, liquid wedge prism lenses, sensor shift, or post-processing adjustments in order to attempt to compensate for small movements and shakes due to a user's hand movement. However, when attempting to adapt for the movements of a vehicle these techniques have well-understood limits, making them less effective at addressing quality issues such as defocus blur, motion blur, and noise due to short exposures at low light levels. When such images are blurry or noisy, this may make it difficult to recognize certain features in images, such as text on signs, when using techniques such as optical character recognition (OCR). As a result, in order to capture the variety of shots required for OCR, typical camera systems may require either multiple passes or multiple cameras.
- In order to improve the quality of images captured by the image capture system while the vehicle is in motion, an image capture system may utilize an image sensor installed on a tilt in relation to a lens rather than parallel to the lens. The image sensor in the image capture system may be tilted at an angle, with a first edge slightly closer to the lens, to focus best on distant objects, than a second edge opposite the first edge, which focuses best on near objects. The amount of tilt required to cover a range of object distances is approximately linear in reciprocal object distance: i=f+f2/o, where image distance i (from image sensor to lens place) is slightly more than focal length f, by an amount proportional to the reciprocal of object distance o. As a result, the first edge of the image sensor may be focused on objects farther away from the camera than objects on which the second edge of the image sensor is focused, such that objects that appear in different positions in more than one image may be in better focus in one than in another.
- The image sensor may be a rolling-shutter image sensor with rows oriented vertically, or in portrait mode, and readout times progressing from back to front, or vice versa. The rolling shutter reads out an image starting at one edge of the image sensor and progressing to the opposite edge, such that different portions of the image are read at slightly different times. By applying a time-varying image stabilization related to vehicle velocity, the motion blur due to vehicle motion may be compensated for objects at different distances in different portions of the image. Image stabilization may be applied by various known means, including moving the image sensor behind the lens, rotating the whole camera, or moving the image using a liquid-wedge prism.
- The system may rotate on an axis normal to the direction in which the vehicle is traveling and normal to the direction in which the camera is pointing in order to compensate for vehicle movement; for example, a vertical axis if the camera is looking to the side of a moving car. The rotation may be a torsional oscillation and may also be sinusoidal or any other type of oscillation. In other words, the system may rotate back and forth on the axis, swinging the lens in the direction of travel and then opposite the direction of travel. During oscillation of the system, the orientation of the image sensor relative to the lens may remain fixed. When rotating opposite the direction of travel, the system may compensate for vehicle movement. This compensation may allow the system to capture clearer images of stationary objects that would otherwise be blurry due to the vehicle movement. Thus, to obtain the most images of higher quality, system may be set up to only capture images when the system is rotating opposite the direction of vehicle travel.
- The combination of the tilted image sensor and rotating system may allow the system to compensate for varied angular velocities of multiple planes of focus. By nature of being a short distance away from a moving vehicle, objects close to the vehicle have a higher angular velocity than objects farther away from the vehicle. Objects at infinity have zero angular velocity. The angular velocity of an object changes in a linear fashion in relation to the reciprocal distance of the object from the system. When the tilted image sensor is rotated, the angular velocity of the first edge of the image sensor may be faster than the angular velocity of the second edge. This arrangement may compensate for the difference in angular velocity at different distances away from the vehicle and allow the system to capture portions of the image that are both well focused and well motion compensated for objects at different distances in different parts of the image.
- To further compensate for the difference in angular velocity, the rotation of the system may be synchronized with a rolling shutter exposure and readout, and the amount of rotational velocity may be adjusted to be proportional to the vehicle velocity, such that the image motion on the image sensor is approximately stopped for objects that are in focus in each region of the image sensor.
- The system may capture images at a high enough rate that objects of interest appear in several successive images, but are clearer or more in focus in one image or another depending on their distances and on their positions within the image. Because the tilted image sensor may provide a gradient of focal distances across a single image, neighboring images may capture the same object or location at different image locations having different best focus distances.
- The captured images may be processed in order to extract textual information from objects in the image. Having been generated from images that have a variety of focus distances and have been compensated for motion as described above, traditional machine-reading methods may be applied.
-
FIG. 1 is a functional diagram of a traditional image capture system.Image capture system 110 may havelens 120 and animage sensor 130 arranged parallel to one another. As a result, the plane offocus 140 is also parallel to the lens. - An image capture system may utilize an image sensor installed on a tilt in relation to a lens rather than parallel to the lens in order to improve the quality of images captured by the image capture system while the vehicle is in motion. As shown in
FIG. 2 , animage capture system 210 may have alens 220 andimage sensor 230. Theimage sensor 230 may be tilted at an angle α, with a first edge slightly closer to the lens, to focus best on distant objects, than a second edge opposite the first edge, which focuses best on near objects. The amount of tilt required to cover a range of object distances is approximately linear in reciprocal object distance: i=f+f2/o, where image distance i (from image sensor to lens place) is slightly more than focal length f, by an amount proportional to the reciprocal of object distance o. - For example,
lens 220 may have a focal length 0.010 m (10 mm), with far distance of infinity, the near edge of theimage sensor 230 may be placed 0.010 m behind the lens plane; with near object distance of 4 m, the opposite edge of theimage sensor 230 may be placed 0.010025 m behind the lens plane. This slight tilt of 25 microns (0.000025 m) from one edge to the other corresponds to an angle of 2.5 milliradians (about 0.14 degree) if the image sensor edges are 10 mm apart. As a result, the first edge of the image sensor may be focused on objects farther away from the camera than objects on which the second edge of theimage sensor 230 is focused, such that objects that appear in different positions in more than one image may be in better focus in one than in another. - The specific amount of tilt may be determined to be that which best covers the range of object distances to be imaged. As noted above, the direction of tilt may be about an axis normal to the direction in which the vehicle is traveling, such as a vertical axis, but may be in any other direction as well.
- The
image sensor 230 may be a rolling-shutter image sensor with rows oriented vertically, or in portrait mode, and readout times progressing from back to front, or vice versa. The rolling shutter reads out an image starting at one edge of the image sensor and progressing to the opposite edge, such that different portions of the image are read at slightly different times. The image may also be a collection of pixel photosensors in a single array on one chip or on a plurality of chips. Additionally or alternatively, the system may utilize a plurality of image sensors arranged on an angle behind the lens. An image sensor at a first edge of the lens may be a smaller distance from the lens than an image sensor at a second edge of the lens. Each of the plurality of image sensors may be tilted as described above. The plurality of image sensors may be configured to behave as a single rolling-shutter image sensor; in other words, expose and readout images starting at one edge of the lens and progressing to the opposite edge of the lens. Each image sensor may be a rolling-shutter image sensor, but not necessarily. -
FIG. 3 is a pictorial diagram of the image capture system ofFIG. 2 . In addition tolens 220 and one ormore image sensors 230,image capture system 210 may have one ormore controllers 310. The one ormore controllers 310 may control operations of the image capture system. For example, the one ormore controllers 310 may cause theimage capture system 210 to move or capture an image. The one or more controllers may also control components of theimage capture system 210, such aslens 220 orimage sensors 230, individually. In some examples, the image capture system may include a memory storing data and instructions that may be used executed to operate the system. - In addition to having a tilted image sensor, the system may be configured to rotate on an axis normal to the direction in which the vehicle is traveling and normal to the direction in which the camera is pointing in order to compensate for vehicle movement. The rotation may be a torsional oscillation and may also be sinusoidal. In other words, the system may rotate back and forth on the axis, swinging the lens in the direction of travel and then opposite the direction of travel. As shown in
FIG. 4 ,image capture system 210 may rotate back and forth aboutaxis 420, covering angular distance β. - During oscillation of the system, the orientation of the image sensor relative to the lens may remain fixed. In other words, the image sensor may rotate together with the lens. For example, the system may rotate back and forth within 10 degrees of angular distance. The system may therefore compensate for displacement of the system in the period of time in which the system is rotating opposite the direction of travel. Put another way, the system may compensate for image motion across the sensor by rotating in the direction of image motion. The system may be configured to capture images when the system is rotating opposite the direction of displacement (or in the direction of image motion) and best compensating for the displacement.
-
FIGS. 5A and 5B show the angular velocity and camera rotation angle versus time through a cycle of image capture for an image capture system calibrated to focus on object distances of 4 meters through infinity, according to example embodiments of the invention.FIG. 5A graphically depicts an example of the rotation of the image capture system as angular velocity over time shown insolid line 510.Dotted line 512 represents the amount of rotation in radians.Line 514 represents the exposure and readout interval for an image, andline segment 516 represents the distance with the best motion compensation based on the tilt of the image sensor and the angular velocity of the system. The + marks 518 represent the ideal angular velocity to compensate for velocity of travel at 4 meter and 8 meter distances. InFIG. 5A , the image capture rate is 8 frames per second (fps) with readout in 1/48 second, and vehicle velocity at 10 m/s. Therefore, at 8 fps, the system images during ⅙ of a sinusoidal motion, highlighted by thebold line 520, where the system rotation best compensates for distances ranging from 4 meters to infinity. -
FIG. 5B graphically depicts another example of the rotation of the image capture system as angular velocity over time shown insolid line 530.Dotted line 532 represents the amount of rotation in radians.Line 534 represents the exposure and readout interval for an image, andline segment 536 represents the distance with the best motion compensation based on the tilt of the image sensor and the angular velocity of the system. The + marks 538 represent the ideal angular velocity to compensate for velocity of travel at 4 meter and 8 meter distances. InFIG. 5B , the image capture rate is 12 frames per second with readout in 1/48 second, and vehicle velocity at 10 m/s. The higher frame rate is obtained by using a motion closer to a sawtooth waveform (slower ramp one direction than the other, using up to fifth harmonic of the 12 Hz cycle). At a rate of 12 fps, the system images during ¼ of the cycle, as highlighted bybold line 540. - In addition to or as an alternative to rotation, the system may oscillate side-to-side as a translational oscillator in a plane parallel to the direction in which the vehicle is traveling. In yet another example, rather than rotating or oscillating the whole system, only the image sensor may be oscillated relative to the lens to compensate image motion.
- The combination of the tilted image sensor and rotating system, may allow the system to compensate for varied angular velocities of multiple planes of focus. By nature of being a short distance away from a moving vehicle, objects close to the vehicle have a higher angular velocity than objects farther away from the vehicle. For example, for a vehicle velocity of 10 m/s, the angular velocity of an object at 4 meters away is 2.5 rad/s, or 2.5 mrad/ms. Objects at infinity have zero angular velocity. The angular velocity of an object changes in a linear fashion in relation to the reciprocal distance of the object from the system. When the tilted image sensor is rotated, the angular velocity of the first edge of the image sensor may be faster than the angular velocity of the second edge. In other words, the edge of the image sensor with the near focus may have the faster angular velocity. This arrangement may compensate for the difference in angular velocity at different distances away from the vehicle and allow the system to capture portions of the image that are both well focused and well motion compensated for objects at different distances in different parts of the image.
- To further compensate for the difference in angular velocity, the rotation of the system may be synchronized with a rolling shutter exposure and readout, and the amount of rotational velocity may be adjusted to be proportional to the vehicle velocity, such that the image motion on the image sensor is approximately stopped for objects that are in focus in each region of the image sensor.
- The system may capture images at a high enough rate that objects of interest appear in several successive images, but are clearer in one image or another depending on their distances and on their positions within the image. In other words, objects may appear in more than one image and at different points in different images. For example, the system may capture one shot per oscillation cycle, each oscillation being within 10 degrees of rotation. This ⅙ cycle exposure and readout may correspond to a frame interval equal to about 6 times the readout time, or ⅙ of the image sensor's maximum frame rate. For an image sensor capable of reading in 1/48 s, a total of 8 images may be captured per second, or an image every 1.25 meters if the vehicle is traveling at 10 m/s, which may capture multiple views of an object that is more than a few meters from the vehicle (depending on the field of view of the camera).
- Because the tilted image sensor may provide a gradient of focal distances across a single image, neighboring images may capture the same object or location at different image locations having different best focus distances. For example, at object at 8 m distance might appear in 3 or more different images (depending on the field of view), and in at least one of those the object appear near the center of image where the best focus distance is 8 m (half way between the infinity and 4 m edges in terms of reciprocal distance). If the motion compensation is adjusted to no rotation at the infinity edge and compensation for 10 m/s vehicle motion for objects at 4 m distance (rotation of 2.5 radians per second) at the other edge, and varying approximately linearly between those, then it will approximately cancel the motion blur for the object at 8 m distance near the center of the image. Similarly, other objects at distances between 4 m and infinity will be both well focus and motion compensated at the approximately corresponding image locations, so that if they appear in several image they will be sharp in at least one.
- As shown in
FIG. 6 ,image capture system 210 may be incorporated into vehicle 600. While certain aspects of the disclosure are particularly useful in connection with specific types of vehicles, the vehicle may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, buses, recreational vehicles, etc. The vehicle may have one or more computing devices, such ascomputing device 610 containing one ormore processors 620,memory 630 and other components typically present in general purpose computing devices.Image capture system 210 may be connected tocomputing device 610 and mounted onto vehicle 600. - The
memory 630 stores information accessible by the one ormore processors 620, includingdata 632 andinstructions 634 that may be executed or otherwise used by theprocessor 620. The one ormore processors 620 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Thememory 630 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media. - The
data 632 may be retrieved, stored or modified byprocessor 620 in accordance with theinstructions 634. For instance, although the claimed subject matter is not limited by any particular data structure, the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computing device-readable format. - The
instructions 634 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below. - Although
FIG. 6 functionally illustrates the processor, memory, and other elements ofcomputing device 610 as being within the same block, it will be understood by those of ordinary skill in the art that the processor, computing device, or memory may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. For example, memory may be a hard drive or other storage media located in a housing different from that ofcomputing device 610. Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel. -
Computing device 610 may have all of the components normally used in connection with a computing device such as the processor and memory described above as well as a user input 650 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device that is operable to display information). In this example, the vehicle includes an internalelectronic display 652 as well as one ormore speakers 654 to provide information or audio visual experiences. In this regard, internalelectronic display 652 may be located within a cabin of vehicle 600 and may be used by computingdevice 610 to provide information to passengers within the vehicle 600. -
Computing device 610 may also include one or morewireless network connections 654 to facilitate communication with other computing devices, such as the client computing devices and server computing devices described in detail below. The wireless network connections may include short range communication protocols such as Bluetooth, Bluetooth low energy (LE), cellular connections, as well as various configurations and protocols including the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. - In addition to
image capture system 210,computing device 610 may also be in communication with one or morevehicle operation systems 660 of vehicle 600.Vehicle operation systems 660 may include systems involved in operations of the vehicle, such as one or more of deceleration, acceleration, steering, signaling, navigation, positioning, detection, etc. Although one or morevehicle operation systems 660 are shown as external tocomputing device 610, in actuality, the systems may also be incorporated intocomputing device 610. -
Image capture system 210 may also receive or transfer information, such as captured images, to and from other computing devices.FIGS. 7 and 8 are pictorial and functional diagrams, respectively, of anexample system 700 that includes a plurality ofcomputing devices storage system 750 connected via anetwork 760.System 700 also includesimage capture system 210. Although only a few computing devices are depicted for simplicity, a typical system may include significantly more. Additionally or alternatively, one or more vehicles such as vehicle 600 may be included insystem 700. - As shown in
FIG. 8 , each ofcomputing devices more processors 620,memory 630,data 632, andinstructions 634 ofcomputing device 610. - The
network 760, and intervening nodes, may include various configurations and protocols including short range communication protocols such as Bluetooth, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. Such communication may be facilitated by any device capable of transmitting data to and from other computing devices, such as modems and wireless interfaces. - In addition,
server computing devices 710 may usenetwork 760 to transmit and present information to a user, such asuser displays computing devices computing devices - As shown in
FIG. 8 , eachclient computing device user displays - In addition, the
client computing devices components - Although the
client computing devices client computing device 720 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook that is capable of obtaining information via the Internet or other networks. In another example,client computing device 730 may be a wearable computing system, shown as a head-mounted computing system inFIG. 7 . As an example the user may input information using a small keyboard, a keypad, microphone, using visual signals with a camera, or a touch screen. -
Storage system 750 may store various types of information that may be retrieved or otherwise accessed by a server computing device, such as one or moreserver computing devices 710, in order to perform some or all of the features described herein. Thestorage system 750 may store images captured byimage capture system 210. Information associated with images such as location information and pose information may be stored in association with the images. - As with
memory 730,storage system 750 can be of any type of computerized storage capable of storing information accessible by theserver computing devices 710, such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, and read-only memories. In addition,storage system 750 may include a distributed storage system where data is stored on a plurality of different storage devices which may be physically located at the same or different geographic locations.Storage system 750 may be connected to the computing devices via thenetwork 760 as shown inFIG. 7 and/or may be directly connected to or incorporated into any of thecomputing devices image capture device 210, etc. -
FIG. 9 is an example flow diagram 900 in accordance with some of the aspects described above that may be performed by one or more controllers in the system. In this example, one or more image sensors may be configured at an angle behind a lens of an image capture system atblock 910. The image capture system may then be oscillated based on a direction of motion of the image capture system atblock 920. For example, the image capture system may be mounted on a vehicle, in which case, the direction the vehicle is traveling would be the direction of motion of the image capture system. Atblock 930, the image capture system may expose and read images starting at a first edge of a lens in the system and progressing to a second edge of the lens opposite the first edge. To expose and read images in such a way, a sensor of the image capture system may be a rolling-shutter image sensor. - The captured images may be processed in order to extract textual information from objects in the image. Having been generated from images that have a variety of focus distances and have been compensated for motion as described above, traditional machine-reading methods may be applied. For example, words on signs may be read through use of OCR. The captured images may be processed at the
image capture device 210. In other examples, the captured images may be sent vianetwork 760 to one ormore computing devices image capture system 210 or one ormore computing devices network 760 tostorage system 750. In response to a user request, a captured image or extracted information may be retrieved fromstorage system 750. - In some examples, a composite image may be generated by stitching together select portions of the captured images that are most in focus and processed in order to extract information from the image. The composite image may be a panoramic image. In order to select the portions of the captured images that are most in focus, portions of different images capturing a particular location may be compared. The portion of a captured image that depicts a particular location the clearest or at a location closest to the focal distance may be selected to be included in the composite image. The system may use LIDAR to detect the distance between the particular location and the vehicle. The portion of a captured image in which the particular location is captured at the image sensor portion focused on a distance matching the detected distance may be selected to be included in the composite image. The generated composite image may then be processed to extract textual information from objects in the image.
- The features described above allow for the capture of images that are in focus at a wide range of distances when traveling at a high velocity, such as when driving in a car. While each of a tilted image sensor, rotation or oscillation, or overlapping images provides individual advantages over a typical image capture system as discussed above, the combination of these may provide the best source of captured images for generating composite images as described above. As a result, the composite images may contain more information than what is typically captured in an image taken from a moving vehicle. For example, the composite images may be used in post-processing to extract textual information like words on signs that would otherwise be too blurry or out-of-focus to read if captured by a typical image capture system. In addition, the tilted image sensor and the motion of the image sensor allows for the efficient capture of images in a single pass with one camera where the images are captured at a variety of focus distances and degrees of motion compensation. As such, the features disclosed above may allow for the use of wider aperture and slower shutter speeds, enabling the image capture system to get enough light to make clean images at a high velocity.
- Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims.
Claims (20)
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/832,335 US9596419B1 (en) | 2015-08-21 | 2015-08-21 | Image capture system with motion compensation |
PCT/US2016/046757 WO2017034832A1 (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
DE112016001832.0T DE112016001832T5 (en) | 2015-08-21 | 2016-08-12 | Image acquisition system with motion compensation |
GB1717735.3A GB2554579A (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
EP16757420.1A EP3275173B1 (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
DE202016007844.6U DE202016007844U1 (en) | 2015-08-21 | 2016-08-12 | Image acquisition system with motion compensation |
JP2017561645A JP6346716B1 (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
KR1020177030780A KR101887600B1 (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
AU2016312297A AU2016312297B2 (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
CN201680030439.6A CN107667520B (en) | 2015-08-21 | 2016-08-12 | Image capture system with motion compensation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/832,335 US9596419B1 (en) | 2015-08-21 | 2015-08-21 | Image capture system with motion compensation |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170054927A1 true US20170054927A1 (en) | 2017-02-23 |
US9596419B1 US9596419B1 (en) | 2017-03-14 |
Family
ID=56801808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/832,335 Active 2035-09-10 US9596419B1 (en) | 2015-08-21 | 2015-08-21 | Image capture system with motion compensation |
Country Status (9)
Country | Link |
---|---|
US (1) | US9596419B1 (en) |
EP (1) | EP3275173B1 (en) |
JP (1) | JP6346716B1 (en) |
KR (1) | KR101887600B1 (en) |
CN (1) | CN107667520B (en) |
AU (1) | AU2016312297B2 (en) |
DE (2) | DE202016007844U1 (en) |
GB (1) | GB2554579A (en) |
WO (1) | WO2017034832A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10681277B1 (en) * | 2019-03-07 | 2020-06-09 | Qualcomm Incorporated | Translation compensation in optical image stabilization (OIS) |
US12015848B2 (en) | 2021-08-10 | 2024-06-18 | Samsung Electronics Co., Ltd. | Electronic device performing image stabilization and operating method thereof |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3346693B1 (en) * | 2017-01-10 | 2021-03-24 | Veoneer Sweden AB | An imaging device for a motor vehicle, and a method of mounting an imaging device in a motor vehicle |
JP7271132B2 (en) | 2018-10-26 | 2023-05-11 | キヤノン株式会社 | Imaging device and surveillance system |
US10999507B2 (en) * | 2018-10-30 | 2021-05-04 | Qualcomm Incorporated | Optical image stabilization techniques |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2899882A (en) | 1959-08-18 | Camera system for image motion compensation | ||
US1953304A (en) | 1933-05-31 | 1934-04-03 | Fairchild Aerial Camera Corp | Mount for aerial cameras and method of aerial photography |
US6351288B1 (en) | 1997-06-27 | 2002-02-26 | Eastman Kodak Company | Sensor tilt control for a digital camera |
US6304284B1 (en) | 1998-03-31 | 2001-10-16 | Intel Corporation | Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera |
DE69820871T2 (en) * | 1998-07-08 | 2004-12-16 | Hewlett-Packard Co. (N.D.Ges.D.Staates Delaware), Palo Alto | Camera with device for correcting the trapezoidal image error |
JP4433575B2 (en) * | 2000-06-16 | 2010-03-17 | 株式会社豊田中央研究所 | In-vehicle imaging device |
JP4546781B2 (en) * | 2004-07-16 | 2010-09-15 | 日本放送協会 | Imaging apparatus and color misregistration correction program |
US7493030B2 (en) * | 2005-06-24 | 2009-02-17 | Nokia Corporation | Adaptive optical plane formation with rolling shutter |
JP4789722B2 (en) * | 2006-07-13 | 2011-10-12 | Hoya株式会社 | Image blur correction device |
US8054335B2 (en) * | 2007-12-20 | 2011-11-08 | Aptina Imaging Corporation | Methods and system for digitally stabilizing video captured from rolling shutter cameras |
JP2012049892A (en) * | 2010-08-27 | 2012-03-08 | Honda Elesys Co Ltd | Imaging apparatus |
JP2015219754A (en) * | 2014-05-19 | 2015-12-07 | 株式会社リコー | Imaging device and imaging method |
-
2015
- 2015-08-21 US US14/832,335 patent/US9596419B1/en active Active
-
2016
- 2016-08-12 AU AU2016312297A patent/AU2016312297B2/en active Active
- 2016-08-12 DE DE202016007844.6U patent/DE202016007844U1/en active Active
- 2016-08-12 KR KR1020177030780A patent/KR101887600B1/en active IP Right Grant
- 2016-08-12 WO PCT/US2016/046757 patent/WO2017034832A1/en active Application Filing
- 2016-08-12 EP EP16757420.1A patent/EP3275173B1/en active Active
- 2016-08-12 DE DE112016001832.0T patent/DE112016001832T5/en not_active Ceased
- 2016-08-12 GB GB1717735.3A patent/GB2554579A/en not_active Withdrawn
- 2016-08-12 CN CN201680030439.6A patent/CN107667520B/en active Active
- 2016-08-12 JP JP2017561645A patent/JP6346716B1/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10681277B1 (en) * | 2019-03-07 | 2020-06-09 | Qualcomm Incorporated | Translation compensation in optical image stabilization (OIS) |
US11283999B2 (en) | 2019-03-07 | 2022-03-22 | Qualcomm Incorporated | Translation compensation in optical image stabilization (OIS) |
US12015848B2 (en) | 2021-08-10 | 2024-06-18 | Samsung Electronics Co., Ltd. | Electronic device performing image stabilization and operating method thereof |
Also Published As
Publication number | Publication date |
---|---|
EP3275173B1 (en) | 2019-04-10 |
GB2554579A (en) | 2018-04-04 |
GB201717735D0 (en) | 2017-12-13 |
AU2016312297A1 (en) | 2017-10-26 |
DE112016001832T5 (en) | 2018-01-25 |
EP3275173A1 (en) | 2018-01-31 |
JP2018518898A (en) | 2018-07-12 |
CN107667520A (en) | 2018-02-06 |
CN107667520B (en) | 2019-04-30 |
KR20170131608A (en) | 2017-11-29 |
US9596419B1 (en) | 2017-03-14 |
DE202016007844U1 (en) | 2017-01-18 |
AU2016312297B2 (en) | 2018-08-09 |
JP6346716B1 (en) | 2018-06-20 |
WO2017034832A1 (en) | 2017-03-02 |
KR101887600B1 (en) | 2018-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2016312297B2 (en) | Image capture system with motion compensation | |
JP6743191B2 (en) | Multi-sensor image stabilization technology | |
EP2815569B1 (en) | Video image stabilization | |
US9438800B1 (en) | Enabling image stabilization for a panoramic camera with multiple fixed lenses | |
US8810629B2 (en) | Image processing apparatus, image capturing apparatus, image processing method, and program | |
US10104292B2 (en) | Multishot tilt optical image stabilization for shallow depth of field | |
US20190222746A1 (en) | Focusing method and apparatus, image photographing method and apparatus, and photographing system | |
US9681052B1 (en) | Multi-aperture camera with optical image stabilization function | |
CN112740654A (en) | System and method for stabilizing video | |
EP3363191B1 (en) | Omnidirectional camera with movement detection | |
CN105262934B (en) | A kind of method of adjustment and device of video image | |
EP2974271B1 (en) | Anti-shake correction system for curved optical sensor | |
JP2007235532A (en) | Vehicle monitoring apparatus | |
CN107615744A (en) | A kind of image taking determination method for parameter and camera device | |
CN111213364A (en) | Shooting equipment control method, shooting equipment control device and shooting equipment | |
CN110266950B (en) | Gyroscope processing method and device, electronic equipment and computer readable storage medium | |
WO2018024239A1 (en) | Hybrid image stabilization system | |
JP5143172B2 (en) | Imaging apparatus and image reproduction apparatus | |
EP3474535B1 (en) | Image synthesis method and device for mobile terminal | |
JP6250446B2 (en) | Image processing system, image processing apparatus, image processing method, and program | |
EP3691250A1 (en) | Imaging device, control method, and recording medium | |
JP2000069353A (en) | Camera-shake detector and camera-shake correction device | |
JP2024041308A (en) | Information processing device | |
WO2024205702A1 (en) | Panamorphic lens system | |
JP2023051234A (en) | Control method and imaging apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LYON, RICHARD FRANCIS;REEL/FRAME:036401/0836 Effective date: 20150824 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044097/0658 Effective date: 20170929 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |