GB2483224A - Imaging device with measurement and processing means compensating for device motion - Google Patents

Imaging device with measurement and processing means compensating for device motion Download PDF

Info

Publication number
GB2483224A
GB2483224A GB1014253.7A GB201014253A GB2483224A GB 2483224 A GB2483224 A GB 2483224A GB 201014253 A GB201014253 A GB 201014253A GB 2483224 A GB2483224 A GB 2483224A
Authority
GB
United Kingdom
Prior art keywords
image
data items
pixel sensor
pixel
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1014253.7A
Other versions
GB201014253D0 (en
Inventor
Peter Richard Cronshaw
Paul Thompson
Stuart Pooley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DREAMPACT Ltd
Original Assignee
DREAMPACT Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DREAMPACT Ltd filed Critical DREAMPACT Ltd
Priority to GB1014253.7A priority Critical patent/GB2483224A/en
Publication of GB201014253D0 publication Critical patent/GB201014253D0/en
Publication of GB2483224A publication Critical patent/GB2483224A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/689Motion occurring during a rolling shutter mode
    • H04N5/23248
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Abstract

Imaging is carried out by obtaining image data comprising a set (e.g. a frame of display data) of pixel sensor data items representative of an image formed on an array 12 of image sensors of an imaging device 2 that moves in operation. The image data comprises a plurality of subsets e.g. rows or columns of pixel sensor data items, each subset of pixel sensor data items being captured using at least one image sensor of the array at a respective different time during a frame capture period. Motion or position measurements are obtained using a physical sensor 16. For each subset of pixel sensor data items, respective different position data are determined from the motion or position measurements, independently of values of pixel sensor data items. For each subset of pixel sensor data items, mapping each pixel sensor data item of the subset is mapped to a respective display position using the determined position data for that subset of pixel sensor data items. The mapping of each subset of pixel sensor data items is such as to at least partially compensate for motion of the imaging device or array. The physical sensor 16 may be three orthogonal gyroscopes or accelerometers providing measurement signals to a processor 14 for determining angular and/or translational motion of the imaging device without reference to the content of images. The imaging device may be handheld, in a projectile, or mounted on an unmanned aerial vehicle (UAV). The mapping may be such as to maintain a desired perspective e.g. orientation of the image on a display.

Description

t V.' INTELLECTUAL ..* PROPERTY OFFICE Application No. GB 1014253.7 RTIVI Date 28 June 2011 The following terms are registered trademarks and should be read as such wherever they occur in this document: "Xilinx" and "Virtex" Intellectual Properly Office is an operating name of the Patent Office www.ipo.gov.uk Imaging apparatus and method
Field of the invention
The present invention relates to an image processing apparatus and method, for example an image processing apparatus and method for processing image data obtained from imaging devices that move in operation, for instance projectile imaging devices, handheld imaging devices or imaging devices that are mounted on aircraft or other vehicles.
Background to the invention
A wide range of imaging devices that are intended to be used, whilst in motion, to image a scene or an object within a scene are known.
For example, imaging devices are often mounted on civilian or military aircraft or on unmanned aerial vehicles (UAVs) or other vehicles. Such imaging devices can be used for reconnaissance or surveillance purposes. The imaging devices can be subject to a wide range of rapid movements, for example due to motion of the vehicle on which they are installed. It can be difficult for an operator to monitor or identify a scene or object, or changes of the scene or object, from the resulting images produced by the imaging device due to the variations in the images caused by the motion of the device.
A common approach to stabilising images produced by mobile imaging devices that are subject to movement during use is to attempt to reduce the motion of the imaging device itself. For example, it is known to install vehicle-mounted imaging devices on physical structures such as gimbals that are intended to maintain the imaging device in a stable position and orientation, despite movement of the vehicle.
For example, US 2010/0110187 describes an airborne camera system in which the camera is mounted within a gimbal system, and measurements of heading, pitch and roll of the aircraft from aircraft gyros and calculations concerning the overflight velocity of the aircraft are used in controlling gimbals of the gimbal system in order to maintain the line of sight of the camera fixed on a target. However, such systems for stabilising images by physical stabilisation of the imaging device itself become increasingly ineffective for as the size of movements and frequency of movements of a vehicle on which the device is mounted increase, due to the inertia of the imaging device itself. )
In addition to the use of mobile imaging devices for reconnaissance or surveillance purposes, it is also known to mount TV or film cameras on moving platforms or objects. For example it has been suggested to mount TV cameras on balls or other moving objects in filming sporting events. In such arrangements it is generally less important to know the exact position or orientation of an object or of the imaging device itself than is the case for reconnaissance or surveillance systems, but it is important that a reasonably stable image of an object or other point of interest, for example a goal, is provided to the viewer. One known approach is to use motion estimation techniques based upon tracking the content of images themselves to estimate the motion of the device, and then to compensate the images in an attempt to provide acceptably stable image to the viewer. Such techniques usually require the use of image recognition techniques to identify and track common objects between different frames. WO 2006/033061 describes a technique that is based upon such image tracking techniques, and is used in a system in which several cameras are mounted on a ball, for example for example a soccer ball, american football or basketball, in different orientations. Initial measurements of rotational speed of the ball using strain gauges are performed and are used to obtain a subset of possible motion vectors. A motion vector is then selected from the sub-set using motion tracking techniques based upon tracking images of common objects between frames and the motion vector is then used in the merging of images obtained from the different cameras.
An alternative approach is described in WO 2008/090345, which is directed to a projectile imaging device that can provide stabilised images of a scene in-flight from a desired perspective despite rotation of the device in-flight. The device includes image sensors comprising an array of pixel sensors. Motion of the device is measured in-flight using gyroscopes or accelerometers. The pixel sensor signals are mapped frame-by-frame to corresponding pixels on a display in dependence on the measured motion and so as to compensate for the measured motion. The techniques described in WO 2008/090345 allow for the production of stabilised images without requiring physical stabilisation of the imaging device to compensate for motion, and without using image tracking techniques or any other techniques that require analysis of the content of the images. Indeed, the techniques described in WO 2008/090345 can provide for image stabilisation and compensation independent of the content of images.
Imaging devices, for example video cameras, generally include an array of image sensors located at the focal plane of associated imaging optics. A focused image is formed on the array, and each image sensor generates an output signal in response to light received by that image sensor during a measurement period, with the magnitude of the signal usually being representative of the intensity of received light in the sensitivity range of the image sensor. The output signals from the image sensors are usually sampled periodically to provide a succession of frames of image data.
In some known devices, a rolling shutter mode of operation is used. In such a mode of operation, the output from each row or column of image sensors in the array is sampled in succession, rather than the whole array of image sensors being sampled simultaneously. Rolling shutter devices can be smaller (for example, having a 6mm square array versus a 35mm square array for a global shutter device having similar performance levels), less costly (for example 10 times less costly for similar performance levels) and have simpler sensor electronics than non rolling shutter devices, for example the same read-out circuitry can be used for different rows of columns of image sensors. In some such rolling shutter devices, a single line or IS column of image sensors can be used (the array is then a one-dimensional array) and different parts of the image can be successively focussed on the row or column, providing for reduced numbers of image sensors. However, it has been found for rolling shutter devices that if there is any movement of the device during or between the sampling of successive rows or columns, for example due to camera shake or jitter, or movement of a vehicle on which the device is installed, then the image represented by the resulting frame of image data can be distorted.
An imaging device and method that is intended to stabilize video captured using a rolling shutter operation is described in US 200g/0160957. Several rows of pixels are assigned to a strip, and for each strip, motion estimation techniques based on the tracking of image content features between successive frames, are used to determine a motion vector for each strip. The motion vector for each strip are used to adjust display positions for pixels of the strip. Gyroscope measurements may additionally be performed for each strip (for example each strip of 16 rows) and may be used to assist in the detection and measuring of camera motion with the aim of improving the accuracy of the motion estimation step.
It is an aim of the present invention to provide an improved or at least alternative imaging apparatus or method.
Summary of the invention
In a first, independent aspect of the invention there is provided an imaging method comprising:-obtaining image data comprising a set of pixel sensor data items representative of an image formed on an array of image sensors of an imaging device, wherein the image data comprises a plurality of subsets of pixel sensor data items, each subset of pixel sensor data items being captured using at least one image sensor of the array at a respective different time during a frame capture S period; obtaining motion or position measurements performed using a physical sensor; for each subset of pixel sensor data items, determining from at least one of the plurality of motion or position measurements, and independently of values of pixel sensor data items of the subset, respective different position data; for each subset of pixel sensor data items, mapping each pixel sensor data item of the subset to a respective display position using the determined position data for that subset of pixel senscr data items and obtaining display pixel data items from the subset of pixel sensor data items for the mapped display positions; wherein the mapping of each subset of pixel sensor data items is such as to at least partially compensate for motion of the imaging device or array.
Thus, different motion compensation mappings can be applied to different sub-sets of pixel sensor data items obtained at different times during a rolling shutter- type frame capture procedure, which in turn can provide for improve motion-compensation in the presence of motion during the frame capture period.
Furthermore, as the position data is obtained independently of values of pixel data items, the mapping can be performed independently of the content of the image and with requiring the use of object-tracking image processing techniques.
Both of those features can be particularly useful in the context of surveillance systems, for example vehicle-mounted or projectile surveillance systems in which large movements may occur during each frame capture period. In such surveillance systems, object tracking image processing techniques to determine motion can be unreliable, particularly if the vehicle mounted system is moving rapidly in an environment without distinct landmarks or objects, and such techniques may also slow down processing of real time images and require additional processing resources. The method may allow rolling shutter type imaging systems to be used in such contexts, thus providing for smaller and cheaper imaging devices to be used whilst maintaining acceptable performance. The use of smaller devices can be particularly valuable for airborne systems for which weight reduction can be an important consideration.
The array may comprise a one-dimensional or two-dimensional array. In the case of a one-dimensional array, different parts of the image may be scanned by the array in turn, for example using imaging optics or array actuators that cause relative movement of the image and the array.
The set of pixel data items may form a frame of image data and the display pixel data items obtained for the subsets of pixel data items may combine to form a frame of display data for representing the image on a display.
The mapping may at least partially compensate for the cumulative motion experienced by the array of image sensors at the time of capture of each subset of pixel data.
The method may comprise repeatedly performing the motion or position measurements using at least one physical sensor. The method may comprise repeatedly performing the motion or position measurements during the frame capture period.
The physical sensor may perform measurements of at least one physical property of the imaging device or at least one component of the imaging device, for example the array of image sensors. The motion or position measurements may be representative of motion of the imaging device or array of image sensors. The motion or position measurements may be representative of motion of the image relative to the array.
Each pixel sensor data item may comprise a measurement by an image sensor of the array.
The position data may be any data representative of, or that can be used to obtain, position. The position data determined for a subset of pixel sensor data items may be representative of the position of the image on the array of image sensors at the time of capture of that subset of pixel sensor data items.
For each subset of pixel sensor data items, the position data may be representative of the position of the image relative to the array. The position data may be representative of an angular or lateral position.
The mapping may be such as to maintain a desired perspective of the image on the display, for example a desired orientation of the image on the display. The desired orientation may be determined with respect to the array of image sensors.
The method may further comprise obtaining a succession of frames of image data, wherein each frame of image data is obtained using a method according to any preceding claim.
Each pixel sensor data tern may be obtained from a corresponding at least one image sensor of the array, and the method may comprise, for each frame of image data, capturing pixel sensor data items from a series of subsets of image sensors of the array in succession. The order of the subsets of image sensors from which pixel sensor data items are captured may be different for at least one of the frames than for at least one other of the frames.
By varying the order in which pixel sensor data items are captured, the chances of the same portions of image being missed during each frame capture period may be reduced.
Each subset of pixel data items may comprise a line of pixel sensor data items obtained from a corresponding line of the image sensors of the array, for example a row or column of pixel sensor data items obtained from a corresponding row or column of image sensors of the array.
The method may comprise, for each of the frames of image data, capturing pixel sensor data items from successive lines of image sensors in a scan direction relative to the array. The scan direction may different for at least one of the frames than for at least one other of the frames. The different scan directions may be in substantially opposite or substantially orthogonal directions. The scan direction relative to the array for each frame may be one of up, down, left and right. The scan direction for each frame may vary from frame-to-frame in one of a regular or pseudo-random sequence.
The method may comprise selecting a scan direction in dependence on a measured or expected direction of motion of the imaging device or sensor array.
Each subset of pixel sensor data items may comprise a single pixel sensor data item.
Each frame of display data may comprise a plurality of display pixel data items each corresponding to a display position, and the method may comprise retaining a display pixel data item from a preceding frame in a subsequent frame, if the mapping for the subsequent frame does not update the display pixel data item.
Each display frame may comprise a plurality of display pixel data items each corresponding to a display position, and the method may comprise, for at least one of the frames, determining the value of a dispLay pixel data item from image or display data for at least one nearby position, if the mapping for said at least one of the frames does not update the display pixel data item.
The method may comprise interpolating or extrapolating the values of at least one pixel data item or display pixel data item for nearby positions to determine the value of said display pixel data item.
The determining of position data for a subset of pixel sensor data items may comprise determining the position data from motion or position measurements obtained before and/or after the time of capture of the subset of pixel sensor data items.
The determining of position data for a subset of pixel sensor data items may comprise interpolating or extrapolating motion or position measurements obtained before and/or after the time of capture of the subset of pixel sensor data items.
In a further independent aspect of the invention there is provided an imaging apparatus comprising:-an array of image sensors; means for capturing image data from the array of image sensors, the image data comprising a set of pixel sensor data items representative of an image formed on the array, wherein the set of pixel sensor data items comprises a plurality of subsets of pixel sensor data items, each subset of pixel sensor data items being captured using at least one image sensor of the array at a respective different time during a frame capture period; a physical sensor for obtaining motion or position measurements; and a processing resource configured to:-for each subset of pixel sensor data items, determine from at least one of the plurality of motion or position measurements, and independently of values of pixel sensor data items of the subset, respective different position data; and for each subset of pixel sensor data items, map each pixel sensor data item of the subset to a respective display position using the determined position data for that subset of pixel sensor data items and obtain display pixel data items from the subset of pixel sensor data items for the mapped display positions, wherein the mapping of each subset of pixel sensor data items is such as to at least partially compensate that subset of pixel sensor data items for sensed motion of the imaging device.
In another independent aspect of the invention there is provided a computer program product comprising computer readable instructions that are executable to perform a method as claimed or described herein, There may also be provided an apparatus or method substantially as described herein with reference to the accompanying drawings.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. For example, apparatus features may be applied to method features and vice versa.
Detailed of mkociirria.n ts Embodiments of the invention are now described, by way of non-limiting example, and are illustrated in the following figures, in which:-Figure 1 is a schematic illustration of an imaging device installed in a vehicle; Figure 2 is a schematic diagram showing the imaging device in more detail; Figures 3a and 3b are schematic diagrams of portions of a pixel sensor array and a display; Figure 4 is a flowchart illustrating in overview one mode of operation of the imaging device of Figure 1; Figure 5 is a schematic diagram of components of the imaging device in a side view; Figures 6a to 6e are schematic diagrams illustrating distortions in displayed images that can occur due to intra-frame motion when no motion compensation is applied; Figures 7a to 7e are schematic diagrams illustrating motion compensation that can be obtained by applying a motion compensation mapping using the embodiment of Figure 1; Figures 8 and 9 are schematic diagrams illustrating the correspondence between an image formed on an image sensor array and a displayed image, for a stationary image sensor array; Figures 10 to lOd are schematic diagrams of an image sensor array, showing the movement of an image on the array during a frame capture period due to movement of the image sensor array; Figures ha and 11 b are schematic diagrams of a portion of a display frame showing a corresponding display image obtained for the image sensor array and frame capture period of Figures iDa to lCd, without and with motion compensation mapping respectively; Figures 12a to 12h are further schematic diagrams of an image sensor array, showing the movement of an image on the array during a frame capture period due to movement of the image sensor array; and Figures 13a and 13b are further schematic diagrams of a portion of a display frame showing a corresponding display image obtained for the image sensor array and frame capture period of Figures 12a to 12h, without and with motion compensation mapping respectively.
An imaging device 2 according to one embodiment installed in an unmanned aerial vehicle (UAV) is illustrated in Figure 1. In alternative embodiments the imaging device can be installed in any other type of vehicle, for example an aeroplane, helicopter, car, buoy, water-borne vehicle, tank or other armoured vehicle.
Alternatively the imaging device can be a handheld device for operation whilst being held by an operator, for example a handheld visible or infra-red camera, or can be a projectile imaging device for imaging scenes in4light or after landing after having been thrown or otherwise projected.
The imaging device 2 is illustrated in more detail in Figure 2, and includes a camera 10 including a lens that focuses light on a sensor array 12. In this embodiment the sensor array is a rectangular two dimensional array of 2048 x 2048 CMOS pixel sensors. In alternative embodiments, larger or smaller sized sensors that use CMOS or other technologies are used. Any suitable focal plane array image sensor may be used, that is sensitive to electromagnetic radiation of interest, for example visible, infra-red, or x-ray.
The sensor array 12 is connected to processing circuitry 14 that in this embodiment comprises a core processor and a memory, for example a Xilinx Virtex-6 field programmable gate array with DDR2 SDRAM memory. The processor 15 is programmed with, and/or is configured to execute, a motion or position determining module for determining motion or position from measurements by the motion sensing system, and a data capture mapping module for mapping pixel signals received from the array 12.
The processor is configured to control operation of the sensor array and camera, including for example controlling frame rate, focussing, zoom functions and perspective functions. The processor is also configured to process pixel sensor signals received from the sensor array 12 to provide image data for display, and that processing includes performing image stabilisation algorithms to stabilise the images to compensate for motion of the imaging device 2, as discussed in more detail below.
A motion sensing system 16 is also connected to the processor and comprises three gyroscopes each arranged along a different orthogonal axis, an Analog Devices AD1S16360 module containing three orthogonal gyroscopes in this embodiment. Any other suitable arrangement of gyroscopes or other motion sensing devices, for example accelerometers, can be provided in alternative embodiments.
Measurement signals are provided from the motion sensing system 16 to the processor of the processing circuitry 14 and the processor is configured to determine the angular and/or translational motion of the imaging device 2 from the received measurement signals. The processor is able to determine the motion of the imaging device 2 independently of the imaging process, and for example without reference to the content of images obtained from the imaging device. The determined motion is subsequently used by the processor in processing the image data as described in more detail below.
In alternative embodiments, a motion sensing system is not provided as pad of the imaging device 2 itself, and instead the processor is connected to, and determines motion from measurement signals received from, motion sensing apparatus (for example gyroscopes, accelerometers, a compass, a GPS device or a radar device) installed in the UAV 4 or other vehicle.
The processing circuitry 14 is also connected to wireless communication circuitry 18 comprising a wireless transceiver and antenna, that can be used to transmit image data from the processor to a remote server or terminal 200 for viewing on a display 20 by an operator. The terminal 200 includes wireless communication circuitry 202 for communicating with the device 202. The terminal also includes an image processing module 204 for processing image data received from the device to display images on the display 20, and a control module 206 for controlling operation of imaging processes of the device 2, via the sending of control signals to the device via the wireless communication circuitry. The control module 206 and the image processing module can be implemented in any suitable hardware, software or combination of software and hardware. The terminal includes user input devices, for example a keyboard and mouse, and the control module 206 is operable to generate a user interlace on the display 20 to enable a user to select various control parameters for controlling the image capture or display processes.
In other embodiments, other communication systems may be employed as well as or instead of the wireless communication circuitry of the device 2 or terminal 200, for example wired or fibre-optic communication systems.
In operation, an image is formed on the pixel sensor array, and the processor receives a respective pixel sensor signal from each of the pixel sensors. For each pixel sensor the pixel sensor signal is representative of the light level received by that pixel sensor. In the case of a colour sensor, three pixel sensor signals are usually received for each pixel sensor, representative of red, green and blue light levels. The pixel sensor signal or signals obtained from a pixel sensor at a particular time, either before or after processing can be considered to be a pixel sensor data item. The pixel sensor signals together form a frame of image data. The pixel sensor signals are refreshed and are read periodically by the processor to obtain a succession of frames of image data, each representative of the image at a respective time.
In the embodiment of Figure 1, pixel sensor signals from different lines of pixel sensors of the array are captured at successive, different times to build up a frame of image data, in a rolling shutter-type image capture procedure.
The processor is configured to process the pixel sensor signals and to output corresponding display data that is readable by a display device to display an image on the display device. Each frame of display data comprises or is representative of a plurality of display pixel signals, and each display pixel signal is representative of the 1I brightness (and colour in the case of a colour image) of the output to be provided by the display device at a corresponding pixel position on the display device.
There is a direct correspondence between the pixel sensors of the sensor array and the pixel signals of the display data, as illustrated schematicafly in Figures 3a and Sb. Figure 3a is a schematic diagram of a portion 30 of the sensor array 12, which comprises a plurality of pixel sensors labelled a to I, and Figure Sb is a schematic diagram of a corresponding portion 32 of the display showing pixel signals of the display data that are display on the portion 32 of the display. It can be seen that in this example there is a one-to-one mapping of each image sensor to a corresponding display pixel, and for each frame the value of a display pixel signal corresponds to the value of a pixel sensor signal obtained from a corresponding pixel sensor.
It can be understood from Figures 3a and Sb that if the mapping between pixel sensors and display pixels is constant and does not change during or between frames, then any movement of an image on the sensor array (for example due to movement of the sensor) would be reflected in a corresponding movement of the display image represented by the display data, and in the image displayed to an operator on the display.
It is a feature of the embodiment of Figure 1 that the mapping between pixel sensors and display pixels is not constant but instead is performed in dependence on the motion of the imaging device 2 measured by the motion sensing system 16. The mapping can vary both from frame to frame, and from line to line within a frame, in order to compensate for measured motion of the device 2. The processing circuitry 14 applies a separate motion compensation procedure to the image data for each line of pixel sensor signals to compensate for the measured motion, as is described in more detail below.
The data capture and mapping process a single image frame for one mode of operation of the embodiment of Figure 1 is illustrated in overview in the flow chart of Figure 4.
At the first stage of the process 40 the data capture and mapping module is in an idle state. At a frame capture start time determined by the data capture and mapping module, pixel sensor signals are read from the first row of image sensors of the image sensor array, at stage 42.
In parallel with the capture of each frame of image data, motion measurements are also performed using the motion sensing system 16, as also illustrated in overview in Figure 4. In the mode of operation illustrated in Figure 4, motion measurements are performed periodically at stage 50 (for example every 2 milliseconds, or 20 times per frame). The position or motion of the imaging device 2 as a function of time is calculated by the motion or position determining module from the motion measurements, at stage 52. The processor continually determines whether the data capture and mapping module has started reading pixel sensor signals from a row of the image sensors and if it has then the latest calculated motion or position is latched, at stage 54, and attributed to the pixel sensor signals obtained from that row of image sensors. The latched value is next updated when it is determined that the motion or position determining module has begun to read the next row of image sensors. A different motion or position can thus be attributed to each of the successive lines (in this case rows) of pixel sensor signals.
In another mode of operation, the processor interpolates or extrapolates motions or positions determined using the motion sensors to obtain the motion or position of the device 2 at the precise time of capture of the line of pixel sensor signals, rather than using the closest determined motion or position.
At the next stage 44 of the procedure, the processor determines a mapping for each of the pixels of the line of pixel sensor signals, based on the motion or position determined for that line of pixel sensor signals. The mapping is such as to compensate for the measured motion, for example such as to maintain a desired perspective of the resulting image on the display. Any suitable motion compensation mapping can be used, and the mapping used by the embodiment of Figure 1 in one mode of operation is described in more detail below.
At the next stage 46, the pixel sensor signals of the row are mapped to display positions using the mapping determined for the row, and display pixel signals corresponding to the pixel sensor signals are generated for the determined display positions.
It is then determined whether the row is the last row of the array of image sensors, at stage 48. If the row is not the last row then stages 42 to 48 are repeated for the next row of image sensors of the array, to provide further display pixel signals.
Once the mapping and display pixel signal generation process has been performed for all rows, a frame of display data, made up of the display pixel signals, is produced and can be output for storage or display. The process is then repeated for the next frame of image data.
It will be understood that, as the motion and position of the device can vary during a frame capture period, a different mapping can be applied to each line (in this case each row) of pixel sensor signals making up a single frame. Thus, accurate motion compensation can be obtained using the embodiment of Figure 1, that can take into account short term motion occurring during the frame capture period as well as cumulative motion occurring from frame to frame.
As mentioned above any suitable motion-compensation mapping process can be used. In one mode of operation of the embodiment of Figure 1, the processor maps each of the pixels of the array of pixel sensors onto an imaginary three dimensional hemisphere or sphere, whose radius from the centre of the lens is chosen by giving each pixel a projection co-ordinate in three dimensional space which is determined relative to the positional centre of the imaging device 2. These projection co-ordinates on the hemisphere or sphere are hereafter referred to as x'n, y'nandz'n.
The mapping of pixel signals to projection co-ordinates is illustrated schematically in Figure 5 which shows, by way of example, light rays directed by lens 84 of camera 10 onto two pixels 80 82 of the image sensor 12.
The origin of each light ray is in one-to-one correspondence with a pixel of the image sensor 12. Each pixel is also in one-to-one correspondence with a point x'n,y'n,in, where n=0,1,2..., on the notional hemisphere or sphere of centre x'O,y'O,z'O around the lens and image sensor assembly, through which the rays of right from an imaged scene pass before arrival at the lens and image sensor 12, as illustrated in Figure 5. The co-ordinates (x'n,y'n,z'n) are defined with respect to a nominal reference point x'O,y'O,z'O within the volume of the device, for instance at the centre of the device. The mapping of x'n,y'n,z'n projection co-ordinates to pixels of the image sensor may be determined by experimentation or by calibration or may be determined by calculation based on the physical relationship between, and performance of, the lens and image sensor.
For each line of pixel sensor signals, the processor performs trigonometric calculations for each pixel of the line, to determine an offset to be applied to the x'n, y'n and zn projection co-ordinates for that pixel as part of the mapping process so as to compensate for rotational and/or translational motion of the device 2 determined for that line of pixel sensor signals. For example, for each line of pixel sensor signals, a measured change in angle obtained from the motion sensing system for that line of pixel sensor signals, is used to determine the trigonometric correction to be applied to the x'n, y'n and z'n co-ordinates that correspond to each pixel of the image sensor, in order to stabilise the image from the sensor in direction and/or attitude Each pixel sensor signal, following the mapping described in relation to Figure 5, is then mapped to a corresponding display pixel using the offset the x'n, y'n and z'n projection co-ordinates. It can be understood that if the resolution of the display of the pixel sensor array 12 and the resolution of the display are different then a plurality of pixel sensor signals can be mapped to a single display pixel (for example the values of the pixel sensor signals can be averaged to provide a value for the display pixel signal) or a single pixel sensor signal can be mapped to a plurality of display pixels.
The offset applied to the x'n, y'n and z'n co-ordinates for pixel sensor signals and the subsequent mapping to display pixels is such as to compensate for a sensed motion of the device, for example a sensed change in orientation of the device. For example, it can be understood that if there is motion of the image formed on the array of image sensors (due to sensed movement of the device 2 rather than a change in the image content) then a pixel sensor signal received from one image sensor of the array would have been received from a different image sensor of the array if the motion had not occurred (for example if the device 2 was still in a reference position or orientation). In one mode of operation, the offset and mapping process effectively attributes a pixel sensor signal received from one image sensor of the array to another image sensor of the array from which the pixel sensor signal would have been received if there had been no motion. The pixel sensor signals can then be mapped to display positions using a standard mapping, such as the one-to-one mapping illustrated in Figures 3a and 3b, and display pixel signals for those display positions obtained from the pixel sensor signals.
The effects of motion of the device during a frame capture period are illustrated schematically in Figures 6a to 6e and 7a to 7e.
Figures 6a to 6e show an image sensor array field of view 90 and the corresponding outline of a display frame 92 representative of display data obtained from the sensor array. In this case, the image sensor array field of view 90 is larger than the display field of view, and those pixel sensor signals from the outside area of the sensor array are not mapped to the display. The outline of a displayed image 94 of a square object that appears in the field of view is also shown in Figures 6a to 6e.
The outline of the image 94 is the image as it is represented in the display frame 92 rather than the image as formed on the sensor array. In each case the image of the object formed on the sensor array, which is not shown, is a square that appears at the centre of the sensor array. For each of Figures 6a to Se the rows of pixel signals are captured on a row-by-row basis, scanning from top to bottom of the sensor array.
Figure 6a illustrates the case where there is no motion of the device during the frame capture period, and no motion compensation is applied during processing of the image data. In this case the displayed image 94 is identical to the image formed on the sensor array.
Figure 6b illustrates the case where the imaging device moves downwards (in a direction from top to bottom of the sensor array) during the frame capture period.
Again, no motion compensation is applied and it can be seen that the displayed image 94 of the object is compressed in a vertical direction. As the image formed on the sensor array is effectively moving up relative to the sensor array during the frame capture period (due to the downward movement of the imaging device) by the time the rows of pixel sensors on which the image was formed at the start of the frame capture period are captured, the image has moved upwards away from those rows, causing the compression of the image on the display.
Figures 6c to 6e illustrate the cases where there is no motion compensation applied, and there is upward motion, left-to-right motion and right-to-left motion respectively during the frame capture period. It can be seen that in these case there is again distortion of the image formed on the display due to the motion of the imaging device during the frame capture period.
The distortion of the displayed images can be compensated for by the individual line-by-line compensatory mappings applied by the processor of the embodiment of Figure 1, as illustrated in Figures 7a to 7e.
Figures 7a to 7e illustrated the same scenarios as illustrated in Figures 6a to 6e, but for Figures 7a to 7e a motion compensation mapping is applied to the image data, on a line by line basis, in mapping the pixel sensor signals to the display pixel signals. It can be seen that the motion compensation mapping causes the outline of the object 94 to be displayed undistorted on the display.
In the case of Figure 7b, the image of the bottom part of the object, which was formed on particular rows of pixel sensor signals at the start of the frame detection period will have moved upwards relative to the array and be formed on other rows of pixel sensor signals higher up the array by the time that part of the image was captured. However, the processor maps the pixel sensor signals representative of that part of the image to display pixels corresponding to the original position of that part of the image on the display, based on the physical motion or position measurements obtained using the motion sensing system 16. As the motion or position used in the motion compensation represents the motion or position at the time each particular pixel sensor signal was captured, the motion compensation can compensate for motion occurring both during a frame capture period and from frame to frame. Furthermore, as the motion compensation is based upon sensing of physical motion or position and is not dependent on the content of images, it can provide for rapid and robust motion compensation which is not affected by variations in image content or properties or movement of imaged objects.
The effect of motion of the device and the motion compensation procedure is also illustrated with reference to Figures 8, 9, ba to lOd, ha and lib, 12a to 12h, 13a and 13b.
Figure 8 shows the outline of an image 100 of an object formed on an array of S pixel sensor 102. The individual rows of pixel sensors are labelled row 1 to row n. In this case the pixel sensor signals are captured from pixels of each row in turn, reading from left to right along the row, and reading the rows from top to bottom.
Figure 9 shows the outline of the resulting image 104 of the object formed on a portion 106 of a display, following the mapping of the pixel sensor signals to display pixel signals. Individual rows of display pixels are labelled display row m to display row m+5. In this case there is no motion of the imaging device or the object during the frame capture period and so the image 104 displayed on the display is substantially the same as the image displayed on the array of image sensors, regardless of whether or not motion compensation is used.
Figures lOa to hOd show the same components as Figure 9, but in this case there is motion of the image sensor array in an upward direction during the frame capture period. Figures lOa to lOd show the position of the outline of the image 100 during the capture of pixel sensor signals from each of rows 2, 3, 4 and 5 respectively. Diagrams showing the position of the outline of the image during capture of pixel sensor signals from rows I and 6 are omitted, but pixel sensor signals are also captured from those rows.
Figure 1 Ia shows the resulting image formed on the display from the pixel sensor signals captured from rows I to 6 of the pixel sensor array, without motion compensation mapping being applied. In this case image sensor row 2 is mapped to display row m+1, sensor row 3 is mapped to display row m+2 etc. It can be seen that the image formed on the display is compressed in the vertical direction, as was also illustrated in Figure 6b for a similar scenario.
Figure 11 b shows the resulting image formed on the display from the pixel sensor signals captured from rows I to 6 of the pixel sensor array, when motion compensation mapping is applied. Again, for each row the pixel sensor signals of the row have been captured at the same time, but pixel sensor signals for different rows are captured at different times.
In variants of the embodiment, each pixel sensor signal is captured individually at a respective different time and, depending on the application, likely speed of movement, and differences between capture times for pixel sensor signals of the same row or column! different motion or position values are attributed to each pixel sensor on an individual basis. Each pixel sensor signal may be mapped individually in such variants, based on a respective motion or position value individual to that pixel sensor signal.
It can be seen in Figure lib that the outline of the object displayed on the display is substantially the same as the image formed on the array of image sensors, due to the motion compensation mapping. However, it can also be seen that a section of the image of the object is missing from the dispLayed image, as indicated by cross-hatching in Figure 1 lb. That can be understood from review of Figures lOa to lOe, from which it can be seen that an image of the object was captured only by three different rows of pixel sensors during the frame capture period. The image of the object on the array has a height equal to four rows of pixel sensors, and as image data representative of the object was captured for only three rows of pixel sensors it is clear that part of the image must be missing from the image data.
The omission of a section of the displayed image in Figure 1 lb is due to the speed and nature of the movement of the image relative to the array and the scan direction of the pixel signal capture procedure.
Figures 12a to 12h show the same components as Figures lOa to lOd (although more sensor and display rows are now shown) and illustrate the position of the outline of the image 100 during the capture of pixel sensor signals from rows 2 to 9. In this case there is downward motion of the imaging device during the frame capture period.
Figure l3a shows the resulting image formed on the display from the pixel sensor signals captured from rows I to 10 of the pixel sensor array for the scenario of Figures 12a to 12h, without motion compensation mapping being applied. It can be seen that the image formed on the display is stretched in the vertical direction, as was also illustrated in Figure 6c for a similar scenario.
Figure l3b shows the resulting image formed on the display from the pixel sensor signals captured from rows I to 10 of the pixel sensor array, when motion compensation mapping is applied, for the scenario of Figures 12a to 12h. Once again, the motion compensation causes the image of the object displayed on the display to be substantially the same as the image formed on the array of image sensors, and prevents distortion caused by movement of the imaging device. In this case, there are also no missing sections of the image, in contrast to the situation illustrated in Figure ha.
Figures 10 to 13 show the variation of image position on the sensor array, and resulting displayed images, for a single frame. It is a feature of certain embodiments that the order in which pixel sensor signals are captured for different rows or columns is varied from frame to frame in a sequence of frames, which can ensure that at least some sections of the image missing in one frame of the sequence are present in other frames of the sequence.
For example, in some such embodiments, the scan direction for capturing image data from rows of pixel sensors is alternated between top-to-bottom and S bottom-to-top for successive frames in the sequence. The resulting display images, following motion compensation mapping, may be such that any missing image sections are for different parts of the image for different frames in the sequence. If the frame repetition rate is faster than that that can be distinguished by human perception (for example, greater than or equal to around 30 frames per second) then the use of such alternating scan directions can cause a human viewer to perceive that there are no missing sections of the image when viewing the sequence of frames. Thus, in some circumstances improved, stabilised display images can be obtained in comparison with the case where the scan direction is the same from frame to frame.
In further embodiments, the scan direction alternates from left-to-right and right-to-left for successive frames, in which case motion compensation mapping may be performed on a column-by-column rather than a row-by-row basis. In other embodiments, all four scan directions (left-to-right, right-to-left, top-to-bottom, bottom-to-top) are used for successive frames either in a predetermined, regular sequence or in a random or pseudo-random sequence.
By varying the scan directions between frames, the number and position of missing sections of the display image can be randomised wfthin the field of view.
in other embodiments, the processor monitors the measured motion of the image sensor array using the motion sensing system and selects the direction of scanning of the rows or columns of sensor pixels for the capture of pixel sensor data in dependence on the measured motion, for example to ensure that the scan direction is aligned with, and in substantially the same direction as, the direction of motion of the image relative to the array, thereby to reduced the chances of sections of the displayed image being missing.
In some embodiments, the scan direction is selected in accordance with the likely direction of motion of the imaging device. For example, if the imaging device is mounted on a UAV the scan direction may be selected to be in the opposite direction to, and along the axis of, the direction of travel of the UAV (and thus in the same direction as the direction of travel of the image relative to the array of image sensors caused by motion of the UAV).
In alternative embodiments, the processor is configured to fill in missing sections of the display image either by, for each missing display pixel, interpolating or extrapolating other display pixel values and assigning the interpolated or extrapolated values to the missing pixel, or by assigning to the missing pixel a pixel value obtained for that missing pixel in a previous frame. In embodiments for which interpolation or extrapolation is used to assign pixel values to missing pixels in a display frame, the S interpolation or extrapolation may be from pixel values for nearby display positions in the same frame, or from pixel values for the missing pixel in preceding and/or following frames.
It will be understood that further embodiments are not limited to the components and arrangements of such components described in relation to the illustrated embodiments.
For example, in the embodiment of Figure 1, the mappings are performed by the processor included in the imaging device. In alternative embodiments, the pixel sensor signals are transmitted from the imaging device to a further device, for example the user terminal, and the mapping and other processing is performed at the remote device.
The individual motion-compensation mappings can be performed on a line-by-line basis, and the lines can be rows, columns, diagonal lines of curved lines of image sensors. Alternatively, the motion compensation mappings can be performed separately for individual pixel sensor signals from individual image sensors of an array, or for any desired sub-set of the image sensors of an array (for example with pixel sensor signals from each individual image sensor or sub-set of image sensors having its own, potentially different mapping).
In the embodiment described above, the pixel signals are represented by Cartesian co-ordinates (x,y,z) and the pixel signals are mapped to offset Cartesian co-ordinates (x', y' and 1) in accordance with trigonometric calculations to take account of the motion of the device (which may also be referred to as correction of the pixel signals or image). The device is not limited to using the Cartesian co-ordinate system or to correction of the signals using trigonometric calculations. Any suitable co-ordinates may be used, for instance spherical co-ordinates, and any suitable mapping process for pixel signals to take account of the motion of the device may be used.
The operator terminal in the embodiments illustrated herein is a suitably programmed PC, but in alternative embodiments any suitable processing device, for example a pda, tablet or laptop, that is suitably programmed with image processing or display software, or that includes suitable image processing or display circuitry can be used, for example a ROVER 5 terminal (manufactured by L3 Communications).
Embodiments, or features of such, can be implemented as a computer program product, for example for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The computer program product, in some embodiments may comprise one or more ASICs (application specific integrated circuits) or one or more field-programmable gate arrays (FFGAs). The series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.
It will also be well understood by persons of ordinary skill in the art that whilst embodiments implement certain functionality by means of software, that functionality could be implemented solely in hardware (for example by means of one or more ASICs (application specific integrated circuits)) or by a mix of hardware and software.
As such, embodiments are not limited only to being implemented in software.
It will be understood that the present invention has been described above purely by way of example, and modifications of detail can be made within the scope of the invention.
Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.

Claims (21)

  1. Claims 1. An imaging method comprising:-obtaining image data comprising a set of pixel sensor data items S representative of an image formed on an array of image sensors of an imaging device, wherein the image data comprises a plurality of subsets of pixel sensor data items, each subset of pixel sensor data items being captured using at least one image sensor of the array at a respective different time during a frame capture period; obtaining motion or position measurements performed using a physical sensor; for each subset of pixel sensor data items, determining from at least one of the plurality of motion or position measurements, and independently of values of pixel sensor data items of the subset, respective different position data; for each subset of pixel sensor data items, mapping each pixel sensor data item of the subset to a respective display position using the determined position data for that subset of pixel sensor data items and obtaining display pixel data items from the subset of pixel sensor data items for the mapped display positions; wherein the mapping of each subset of pixel sensor data items is such as to at least partially compensate for motion of the imaging device or array.
  2. 2. A method according to Claim 1, wherein the set of pixel data items form a frame of image data and the display pixel data items obtained for the subsets of pixel data items combine to form a frame of display data for representing the image on a display.
  3. 3. A method according to Claim 1 or 2, comprising obtaining a succession of frames of image data, wherein each frame of image data is obtained using a method according to Claim 1.
  4. 4. A method according to Claim 3, wherein each pixel sensor data item is obtained from a corresponding at least one image sensor of the array, and the method comprises, for each frame of image data, capturing pixel sensor data items from a series of subsets of image sensors of the array in succession, and the order of the subsets of image sensors from which pixel sensor data items are captured is different for at least one of the frames than for at least one other of the frames.
  5. 5. A method according to any preceding claim, wherein each subset of pixel data items comprises a line of pixel sensor data items obtained from a corresponding line of the image sensors of the array, for example a row or column of pixel sensor data items obtained from a corresponding row or column of image sensors of the array.
  6. 6. A method according to Claim 5 as dependent on Claim 3, comprising, for each of the frames of image data, capturing pixel sensor data items from successive lines of image sensors in a scan direction relative to the array.
  7. 7. A method according to Claim 6, wherein the scan direction is different for at least one of the frames than for at least one other of the frames.
  8. 8. A method according to Claim 6 or 7, wherein the scan direction relative to the array for each frame is one of up, down, left and right.
  9. 9. A method according to any of Claims 6 to 8, wherein the scan direction for each frame varies from frame-to-frame in one of a regular or pseudo-random sequence.
  10. 10. A method according to Claim 6, comprising selecting a scan direction in dependence on a measured or expected direction of motion of the imaging device or sensor array.
  11. 11. A method according to any of Claims I to 4, wherein each subset of pixel sensor data items comprises a single pixel sensor data item.
  12. 12. A method according to any preceding claim, wherein, for each subset of pixel sensor data items, the position data is representative of the position of the image relative to the array.
  13. 13. A method according to Claim 12, wherein the position data is representative of an angular or lateral position.
  14. 14. A method according to any preceding claim, wherein the mapping is such as to maintain a desired perspective of the image on the display, for example a desired orientation of the image on the display.
  15. 15. A method according to any preceding claim, wherein each frame of display data comprises a plurality of display pixel data items each corresponding to a display position, and the method comprises retaining a display pixel data item from a preceding frame in a subsequent frame, if the mapping for the subsequent frame does not update the display pixel data item.
  16. 16. A method according to any preceding claim, wherein each display frame comprises a plurality of display pixel data items each corresponding to a display position, and the method comprises, for at least one of the frames, determining the value of a display pixel data item from image or display data for at least one nearby position, if the mapping for said at least one of the frames does not update the display pixel data item.
  17. 17. A method according to Claim 16, comprising interpolating or extrapolating the values of at least one pixel data item or display pixel data item for nearby positions to determine the value of said display pixel data item,
  18. 18. A method according to any preceding claim, wherein the determining of position data for a subset of pixel sensor data items comprises determining the position data from motion or position measurements obtained before and/or after the time of capture of the subset of pixel sensor data items.
  19. 19. A method according to any preceding claim, wherein the determining of position data for a subset of pixel sensor data items comprises interpolating or extrapolating motion or position measurements obtained before and/or after the time of capture of the subset of pixel sensor data items.
  20. 20. An imaging apparatus comprising:-an array of image sensors; means for capturing image data from the array of image sensors, the mage data comprising a set of pixel sensor data items representative of an image formed on the array, wherein the set of pixel sensor data items comprises a plurality of subsets of pixel sensor data items, each subset of pixel sensor data items being captured using at least one image sensor of the array at a respective different time during a frame capture period; a physical sensor for obtaining motion or position measurements; and a processing resource configured to:-for each subset of pixel sensor data items, determine from at least one of the plurality of motion or position measurements, and independently of values of pixel sensor data items of the subset, respective different position data; and for each subset of pixel sensor data items, map each pixel sensor data item of the subset to a respective display position using the determined position data for that subset of pixel sensor data items and obtain display pixel data items from the subset of pixel sensor data items for the mapped display positions, wherein the mapping of each subset of pixel sensor data items is such as to at least partially compensate that subset of pixel sensor data items for sensed motion of the imaging device.
  21. 21. A computer program product comprising computer readable instructions that are executable to perform a method according to any of Claims 1 to 19.
GB1014253.7A 2010-08-26 2010-08-26 Imaging device with measurement and processing means compensating for device motion Withdrawn GB2483224A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1014253.7A GB2483224A (en) 2010-08-26 2010-08-26 Imaging device with measurement and processing means compensating for device motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1014253.7A GB2483224A (en) 2010-08-26 2010-08-26 Imaging device with measurement and processing means compensating for device motion

Publications (2)

Publication Number Publication Date
GB201014253D0 GB201014253D0 (en) 2010-10-13
GB2483224A true GB2483224A (en) 2012-03-07

Family

ID=43013295

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1014253.7A Withdrawn GB2483224A (en) 2010-08-26 2010-08-26 Imaging device with measurement and processing means compensating for device motion

Country Status (1)

Country Link
GB (1) GB2483224A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913140B2 (en) 2011-08-15 2014-12-16 Apple Inc. Rolling shutter reduction based on motion sensors
DE102013220477A1 (en) * 2013-10-10 2015-04-16 Application Solutions (Electronics and Vision) Ltd. Method for correcting a pictorial representation of a vehicle environment
US9148569B2 (en) 2012-11-21 2015-09-29 Bank Of America Corporation Capturing an image on a mobile device
GB2530659A (en) * 2014-09-09 2016-03-30 Boeing Co Coordinating image sensing with motion
GB2548970A (en) * 2016-02-25 2017-10-04 Bosch Gmbh Robert Method and device for generating an image of an area surrounding a vehicle
WO2018053846A1 (en) * 2016-09-26 2018-03-29 深圳市大疆创新科技有限公司 Focusing method, camera device, and unmanned aerial vehicle
WO2018071983A1 (en) * 2016-10-20 2018-04-26 Spookfish Innovations Pty Ltd An image synthesis system
WO2019167672A1 (en) * 2018-03-01 2019-09-06 Sony Corporation On-chip compensation of rolling shutter effect in imaging sensor for vehicles
CN111114780A (en) * 2019-12-20 2020-05-08 山东大学 Unmanned aerial vehicle steel bar detection standard part placing and recycling system and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114846536B (en) * 2020-12-01 2023-12-12 京东方科技集团股份有限公司 Data processing method and device and display device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036460A1 (en) * 2004-04-10 2007-02-15 Christian-Albrechts-Universitaet Zu Kiel Method for the rotation compensation of spherical images
WO2008090345A1 (en) * 2007-01-24 2008-07-31 Dreampact Limited Imaging apparatus
US20090015674A1 (en) * 2006-04-28 2009-01-15 Kevin Alley Optical imaging system for unmanned aerial vehicle
US20090160957A1 (en) * 2007-12-20 2009-06-25 Micron Technology, Inc. Methods and system for digitally stabilizing video captured from rolling shutter cameras

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036460A1 (en) * 2004-04-10 2007-02-15 Christian-Albrechts-Universitaet Zu Kiel Method for the rotation compensation of spherical images
US20090015674A1 (en) * 2006-04-28 2009-01-15 Kevin Alley Optical imaging system for unmanned aerial vehicle
WO2008090345A1 (en) * 2007-01-24 2008-07-31 Dreampact Limited Imaging apparatus
US20090160957A1 (en) * 2007-12-20 2009-06-25 Micron Technology, Inc. Methods and system for digitally stabilizing video captured from rolling shutter cameras

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913140B2 (en) 2011-08-15 2014-12-16 Apple Inc. Rolling shutter reduction based on motion sensors
US9148569B2 (en) 2012-11-21 2015-09-29 Bank Of America Corporation Capturing an image on a mobile device
DE102013220477A1 (en) * 2013-10-10 2015-04-16 Application Solutions (Electronics and Vision) Ltd. Method for correcting a pictorial representation of a vehicle environment
DE102013220477B4 (en) * 2013-10-10 2021-07-01 Application Solutions (Electronics and Vision) Ltd. Method and device for correcting a pictorial representation of a vehicle environment
GB2530659B (en) * 2014-09-09 2019-02-13 Boeing Co Coordinating image sensing with motion
GB2530659A (en) * 2014-09-09 2016-03-30 Boeing Co Coordinating image sensing with motion
US9781378B2 (en) 2014-09-09 2017-10-03 The Boeing Company Coordinating image sensing with motion
GB2548970B (en) * 2016-02-25 2022-01-19 Bosch Gmbh Robert Method and device for generating an image of an area surrounding a vehicle
US10616555B2 (en) 2016-02-25 2020-04-07 Robert Bosch Gmbh Method and device for ascertaining an image of the surroundings of a vehicle
GB2548970A (en) * 2016-02-25 2017-10-04 Bosch Gmbh Robert Method and device for generating an image of an area surrounding a vehicle
WO2018053846A1 (en) * 2016-09-26 2018-03-29 深圳市大疆创新科技有限公司 Focusing method, camera device, and unmanned aerial vehicle
US11181809B2 (en) 2016-09-26 2021-11-23 SZ DJI Technology Co., Ltd. Focusing method, imaging device, and unmanned aerial vehicle
US10698295B2 (en) 2016-09-26 2020-06-30 SZ DJI Technology Co., Ltd. Focusing method, imaging device, and unmanned aerial vehicle
US11057566B2 (en) 2016-10-20 2021-07-06 Spookfish Innovations Pty Ltd Image synthesis system
WO2018071983A1 (en) * 2016-10-20 2018-04-26 Spookfish Innovations Pty Ltd An image synthesis system
AU2017344761B2 (en) * 2016-10-20 2022-09-15 Spookfish Innovations Pty Ltd An image synthesis system
US11689808B2 (en) 2016-10-20 2023-06-27 Spookfish Innovations Pty Ltd Image synthesis system
WO2019167672A1 (en) * 2018-03-01 2019-09-06 Sony Corporation On-chip compensation of rolling shutter effect in imaging sensor for vehicles
CN111114780A (en) * 2019-12-20 2020-05-08 山东大学 Unmanned aerial vehicle steel bar detection standard part placing and recycling system and method

Also Published As

Publication number Publication date
GB201014253D0 (en) 2010-10-13

Similar Documents

Publication Publication Date Title
GB2483224A (en) Imaging device with measurement and processing means compensating for device motion
US10771699B2 (en) Systems and methods for rolling shutter correction
US10659690B2 (en) Systems and methods for mobile platform imaging
US8964047B2 (en) Self-correcting adaptive long-stare electro-optical system
US20090015674A1 (en) Optical imaging system for unmanned aerial vehicle
US20180160110A1 (en) Method for calibrating a camera and calibration system
US20130322697A1 (en) Speed Calculation of a Moving Object based on Image Data
US11689808B2 (en) Image synthesis system
EP2430616A2 (en) Image generation method
US9794483B1 (en) Video geolocation
JP4446041B2 (en) Camera vector computing device, shake component detecting device, image stabilizing device, position and orientation stabilizing device, target object lock-on device, and live-action object attribute calling device provided in this camera vector computing device
JP6526955B2 (en) Sensor information integration method and device thereof
KR102016713B1 (en) Method for fixed pattern noise reduction and use of such method
GB2481027A (en) Image stabilising apparatus and method
CN112485785A (en) Target detection method, device and equipment
US11019245B2 (en) Bundle adjustment system
US20220174217A1 (en) Image processing method and device, electronic device, and computer-readable storage medium
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length
JP7196920B2 (en) Driving support device, driving support method, and program
WO2005048605A1 (en) Synthetic electronic imaging system
US8085299B2 (en) Digital line scan camera
US20140078294A1 (en) System and method for real time registration of images
US20230206495A1 (en) Adaptive alignment system
AU2019275236B2 (en) System and method for sensor pointing control
EP3691250A1 (en) Imaging device, control method, and recording medium

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)