GB2516918A - Method and apparatus - Google Patents
Method and apparatus Download PDFInfo
- Publication number
- GB2516918A GB2516918A GB1314077.7A GB201314077A GB2516918A GB 2516918 A GB2516918 A GB 2516918A GB 201314077 A GB201314077 A GB 201314077A GB 2516918 A GB2516918 A GB 2516918A
- Authority
- GB
- United Kingdom
- Prior art keywords
- frame
- pixel
- frames
- fraction
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/48—Increasing resolution by shifting the sensor relative to the scene
Abstract
An apparatus comprises an image sensor 46 having a plurality of pixels, a motion detector gyroscope 30, and a motion compensator in the form of an optical image stabiliser (OIS) 34. The motion compensator controls a position of said image sensor during image capture. A first frame captured by the image sensor is displaced by a fraction of an amount of a pixel in the horizontal, vertical, diagonal direction(s) with respect to at least one other frame captured by said image sensor. The frames are then combined resulting in an image with increased resolution when compared to an image captured by the sensor where the senor is not moved between frames.
Description
METHOD AND APPARATUS
Some embodiments relate to a method and apparatus and in particular but not exclusively to a method and apparatus for use in an optical apparatus.
Optical image stabiflsaflon is typicaUy provided in cameras and in other optical apparatus, such as binoculars. The cameras may he standalone cameras or cameras included in other devices, for example mobile communication devices such as mobile phones, smart phones or the like, Optical image stahilisation (OIS) may be performed by providing a movable opflcal element within an optical system which is moved to compensate for movement, such as from hand tremor.
According to an aspect, there is provided an apparatus comprising: an image sensor having a plurality of pixels; a motion detector; and a motion compensator configured in response to motion detected by said motion detector to control a position of said image sensor such that a first frame captured by said image sensor is displaced by a fraction of an amount of a pixel with respect to at least one other frame captured by said image sensor.
The fraction may comprise said fraction of a pixel in at least one of: an x direction; a y direction; and a diagonal direction representing a shift by said fraction of a pixel in said x direction and said y direction.
The fraction may comprise half a pixel in at least one of said x direction, said y direction; and both said x direction and said y direction.
The apparatus may comprise an image processor configured to combine said first and at least one other frame.
The image processor may be configured to combine said first frame and one other frame.
The one other frame may be displaced in a diagonal direction with respect to said first frame.
The image processor may be configured to combine the first Frame and at least one other frame, and to interpolate further pixels into said combined frame.
The image processor may be configured to combine said first frame with three other frames, said three other frames being respecUvely displaced with respect to said first frame by half a pixel in a x direction, half a pixel in a y direction and a diagonal direction representing a displacement of half a pixel in the x direction and in the y direction.
The motion detector may be a gyroscope configured to provide information about motion of said apparatus.
The motion compensator may comprise an optical image stabillser, The image sensor may comprise a ens and an actuator is provided to control a position of said ens, said actuator being controed by said motion compensator.
A camera may comprise an apparatus as described previously.
According to another aspect, there is provided a method comprising: detecting motion; and controlUng a position of an image sensor in response to said detected motion such that a first frame captured by said image sensor is displaced by a fraction of an amount of a pixel with respect to at least one other frame captured by said image sensor.
The fraction may comprise said traction of a pixel in at least one of: an x direction; a y direction; and a diagonal direction representing a shift by said fraction of a pixel in said x direction and said y direction.
The fraction may comprise half a pixel in at least one of said x direction, said y direction; and both said x direction and said y direction.
The method may comprise combining said first and at least one other frame.
The method may comprise combining said first frame and one other frame.
One other frame may be displaced in a diagonal direction with respect to said first frame.
The method may comprise combining the first frame and at least one other frame, and to interpolate further pixels into said combined frame.
The method may comprise combining said first frame with three other frames, said three other frames being respectively displaced with respect to said first frame by half a pixel in a x direction, half a pixel in a y direction and a diagonal direction representing a displacement of half a pixel in the x direction and in the y direction.
According to another aspect, there is provided an apparatus comprising: image sensing means having a plurality of pixels; motion detecting means; and motion compensation mean for, in response to motion detected by said motion detecting means, controlling a position of said image sensor means such that a first frame captured by said image sensing means is displaced by a fraction of an amount of a pixel with respect to at least one other frame captured by said image sensing means.
The fraction may comprise said fraction of a pixel in at least one of: an x direction; a y direction; and a diagonal direction representing a shift by said fraction of a pixel in said x direction and said y direction.
The fraction may comprise half a pixel in at least one of said x direction, said y direction; and both said x direction and said y direction.
The apparatus may comprise image processing means for combining said first and at least one other frame.
The image processing means may be for combining said first frame and one other frame.
The one other frame may be displaced in a diagonal direction with respect to said first frame.
The image processing means may be for combining the first frame and at least one other frame, and for interpolating further pixels into said combined frame.
The image processing means may be for combining said first frame with three is other frames, said three other frames being respectively displaced with respect to said first frame by half a pixel in a x direction, half a pixel in a y direction and a diagonal direction representing a displacement of half a pixel in the x direction and in the y direction, The motion detecting means may be gyroscope means for providing information about motion of said apparatus.
The motion compensation means may comprise optical image stahilisation means.
A camera may comprise an apparatus as described previously.
In the above, many different embodiments have been described. It should be appreciated that further embodiments may be provided by the combination of any two or more of the embodiments described above.
A computer program comprising program code means adapted to perform the method may also be provided.
Various other aspects and further embodiments are also described in the following detailed description and in the attached claims.
Embodiments will now be described by way of example only to the accompanying drawings in which: Figure Ia shows an example of a sensor; Agure lb shows the sensor of Figure Ia which is shifted by half a pix& between frames; Figure 2 shows schematically graphs of x and y position displacements for first and second frames; s Figure shows an optical apparatus of an embodiment; Figure 4 shows a method of an embodiment; Figure 5 schematicafly shows the capture of four frames and combination thereof; and Figure 6 schematicafly shows the combination of two frames.
Reference is made to Figure la which schematicafly shows a 16 pixel sensor in a 4 x 4 array. It should be appreciated that the number of pixels is purely by way of example only and in embodiments, any suitable number of pixels may be used.
Reference is made to Figure lb which schematicaHy shows how a higher resolution image may be achieved using the pixel sensor array of Fgure la. In Is particular, the pixel sensor array Es moved, in this example, by half a pixel in both the x and y directions between the capture of two frames. The frames may be successive frames. Thus, the same scene can he captured twice and combined to increase the spatial resolution. In some embodiments, the scene is thus super sampled in space using the two captured frames.
In capturing images, hand shake motion is often taken into account.
Handshake motion is variable from user to user and over time. The amount of handshake motion may be dependent on the shutter speed or the like, In embodiments, the hand shake motion compensation function is modified, taking into account handshake motion, to ensure that the field of view moves by 0.5 pixels (in the x andy direction) between the captures of successive images.
Reference is made to Figure 2 which shows this principle in more detail. A first frame, Frame 1, is read out. A second frame, Frame 2, is shifted by half a pixel both in the x direction and y direction compared to Frame 1. The first graph shows when the first and second frames are read out. The next graph shows an x position of the pixel sensor array over time. Line 60 represents the field of view displacement with handshake in the x direction. In other words this is the movement of the field of view in the x direction if there is no 013.
The next graph shows a y position of the pixel sensor array over Ume. Lhie 62 represents the fi&d of view displacement with handshake in the y direction. ri other words this is the movement of the field of view in the y direction if there is no OIS.
As can be seen, the field of view in the x and y direction varies during the read out of each frame. 015 is configured to compensate for this handshake. Thus line 64 in the second graph represents the field of view displacement with optical image stabilization in the x direction. As can be seen from the second graph, whilst the first frame is being captured, the field of view displacement is compensated by the optical image stabiliser to provide a constant x position. When the next frame is being captured, the optical image stabilizer will provide compensation such that the pixel is shifted by half a pixel in the x direction with respect to the image captured in the first frame.
Similarly Une 66 in the third graph represents the field of view displacement with optical image stabilization in the y direcfion. Again, whilst the first frame is being captured, the field of view displacement is compensated by the optical image stabilizer to provide a constant y position. When the next frame is being captured, the optical image stabiliser will provide compensation such that the pixel is shifted by half a pixel in the y direction with respect to the image captured in the first frame.
Thus, whilst the first frame is being captured, the field of view displacement is compensated by the 015 to provide a steady field of view position. When the second frame is being captured, the 015 will control the field of view displacement so that it is a constant half pixel step in the x and y directions with respect to the previous frame.
Thus, some embodiments may use the optical image stabilizer mechanism to cancel hand motion and impart a deliberate half pixel shift between two successive captures.
In some embodiments, this may provide super sampling of the scene by a factor of four. For example, a 4 megapixel sensor may perform as though it were a 16 megapixel sensor.
Reference is made to Figure 3 which shows schemafically an apparatus and Figure 4 which schematically shows a method of an embodiment.
The apparatus comprises a gyroscope 30 which is arranged to detect hand motion. The gyroscope 30 provides an output to an optical image stabilisation 015 processor 34. An image signal processor ISP 36 is provided which provides a control output to the gyroscope 30 and the 013 processor 34. The apparatus also comprises an actuator 38 which is arranged to control the position of a ens 32. The actuator 38 is controUed by the OIS processor 34, A sensor array 46 which captures the image is configured to provide an output to the image signal processor 36.
s Referring to Figure 4, in step Si, the image signal processor 36 indicates a start of frame ito an optical image stabilization (015) processor 34.
In step 82, camera motion is detected by a gyroscope 30. This motion may be due to hand motion of the user holding the camera.
In step 33, a gyroscope signal from the gyroscope representing the motion is passed to the optical image stabilization (OIS) processor 34. The OIS processor 34 uses the gyroscope signal to calculate a ens shift. The 013 processor provides a control output to an actuator 38 which will move the lens 32 by the calculated lens shift. The 015 processor, for the first frame, will simply provide compensation for handshake or the like motion.
is In step S4, the first frame, frame I, captured by the image sensor (i.e. pixel array) 46 is read from the sensor 46 to the image signal processor 36. The image signal processor wiU store this first frame in memory which may be part of the processor or separate from the processor.
In step 35, the image signal processor 36 indicates a start of frame 2 to the opflc& image stabilization processor 34. Steps S2, 83 and S4 are then repeated to obtain the second frame. In step S3 for the second frame, the 018 processor causes the actuator to shift the lens by half a pixel with respect to the first frame whilst still providing compensation for handshake or the like motion.
In step S6, the first and second frames are combined to provide an output frame with a higher resolution than the first and second frames. The combining of the frames is carried out by the image signal processor.
In some embodiments, four frames may be obtained. The first frame may be considered to be a reference or base frame. The remaining three frames may be respectively shifted by half a pixel in the horizontal direction, half a pixel in the vertical direction and half a pixel in the diagonal direction with respect to the base frame. In this regard, reference is made to Figure 5.
Figure 5 schematically shows frame I with the captured pixels represented by a dot. The first frame can be considered to be a reference of base frame.
Frame 2 is shown with the pixels of the first frame as a reference. The pixels of the second frame are shifted by half a pixel in the horizontal direction with respect to the first frame. These pixels of frame 2 are represented by h" in Figure 5.
Frame 3 is shown with the pixels of the first frame as a reference. The pixels of the third frame are shifted by half a pixel in the vertical direction with respect to the first frame. These pixels of frame 3 are represented by v" in Figure 5.
Frame 4 is shown with the pixels of the first frame as a reference. The pixels of the fourth frame are shifted by a shift represented by half a pixel in the x direction and half a pixel in the y direction to give a diagonal shift with respect to the first frame. These pixels of frame 4 are represented by "d" in Figure 5.
If frames 1 and 2 are combined, this will increase the horizontal resolution by a factor of two. If frames I and 3 are combined, this will increase the vertical resolution by two. If frames I and 4 are combined, this wHI increase the diagonal resolution by a factor of two. It should be appreciated that any two or more of these frames may be combined.
In some embodiments, frames one, two, three and four are combined: also as shown in Figure 5.
Reference is made to Figure 6 which schematically shows a combination of frames I and 4. The simple combination of frames wifl give an increase of resolution ofafactoroftwo, In some embodiments, interpolation may be used to achieve for example an interpolated resolution increase of a factor of 4. The interpolated pixels are represented by a "±" and are provided between each pair of adjacent pixels of the combined frame obtained by the combination of frames 1 and 4.
The interpolated pixel may be determined using for example simple linear interpolation where the interpolated pixel is a combination of Pl+P2+P3+P4/4 where P1, P2, P3 and P4 represent the pixels surround the interpolated pixel.
However, it should be appreciated that more sophisticated directional filters may be used.
It should be appreciated that in some embodiments, two frames may be combined. These two frames can any two of the four frames of figure 5. In some embodiments three or four frames may be combined.
In the above described embodiments, reference Is made to an x and y compensation system. It should be appreciated that this Is by way of example only and any other suitable coordinate system may alternatively or additionally be used.
In the above described embodiments, there Is a halt pixel shift between s Frames. In other embodiments, the shift may be by a different amount In some embodiments, the amount of the shift may be a fraction amount, less than 1. In other embodiments, the amount of shift may be by more than one pixel but may be a non-Integer value. With different sizes of pixel shifts, different numbers of frames may be combined in an embodiment.
For example, the shift may be a third of a pixel. Two or more frames may be combined. Advantageously three or more frames may be combined when the shift is a third of a pixel.
Two or more frames may be combined.
Interpolation may be used In some embodiments.
is The described embodiments have a used a regular array of pixels. In some embodiments, a different arrangement of pixels may be used. For example one row of pixels may be offset with respect to another.
Some embodiments may be provided by one or more Integrated circuits.
The processors may be provided by the same or different processors.
One or more memories may be provided.
Some embodiments may be controlled by a computer program, stored for example in memory, and comprising computer executable Instructions which when run by, for example one or more processors, cause at least part of the above described method to be performed.
Various embodiments with different variations have been described here above. It should be noted that those skilled In the art may combine various elements of these various embodiments and variations.
Such alterations, modifications, and Improvements are intended to be part of this disclosure, and are Intended to be within the scope of the present invention.
Accordingly, the foregoing description is by way of example only and is not intended to be limiting. The present invention is limited only as defined in the following claims and the equivalents thereto.
Claims (20)
- CLAIMS: 1. An apparatus comprising; S an image sensor having a pluraty of pixels; a motion detector; and a mobon compensator configured in response to motion detected by said motion detector to control a position of said image sensor such that a first frame captured by said image sensor is displaced by a fraction of an amount of a pixel with respect to at least one other frame captured by said image sensor.
- 2. An apparatus as claimed in claim 1, wherein said fraction comprises said fraction of a pixel in at least one of: an x direction; a y direction; and a diagonal direction representing a shift by said fraction of a pixel in said x direction and said y direction.
- 3. An apparatus as daimed in claim 2, wherein said fraction comprises half a pixel in at least one of said x direction, said y direction; and both said x direction and said y direction.
- 4. An apparatus as claimed in any preceding claim, comprising an image processor configured to combine said first and at east one other frame.
- 5. An apparatus as claimed in claim 4, wherein said image processor is configured to combine said first frame and one other frame.
- 6. An apparatus as claimed in claim 5, wherein said one other frame is displaced in a diagonal direction with respect to said first frame.
- 7. An apparatus as claimed in claim 4. 6 or 6, wherein said image processor is configured to combine the first frame and at least one other frame, and 1j) interpolate further pixels into said combined frame.
- 8. An apparatus as claimed in daim 4 or 7, wherein said image processor is configured to combine said first frame with three other frames, said three other frames being respectively displaced with respect to said first frame by haff a pix& in a x direction, haf a pixS in a y direction and a diagonal direction representing a displacement of half a pixel in the x direction and in the y direction.
- 9. An apparatus as claimed in any preceding daim, wherein said motion detector is a gyroscope configured to provide information about motion of said apparatus.
- 10. An apparatus as daimed in any preceding daim, wherein said motion compensator comprises an optical image stabUiser.
- 11. An apparatus as claimed in any preceding claim, wherein said image sensor comprises a lens and an actuator is provided to control a position of said lens, said actuator being controlled by said motion compensator.
- 12. A camera comprising an apparatus as claimed in any preceding claim,
- 13. A method comprising; detecting motion, controlNng a position of an image sensor in response to said detected such that a first frame captured by said image sensor is displaced by a fraction of an amount of a pixel with respect to at least one other frame captured by said image sensor.
- 14. A method as claimed in claim 13, wherein said fraction comprises said fraction of a pixel in at least one of: an x direction; a y direction; and a diagonal direction representing a shift by said fraction of a pixel in said x direction and said y direction,
- 15. A method as claimed in claim 14, wherein said fraction comprises half a pixel in at least one of said x direction, said y direction; and both said x direction and said y direction.
- 16. A method as claimed In any of claim 13 to 15, comp,lsing combining said first and at least one other frame.
- 17. A method as claimed in claim 16, comprIsing combining said first frame and S one other frame.
- 18. A method as claimed in claim 17, wherein said one other frame is displaced In a diagonal direction with respect to said first frame.
- 19. A method as claimed in any of claIms 16, 17 or 18, comprIsing combining the first frame and at least one other frame, and to interpolate further pixels into said combined frame.
- 20. A method as claimed In claim 16 or 19, comprising combining said first frame is wIth three other frames, said three other frames being respectively displaced with respect to said first frame by half a pixel in a x direction, half a pixel in a y direction and a diagonal direction representing a displacement of half a pixel in the x direction and in the y direction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1314077.7A GB2516918A (en) | 2013-08-06 | 2013-08-06 | Method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1314077.7A GB2516918A (en) | 2013-08-06 | 2013-08-06 | Method and apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201314077D0 GB201314077D0 (en) | 2013-09-18 |
GB2516918A true GB2516918A (en) | 2015-02-11 |
Family
ID=49224249
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1314077.7A Withdrawn GB2516918A (en) | 2013-08-06 | 2013-08-06 | Method and apparatus |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2516918A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3261329A4 (en) * | 2015-02-19 | 2018-08-15 | Olympus Corporation | Image capturing device |
WO2022218216A1 (en) * | 2021-04-14 | 2022-10-20 | 华为技术有限公司 | Image processing method and terminal device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0669757A2 (en) * | 1994-02-28 | 1995-08-30 | Canon Kabushiki Kaisha | Image sensing apparatus |
US5754226A (en) * | 1994-12-20 | 1998-05-19 | Sharp Kabushiki Kaisha | Imaging apparatus for obtaining a high resolution image |
US20050280714A1 (en) * | 2004-06-17 | 2005-12-22 | Freeman Philip L | Image shifting apparatus for enhanced image resolution |
US20120188387A1 (en) * | 2011-01-26 | 2012-07-26 | Kabushiki Kaisha Toshiba | Camera module, electronic apparatus, and photographing method |
-
2013
- 2013-08-06 GB GB1314077.7A patent/GB2516918A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0669757A2 (en) * | 1994-02-28 | 1995-08-30 | Canon Kabushiki Kaisha | Image sensing apparatus |
US5754226A (en) * | 1994-12-20 | 1998-05-19 | Sharp Kabushiki Kaisha | Imaging apparatus for obtaining a high resolution image |
US20050280714A1 (en) * | 2004-06-17 | 2005-12-22 | Freeman Philip L | Image shifting apparatus for enhanced image resolution |
US20120188387A1 (en) * | 2011-01-26 | 2012-07-26 | Kabushiki Kaisha Toshiba | Camera module, electronic apparatus, and photographing method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3261329A4 (en) * | 2015-02-19 | 2018-08-15 | Olympus Corporation | Image capturing device |
WO2022218216A1 (en) * | 2021-04-14 | 2022-10-20 | 华为技术有限公司 | Image processing method and terminal device |
Also Published As
Publication number | Publication date |
---|---|
GB201314077D0 (en) | 2013-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7978222B2 (en) | Systems and methods for image stabilization | |
JP6209002B2 (en) | Imaging apparatus and control method thereof | |
JP4518197B2 (en) | Imaging apparatus, image blur correction method, and program | |
US20140125825A1 (en) | Super-resolution based on optical image stabilization | |
JP5917054B2 (en) | Imaging apparatus, image data processing method, and program | |
US9185281B2 (en) | Camera platform system | |
JP6372983B2 (en) | FOCUS DETECTION DEVICE, ITS CONTROL METHOD, AND IMAGING DEVICE | |
US11445109B2 (en) | Image processing device, image capturing device, image processing method, and storage medium | |
JP2012039177A5 (en) | ||
US10095919B2 (en) | Image processing apparatus, image processing method and storage medium to suitably clip a subject region from a moving image | |
JP2015049402A5 (en) | FOCUS DETECTION DEVICE, ITS CONTROL METHOD, AND IMAGING DEVICE | |
US20070160355A1 (en) | Image pick up device and image pick up method | |
KR102176285B1 (en) | Apparatus and method for forming high resolution image using optical image stabilizer | |
KR102592745B1 (en) | Posture estimating apparatus, posture estimating method and computer program stored in recording medium | |
GB2516918A (en) | Method and apparatus | |
JP6564284B2 (en) | Image processing apparatus and image processing method | |
JP5836110B2 (en) | Imaging apparatus and control method thereof | |
JP6557499B2 (en) | FOCUS DETECTION DEVICE, ITS CONTROL METHOD, IMAGING DEVICE, PROGRAM, AND STORAGE MEDIUM | |
JP6362473B2 (en) | Image processing apparatus, imaging apparatus, image processing method, program, and storage medium | |
US8817127B2 (en) | Image correction device for image capture device and integrated circuit for image correction device | |
JP7453267B2 (en) | Imaging device, imaging device, operating method of the imaging device, and program | |
JPWO2017094122A1 (en) | Imaging apparatus, endoscope apparatus, and imaging method | |
US8786677B2 (en) | Imaging device | |
TWI639338B (en) | Image capturing apparatus and image smooth zooming method thereof | |
KR100562334B1 (en) | Image distortion compensation method and devices for CMOS Image Sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |