EP1949671A1 - Imaging system with adjustable optics - Google Patents
Imaging system with adjustable opticsInfo
- Publication number
- EP1949671A1 EP1949671A1 EP05808852A EP05808852A EP1949671A1 EP 1949671 A1 EP1949671 A1 EP 1949671A1 EP 05808852 A EP05808852 A EP 05808852A EP 05808852 A EP05808852 A EP 05808852A EP 1949671 A1 EP1949671 A1 EP 1949671A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- exposure time
- measurement
- exposure
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 86
- 238000005259 measurement Methods 0.000 claims abstract description 95
- 238000000034 method Methods 0.000 claims description 33
- 238000005096 rolling process Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims 1
- 230000033001 locomotion Effects 0.000 description 63
- 238000001514 detection method Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 15
- 238000012545 processing Methods 0.000 description 14
- 230000003068 static effect Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000002123 temporal effect Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000008685 targeting Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000012512 characterization method Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00281—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
- H04N1/00307—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
Definitions
- This invention relates generally to field of imaging, and particularly to imaging with an imaging system having adjustable optics.
- digital imaging systems such as digital cameras have taken remarkable role in imaging technology.
- Digital cameras are characterized by one or more built-in processors and they record images in digital form.
- a digital camera or a digital camera module
- the camera module can be readily integrated to another electronic device, of which mobile telecommunication device (mobile terminal) is nowadays a common example.
- the camera module can communicate with several other components and systems of said device.
- the camera module is typically operatively communicating with one or more processors, and in the case of a digital camera, the device can comprise some other type of dedicated signal processing components.
- Adjustable optics in context of digital imaging system relates to a possibility to use electronically controlled image focusing, such as auto- focusing and optical zoom functions to adjust the properties of the image to be captured. These operations are becoming more and more important in the imaging devices. Auto-focusing and zooming may be accomplished with traditional lens optics with moving lens components, or today they can also be accomplished using optical systems based on lenses with adjustable shape or other adjustable means to affect their refractive power.
- the imaging system comprises a lens system that focuses light in order to create an image of a scene.
- the light is focused onto a semiconductor device that records light electrically.
- This semiconductor device can typically be e.g. a CMOS (Complementary Metal Oxide Semiconductor) or CCD (Charge-Coupled Device) sensor.
- the sensor is mainly composed of a collection of light-sensitive pixels which convert light into electrical charge which electrical charge is further converted into digital image data.
- binning can be used. Binning combines charge in adjacent pixels in order to increase the effective sensitivity of the imaging system and to reduce the amount of pixels in the image.
- the imaging system also comprises shutter means.
- the main types of the shutters are a global shutter and a rolling shutter.
- the shutter means is used to restrict the exposure of the image sensor.
- the shutter operation consists at least of operations, such as reset, exposure and read operation, but also operations such as open and close can take place.
- the shutter means, both global and rolling can be implemented electronically or mechanically, but in mechanical implementation also variable aperture or Neutral Density (ND) filters may be used.
- ND Neutral Density
- the imaging system comprises also focus detector that measures current focus values, typically from one or multiple regions of the image, and the results are used in a control function being also included in the imaging system.
- the measurement of focus is typically based on the contrast between adjacent areas of the image and therefore the control function tries to find the best focus for the image by maximizing the contrast in the image.
- the imaging system comprises also exposure detector that measures the current level of the light exposure in image pixels and its result is used in the control function also.
- the control function utilizes the current level of the exposure and compares it to the target exposure level. Based on this comparison exposure time, analogue gain, digital gain, aperture and ND filters are controlled.
- the control function also utilizes the information being received from the user interface. For example, if the user wants to zoom into the image, the control function starts to change lens positions.
- Optics driver is used when the lens system is moved and it is typically controlled by I2C (Inter-lntergrated Circuit) commands or using Pulse Width Modulation (PWM) signals.
- I2C Inter-lntergrated Circuit
- PWM Pulse Width Modulation
- the imaging system may also comprise, or be in connection with, input devices (e.g. control buttons for zoom, scene selection and shutter control). Flash is also typically used in the imaging system. All the image processing including focus detector, exposure detector, control function and actual image processing can be done in the camera module, in the camera processor, in the application engine, in the base band engine or in any combination of those. The processing can also be implemented by using software or hardware processing blocks. At least detector and control functions of the image processing have to operate in real-time.
- imaging may refer to still imaging, video imaging or viewfinder imaging.
- Still imaging produces visual information that is characterized by being non- moving.
- a still image is stored into a memory right after it has been taken.
- Video imaging produces moving visual representation that changes with time.
- a series of visual representations are taken in order to give an impression of motion when shown in succession.
- Viewfinder imaging provides images for the viewfinder display.
- the viewfinder of a digital imaging system is typically an integrated color display that provides a preview of the scene that the user is capturing.
- the viewfinder image that is seen on the display is typically taken from the image sensor and after scaling down in the sensor or in the processor from its original resolution displayed on the viewfinder display.
- Viewfinder images should preferably be updated on the viewfinder display promptly and with minimum delay in order to provide good real time feel and response to the user.
- Focus in the imaging can be done both automatically (auto-focus) or manually involving user interactions. Further, auto-focus (AF) function can be implemented by using single shot auto-focusing or by using continuous auto-focusing. Single shot auto-focusing is typically applied when capturing still images and continuous auto-focusing is applied in video imaging.
- Single shot auto-focus is typically implemented in such a way that the lens is moved thru its range, by using fixed increments, and focus detector values are recorded. When the scanning has finished, the lens is moved to the position where the contrast was found to have maximum.
- Single shot auto-focus can be activated, for example, by pressing the image capture button halfway. Therefore, when the capture button is pressed all the way down, the imaging optics has already been adjusted properly and hence the image can be captured immediately giving good user experience. Performance of the focusing system can be characterized by the time it takes to find the best focus and the accuracy of focused image. In continuous auto-focusing the focus detector values are determined from captured images substantially continuously and focusing is improved by adjusting imaging optics whenever focus detector values indicate that this is necessary.
- captured images are also displayed on the viewfinder display in real-time.
- the advantage of continuous auto-focusing is that the optics can be kept continuously in focus and therefore also the viewfinder image stays in focus all the time. In video recording this is clear necessity, but also when recording still images this is highly beneficial and a single still image can then be captured without delay or after a short delay by fine tuning the basic continuous focusing with a quick single shot focusing procedure.
- the viewfinder image usually needs to be sub-sampled, binned or downscaled, because of the bandwidth limitation in the interface between the camera module and subsequent parts of the image processing chain, therefore the quality of the auto-focus detection is limited in the later parts of the image processing chain because of the limited resolution of the viewfinder images.
- the auto-focus detection information can be calculated by dedicated hardware or software immediately when the image information from the detection regions is available for the focus detector.
- the auto-focusing needs not to be based on sub-sampled, binned or downscaled viewfinder images but it can be performed on a selected part of the image using full resolution of that part.
- the regions of the focus detection are located in the middle of the image area. In this case, the decision for the next frame lens movement can be available before all the lines of the current images are fully exposed and transmitted.
- the problem is that, if the lenses are moved immediately after the auto-focus processing is done for the center part of the image, then the last lines of the current image are exposed with moving lens and the artefact caused by it can be easily seen in the captured or viewed image. Similar kind of an artefact can be caused, if the exposure of the next image frame is started before the lens movement has ended.
- This situation is possible with both the focus lens and the optical zoom lens movements.
- the rolling shutter the first lines of the image are corrupted, and with the global shutter the whole image is corrupted.
- the zoom lens can be moved only at the time when the image sensor is not exposing, i.e. between image frames. Timing of the commands is very critical: also if the dedicated hardware is used, there might be delay before the lenses actually move.
- Image focusing especially with a single shot auto-focus takes significant amount of time and may cause in a rapidly changing situation that the scene aimed to be captured is already unavai lable, when the camera system is finally ready and the image is focused.
- Such a situation is typical for example when imaging sports or other activities, where the scene contains fast moving objects and rapidly changing situations.
- the command for the lens movement can be given immediately after the decision for the lens movement has been made without considering the effect for the image being captured. In these cases, typically the last lines of the image become corrupted.
- the command for the lens movement is given only after the whole image has been captured. In this case the start of the lens movement is effectively delayed until the whole image has been captured and then, depending the length of the blanking and exposure times, the lens is moved during the blanking period. But due to the shortness of that period quite often the first lines of the next image become corrupted because the lens movement continues too long.
- the auto-focus detection is traditionally made by measuring auto-focus detection values frame by frame.
- This type of detection requires that the whole image frame, or the whole sub-sampled image frame, needs to be AD converted when the focus detection is performed. Quite often some frames become skipped due to the lack of time for proper focus detection or for a proper image viewing. This increases the focusing time even more. With video images the frames are not usually skipped, but then the artefacts caused by the exposure and the lens movement can be seen from the recorded video sequence. It can be clearly seen that solutions for exposing the image properly at the time when the lens also needs to be moved for focusing or zooming purposes, and without damaging the images to be captured, need still to be developed further in order to overcome the deficiencies of the state of art.
- This invention aims to provide a solution that maximizes the time available for adjustments of the optics and at the same time minimizes the artefacts caused for captured images. In the same time the invention aims to minimize the response times providing improved user experience.
- an imaging method for acquiring an image sequence comprising at least two images, at least one of which is used as measurement image and at least one other of which is used as final image, determining a measurement image exposure time and a final image exposure time, determining a non- exposure time between the measurement image exposure time and the final image exposure time, and allowing adjustment of imaging optics during said non-exposure time.
- First example of the current invention is so called a timing solution for the optics adjustment.
- a proper timing is determined by auto focus detection values.
- the timing describes how the auto focus and/or zoom optics need to be adjusted.
- the first example of the invention defines an exact point in time when focus or zoom optics can be adjusted, image artefacts can be avoided. If frame blanking times for the given situation are small, then the invention brings much more time margin for the optics control beyond the blanking time.
- Said first example also minimizes the latency in the control loop and improves the real-time performance, since it is guaranteed that the auto focus statistic calculations have been finished for the previous frame and the optics adjustment has been readily applied before the next frame.
- Settling time for the auto-focus/zoom hardware i.e. the total time needed for the position of the optics to finally freeze after starting to move it, might be on the same range as the blanking time. Therefore it is important to be able to provide long enough settling time before exposing the pixels under interest.
- Long settling time allows small startup current in an auto-focus/zoom actuator controller, which is an advantage especially in portable applications where only battery with limited capacity can be available.
- the actuator does not have to be extremely fast, which means that less power can be used for the optics adjustment.
- a second example of the current invention is to detect auto-focus from multiple optics positions within one frame. This can be done by the optics adjustment when the detection area pixels are not exposed, but still during the exposure of the total image frame.
- the detection area is the area of interest in the image, which is used for the focus detection.
- This second example enables shorter time for finding the focus. In addition lower power consumption can also be achieved, because continuous focus is not always needed. This example also improves usability.
- a third example of the current invention is to control blanking period or exposure and lens movement times.
- This example provides such video and viewfinder images that are not corrupted, which images can also be provided at maximal repetition frequency in the light and control condition in question.
- the maximal image frequency is not wanted, the fixed image frequency enables the optics adjustment and assures the maximal exposure time, whether the optics is adjusted (a lot, a little, at all).
- the zooming may be accelerated or temporal peak effect may be decreased, when the optics are adjusted slower.
- the third example also enables flexible automatic night/day mode, whereby the image frequency may slow down according to the exposure time, but not more than that.
- Figure 1 illustrates an example of an image sequence
- Figure 2 illustrates an example of timing solution for the optics adjustment
- Figure 3 illustrates example of an image frame comprising auto- focus windows
- Figure 4 illustrates an example of an auto-focus system
- Figure 5 illustrates an example of one optics position and adjustment during a frame period
- Figure 6 illustrates an example of measuring N optics positions within one frame
- Figure7 illustrates an example of a focus measure as a function of an optics position
- Figure 8 illustrates an example of a full focus scan during one frame
- Figure 9 illustrates an example of a focus scan with the global shutter
- Figure 10 illustrates an example of a static blanking period solution
- Figure 11 illustrates an example of a dynamic blanking period solution
- Figure 12 illustrates an example of a device according to the invention.
- the current invention relates to an imaging system with adjustable optics.
- the imaging system may be a digital still image camera, a digital video camera, a mobi le terminal capable of either still imaging or video imaging or both, or any other electronic device capable of imaging.
- the imaging system comprises adjustable optics that can be moved (e.g. auto-focus lens or optical zoom lens) and a sensor (e.g. CCD sensor or CMOS sensor).
- System further comprises an image processing means that relate to the image sensor and may locate on a camera module, on a separate processing circuit, on an application engine of mobile device, or on a combination of the previous. Processing operation consists at least of forming an image, improvement functions for the image and a real-time controlling, such as lighting (EC), white balance (WB) and sharpness (F).
- EC lighting
- WB white balance
- F sharpness
- the real-time processing can be implemented automatically, whereupon no actions from the user are needed.
- the imaging system comprises also input devices or is connected to such, by means of which it is possible to control the operations of the camera. These operations can be e.g. a zoom control, an object selection, a mode selection and a launcher that activates the image capturing or the video imaging.
- optics comprising e.g. a traditional lens or a liquid lens or similar is meant. Therefore when “lens movement” or “moving of a lens” is written in the description, the skilled person will appreciate that the moving is actually operation of a traditional lens, but when e.g. liquid lens is used, the moving is a some other adjusting operation, by means of which the light can be projected to the image sensor and by means of which the image can be outlined.
- the imaging system comprises also shutter means, such as a global shutter or a rolling shutter.
- shutter means such as a global shutter or a rolling shutter.
- Figure 1 illustrates an example an "image sequence" comprising at least two frames F1 , F2.
- One of the frames is a measurement image F1
- the other is a final image F2.
- the final image is the one being stored and the measurement image can be used for measuring e.g. the focus or exposure time.
- a measurement area M1 can be defined to the measurement image F2 and it is used for the measurement.
- the final image can be a raw image being obtained from a sensor. Therefore digital image processing or other algorithms can be performed to such a final image before the actual storage.
- the measurement image is typically smaller than the final image, and it is not stored. However, when image sequence is a video image sequence, typically also the measurement images are so called final images, and will be stored.
- “Blanking time” relates to the time during which the sensor is not able to record meaningful image data due to frame/line or pixel resetting or any other sensor architecture originated reasons or due to user defined control.
- the "blanking time” not necessarily correspond time, when the image is not exposed, but the time when the pixels are not sent from the sensor forward.
- the blanking time is shown in Figure 1 between the two frames F1 , F2. In the rolling shutter the light is continuously received, but before the actual reading, the pixels and lines are reset before the amount of the exposure time. The time when the pixels in the sensor are not exposed is within the vertical blanking period. Still the sensor can be exposed during each blanking periods (e.g. at least the following line is exposed during line blanking).
- the blanking time can occur between image frames, between lines and also between pixels.
- the exposure of the measurement image F1 can behave completely differently to the final image F2.
- the exposure for the measurement image F1 starts before a readout of the measurement area M1.
- Figure 1 the exposure of the final image F2 does not continue to the blanking time following the final image F2.
- the "non-exposure time" between the frames F1 , F2 defines how long there is time from the beginning of the blanking time, during which the exposure is not done at least to the next pixel, line or image, and during which the lens can be moved.
- the non-exposure time starts from when the exposure of the measurement image F1 has been ended.
- the non-exposure time can be extended depending on the time needed for the lens movement. In the case of video, the non-exposure time cannot be extended as much as with viewfinder or measurement images.
- the extension of the non-exposure time can be implemented by increasing the channel rate, which creates faster image readout.
- the image is read faster, there will be longer time for blanking, which can be used for the lens movement.
- the prior art teaches against the channel rate increment. This is because of the costs and because the increment would cause EMC- noise. But with smaller processes and differential serial interfaces, this kind of an increasing becomes possible.
- the focus and the measurement is targeted to a smaller image, and therefore the image is sub-sampled, binned or downscaled thus providing an image of smaller size.
- the smaller image can in addition be read faster, whereby there will be more time left for the non-exposure.
- the measurement images are smaller than the actual final image, and therefore there will be time left automatically for the non-exposure.
- the basic method comprises steps for acquiring an image sequence formed of series of image frames or series of image sections, e.g. lines. At least a portion of the image sequence is used for controlling the lens. Blanking times are defined in the image sequence between images, between lines or between pixels.
- the non-exposure time is defined.
- the lens is at the end moved during the non-exposure time, which non-exposure time may partly comprise of the blanking time.
- the purpose of the solution is not to let the lens movement (auto- focus, zoom) go upon the exposure. Therefore a time period for the lens movement, during which the exposure is not done, i.e. non- exposure time, is defined.
- the non-exposure time can be defined basically as the time gap between successive images and their exposes.
- the imaging system comprises means for enabling the operations during the non-exposure time.
- Those operations include e.g. defining the exposure time and defining the location in the image where information is taken.
- the system needs to be aware of the actions of the pixel cell and of the actions being done earlier.
- said means are arranged to know all the delays to each operation, e.g. how much delay is got wi th different amount of lens movement. Further said means are capable of knowing whether the focus has gone or goes wrong.
- Figure 2 shows a solution for achieving a proper timing for the lens movement.
- the pixel data is exposed and transmitted (1 ) from the sensor 100.
- the processor 110 receives the image data, being the measurement image, and includes e.g. auto-focus logic 1 12, where the auto-focus detection is made.
- Detection block 112 calculates auto- focus statistics or then the statistics is already calculated in the imaging sensor and transmitted to the control CPU 113 (e.g. through I2C or within the image frame). After being calculated, the auto-focus statistics are informed (3) to the control CPU 113.
- the control CPU 113 reads (4) the auto-focus statistics and makes necessary conclusions of what is the needed non-exposure time, i.e. how the auto-focus and/or zoom lenses need to be altered.
- the control CPU 113 can use also the information received from a user interface, when deciding the needs for the lens movement.
- line counters in the receiver logic 111 that receives the image data from the sensor 100 can be used.
- the control CPU 113 reads (5) the line counter register in the receiver 111 after the auto-focus statistics are calculated, and by knowing the number of pixels in the image sensor, transmission clock frequency and possible delay in the zoom hardware, it is possible to determine whether the lens can be moved or not.
- an image frame from a 1600 x 1200 sensor is received with a rate of 20 frames per second (f/s).
- a transmission time is 40 microseconds per line, if there are 50 blanking lines per frame.
- Control CPU 113 also controls exposure values and the blanking periods so that the lens movements will not corrupt the next image frame.
- the control CPU 113 takes care that the next focus detection regions are not corrupted with the lens movements, for example when the single shot focus is tried to make as soon as possible. If global shutter is used, the timing of a global shutter needs to be known, in order to start the lens movement as quickly as possible.
- Figure 3 illustrates auto-focus locations inside a measurement image frame.
- the auto-focus locations can be considered as measurement areas and they are presented with reference number 101 — 108 in the sensor array 100.
- One exposure (with rolling shutter) example relating to first five- lines in the (CMOS) sensor using three- line exposure time is illustrated by the following table 1.
- Line 1 Reset Row 1 : Row 1 : Row 1 : Read out Exposure Exposure
- Line 2 — Line 2: Reset Line 2: Line 2: Exposure Exposure
- Line 3 — Line 3: — Line 3: Reset Line 3: Exposure
- Line 4 — Line 4: — Line 4: — Line 4: Reset
- Line 5 — Line 5: — Line 5: — Line 5: —
- Step 1 First line Step Step Step 4: First line reset 2:lntegration 3:lntegration output starts continues
- Line 1 — Line 1 : — Line 1 : — Line 1 : — Line 1 : — Line 1 : — —
- Line 2 Read out Line 2: — Line 2: — Line 2: —
- Line 3 Line 3: Read Line 3: — Line 3: — Exposure out
- Line 5 Reset Line 5: Line 5: Line 5: Read out Exposure Exposure
- Step 5 Second Step 6: Third Step 7: Fourth Step 8: Picture line output line output line output complete
- step 4 of the table is taken into consideration.
- step 4 the first line (line 1 ) is read, following two rows (rows 2, 3) are exposed, and the fourth line (line 4) is reset. After this the following steps proceed until the fifth line (line 5) has been read in step 8.
- the previous steps 1 — 3 initialize the exposure operation and this typically happens during the vertical blanking period. If actions of line 1 in steps 1—4 are traced, it can be seen that the line in question is first reset, then exposed during two lines, and is read at tie end. If on the other hand, reset is traced, it can be seen that the reset moves from line 1 in step 1 one line forward until it is in the line 5 during step 5. In this example the following lines or blanking time are not illustrated and the blanking time is assumed to be greater than the exposure time, whereby line 1 is not need to be reset for the next image until the last line of the current image has been read.
- line counters are described in context with the receiver logic 111 for evaluating the status of the imaging sensor.
- the line counter can also be used for time measurement purposes. This example works as such in a situation, where the exposure time is very short, and where the lens movement is very quickly.
- Short exposure time means that it is shorter than vertical blanking period, whereby there is time for the lens movement within the blanking time.
- the exposure time is shorter than this, there is enough time for the lens movement within the blanking period, because otherwise the first line of the following image will be exposed before the last line of the current image has been read.
- the reading of the sensor pixels should be accelerated at the same time, because otherwise it is not possible to have 20 frames per second from the sensor. Typically at least 15 frames per second is required in order to have the distortion of the image caused by the rolling shutter tolerable.
- the table 1 presents an example of first five lines of a sensor having e.g. 1200 visible image lines and e.g. 50 blanking lines.
- the reading of the first line begins in a step 4, and steps 1—3 will occur during blanking time.
- Figure 4 illustrates an example of an auto-focus system.
- the auto- focus system comprises at least a sensor 300, a focus detection module 312, an auto-focus control module 314, an optics driver 315 and optics 316.
- the basic method is to move the lens thru its range and to record the contrast values, and then to move the lens to a position with best contrast.
- the current example of the invention makes it possible to find focus faster than by using implementations of related art.
- the idea of the current invention is to measure focus from one or multiple lens positions within one or many frames, thus making the focus search shorter. This example is described by following methods. In the methods positions for lenses need not to be with fixed increments, but auto-focus control takes the responsibility for selecting new lens position for the measurement. 2.1 Measuring one lens position within one frame
- Figure 5 illustrates an example, where one lens position is measured within one frame reading time T f . Contrast is detected from the measurement area M in the image frame.
- the measurement value for the lens position is obtained by gathering high frequency (and/or bandpass) content from sub-areas of the measurement area M. It is also possible that only the set of measured sub-areas are used in the evaluation phase.
- the lens is moved between positions P n and P n+1 during a lens movement time T Lens between the readout of the last line of the measurement area M and a start of the exposure of the first line of the measurement area M in the next frame N+1 (not shown in figure 5). If the lens is moved outside these time windows, the lines in the measurement area will get mixed data and the measurement area does not anymore correspond only one lens position.
- Position P n+1 is measured in the next frame N+1.
- the time allocated for the lens movement i.e. non-exposure time, is:
- T exp stands for the exposure time
- two lens positions are measured within one exposed frame.
- the contrast is detected from areas M1 and M2 in an image frame I.
- the measurement value is obtained by gathering the high frequency (and/or band- pass) content from the sub- areas. It is also possible to use only the set of measured sub-areas in the evaluation phase.
- the exposure of the first line in the area M1 is started at the readout of line U ad - This means that the first line in the area M1 is starting exposed during the readout of line L read -
- the lens is moved between positions P n and P n+1 during time T LenSM i- M2 between the readout of the last line of the area M1 in image frame N and start of the exposure of the first line of the area M2.
- the lens is moved between positions P n+1 and P n+2 during time T LenSM2 - M i (not showed in the image) between the readout of the last line of area M2 in image frame N and 5 the start of exposure of the first line of the area M1 in the next frame N+1. If the lens is moved outside these time windows, the lines in the measurement areas will get mixed data, and the measurement area does not correspond only one lens position. Positions P n+2 and P n+3 are measured in the next frame.
- the exposure time T exp is typically 10 constant in one frame, but the skilled person will appreciate that it can also vary in one frame.
- Figure 7 shows a result after scanning the lens movement range.
- Figure 7 relates to figure 6 and shows curves relating to two separate measurement areas, where the other area has more information than 20 the other one.
- the peak focus position can be estimated by combining the curves.
- Exposure time T exp is used to define how many measurements can be done within one frame.
- lens characterization values e.g. 25 MTF/PSF (Modulation Transfer Function/Point Spread Function), can be utilized when decision for focus is evaluated.
- the example 2.2 describes two measurement areas, but skilled person will appreciate that the amount is not limited to two.
- the lens 30 positions are sequential in said example, but the positions can be different.
- the distance between the lens positions do not need to be always the same. Size and place of the areas can vary depending on the exposure time and time needed for the lens movement. 2.3 Continuous moment within one frame (or two frames)
- This example is similar to the example 2.1, but in this example the lens is stopped to a particular lens position.
- the sub-areas of area M contain data from a subinterval of the total lens movement range. The interpretation of the focus values has to take this into account. Also exposure time needs to be taken into account within these calculations.
- the current example is useful, when image is taken of a flat object, such as a document or a business card.
- the lens can be moved with fixed speed, but it can also be moved with varying speed and trajectory, for example several cycles of range during the frame.
- the lens can move from minimum to maximum between the first frame and the lens can move back from maximum to minimum between the second frame. In this way two curves can be created and so the effect of different contrast areas in different part of the image can be reduced.
- the lens characterization values e.g. MTF/PSF
- MTF/PSF lens characterization values
- FIG. 9 shows an example where a cropped image 810 is used for auto-focus metering and a full image 800 can be used as a viewfinder image.
- the first timing chart 801 contains normal operation mode where the frame rate and the focus speed are limited usually with ADC (Analog to Digital Conversion) speed. Naturally, if the exposure time is very long, the exposure time can also be the limiting factor.
- the second timing chart 802 shows a system, where the focus speed is maximized, but the viewfinder image is not captured at all. In this case the focus speed is limited by the lens movement, the reset and the exposure time.
- the third timing chart 803 shows an example where fast focusing can be achieved but still the preview image can be shown with reasonable frame rate. When the cropping is done, all the charges outside the cropping area window can be ignored and not AD converted.
- the fast focus detection solution has considerable advantages what comes to the time needed to find focus. For example, if "X" corresponds to the amount of needed measurements, then the time can be shortened to X frames by the example 2.1 (by limiting exposure time); by the example 2.2, time can be shortened to X/N; by the example 2.3 time can be shortened by one frame; and by the example 2.4 time can be shortened by increasing the frame rate for measurements.
- the example 2.3 also lowers power consumption because continuous focus is not needed.
- This example describes a method, where either blanking periods are controlled (dynamically varied blanking period), or wherein exposure and lens movement times are controlled (static blanking period).
- dynamically varied blanking period When the dynamically varied blanking period is used, maximum image frame rate can be achieved within the known exposure and the lens movement times without corrupting the image information. It also enables the use of automatic night and day light scene variation within the maximum image frame rate with or even without the lens movements.
- the frame rate is not constant, when the dynamically varied blanking period is used. However, static blanking period enables constant frame rate.
- the desired image frequency is achieved by means of the maximum blanking time based on the rate being used of the bus, after which the exposure time is limited to its maximum in such a way, which enables the moving of the lens within limits of the blanking time being defined depending on the way the lenses are moved.
- the exposure time can be limited in such a manner that a shortened exposure time is replaced with analogue or digital gain.
- Reference signs 96a, 96b represent a non- exposure time of the complete visible image frame, whereas signs 97a, 97b represent non-exposure time of the AF-statistics block.
- frame blanking 92a, 92b, 92c, line blanking 91 a, 91 b, embedded/ancillary data 94a, 94b, 94c, 94d are illustrated.
- the reading of the lines is referred by sign 98, where as the exposure time is referred by sign 95 and visible data by 99.
- the exposure and lens movement times are restricted so that the images can be captured without any artefacts in the images. This means that fie blanking time have to be as long as possible, which is enabled by the interface within the required frame rate. In addition, if long exposure or long lens movement times are required, then the other or both have to be limited. This means that the exposure time has to be compensated by using e.g. analogue gain, and the speed of zooming is reduced.
- such a frame blanking time 102a, 102b, 102c is used for each image, which is needed for the lens movement and exposing of first lines of the next image, the amount of which is defined by the exposure time
- the lens movement can be started immediately after the last visible line of the previ ous image (Frame N- 1 ) is exposed.
- the control of lens movement can thus be started at the time it is known into which direction the lens is supposed to be moved (zoom control or auto-focus control) and when the delay needed for starting the lens movement is left from the exposure of the last visible pixel (line).
- the exposure of the first pixel (line) can be started by resetting the pixels of the line in question.
- the lens movement and the time (106a, 107a, 106b, 107b) required by it, which are done for the next image (Frame N+1), are already known before the current image (FRAME N) is exposed, as long as it is known which operation is in action - zooming or auto-focus control - and which are the amount and direction of the lens movement.
- the time (107a, 107b) for lens movement is different to the time (106a, 106b) for the lens movement with still or video images.
- the zoom control is resulted from the user and there has to be enough hysteresis in a continuous auto- focus control in order not to move the lenses continuously back and forth with great frequency.
- the exposure (105b) of the next image (FRAME N+1) is already known, whereby it is easy to calculate the amount of the needed blanking lines in the current image (FRAME N) frame blanking area (102b).
- the reading rate of the sensor is not changed and therefore there is no need for impacting sensor's system clock, but only to the amount of visible and blanking lines to be transmitted and being in the image.
- the blanking pixels in the line blanking areas (101 a, 101 b) are such pixels, which are not included in the image being displayed, i.e. visible image 109a, 109b.
- There can be also other non visible image areas for example vertical blanking, ancillary/embedded data, dummy pixels, dark pixels, black pixels and manufacturer specific data.
- the changing the amount of visible lines corresponds image cropping which is usually done when images are zoomed digitally.
- Figure 11 illustrates changing frame blanking by inserting blanking lines, but at least with sensors of SMIA (Standard Mobile Imaging Architecture) specification, it is possible to change also line blanking by inserting pixels to the end of the line (101 a, 101 b).
- SMIA sensors are originally intended to control constant image frequency without changing the system clock. Similarly these sensors are designed to implement long exposure times without a need for read rate or continuing exposure of the image over the image limit. By using the control structure of this example it is possible to achieve as great image rate as possible.
- the lens movement is not shown at any phase in the image. This system therefore provides as great image frequency as possible without corrupting the image and with desired exposure time by exposing the image in such a manner that the lens can be moved when desired.
- the blanking areas can be increased dynamically even little bit more than absolutely needed for achieving more suitable viewfinder frame updating.
- blanking lines are added so that from the current frame start to the next frame start is e.g. 1 ,2,..., n / 60 s.
- Figures 10 and 11 illustrate static blanking period solution and dynamic blanking period solution respectively.
- the time for lens movement, exposure time and focusing data area are described.
- the biggest difference between Figures 10 and 10 is that in Figure 10 the blanking period is equal, and the time for the lens movement (96a, 97a, 96b, 97b) varies based on the required exposure time (95a, 95b) (or vice versa).
- the time for the lens movement (106a, 107a, 106b, 107b) and time for the exposure (105a, 105b) are known and the blanking period varies based on those.
- the previous processes (3.1 , 3.2) work also with global shutter in the following manner.
- the exposure of the image is shut by the global shutter, after which the lens movement can be started.
- the lens movement is started regardless of how long the reading of the image in question from the sensor still lasts.
- the global reset of pixels (or opening of the global shutter) and thus the exposure of the new image will be started right after the lens has been moved to the right location and the previous visible image has been read out from the sensor, and the global shutter has been opened after said reading.
- the non-exposure time is the time during which the sensor is read after closing the shutter and the blanking time before the first used line of next image is resetted (typically global reset). During the non-exposure time the sensor does not receive light (or it is discarded) to its such pixels, which are visible in the final image or are used in measurements.
- temporal corruption of viewfinder image is not wanted to be prevented, because such an image is not stored for further use. Therefore as quick auto-focus as possible with long exposure time can be achieved for both processes (3.1 , 3.2) by the lens movement also during exposure of pixels / pixel lines that do not belong to focus control. This moving is visible in the viewfinder image, but not in the final and stored still images. Also for the video images, it is better to implement the first auto-focus control as quickly as possible and thus the image can be corrupted as long as areas being used for statistics are not damaged.
- the implementation targeting to the maximum image frequency ⁇ .2) the blanking time can often be set to zero (or sensor limit).
- said implementation targeting to maximum image frequency works well in an automatic control of night/day mode (with or without the lens movement), whereby blanking is increased depending on the lighting conditions, but the viewfinder image is not slowed more than what is needed.
- the previous examples can be implemented on the control CPU of an imaging system being part of an electronic device, such as a mobile device, a digital camera, a web camera or similar.
- Dedicated hardware implementations may be needed in the imaging sensor or in the receiver block in order to have faster and more accurate timings.
- Figure 12 illustrates a possible configuration of the electronic device.
- the device 1200 in Figure 12 comprises communication means 1220 having a transmitter 1221 and a receiver 1222, or is connected to such.
- the first communicating means 1220 can be adapted for telecommunication and the other communicating means 1280 can be a kind of short-range communicating means, such as a BluetoothTM system, a WLAN system (Wireless Local Area Network) or other system which suits local use and for communicating with another device.
- the device 1200 according to the example in figure 12 also comprises a display 1240 for displaying visual information and the imaging data. Further the device 1200 may comprise an interaction means, such as a keypad 1250 for inputting data etc. In addition or instead of the keypad 1250, the device can comprises stylus, whether the display is a touch-screen display.
- the device 1200 comprises audio means 1260, such as an earphone 1261 and a microphone 1262 and optionally a codec for coding (and decoding, if needed) the audio information.
- the device 1200 comprises an imaging system 1210 or is connected to such.
- a control unit 1230 may be incorporated to the device 1200 for controlling functions and running applications in the device 1200.
- the control unit 1230 may comprise one or more processors (CPU, DSP).
- the device comprises memory 1270 for storing e.g. data, applications, and computer program code.
- the imaging system may also incorporate any number of capabilities and functionalities, which suitable enhance efficiency of the system
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Exposure Control For Cameras (AREA)
- Focusing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/FI2005/050409 WO2007057498A1 (en) | 2005-11-15 | 2005-11-15 | Imaging system with adjustable optics |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1949671A1 true EP1949671A1 (en) | 2008-07-30 |
EP1949671A4 EP1949671A4 (en) | 2008-11-05 |
Family
ID=38048319
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05808852A Withdrawn EP1949671A4 (en) | 2005-11-15 | 2005-11-15 | Imaging system with adjustable optics |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1949671A4 (en) |
JP (1) | JP5086270B2 (en) |
KR (1) | KR20100023056A (en) |
CN (1) | CN101326814A (en) |
WO (1) | WO2007057498A1 (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101682696B (en) * | 2007-06-04 | 2012-04-18 | 夏普株式会社 | Portable terminal, control method for portable terminal |
JP5171433B2 (en) | 2008-01-22 | 2013-03-27 | キヤノン株式会社 | Imaging device and lens device |
JP5400406B2 (en) | 2009-02-06 | 2014-01-29 | キヤノン株式会社 | Imaging device |
US8542287B2 (en) | 2009-03-19 | 2013-09-24 | Digitaloptics Corporation | Dual sensor camera |
JP5471004B2 (en) * | 2009-04-22 | 2014-04-16 | カシオ計算機株式会社 | Focus adjustment apparatus, focus adjustment method, and program |
JP5491677B2 (en) * | 2011-03-31 | 2014-05-14 | 富士フイルム株式会社 | Imaging apparatus and focus control method thereof |
EP2592821A1 (en) * | 2011-11-10 | 2013-05-15 | Research In Motion Limited | Camera autofocus apparatus and associated method |
WO2014001844A1 (en) | 2012-06-27 | 2014-01-03 | Nokia Corporation | Imaging and sensing during an auto-focus procedure |
WO2014078735A1 (en) * | 2012-11-16 | 2014-05-22 | Molecular Devices, Llc | System and method of acquiring images with a rolling shutter camera while asynchronously sequencing microscope devices |
US9462188B2 (en) * | 2014-10-31 | 2016-10-04 | Qualcomm Incorporated | Time extension for image frame processing |
CN111355895B (en) * | 2018-12-05 | 2021-07-16 | 北京图森智途科技有限公司 | Image exposure and gain adjustment method, imaging device and vehicle |
CN111818272B (en) * | 2020-06-30 | 2021-09-03 | 浙江大华技术股份有限公司 | Method for eliminating image flicker, electronic device and storage medium |
DE102022133188A1 (en) | 2022-12-14 | 2024-06-20 | Connaught Electronics Ltd. | Adjusting the focus of a vehicle camera for different areas of interest |
DE102022133187A1 (en) | 2022-12-14 | 2024-06-20 | Connaught Electronics Ltd. | Focus adjustment of a vehicle camera |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6683651B1 (en) * | 1999-10-28 | 2004-01-27 | Hewlett-Packard Development Company, L.P. | Method of automatically adjusting focus in a shutterless digital camera |
US20040165090A1 (en) * | 2003-02-13 | 2004-08-26 | Alex Ning | Auto-focus (AF) lens and process |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0834551B2 (en) * | 1988-07-19 | 1996-03-29 | 松下電器産業株式会社 | Automatic focus adjustment device |
JPH04133015A (en) * | 1990-09-26 | 1992-05-07 | Nikon Corp | Control method for automatic focusing device |
JPH04229783A (en) * | 1990-12-27 | 1992-08-19 | Sony Corp | Video camera |
US5563658A (en) * | 1994-12-16 | 1996-10-08 | Eastman Kodak Company | Electronic camera with rapid automatic focus of an image upon an image sensor |
US5668597A (en) * | 1994-12-30 | 1997-09-16 | Eastman Kodak Company | Electronic camera with rapid automatic focus of an image upon a progressive scan image sensor |
JP2001177771A (en) * | 1999-12-16 | 2001-06-29 | Toshiba Corp | Solid-state image sensing device |
JP2001296470A (en) * | 2000-04-14 | 2001-10-26 | Hitachi Ltd | Electronic still camera |
JP4548045B2 (en) * | 2004-08-25 | 2010-09-22 | コニカミノルタオプト株式会社 | Automatic focus adjustment device |
-
2005
- 2005-11-15 CN CNA2005800523022A patent/CN101326814A/en active Pending
- 2005-11-15 WO PCT/FI2005/050409 patent/WO2007057498A1/en active Application Filing
- 2005-11-15 EP EP05808852A patent/EP1949671A4/en not_active Withdrawn
- 2005-11-15 JP JP2008540635A patent/JP5086270B2/en not_active Expired - Fee Related
- 2005-11-15 KR KR1020107003295A patent/KR20100023056A/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6683651B1 (en) * | 1999-10-28 | 2004-01-27 | Hewlett-Packard Development Company, L.P. | Method of automatically adjusting focus in a shutterless digital camera |
US20040165090A1 (en) * | 2003-02-13 | 2004-08-26 | Alex Ning | Auto-focus (AF) lens and process |
Non-Patent Citations (1)
Title |
---|
See also references of WO2007057498A1 * |
Also Published As
Publication number | Publication date |
---|---|
KR20100023056A (en) | 2010-03-03 |
CN101326814A (en) | 2008-12-17 |
JP5086270B2 (en) | 2012-11-28 |
WO2007057498A1 (en) | 2007-05-24 |
EP1949671A4 (en) | 2008-11-05 |
JP2009516448A (en) | 2009-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007057498A1 (en) | Imaging system with adjustable optics | |
CN101052101B (en) | Apparatus and method for image pickup | |
US7689113B2 (en) | Photographing apparatus and method | |
JP4546565B2 (en) | Digital image processing | |
US9596398B2 (en) | Automatic image capture | |
JP5802520B2 (en) | Imaging device | |
US20020030749A1 (en) | Image capturing apparatus | |
CN102739961B (en) | Can the image processing apparatus of generating wide angle | |
CN110198418B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
JP2006324976A (en) | Moving imaging apparatus and program thereof | |
EP2472852A2 (en) | Digital imaging with autofocus | |
WO2011145342A1 (en) | Imaging device | |
JP4210189B2 (en) | Imaging device | |
US7729601B1 (en) | Shutter for autofocus | |
JP7320024B2 (en) | IMAGE SENSOR, IMAGING DEVICE, IMAGE DATA PROCESSING METHOD, AND PROGRAM | |
KR102368625B1 (en) | Digital photographing apparatus and the method for the same | |
JP2020017807A (en) | Image processing apparatus, image processing method, and imaging apparatus | |
CN110266965B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
JP2006148550A (en) | Image processor and imaging device | |
JP2008301161A (en) | Image processing device, digital camera, and image processing method | |
JP4534250B2 (en) | Movie imaging apparatus and program thereof | |
KR20080057345A (en) | Imaging system with adjustable optics | |
US20150085172A1 (en) | Image capturing apparatus and control method thereof | |
CN110463184B (en) | Image processing apparatus, image processing method, and non-volatile tangible medium | |
JP4757223B2 (en) | Imaging apparatus and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20080331 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: OLLILA, MIKKO Inventor name: KUNNARI, MIKA Inventor name: KALEVO, OSSI Inventor name: KAKKORI, HANNU |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20081007 |
|
17Q | First examination report despatched |
Effective date: 20081229 |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20120601 |