WO2014108319A1 - Method for moving a digital image sensor in order to calibrate same - Google Patents

Method for moving a digital image sensor in order to calibrate same

Info

Publication number
WO2014108319A1
WO2014108319A1 PCT/EP2013/078092 EP2013078092W WO2014108319A1 WO 2014108319 A1 WO2014108319 A1 WO 2014108319A1 EP 2013078092 W EP2013078092 W EP 2013078092W WO 2014108319 A1 WO2014108319 A1 WO 2014108319A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
sensor
path
position
point
sensitive
Prior art date
Application number
PCT/EP2013/078092
Other languages
French (fr)
Inventor
Camille DAVID
Original Assignee
Sagem Defense Securite
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/21Circuitry for suppressing or minimising disturbance, e.g. moiré, halo, even if the automatic gain control is involved
    • H04N5/217Circuitry for suppressing or minimising disturbance, e.g. moiré, halo, even if the automatic gain control is involved in picture signal generation in cameras comprising an electronic image sensor, e.g. digital cameras, TV cameras, video cameras, camcorders, webcams, to be embedded in other devices, e.g. in mobile phones, computers or vehicles
    • H04N5/2173Circuitry for suppressing or minimising disturbance, e.g. moiré, halo, even if the automatic gain control is involved in picture signal generation in cameras comprising an electronic image sensor, e.g. digital cameras, TV cameras, video cameras, camcorders, webcams, to be embedded in other devices, e.g. in mobile phones, computers or vehicles in solid-state picture signal generation
    • H04N5/2176Correction or equalization of amplitude response, e.g. dark current, blemishes, non-uniformity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/335Transforming light or analogous information into electric information using solid-state image sensors [SSIS]
    • H04N5/341Extracting pixel data from an image sensor by controlling scanning circuits, e.g. by modifying the number of pixels having been sampled or to be sampled
    • H04N5/349Extracting pixel data from an image sensor by controlling scanning circuits, e.g. by modifying the number of pixels having been sampled or to be sampled for increasing resolution by shifting the sensor relative to the scene, e.g. microscanning

Abstract

The invention relates to a method for controlling the movement actuators of a digital image sensor, and/or movement actuators of a group of optical lenses associated with a digital image sensor, along a path (CA2) in order to calibrate the sensor by comparing the integration of a single image point by different sensitive points of the sensor during multiple integration phases triggered over the course of the path (CA2). According to the invention: the path is a convex path having dimensions greater than multiple times the distance between two adjacent sensitive points of the sensor; and successive step movements are used to cover the path (CA2), blocking the movement of the sensor during each integration between two step movements.

Description

METHOD FOR DIGITAL MOVE AN IMAGE SENSOR

TO THE CALIBRATE

The invention relates to the sensor microdisplacements piloting digital image acquisition.

BACKGROUND OF THE INVENTION

It is known from WO2007 / 025832 to calibrate the sensitivities of a digital image sensor in an operation during which the sensor is moved to carry out several additions to the scene sees this sensor, these integrations being thus produced when the sensor occupies different positions.

This will integrate the same image point, that is to say the same point of the scene seen by the sensor, with several sensitive points of the sensor gen- néralement contiguous. The establishment of an average value of data resulting from the integration of the same image point by different sensitivities may well be a reference to calculate for each nerve a calibration patch.

In practice, the digital image sensor is mounted on piezoelectric actuators controlled to move along a predefined path, marking stops, each corresponding to an integration phase.

In this context, use a path that allows you to browse all the sensitive points contained within a square pad eg containing nine or sixteen points. As understood, the calibration is more robust than the number of points contained in the vee pa- is important.

A path that runs through all the points of a nine-point pad requires six changes of direction. But a path through all the points on a pad of sixteen points has a complex shape nécessi- EP2013 / 078092

2

As sixteen directional changes, as explained in the document WO2007 / 025832.

Thus, the improvement of the robustness of the calibration results in an increase in complexity of the path to be described by the sensor during calibration. This complexity is a significant obstacle to the implementation of a robust calibration because it induces important constraints on steering actuators.

OBJECT OF THE INVENTION

The object of the invention is to propose a solution to overcome this disadvantage.

SUMMARY OF THE INVENTION

To this end, the invention relates to a method for driving displacement of the actuators of a digital image sensor, and / or displacement actuators of a lens array associated with a digital image sensor, according to a path to calibrate this sensor by comparing the integration of the same image point by dif- ferent sensitive points of the sensor over several integration phases initiated during the course of the path, wherein:

- the path is a convex trajectory covering an area of ​​dimensions greater than several times the distance between two adjacent sensitive points of the sensor;

- this path is traversed in steps of successive movement by immobilizing the sensor during each integration between two displacement steps.

This produces calibration using sensitive points of the sensor which are not necessarily situated in a cobblestone, but instead are simply distributed along a convex path, that is to say simple. This increases the robustness of the calibration by increasing the number of points without penalizing the complexity of the path.

The invention also relates to a method as defined above, wherein the path is a sequence of rectilinear movements.

The invention also relates to a method as defined above, wherein the path has a shape of a regular polygon.

The invention also relates to a method as defined above, wherein the trajectory has a square shape.

The invention also relates to a method as defined above, wherein the path is traversed in steps of lengths less than the dimensions of the square shape of this path.

BRIEF DESCRIPTION OF FIGURES

Figure 1 is a schematic representation of a first embodiment of the invention in which a square trajectory is used to integrate an image point with eight sensitive points of the sensor;

Figure 2 is a schematic representation of a second embodiment of the invention in which a square trajectory is used to integrate a pixel with sixteen tender points of the sensor;

Figure 3 is a schematic representation of a third embodiment of the invention wherein a generally circular polygonal path is used to integrate an image point with seven sensitive points of the sensor;

Figure 4 is a schematic representation of a fourth embodiment of the invention wherein a generally circular polygonal path is used to integrate an image point with fourteen sensitive points of the sensor.

DETAILED DESCRIPTION OF THE INVENTION 13 078092

4

The idea underlying the invention is to achieve calibration using sensitive points of the sensor which are not necessarily all situated in a cobblestone, but instead are simply distributed along a convex path, c ' is to say, a simple trajectory to describe.

In Figures 1 to 4 there is shown the calibration paths according to the invention, each represented in a grid with square meshes regu- Q res. These meshes are spaced apart from each other by a distance corresponding to that between two adjacent sensitive sensor points projected in the image plane, that is to say in the scene viewed by the sensor.

Each path is represented by a closed contour punctuated by black circles which represent one of the different positions occupied by the same sensor sensitive at all throughout the course of the trajectory.

The trajectory of Figure 1, denoted CA1 is a square path each side of which is four times the distance between two adjacent sensitive points of the sensor, and it is traversed at a pitch corresponding to twice the distance between two adjacent sensitive areas of the sensor.

In the situation of Figure 1, the sensor is immobilized to a first integration, a position in which one of her tender points is located at a position denoted PI for integrating a corresponding point of the scene, which here corresponds to one of the grid cells Q.

When the integration is complete, the piezoelectric neurs action- are controlled to move the sensor vertically upward in the figure, a displacement step, and to immobilize the sensor before one second integration. During this second integration, the sensitive point of the sensor which was previously in the PI position is then in the position P2, and integrates data of the other point of the scene. At the same time a second sensitive point of the sensor is located at the point PI and incorporates the image point corresponding to the point PI.

The sensor is then moved by its actuators to the left in Figure 1 the value of the displacement step. The sore point was the position P2 is then at a new position P3, together with a third hot spot is positioned at the Pl position. A third integration is triggered.

The sensor is then again moved to the left of the displacement step, so that the sensitive spot which was the position P3 is then to a new position P4, together with a fourth sensitive point moves to Pl position. a fourth integration is then triggered.

In the next step, the sensor is moved down to its displacement step. The sore point was the position P4 is then placed in the position P5, and a fifth nerve is at the Pl position. A fifth integration is implemented.

The sensor is then moved downwards, the value of the displacement step, so that the sensitive point that was at the position P5 is located at the P6 position and a sixth nerve is located in the Pl position. a sixth integration is then performed.

In the next step, the sensor is moved from its movement step to the right, so that the sensitive spot which was the position P6 is located in the position P7, and a seventh sensitive point is positioned at the location Pl. A seventh integration is then performed. In the last step, the sensor is again moved to the right of the displacement step, so that the sensitive spot which was the position P7 moves to the position P8, and an eighth sensitive point is located at position Pl. An eighth and final integration is then performed.

Thus, throughout the path traveled by CA1 not equal to twice the distance between two adjacent sensitive points, the sensor is immobilized to each increment of movement to achieve integration. During these eight integrations, the point in the scene corresponding to the PI position is thus integrated by eight different sensitivities of the sensor.

Steps subsequent calculations allow, on the basis of eight different integrations of the same point in the scene, calibrate these sensitive points.

In the case of Figure 2, the trajectory which is marked with CA2 has the same general shape as the CA1 trajectory of Figure 1, namely a square shape, but is not covered by that are each the distance between two consecutive sensitive points of the sensor, not twice as in the case of the CA1 path.

This CA2 path thus comprises sixteen property of the sensor, at positions marked with PI to P16 corresponding to sixteen integrations. The same point of the scene is well integrated sixteen times, which can significantly increase the robustness of the cali- bration while implementing a single path and therefore do not pose a steering difficulties related sensor . In practice, the control of piezoelectric actuators for CA2 trajectory is almost as simple as steering actuators for the CA1 trajectory. 13078092

7

In the case of Figures 3 and 4, convex paths is used, identified respectively by IEC and CE2 which are circular rather than square, and which have radius twice the distance between two adjacent sensitive areas of the sensor. In practice, these trajectories are polygons, to further simplify the control of actuators only performs linear movements. During these trips, the sensor remains parallel to itself, as in the case of square trajectories CA1 and CA2.

In the case of IEC circular path, one of the critical points of the sensor is first placed in the position PI to a first integration.

The actuators are then controlled to move the sensor so that the nerve passes from the position PI to a new position P2, following a rectilinear movement. The position P2 corresponds to the position PI to which was applied a rotation R, which is a rotation of a seventh direction turn di- rect around a center of rotation C. The center of rotation C is located a location shifted to the left in FIG 3, by a value corresponding to twice the distance separating two critical points of the sensor.

At the end of this first movement, the sensitive point initially to the PI position is the position P2, and several other sensitive point of the sensor are located in the vicinity of the Pl position. A second integration is then controlled, in end of which the image point corresponding to the PI position is integrated by the new hot spots in the vicinity of the Pl position. the values ​​for the second integration of the image point corresponding to PI value can be established by interpolation from the integrated values ​​by the various points in the vicinity of the point PI during this second integration.

The process is then repeated to produce a linear displacement of the sensor corresponding to a nou- velle rotation R, so that the sensitive point prior to the position P2 is then placed at the position P3, and various new sensitivities of the sensor are in the vicinity of the Pl position. a third integration is then performed, and the correspon- ding values ​​to the integration of IP position can be determined by interpolating values ​​from the sensitive points in PI neighborhood during the third integration .

The process is then repeated to achieve analogously four linear displacements and four integrations, following the IEC path which is generally circular, but which is described by the actuators in the form of a polygon with seven sides, so as to only make movements rectili- genes.

Analogously to the case of Figures 1 and 2, the calibration of sensitive initially point to the PI position can be performed by comparing the values ​​provided by this point when he realized the first integration, the values ​​resulting from representative interpolations six other integrations that IP position that were performed later in the calibration process.

In the case of Figure 4, the trajectory is denoted CE2 is a generally circular path, but which has fourteen positions capital to complete fourteen integrations, instead of seven as in the case of IEC path of Figure 3.

Again, the generally circular path CE2, is described as a polygon so as to cover it by successive steps corresponding to rectilinear movements, this polygon having fourteen sides here. The vertices of this polygon are marked with PI to P14 correspond to the positions occupied by the same sensor sensitive at all throughout the course of this path CE2.

As will be understood, this trajectory CE2 can increase the robustness of the calibration for integrating fourteen times the same image point, without complicating the steering sensors which remains very close to the steering sensors implemented to browse IEC path.

In the examples described above, the actuators move the image acquisition sensor, but as will be understood, the invention is equally applicable to a situation where the actuators are arranged to move a group or lens assembly associated with the sensor, so as to produce the same effect.

Claims

1. Method of controlling the displacement actuators of a digital image sensor, and / or displacement actuators of a lens array associated with a digital image sensor, according to a trajectory (CA1; CA2; CIS; CE2) to calibrate this sensor by comparing the integration of the same image point from different sensitivities of the sensor over several integration phases initiated during the course of the trajectory (CA1; CA2; CIS; CE2) , in which :
- the path is a convex trajectory covering an area of ​​dimensions greater than several times the distance between two adjacent sensitive points of the sensor;
- this path (CA1; CA2; CIS; CE2) is traversed in steps of successive movement by immobilizing the sensor during each integration between two displacement steps.
2. The method of claim 1, wherein the path is a sequence of displacements rectili- genes.
3. The method of claim 1 or 2, wherein the path has a shape of a regular polygon (CEI, CE2).
4. The method of claim 1, wherein the path is a square path (CA1; CA2).
5. The method of claim 4, wherein the square trajectory (CA1) is traversed by no LON gueurs smaller than the dimensions of the square shape of this path.
PCT/EP2013/078092 2013-01-09 2013-12-27 Method for moving a digital image sensor in order to calibrate same WO2014108319A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FR1350163A FR3000859B1 (en) 2013-01-09 2013-01-09 A method for moving a digital image sensor in order to calibrate the
FR1350163 2013-01-09

Publications (1)

Publication Number Publication Date
WO2014108319A1 true true WO2014108319A1 (en) 2014-07-17

Family

ID=48570211

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/078092 WO2014108319A1 (en) 2013-01-09 2013-12-27 Method for moving a digital image sensor in order to calibrate same

Country Status (2)

Country Link
FR (1) FR3000859B1 (en)
WO (1) WO2014108319A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5877807A (en) * 1988-10-31 1999-03-02 Lenz; Reimar Optoelectronic colored image converter
US6418245B1 (en) * 1996-07-26 2002-07-09 Canon Kabushiki Kaisha Dynamic range expansion method for image sensed by solid-state image sensing device
US6642497B1 (en) * 2002-06-14 2003-11-04 Hewlett-Packard Development Company, Lp. System for improving image resolution via sensor rotation
EP1686416A1 (en) * 2005-02-01 2006-08-02 Steinbichler Optotechnik Gmbh Method and device for capturing an image, in particular by means of a CCD sensor
WO2007025832A1 (en) * 2005-08-03 2007-03-08 Thales Holdings Uk Plc Apparatus and method for imaging
US20070221825A1 (en) * 2006-03-27 2007-09-27 Benq Corporation Imaging apparatus with resolution adjustability

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5877807A (en) * 1988-10-31 1999-03-02 Lenz; Reimar Optoelectronic colored image converter
US6418245B1 (en) * 1996-07-26 2002-07-09 Canon Kabushiki Kaisha Dynamic range expansion method for image sensed by solid-state image sensing device
US6642497B1 (en) * 2002-06-14 2003-11-04 Hewlett-Packard Development Company, Lp. System for improving image resolution via sensor rotation
EP1686416A1 (en) * 2005-02-01 2006-08-02 Steinbichler Optotechnik Gmbh Method and device for capturing an image, in particular by means of a CCD sensor
WO2007025832A1 (en) * 2005-08-03 2007-03-08 Thales Holdings Uk Plc Apparatus and method for imaging
US20070221825A1 (en) * 2006-03-27 2007-09-27 Benq Corporation Imaging apparatus with resolution adjustability

Also Published As

Publication number Publication date Type
FR3000859A1 (en) 2014-07-11 application
FR3000859B1 (en) 2015-01-09 grant

Similar Documents

Publication Publication Date Title
Tsugawa Vision-based vehicles in Japan: Machine vision systems and driving control systems
Courbon et al. A generic fisheye camera model for robotic applications
US2942212A (en) Position sensing devices
US7277123B1 (en) Driving-operation assist and recording medium
JP2006113858A (en) Method and system for supporting remote operation for mobile object
JP2014178759A (en) Work vehicle cooperation system
US20160073104A1 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US20120197461A1 (en) Vision Based Hover in Place
JP2009241247A (en) Stereo-image type detection movement device
Kermorgant et al. Combining IBVS and PBVS to ensure the visibility constraint.
Shimada et al. Mecanum-wheel vehicle systems based on position corrective control
KR20050028859A (en) Apparatus and method for recognizing environment, apparatus and method for designing path, and robot apparatus
US20120249739A1 (en) Method and system for stereoscopic scanning
CN104050668A (en) Object recognition method applied to green tea tender shoots and based on binocular vision technology
JP2006011880A (en) Environmental map creation method and device, and mobile robot device
Gava et al. Nonlinear control techniques and omnidirectional vision for team formation on cooperative robotics
US20140142792A1 (en) Moving body system and method for controlling travel of moving body
JP2013108927A (en) Aerial photograph imaging method and aerial photograph imaging device
US9071721B1 (en) Camera architecture having a repositionable color filter array
JP2009223812A (en) Autonomous mobile device
US20140320651A1 (en) Diagonal Collection of Oblique Imagery
Defoort et al. Experimental motion planning and control for an autonomous nonholonomic mobile robot
Singh et al. A two-layered subgoal based mobile robot navigation algorithm with vision system and IR sensors
Oftadeh et al. A novel time optimal path following controller with bounded velocities for mobile robots with independently steerable wheels
US20130241575A1 (en) Electric machine

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13818227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13818227

Country of ref document: EP

Kind code of ref document: A1