WO2023072889A1 - Vorrichtung zum prüfen eines justierzustands eines bildsensors und verfahren zum prüfen eines justierzustands eines bildsensors - Google Patents
Vorrichtung zum prüfen eines justierzustands eines bildsensors und verfahren zum prüfen eines justierzustands eines bildsensors Download PDFInfo
- Publication number
- WO2023072889A1 WO2023072889A1 PCT/EP2022/079694 EP2022079694W WO2023072889A1 WO 2023072889 A1 WO2023072889 A1 WO 2023072889A1 EP 2022079694 W EP2022079694 W EP 2022079694W WO 2023072889 A1 WO2023072889 A1 WO 2023072889A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical
- image
- optical element
- optical axis
- camera module
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 52
- 230000003287 optical effect Effects 0.000 claims abstract description 244
- 238000011156 evaluation Methods 0.000 claims abstract description 22
- 230000033001 locomotion Effects 0.000 claims description 16
- 230000004044 response Effects 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 4
- 238000012886 linear function Methods 0.000 claims description 2
- 238000003860 storage Methods 0.000 claims description 2
- 238000012360 testing method Methods 0.000 description 13
- 238000005259 measurement Methods 0.000 description 11
- 238000004519 manufacturing process Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000013459 approach Methods 0.000 description 4
- 239000000853 adhesive Substances 0.000 description 3
- 230000001070 adhesive effect Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000002277 temperature effect Effects 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
Definitions
- the present approach relates to a device for checking an adjustment state of an image sensor and a method for checking an adjustment state of an image sensor.
- Active alignment is implemented as part of a production process. After completion of the manufacturing process, it is necessary to check the quality of the alignment of the camera system.
- the alignment can be negatively influenced, for example by manufacturing steps such as the uneven hardening of the adhesive with which the optics and the sensor are fixed to one another, but also by mechanical influences or temperature effects.
- completed camera modules are checked with simple test image structures and it is determined whether they meet the defined sharpness criteria.
- a check of an adjustment state of the image sensor of a camera in relation to the associated optics can advantageously be improved.
- the degree of mechanical tilting or sensor misalignment, which can lead to a drop in sharpness, can be determined quantitatively.
- a device for checking an adjustment state of an image sensor of a camera module having the following features: a first optical device with a first optical element that can be illuminated by a first light source and can be moved along a first optical axis, a second optical device with a second optical element that can be illuminated by a second light source and can be moved along a second optical axis, the second optical device being arranged at a distance (e.g.
- the camera module to be tested can be arranged in a region of the intersection, and an evaluation device that is designed to read in position information that represents a position of the first and second optical element detected at a specific point in time, and to read in an image signal that represents one of the Image sensor represents image information captured at the specific point in time, the evaluation device being designed to assign image information to each detected position using the image signal and additionally or alternatively to the position information in order to determine the adjustment state of the camera module.
- the device presented here can be used to check the image sensor of a camera in relation to the associated optics, for example at the end of the camera manufacturing process.
- An important measurement parameter when checking the camera alignment after installation can be the degree of tilting between an image plane of the optics and a sensor plane, which affects the sharpness and contrast distribution in the image field.
- the camera module to be checked which can consist of optics and a sensor, for example, can be illuminated by means of the optics devices, for example with collimated light. The illumination can take place both in the axial position, parallel to the optical axis of the specimen, and in one or more off-axis positions.
- focusable optical devices can be used with the device presented here, which can also be referred to as collimators.
- One optics device is sufficient for purely axial focusing. If you want to determine the tilt of the image plane in one direction, an additional off-axis optical device is required. At least one further, off-axis optical device is required, which must not be arranged along a line with the on-axis optical device and the first off-axis optical device, in order to determine the tilting of the image plane of the optical system to be tested in two directions. Additional off-axis optical devices can be added to increase the number of measurement positions and to obtain additional information about the curvature of the image plane.
- Determining the spatial position of the image plane of the test object represents an advantageous application of the invention.
- a first and a second optical device in the further course of this description.
- the optical elements can each be designed as a reticle that can be moved along the optical axis of the relevant optical device, so that it is possible to carry out a focusing run.
- ⁇ ZK f
- f k is the focal length of the optics of the camera system to be tested
- f 0E is the focal length of the optical element
- the z-position which can be determined by the position of the reticle in the optical device, and, for example, a respective value of the image contrast, for example as a modulation transfer function (MTF value) of the projected individual image.
- MTF value modulation transfer function
- the device includes the evaluation device, which is designed to use the image signal and additionally or alternatively the position signal to assign image information, for example an MTF value, to each detected position information of the optical elements in order to determine the adjustment state of the camera module.
- image information for example an MTF value
- the device can include an image capture circuit, which can be designed to control or read out the image sensor depending on the position of the optical elements, and which can be designed to provide the image signal.
- the image capture circuit which can also be referred to as a frame grabber, can be an electronic circuit for digitizing analog image signals or also for reading out digital image data.
- the frame grabbing circuit can be designed additionally or alternatively in order to connect the camera module to a wide variety of systems.
- the device can thus be designed, for example, in such a way that the image information captured by the image sensor using the frame grabber, can be processed.
- the image capture circuit can be designed, for example, to provide the image signal to the evaluation device via an interface.
- the image capture circuit can be connected or can be connected to a control device for controlling the optical devices in a manner capable of transmitting signals.
- the image capture circuit (frame grabber) is used for electronic further processing or forwarding of the image information captured by the sensor.
- the device can comprise a control device for controlling the first optical element and the second optical element.
- the control device can be designed to provide the position information.
- all optical devices more precisely their motor controls or their movement drives, can be electronically connected in parallel to the control unit.
- Each optics device can in turn have a position encoder, for example, by means of which the exact position of the respective optics element can be determined.
- a movement of the individual optical elements can be optimally matched to the other optical elements by the control device.
- the control device can be designed to provide the respective positions using the position information. A synchronization of position and image information can advantageously be optimized as a result.
- the device can be designed to arrange the first and the second optical element at the specific point in time in such a way that the intermediate images of the optical elements are in the same plane (intermediate image plane). These intermediate images are mapped into the image plane of the optical system to be tested.
- the first optical element of the first optical device can be moved from a first starting position to a first end position.
- the second optical element of the second optical device can be moved from a second starting position to a second end position.
- the intermediate images of the optical elements move from a first, common, apparent object plane to a second, common, apparent object plane.
- the first, apparent object plane correlates with the first and second starting position and the second, apparent object plane with the first and second end position.
- the device can have a third optical device with a third optical element that can be illuminated by a third light source and can be moved along a third optical axis.
- the third optical device can be arranged at a distance (e.g.
- the optical elements can be illuminated both at the axial position of the first optical device, parallel to the optical axis of the test object, and also at several off-axis positions.
- the three optical elements are not arranged in one plane, so that the image points projected in the camera module span an image plane whose angular position can be determined.
- a contrast (MTF) value for a fixed spatial frequency can be determined at each of the three field positions at each z-position of the optical elements.
- the result of the measurement can be the focus curve, a representation of image contrast as a function of z-position. From the position of the maxima of the three curves along the z-direction, the degree of tilting of the image plane relative to the sensor plane can advantageously be inferred, and defocusing can also be optimally determined. For camera systems that are not yet permanently installed together, a best focus position can now be determined with the help of an active alignment between the optics and the sensor.
- optical elements whose optical axes are not in one plane is particularly advantageous for determining the tilting of the image plane of a camera module.
- Other optical elements can be used to make the determination more precise and to obtain information about the curvature of the field of view of the test object.
- the method can be carried out using a variant of the device presented above, in order to check the adjustment status of the image sensor of a camera in relation to the associated optics.
- a check can be useful, for example, at the end of the camera manufacturing process.
- After completion of the manufacturing process it is necessary to check the quality of the alignment of the camera system.
- the alignment can be negatively influenced, for example, by manufacturing steps such as the uneven hardening of the adhesive with which the optics and the sensor are fixed to one another, but also by mechanical influences or temperature effects.
- the method presented here can advantageously be carried out.
- each position of the optical elements can be directly assigned the corresponding image signal with high accuracy, i.e. with the lowest possible time offset (latency) and temporal inaccuracy, or that the image information and the associated position of the optical elements can be recorded almost simultaneously. This is necessary in order to be able to determine the position of the highest image contrast with the greatest possible accuracy (in the pm range).
- the method can have a step of outputting a position trigger signal in order to determine the point in time for detecting the position of the first and additionally or alternatively second optical element, wherein the position information can be provided in response to the position trigger signal.
- the illuminated optical elements of the optical devices can be moved continuously from a starting position to an end position.
- images of the optical elements that follow one another in time can be recorded by the test object by means of the image sensor.
- the individual pieces of image information which can also be referred to as frames, can be processed, for example, by an image capture circuit or a frame grabber. This image capture circuit can, for example, output the position trigger signal as soon as an image has been completely recorded became.
- the position trigger signal can be output at the beginning of the image acquisition.
- the image signal which represents the image information
- the position trigger signal can, for example, be output to a control device for controlling the optical elements.
- the position of the optical elements that exists at this point in time can be made available to the evaluation device using the position information.
- the image information can thus advantageously be evaluated as a function of the position of the optical elements.
- direct synchronization between the frame grabber and the control device can improve the measurement process in such a way that, on the one hand, a continuous focusing run can be run at high speed and, on the other hand, that there is no indirect link between the image position and the encoder position via time stamps, which necessarily have a temporal linear process would require.
- a position-controlled triggering of the image recording also enables non-linear (accelerated) movement profiles.
- the method can have a step of outputting an image trigger signal in order to determine the point in time for capturing the image information, it being possible for the image signal to be provided in response to the image trigger signal.
- the optical elements can be moved continuously from a starting position to an end position.
- the image trigger signal can be output.
- the image trigger signal can be output, for example, when non-equidistant position marks are reached, for example at the positions 0 mm, 0.1 mm, 0.2 mm, 0.5 mm, 1.0 mm, 2.0 mm and 5.0 mm.
- the image trigger signal can be output to the image capture circuit.
- the position signal can be provided to the evaluation device.
- the image capture circuit can control the start of an image acquisition process.
- the respective image information can then be read out and made available to the evaluation device using the image signal.
- Any piece of image information can be assigned to the predefined position of the optical elements by means of the evaluation device.
- the image information can be buffered and transmitted at the end of the focusing run and assigned to the positions. This step can also advantageously be used to carry out an evaluation of the image information as a function of the optical element encoder position.
- the method can include a step of storing the image information and additionally or alternatively the position of the first and second optical element.
- the respective position of the first and second optical element is stored after it has been detected at a predefined point in time as a reaction to a position trigger signal.
- the image information is also stored approximately at the same time as the position trigger signal is output.
- the image information is stored after it has been captured at a defined point in time as a reaction to an image trigger signal.
- the respective position of the first and second optical element is also stored approximately at the same time as the image trigger signal is output.
- the first optical element can be moved at a first speed and the second and/or each additional optical element can be moved at a second speed that differs from the first speed.
- the control device for controlling the optical elements can be designed in such a way that the intermediate images of the optical elements of all optical devices are advantageously located in the same object plane at the same time.
- the first optical axis of the first optical device essentially corresponds to an optical axis of the camera module to be tested and the second and/or further optical device(s) are/is arranged at a radial distance from the first optical device.
- the traversing speed of a so-called master optics device for example the optics device corresponding to the optical axis of the camera module
- each individual optical device can have its own position encoder, which can be used, for example, in what is known as closed-loop control for position and speed control.
- the signals from the individual optical devices can be transmitted electronically in parallel to the control device. Since the relationship between the positions of the optical elements and apparent object planes (intermediate image planes) is non-linear, it is also advantageous to run a corresponding speed profile in order to achieve a uniform measurement point distribution in the image space.
- the first and the second speed can have a value greater than 0 m/s at any point in time.
- the time course of the first and second and / or other speeds can be mathematically determined by a be describable as a nonlinear function.
- a continuous focusing run can be run at high speed, with an indirect link between image information and the position of the optical elements via time stamps being able to be dispensed with, which would necessarily require a temporally linear movement process.
- the method can have a step of providing a movement signal, in which case the movement signal can represent a specification of the positions to be approached by the optical elements, in particular in which the specification can be stored as a position table.
- the movement signal can represent a specification of the positions to be approached by the optical elements, in particular in which the specification can be stored as a position table.
- one or more sets of optical element z positions can be stored in the control unit.
- the individual z positions can correspond to different object planes into which the images of the optical elements are apparently projected for the camera module. These apparent object planes are also referred to as intermediate image planes.
- the speed profiles of the first optics element and the second optics element can advantageously be matched to one another in such a way that all images of the optics elements are always located at the same time in the previously defined object planes.
- the set of positions can be transferred in the form of a position table and the positions can continue to be equidistant or not equidistant. A trajectory along which the optical elements of
- This method can be implemented, for example, in software or hardware or in a mixed form of software and hardware, for example in a control unit.
- a computer program product or computer program with program code which can be stored on a machine-readable carrier or storage medium such as a semiconductor memory, a hard disk memory or an optical memory and for carrying out, implementing and/or controlling the steps of the method according to one of the embodiments described above, is also advantageous used, especially when the program product or program is run on a computer or device.
- 1 shows a schematic representation of an exemplary embodiment of a measurement of a tilting between an image plane of an optical unit and a sensor plane
- 2 shows a schematic representation of an exemplary embodiment of a device
- 3A shows a schematic representation of an exemplary embodiment of a device in plan view
- 3B shows a schematic cross-sectional illustration of an embodiment of a device in side view
- FIG. 4 shows a schematic representation of an exemplary embodiment of a device
- FIG. 5 shows a schematic representation of an exemplary embodiment of a device
- Fig. 6 is a flow chart of an embodiment of a method for testing a
- FIG. 7 shows a flow chart of an embodiment of a method for checking an adjustment state of an image sensor of a camera module
- Fig. 8 is a flow chart of an embodiment of a method for testing a
- Figure 1 shows a schematic representation of an exemplary embodiment of a measurement of a tilt between an image plane 100 of an optics unit 105 and a sensor plane 110.
- An important measurement parameter when checking a camera alignment is the degree of tilt between the image plane 100 of the optics or the optics unit 105 and the sensor plane 110.
- the camera module 115 to be tested consisting of an optical unit 105 and an image sensor 120, can be illuminated with collimated light.
- the sensor or the optics of the test object can be moved relative to one another, with an MTF value 130 for a fixed spatial frequency being able to be determined at each z-position 125 by way of example at each of the three field positions.
- the focusing curve 135 with the contrast values as Z-position function is shown on the lower left side of the figure. From the position of the maxima of the three curves along the z-direction, the degree of tilting of the image plane 100 relative to the sensor plane 110 can be deduced and a defocusing can also be determined. The illustration shows this in two dimensions; this evaluation can also take place in three dimensions. For camera systems that are not yet permanently installed together, the best focus position can be determined with the help of an active alignment between the optics and the sensor. After completion of the manufacturing process, it is necessary to check the quality of the alignment of the camera system.
- the alignment can be negatively influenced, for example, by manufacturing steps such as the uneven hardening of the adhesive with which the optics and the sensor are fixed to one another, but also by mechanical influences or temperature effects.
- completed camera modules are checked with simple test image setups and it is determined whether they meet the defined sharpness criteria.
- this method does not provide a quantitative indication of the degree of mechanical tilt or sensor misalignment that led to the observed drop in sharpness. This makes it difficult to systematically optimize the manufacturing process.
- FIG. 2 shows a schematic representation of an exemplary embodiment of a device 200.
- Device 200 is designed to check an adjustment state of an image sensor 120 of a camera module 115.
- the device 200 comprises a first optical device 205 with a first optical element 220 that can be illuminated by a first light source 210 and can be moved along a first optical axis 215.
- the first optical axis 215 in the illustration shown here corresponds to an optical axis 225 of the figure below of the first optical device 205 arranged camera module 115.
- the device 200 further comprises a second optical device 235 with a second optical element 250 that can be illuminated by a second light source 240 and can be moved along a second optical axis 245.
- the second optical device 235 is, for example, radial (here specifically by a rotated at an angle with respect to the first optical axis 225) at a distance from the first optics device 205 and the first optical axis 215 has a point of intersection 260 with the second optical axis 245, the camera module 115 to be checked being arranged in a region of the point of intersection 260.
- further optics devices can be added accordingly.
- device 200 has an evaluation device 270 which is designed to read in a position signal 275 .
- the position signal 275 represents a position of the first and the second optical element 220, 250 detected at a specific point in time and in this exemplary embodiment can be provided by a control device 280 for controlling the optical elements 220, 250 to the evaluation device 270.
- the evaluation device 270 is also designed to to read in an image signal 285 that represents image information captured by the image sensor 120 at the specific point in time.
- the evaluation device 270 is designed in this exemplary embodiment to use the image signal 285 and the position signal 275 to assign image information to each detected position in order to determine the adjustment state of the camera module. In another embodiment, only the position signal or the image signal can be used.
- FIG. 3A shows a schematic representation of an exemplary embodiment of a device 200 in plan view.
- This includes an on-axis optics device 205 and a plurality of off-axis optics devices 235, 300, which are radially spaced from the on-axis optics device 205 at different angles.
- the device 200 shown here corresponds to or is similar to the device described in the preceding FIG. Congruent to the first optics device 205 and the second optics device 235, the third optics device 300 has a third optics element 315 that can be illuminated by a third light source 305 and can be moved along a third optical axis 310.
- the third optics device 300 is arranged at a radial distance from the first and second optics devices 205, 235 and the third optical axis 310 has a point of intersection 260 with the first and second optical axes 215, 245
- Camera module 115 can be arranged.
- further optical devices can also be arranged spatially radially around the entry opening of the camera module to be checked. This situation is illustrated in the plan view in FIG. 3A.
- Figure 3B shows a schematic representation of an embodiment of a first optics device 205.
- the first optics device 205 shown here corresponds to or is similar to the first optics device described in the previous Figures 2 and 3A and has a housing 330 in which the first light source 210 is arranged.
- the first light source 210 is designed to emit a light beam 335 that can be collimated by a projection objective 340 .
- the first optical element 220 which can be moved along the first optical axis 215, is arranged between the first light source 210 and the projection lens 340, the first optical axis 215 corresponding to the optical axis 225 of the camera module 115 to be checked.
- the first optical element 220 can be controlled by means of a motorized drive and a position encoder 345 merely by way of example.
- the light beam 335 can be modified in such a way that different apparent object distances can be set for the image sensor 120 of the camera module 115 illuminated in this way and, for example, different contrast (MTF) values for these object distances can be evaluated can.
- the optics 340 generate a virtual intermediate image of the optics element 220, which in turn is imaged as an object by the optics of the system 115 to be tested onto its sensor 120.
- FIG. 4 shows a schematic representation of an embodiment of a device 200.
- the device 200 shown here corresponds to or is similar to the device described in the preceding FIGS.
- the image capture circuit 400 which can also be referred to as a frame grabber, is designed in this exemplary embodiment to read out the image sensor 120 depending on the position of the optical elements 220, 250, 315 and to provide the image signal 285 to the evaluation device 270.
- the frame grabber 400 is additionally designed to output a position trigger signal 405 to the control device 280 .
- the control device 280 is designed in this exemplary embodiment to determine the point in time for detecting the position of the optical elements 220, 250, 315 in response to the position trigger signal 405 and to store the respective position by means of a memory unit 407, which can also be referred to as an optical element position memory. save.
- the positions of the optical elements 220, 250, 315 can then be provided using the position signal 275.
- the device 200 in this exemplary embodiment is designed to process the image information captured by the test object with the aid of the image capture circuit 400, the image capture circuit 400 being connected to the control device 280 in a manner capable of signal transmission only by way of example.
- all optical devices 205, 235, 300, more precisely their motor controls are electronically connected in parallel to the control device 280.
- the control device 280 is designed to store the positions of the optical elements 220, 250, 315 at the point in time determined by the position trigger signal 405, which is only an example, and which can be the beginning or the end of an image acquisition. In this case, the optical elements 220 , 250 , 315 are continuously moved from a starting position 410 to an end position 415 .
- a position of the corresponding intermediate image also correlates with each position of the optical elements.
- the optical elements are arranged along their respective optical axes in such a way that all intermediate images lie in a common, apparent object plane.
- the intermediate images also move from a starting position 410 to an end position 415.
- the first object plane 410 which corresponds to a starting position of the intermediate images of the optical elements 220, 250, 315, and the second object plane 415, the corresponds to an end position of the intermediate images of the optical elements 220, 250, 315.
- the intermediate images of the optical elements along a variable A plurality of object levels can be moved.
- a distance l 2 between the starting position and the end position of the second optical element 250 is greater than a distance between the starting position and the end position of the first optical element 220.
- the speed profiles of the first Optic device 205, the second optic device 235 and the third optic device 300 and also other optic devices can be matched to one another in such a way that all intermediate images of all optic elements 220, 250, 315 can always be arranged simultaneously in the previously defined object planes 410, 415.
- the control of the optical devices 205, 235, 300 is therefore designed in such a way that the intermediate images of the optical elements of all optical devices 205, 235, 300 are located in the same object plane 410, 415 at the same time, with the consequence that the optical elements of the off-axis optical devices 235 , 300 are moved at a different speed than the optics elements of the axial optics device 205.
- the traversing speed of one of the first optics devices 205 is set as a guide value only as an example, to which the speeds of the other optics devices 235, 300 are correspondingly adapted in terms of control technology.
- Each of the optical devices 205, 235, 300 includes, for example, its own position encoder, which can be used in a closed-loop control for the position and speed control.
- the signals from the individual optics devices 205, 235, 300, ie the signal from the on-axis optics device 205 and the signals from the various off-axis optics devices 235, 300, can be transmitted electronically in parallel to the control device 280.
- the control device 280 is therefore designed, purely by way of example, to provide a first movement signal 420, a second movement signal 422 and a third movement signal 425 to the optics devices 205, 235, 300, with the movement signals 420, 422, 425 being a specification for the movement of the optics elements 220 , 250, 315 positions that can be approached.
- a specification for the positions that can be approached by the optical elements 220, 250, 315 is stored as a position table 430 in the control device 280, merely by way of example.
- Figure 5 shows a schematic representation of an embodiment of a device 200.
- the device 200 shown here corresponds to or is similar to the device described in the preceding Figures 2, 3 and 4, 5, with the difference that in this embodiment the control device 280 is designed to to output an image trigger signal 500 .
- the image trigger signal 500 can be provided to the image capture circuit 400 in order to determine the point in time for capturing the image information. Accordingly, the image trigger signal 500 can be triggered in this exemplary embodiment as soon as the optical elements 220, 250, 315 have reached a predefined position.
- the predefined positions to be approached can be stored in a position table 430 .
- the image capture circuit 400 is designed to control the image sensor 120 in response to the image trigger signal 500 and to start an image recording process.
- the image signal 285 can be provided.
- the image signal 285 can only be provided indirectly to the evaluation device 270 in this exemplary embodiment, since an image output device 505 is connected upstream of this only by way of example.
- the position signal 275 can only be provided indirectly by the control device 280 to the evaluation device 270 using a position output device 510 .
- FIG. 6 shows a flow chart of an exemplary embodiment of a method 600 for checking an adjustment state of an image sensor of a camera module.
- the method 600 shown here can be carried out using a device as described in the preceding FIGS. 2, 3, 4 and 5.
- the method 600 includes a step 605 of moving a first optical element, which can be illuminated by a first light source, along a first optical axis of a first optical device.
- the first optical axis essentially corresponds to an optical axis of the camera module to be checked.
- a second optical element that can be illuminated by a second light source is also moved along a second optical axis of a second optical device.
- the second optics device is arranged at a radial distance from the first optics device and the first optical axis has an intersection with the second optical axis inside the camera module.
- the first optics element is moved at a first speed and the second optics element is moved at a second speed that differs from the first speed, merely by way of example.
- both the first and the second speed have a value greater than 0 m/s at any point in time, and the time profile of the first and second speed can only be described mathematically by way of example using a non-linear function. Other optics can be added to this scheme.
- the method 600 also includes a step 610 of reading in.
- position information is read in, which represents a position of the first and of the second optical element detected at a specific point in time.
- an image signal is read in which represents image information recorded by the image sensor at the specific point in time.
- the step 610 of reading in is followed by a step 615 of assignment the positions to the image information using the image signal and the position information to determine the adjustment state of the camera module.
- the aim of the method 600 described here is that the corresponding image signal is directly assigned to each position with high accuracy, i.e. with the lowest possible time offset (latency) and temporal inaccuracy, or that the image information and the associated positions of the optical devices are recorded quasi-simultaneously become. This is necessary in order to determine the position of the highest image contrast with the greatest possible accuracy (in the pm range).
- a continuous focusing run can be carried out at high speed and there is no indirect linking between the image and the encoder position via a time stamp, which would necessarily require a temporally linear traversing process.
- FIG. 7 shows a flow chart of an exemplary embodiment of a method 600 for checking an adjustment state of an image sensor of a camera module.
- the method 600 presented here corresponds to or is similar to the method described in the preceding FIG. 6, with the difference that it has additional steps.
- step 605 of moving is followed by step 700 of outputting a position trigger signal.
- the position trigger signal is only output as an example in order to determine the point in time for detecting the positions of the first and second optical elements (and for example) further optical elements.
- the method 600 in this exemplary embodiment includes a step 705 of storing the image information and the positions of the first and second optical elements and all other optical elements.
- optical elements in focusable collimators are moved continuously, ie not in steps, from a starting position to an end position.
- the test object records sequential images of the reticle and, for example, the individual image information (frames) is processed by a frame grabber.
- the frame grabber releases a trigger signal as soon as an image has been completely recorded.
- the signal can also be output at the beginning of the image recording. This is followed by storing the image information and storing the optics element encoder position information in response to the trigger signal.
- FIG. 8 shows a flow chart of an exemplary embodiment of a method 600 for checking an adjustment state of an image sensor of a camera module.
- the method 600 presented here corresponds to or is similar to the method described in the preceding FIGS. 6 and 7, with the difference that it has alternative and additional steps.
- the method 600 includes a step 800 of providing a motion signal.
- the movement signal represents a specification for positions to be approached by the optical elements in step 605 of moving.
- this specification is stored as a position table with non-equidistant position marks.
- an image trigger signal is triggered in order to determine the point in time for capturing the image information
- the image signal is provided in response to the image trigger signal.
- the optical elements in the focusable collimators move continuously from a starting position to an end position.
- a trigger signal is triggered as soon as the encoder has reached a predefined position.
- These trigger signals are only passed on to the frame grabber as an example, which then starts the image acquisition process.
- the image information is read out and stored, assigned to the initially predefined position.
- the image information can be temporarily stored and transmitted at the end of the focusing run and assigned to the positions.
- the image information for example the contrast values, is evaluated as a function of the optical element encoder position.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
- Testing Of Optical Devices Or Fibers (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280071823.6A CN118402230A (zh) | 2021-10-29 | 2022-10-25 | 用于测试影像传感器的调整状态的装置及方法 |
EP22809079.1A EP4424009A1 (de) | 2021-10-29 | 2022-10-25 | Vorrichtung zum prüfen eines justierzustands eines bildsensors und verfahren zum prüfen eines justierzustands eines bildsensors |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102021128335.3 | 2021-10-29 | ||
DE102021128335.3A DE102021128335A1 (de) | 2021-10-29 | 2021-10-29 | Vorrichtung zum Prüfen eines Justierzustands eines Bildsensors und Verfahren zum Prüfen eines Justierzustands eines Bildsensors |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023072889A1 true WO2023072889A1 (de) | 2023-05-04 |
Family
ID=84360641
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/079694 WO2023072889A1 (de) | 2021-10-29 | 2022-10-25 | Vorrichtung zum prüfen eines justierzustands eines bildsensors und verfahren zum prüfen eines justierzustands eines bildsensors |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4424009A1 (de) |
CN (1) | CN118402230A (de) |
DE (1) | DE102021128335A1 (de) |
TW (1) | TW202317961A (de) |
WO (1) | WO2023072889A1 (de) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012016337A1 (de) * | 2012-08-20 | 2014-02-20 | Jos. Schneider Optische Werke Gmbh | Verfahren zum Bestimmen einer optischen Qualität eines Fotomoduls |
US20210181058A1 (en) * | 2017-06-02 | 2021-06-17 | Trioptics Gmbh | Apparatus for detecting a modulation transfer function and centering of an optical system |
-
2021
- 2021-10-29 DE DE102021128335.3A patent/DE102021128335A1/de active Pending
-
2022
- 2022-10-25 EP EP22809079.1A patent/EP4424009A1/de active Pending
- 2022-10-25 CN CN202280071823.6A patent/CN118402230A/zh active Pending
- 2022-10-25 WO PCT/EP2022/079694 patent/WO2023072889A1/de active Application Filing
- 2022-10-27 TW TW111140866A patent/TW202317961A/zh unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012016337A1 (de) * | 2012-08-20 | 2014-02-20 | Jos. Schneider Optische Werke Gmbh | Verfahren zum Bestimmen einer optischen Qualität eines Fotomoduls |
US20210181058A1 (en) * | 2017-06-02 | 2021-06-17 | Trioptics Gmbh | Apparatus for detecting a modulation transfer function and centering of an optical system |
Non-Patent Citations (2)
Title |
---|
BRÄUNIGER K ET AL: "Automated assembly of camera modules using active alignment with up to six degrees of freedom", PROCEEDINGS OF SPIE, IEEE, US, vol. 8992, 8 March 2014 (2014-03-08), pages 89920F - 89920F, XP060036061, ISBN: 978-1-62841-730-2, DOI: 10.1117/12.2041754 * |
TRIOPTICS GMBH: "CamTest ColMot Optical target projectors", 31 January 2021 (2021-01-31), XP093020328, Retrieved from the Internet <URL:https://trioptics.com/wp-content/uploads/2021/01/CamTest_ColMot2.0_6.0_Optical_Target_Projectors_DS.pdf> [retrieved on 20230202] * |
Also Published As
Publication number | Publication date |
---|---|
DE102021128335A1 (de) | 2023-05-04 |
EP4424009A1 (de) | 2024-09-04 |
TW202317961A (zh) | 2023-05-01 |
CN118402230A (zh) | 2024-07-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102017217320B4 (de) | Linsensystem variabler brennweite und fokuskontrolle | |
DE69218627T2 (de) | Stanzpressensystem mit ccd-kamerasystem für ein automatisches dreiaxiales ausrichtungssystem für matrizen | |
EP0947802B1 (de) | Messanordnung zur Erfassung von Dimensionen von Prüflingen, vorzugsweise von Hohlkörpern, insbesondere von Bohrungen in Werkstücken, sowie Verfahren zur Messung solcher Dimensionen | |
DE102017207179A1 (de) | Bildgebungssystem mit variabler Brennweite | |
DE102016202928B4 (de) | Verbessertes Autofokusverfahren für ein Koordinatenmessgerät | |
EP0637167A1 (de) | Messkamera | |
EP3410091B1 (de) | Verfahren zum erfassen einer modulations-transferfunktion und einer zentrierung eines optischen systems | |
DE102010032800A1 (de) | Verfahren und Vorrichtung zum Kalibrieren einer Laserbearbeitungsmaschine | |
DE102016102579A1 (de) | Verfahren und Vorrichtung zum Bestimmen einer Vielzahl von Raumkoordinaten an einem Gegenstand | |
DE102014107044B4 (de) | Verbesserte Autofokusverfahren für ein Koordinatenmessgerät sowie Koordinatenmessgerät | |
DE102005036529A1 (de) | Verfahren zur lagegetreuen Abbildung eines Objektes | |
DE102011116734B4 (de) | Verfahren zum Ermitteln eines fokussierten Bildabstands eines optischen Sensors eines Koordinatenmessgeräts | |
EP3303990B1 (de) | Beleuchtungssteuerung beim einsatz von optischen messgeräten | |
WO2023072889A1 (de) | Vorrichtung zum prüfen eines justierzustands eines bildsensors und verfahren zum prüfen eines justierzustands eines bildsensors | |
DE102009012248A1 (de) | Autofokusverfahren und Abbildungsvorrichtung zur Durchführung des Verfahrens | |
DE102007063318B3 (de) | Verfahren zum Justieren einer Profilbearbeitungsmaschine | |
DE102022204539A1 (de) | Verfahren zum Justieren einer Kamera | |
DE102008031412A1 (de) | Vorrichtung und Verfahren zur Beobachtung mehrerer auf einer Linie angeordneter Messpunkte auf einer zu vermessenden Objektoberfläche | |
DE102014001151A1 (de) | Messung der Positionen von Krümmungsmittelpunkten optischer Flächen eines mehrlinsigen optischen Systems | |
DE102019133738A1 (de) | Vorrichtung, Verfahren und Verwendung der Vorrichtung zur Justage, Montage und/oder Prüfung eines elektrooptischen Systems | |
WO2017067903A1 (de) | Verfahren zum ermitteln eines fokussierten bildabstands eines optischen sensors eines koordinatenmessgeräts | |
DE102017201529A1 (de) | Verfahren und Vorrichtung zur Erfassung der Ist-Position eines Bauteils | |
EP3647851B1 (de) | Mikroskopie-vorrichtung zur erstellung dreidimensionaler abbilder | |
DE102004001441B4 (de) | Verfahren und Vorrichtung zur Justierung der beiden Objektive in einer 4Pi-Anordnung | |
EP3500886B1 (de) | Wechselsystem für ein mikroskop |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22809079 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2024522521 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022809079 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022809079 Country of ref document: EP Effective date: 20240529 |