CN114390235A - Recording method and recording system for successively recording objects moving relative to a camera - Google Patents
Recording method and recording system for successively recording objects moving relative to a camera Download PDFInfo
- Publication number
- CN114390235A CN114390235A CN202111214649.XA CN202111214649A CN114390235A CN 114390235 A CN114390235 A CN 114390235A CN 202111214649 A CN202111214649 A CN 202111214649A CN 114390235 A CN114390235 A CN 114390235A
- Authority
- CN
- China
- Prior art keywords
- image data
- recorded
- camera
- data set
- line image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 17
- 230000002123 temporal effect Effects 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000005096 rolling process Methods 0.000 description 15
- 230000000875 corresponding effect Effects 0.000 description 10
- 230000002596 correlated effect Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000012935 Averaging Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 239000006059 cover glass Substances 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000010924 continuous production Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/443—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
Abstract
The invention relates to a recording system (1) for the sequential recording of objects (2) moving relative to a camera (3). The recording system includes: a camera comprising an image sensor (4) for recording a line image data set (10) of an object to be recorded in a free-driving mode; a receiving unit (5) for receiving rescan information related to the speed of the relative movement between the object to be recorded and the camera; and a rescanning unit (6) for rescanning the recorded line image dataset to produce a rescanned line image dataset, wherein the spatial position of the rescanned line image dataset with respect to the recorded line image dataset is related to the received rescan information. The image sensor is an area sensor which comprises a plurality of pixel rows and is suitable for recording a plurality of row image data sets by means of the plurality of pixel rows in each case for generating a rescanned row image data set.
Description
Technical Field
The present invention is in the field of electrical communication engineering and relates in particular to a recording method and a recording system for successively recording objects moving relative to a camera.
Background
In industrial environments, digital cameras are often used, for example, to monitor, control, adjust a production process or machine, or to sort objects. Such cameras, which are referred to herein as industrial cameras, are primarily characterized by their integratability into machines and equipment. However, the use of the camera is not limited to industrial environments, and the camera can be used in almost the same form for many other applications, such as mail sorting, toll bridges or medical technology equipment.
Industrial cameras include area-camera (english) and line-camera (english). A camera type with only one single light-sensitive line ("line sensor") is generally referred to as a line camera here. In contrast, a surface camera (the english word "area-scan camera") has a plurality of rows ("row sensors").
The use of area cameras is generally expedient in cases involving the imaging of individual objects in which the ratio is in the order of 1, as is the case, for example, in brake disks or SMD resistors. In contrast, line cameras are used to image objects whose scale deviates considerably from 1 (e.g. railway tracks) or which are produced or inspected in a continuous process. As an example of a continuous process, the printing of newsprint in press publishers is mentioned here. Line cameras are also often used to sequentially image successive series of subsequently sorted objects, such as grain particles or rocks on a conveyor belt.
In order to record a two-dimensional image by means of a line camera, it is necessary to record different regions of the object temporally one after the other and to form a two-dimensional image from the line image data set obtained here. In this case, a relative movement generally takes place between the object to be recorded and the line camera. Either the object to be recorded is moved by means of a conveyor belt or the like ("facsimile principle") or the line camera is moved over a stationary object ("scanner principle").
A problem in recording with the aid of line cameras is that, in the event of an uneven relative speed between the object to be recorded and the line camera, the spacing between the lines recorded one after the other is not constant. Such a speed change therefore causes geometric distortions in the recorded image without corresponding corrective measures.
A conventional recording method for dealing with a varying relative velocity consists in controlling the image repetition rate according to the velocity of the relative motion. If the relative motion is slow, the image repetition rate is also slowed down in order to maintain a constant spacing between the recorded rows. If the relative motion is faster, the image repetition rate is correspondingly increased. The control of the image repetition rate may be achieved by: an external trigger signal (triggered run) is sent to the line camera. However, variations in the image repetition rate typically also cause variations in the exposure, which in turn requires a corresponding exposure control.
Thus, DE 102004050422 a1 takes a different approach to avoid geometric distortions when the relative speed between the object to be recorded and the line camera changes. The line camera is operated in free-running mode (in english "free run"), i.e. the recording of lines takes place at a fixed, relatively high image repetition rate, and the line image data set obtained is rescanned in order to produce a rescanned line image data set with the desired constant (spatial) spacing. Rescanning is a function of a fixed image repetition rate, a measure of relative velocity, and the desired pitch of the rescanned line image data set.
Said solution has the following drawbacks: a relatively high image repetition rate is required in order to keep the line image data set in a sufficient density even in the case of large fluctuations in the relative speed between the object to be recorded and the line camera, so that a high-quality rescan of the obtained line image data set can be achieved.
Disclosure of Invention
Based on the prior art mentioned, the invention is based on the object of specifying a recording system for the sequential recording of objects moving relative to a camera, which can achieve a high-quality rescan of the line image data set obtained in a simple manner even in the event of large fluctuations in the relative speed between the object to be recorded and the camera. The object on which the invention is based is also to propose a corresponding recording method for the successive recording of objects moving relative to a camera.
According to a first aspect of the present invention there is provided a recording system for successively recording objects moving relative to a camera, the recording system comprising:
a camera, wherein the camera comprises an image sensor for recording a line image data set of an object to be recorded in a free-driving operation,
a receiving unit for receiving rescan information related to the speed of the relative movement between the object to be recorded and the camera, an
A rescanning unit for rescanning the recorded line image dataset to produce a rescanned line image dataset, wherein a spatial position of the rescanned line image dataset with respect to the recorded line image dataset is related to the received rescanning information,
i wherein the image sensor is an area sensor which comprises a plurality of pixel rows and which is adapted to record a plurality of line image data sets by means of the plurality of pixel rows, respectively, for generating a rescanned line image data set.
Recording "in free-running mode" is understood here to mean recording at a constant image repetition rate or image frequency based on the timing within the sensor or within the camera, which can be parameterized, for example, by the control unit. The control unit may be implemented in an integrated circuit, for example in a processor, signal processor, microcontroller or FPGA. A particularly frequently chosen architecture for modern CMOS area sensors is known from US 6,606,122B 1. Furthermore, the architecture integrates the pixel array and the control unit inside the sensor on one chip. With this architecture, continuous recording can be performed autonomously by the area sensor at a constant image repetition rate or image frequency, since the control signals required for said operation of the pixel array can be generated by the control unit inside the sensor. The free-running operation can thereby be achieved by means of a particularly simple mechanism which requires relatively little development effort on the camera side.
The inventors' knowledge on which the present invention is based is that by using an area sensor comprising a plurality of pixel rows, a plurality of row image data sets can be recorded simultaneously or substantially simultaneously, respectively. Here, at each time, a plurality of pixel rows cover a larger area of the object to be recorded than one row of the line camera disclosed in DE 102004050422 a 1. Thereby, even in case of a larger fluctuation of the relative speed between the object to be recorded and the camera, the line image data set can be kept in a sufficient density, so that a high quality rescan of the recorded line image data set can be achieved. Furthermore, a series of other advantages result: line cameras typically involve high quality professional cameras that are sold at high prices on the market. Typical line cameras operate at high speed, provide high quality images, and are equipped with special design features, such as an input for delivering a very fast external trigger signal, depending on the requirements of the application. In contrast, a large number of area sensors are available, which have otherwise comparable performance data, in particular with the same image width, at a significantly lower cost than the line sensors. Thus, according to the invention, it is possible to realize a recording system for the successive recording of objects moving relative to the camera at lower material costs than a conventional line camera with line sensors.
The rescanning unit is preferably designed to carry out a rescanning as a function of the image repetition rate or image frequency of the image sensor in the free-run operation and the desired spatial resolution of the rescanning, wherein the image repetition rate or image frequency and/or the desired spatial resolution can preferably be configured. As mentioned, the spatial position of the rescanned line image data set with respect to the recorded line image data set is correlated with the received rescan information, which in turn is correlated with the speed of the relative movement between the object to be recorded and the camera. The object moving relative to the camera may be, for example, an object moving on a conveyor belt. In that case, the speed of the relative movement can be determined, for example, by a rotational angle encoder of the conveyor belt or another sensor for detecting the speed of the conveyor belt. The speed of the relative movement is the main parameter for rescanning, since the speed determines how far the object to be recorded moves relative to the camera within a certain time interval. Other parameters include: an image repetition rate or image frequency of the image sensor, the image repetition rate or image frequency indicating how many images the image sensor records within a particular time interval; and a desired spatial resolution of the rescan that accounts for a desired "pixel density" of the rescan. A recording system for the successive recording of objects moving relative to a camera can be adapted very flexibly to different recording situations and recording requirements if the parameters can be configured. The desired spatial resolution of the rescan may be preferred or configurable such that the spatial resolution corresponds to the spatial resolution of the image sensor. The object to be recorded can then preferably be recorded largely undistorted, i.e. with a substantially correct aspect ratio.
It is also preferred that the image sensor comprises an electronic shutter adapted to simultaneously start exposure of each of the plurality of pixel rows. Electronic shutters of the type generally referred to as global shutters, frame shutters, full-frame shutters or global electronic shutters are characterized in that the exposure of the pixel rows of the image sensor starts at the same time and usually also ends at the same time. At that moment, the photocharges obtained by exposing the pixels are generally transferred to a memory inside the pixels and stored there until read. By simultaneously starting and simultaneously ending the exposure, the recording of the image data sets of the plurality of lines takes place simultaneously. Rescanning the recorded line image data set can therefore be realized particularly simply by means of an image sensor having a global shutter.
Alternatively, it is preferred that the image sensor comprises an electronic shutter adapted to start the exposure of each of the plurality of pixel rows temporally staggered, wherein the rescanning unit is adapted such that a rescanning of the recorded row image data set is related to the temporal staggering of the start of the exposure of each of the plurality of pixel rows. An electronic shutter of the type generally referred to as a rolling shutter or electronic rolling shutter is characterized in that for a first row, the exposure is started at a first instant and the exposure of the other pixel rows is started at different instants, respectively, staggered in time. Since it is generally desirable for all pixel rows of an image to have the same integration time, the exposure is usually also temporally staggered at different times. In contrast to the global shutter, the exposure is ended directly by reading. Therefore, it is not necessary to store photocharge. By eliminating the necessity of storing photocharges, the pixels of an image sensor with a rolling shutter can be constructed substantially smaller than the pixels of an image sensor with a global shutter. Thus, an image sensor with a rolling shutter can be manufactured substantially less expensively than an image sensor with a global shutter, with the smaller silicon area required due to the smaller pixels. Furthermore, due to the smaller pixels, the objective can also be constructed substantially smaller and can therefore be produced at the same cost. In this way, a recording system for the sequential recording of objects moving relative to a camera can be realized in a particularly cost-effective manner by means of an image sensor having a rolling shutter. Since the rescanning unit is adapted such that a rescanning of the recorded line image dataset is correlated with a temporal interleaving of the start of the exposure of each pixel row of the plurality of pixel rows, a spatial mismatch (fehlzurdnungen) of the recorded line image dataset and the disadvantageous image artifacts associated therewith can be avoided.
Preferably, the recording system is adapted to perform successive recording in a forward run in which the temporal staggering of the start of exposure of each of the plurality of pixel lines is performed opposite to the direction of movement of the imaging of the object to be recorded on the image sensor of the camera, or in a backward run in which the temporal staggering of the start of exposure of each of the plurality of pixel lines is performed in the direction of movement of the imaging of the object to be recorded on the image sensor of the camera. As explained, in a rolling shutter, both exposure and reading of the pixel rows of the image sensor are not carried out simultaneously in each case, but are staggered in time row by row. Intuitively, in a two-dimensional time-position diagram (time on the x-axis and spatial position on the y-axis), the exposure instants of successive pixel rows do not overlap vertically, but rather are shifted laterally by the difference of the exposure instants of the respective pixel rows. The time offset has different effects depending on whether successive recordings are performed in a forward run or a backward run. The temporal offset causes the spatial position of the recorded line image data set to also shift slightly because of the relative movement between the object to be recorded and the camera. For the forward operation, the following applies: the recorded line image datasets are spatially pulled apart. The effect, which may also be referred to as scan expansion, causes a slight underscan compared to an image sensor with a global shutter and the same pixel size. Correspondingly, the recorded line image data set is then spatially compressed for the backward run. This effect, which may also be referred to as scan shrinkage, causes a slight overscan compared to an image sensor with a global shutter and the same pixel size. Because of the very small time offset for typical image sensors with rolling shutters, running forward and running backward is not a problem with the information available for rescanning. The rescanning unit is adapted such that a rescanning of the recorded line image data set is related to a temporal interleaving of the start of exposure of each of the plurality of pixel lines, but preferably a corresponding run is taken into account.
It is also preferred that the plurality of pixel rows is in the range of 4 to 128 pixel rows, preferably 4 to 64 pixel rows, still more preferably 8 to 32 pixel rows, most preferably 8 to 16 pixel rows. When recording objects moving in relation to the camera one after the other by means of the area sensor, one feature should be taken into account in comparison with a line camera. Depending on the height of the object to be recorded, for example an object moving on a conveyor belt, different angular velocities and thus different image velocities result from different distances from the camera. This can lead to image artifacts in the case of multi-line scanning, which are referred to herein as three-dimensional distortion (3D distortion). By means of a plurality of pixel rows in the mentioned range, the 3D distortion can be limited to an acceptable level and at the same time a high quality rescanning of the obtained row image data set can be achieved. In particular, the values of 8 to 32 pixel rows and, in the case of still larger dimensions, 8 to 16 pixel rows are a significant compromise between: a rescanned line image data set and a reduced 3D distortion are obtained with sufficient density that a high quality of the recorded line image data set can be achieved even in the case of large fluctuations in the relative speed between the object to be recorded and the camera. Furthermore, as an additional effect of the smaller number of pixel rows, the size of the area to be illuminated by the light source is reduced, whereby, for example, the energy consumption of the illumination can be reduced, or the illumination can be designed more slim and more cost-effective. Furthermore, the problem is reduced that the camera may not be mechanically oriented exactly with respect to the relative direction of movement.
Preferably, the image sensor includes a greater number of pixel rows than the plurality of pixel rows, and is adapted to restrict an image area to be read to the plurality of pixel rows. In other words, the area image sensor is adapted to read a smaller number of pixel rows than the total number of pixel rows of the area sensor in each of the successive exposures. Thereby, in order to realize a recording system for successively recording objects moving relative to the camera, it is not necessary to use a special area sensor comprising exactly a plurality of pixel rows (for example in the range of 8 to 16 pixel rows), but a commercially available area sensor can be used, which again reduces costs.
It is particularly preferred that the region to be read can be configured as a configurable region of interest in the camera. This configuration possibility exists in most of the area sensors available and thus allows the area to be read to be limited to a plurality of pixel rows.
Preferably, a diaphragm with a slit-shaped diaphragm opening is arranged in front of the image sensor in the camera. In addition to the so-called aperture stop in the objective, many cameras contain a further stop, which is here exemplarily referred to as a field stop (feldblend) and is usually positioned in the vicinity of the space in front of the image sensor. A common embodiment of a field stop for an area sensor has a largely rectangular opening, which is referred to herein as a rectangular stop without taking into account the typically rounded corners and generally has a ratio in the range from 1:1 to 2: 1. In contrast, a line camera usually has a further embodiment of a field stop, which may also be referred to as a slit stop. Such a slit diaphragm has a slit-shaped diaphragm opening with a ratio in the range of more than 4:1, preferably more than 8:1, still more preferably more than 12: 1. For example, the diaphragm opening can be embodied as a rectangle, a slot-shaped rectangle with rounded corners or a slot with rounded ends. It is also possible to arrange chamfers on one or both sides. In the case of the use of area sensors in recording systems for the successive recording of objects moving relative to a camera, undesirable image artifacts can in principle occur if unwanted regions of the image sensor are exposed, in particular if the exposure is carried out with a strong intensity. An example of the occurrence of such undesirable image artifacts is set forth in DE 102008016393B 4. Another problem is scattered light that may be caused by undesired illumination of the area sensor. That is to say scattered light, in principle not only occurs in the objective lens, but also possibly in other optical components of the camera, for example the sensor cover glass. By providing a diaphragm with a slit-shaped diaphragm opening in the camera irrespective of the area sensor used, undesirable image artifacts, such as scattered light, for example, at the sensor cover glass, can be reduced.
It is also preferred that the rescanning unit is adapted to generate the rescanned line image data set from each of two or one of the two or more of the plurality of recorded line image data sets recorded at different times, in the case that the spatial position of the line image data set to be rescanned with respect to the recorded line image data sets lies between the two or more recorded line image data sets or on the recorded line image data set in the two or more of the plurality of recorded line image data sets recorded at different times. Such multiple scanning may be performed, for example, as follows: for each of two or more sets of the plurality of recorded line image data sets recorded at different times, an own rescanned line image data set is determined, and a rescanned line image data set is generated from the two or more rescanned line image data sets thus obtained by combination. The combination can be performed, for example, by addition, averaging, weighted averaging or linear combination. The rescanned line image dataset thus produced typically has a better signal-to-noise ratio than a rescanned line image dataset produced from two recorded line image datasets in only one of the plurality of recorded line image datasets. The improved signal-to-noise ratio generally results in an improvement in image quality and is particularly desirable when recording low-light images one after another.
Preferably, the image sensor is a monochrome image sensor, or the image sensor is a color image sensor having a mosaic filter, for example a bayer pattern as described in US 3,971,065A. In the latter case, color images can also be recorded by means of a camera, wherein this can be achieved with relatively little effort by means of a mosaic filter. For this purpose, at least one further technical step is required compared to processing a line image data set of a monochrome image sensor, since in a color image sensor with a mosaic filter color interpolation is required, wherein the color values are determined substantially at intermediate pixel positions. Here, an intermediate pixel position means a position that does not necessarily lie exactly on the pixel grid but can be shifted with respect to the pixel grid by a measure of less than the size of one pixel. A possible way of carrying out this interpolation is to apply interpolation methods known for the arrangement of color filters present on color area sensors, for example according to EP 2929503B 1. Color images each having a plurality of color components are thus obtained from the original image, wherein the color component values for the desired intermediate pixel positions can then be obtained by interpolation in each color plane developed by the values of the corresponding color components. Then, the totality of the color component values for the respective intermediate positions form the color value for the intermediate position, respectively. Another more elaborate method for color interpolation of intermediate pixel positions is described, for example, in DE 102016112968B 4. Furthermore, the method has the following advantages: the obtained color values conform to the model of the standard EMVA 1288. The color values are thereby physically represented as color values of a color line camera with a color line sensor.
It is also preferred that the camera comprises a receiving unit and a rescanning unit.
Preferably, the recording system further comprises a sensor for detecting the speed of the relative movement between the object to be recorded and the camera.
According to another aspect of the present invention, there is provided a recording method for successively recording an object moving relative to a camera, wherein the recording method includes:
recording a line image data set of the object to be recorded by means of an image sensor of the camera in a free-driving mode,
-receiving rescan information related to the speed of the relative movement between the object to be recorded and the camera, and
rescanning the recorded line image dataset to produce a rescanned line image dataset, wherein the spatial position of the rescanned line image dataset with respect to the recorded line image dataset is related to the received rescan information,
i wherein the image sensor is an area sensor comprising a plurality of pixel rows, wherein a plurality of row image data sets are recorded by means of the plurality of pixel rows, respectively, for generating a rescanned row image data set.
It is to be understood that the recording system according to the invention and the recording method according to the invention have similar and/or identical preferred embodiments, in particular as defined in the description.
It is to be understood that a preferred embodiment of the invention may also be any combination of the examples.
Drawings
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings, in which,
figure 1 shows schematically and exemplarily an embodiment of a recording system for successively recording objects moving relative to a camera,
figure 2 shows schematically and exemplarily a rescan of a recorded line image dataset for an image sensor with a global shutter,
figure 3 shows schematically and exemplarily a rescanning of a recorded line image dataset for an image sensor with a rolling shutter in a forward run,
figure 4 shows schematically and exemplarily a rescanning of a recorded line image dataset for an image sensor with a rolling shutter in backward operation,
FIG. 5 shows schematically and exemplarily a rescanning of a recorded line image dataset for an image sensor having a global shutter with multiple scans, and
fig. 6 shows a flow chart exemplarily illustrating an embodiment of a recording method for successively recording objects moving relative to a camera.
In the drawings, identical or corresponding elements or units are provided with identical or corresponding reference numerals, respectively. If an element or unit has been described in connection with the accompanying drawings, detailed description will be omitted if necessary in connection with another drawing.
Detailed Description
Fig. 1 schematically and exemplarily shows an embodiment of a recording system 1 for the sequential recording of an object 2 moving relative to a camera 3. The recording system 1 includes: a camera 3, wherein the camera 3 comprises an image sensor 4 for recording a line image data set 10 of the object 2 to be recorded in a free-driving operation; a receiving unit 5 for receiving rescan information related to the speed v of the relative movement between the object 2 to be recorded and the camera 3; and a rescanning unit 6 for rescanning the recorded line image data set 10 in order to produce a rescanned line image data set 11 (see fig. 2 to 5), wherein the spatial position of the rescanned line image data set 11 with respect to the recorded line image data set 10 is related to the received rescan information. The image sensor 4 is an area sensor, here a monochrome image sensor, which comprises a plurality of pixel rows 12 and is suitable for recording a plurality of line image data sets 10 by means of the plurality of pixel rows 12 in each case for generating a rescanned line image data set 11.
In the embodiment, the camera 3 comprises a receiving unit 5 and a rescanning unit 6.
The rescanning unit 6 is designed to carry out a rescan as a function of the image repetition rate or image frequency of the image sensor 4 in the free-run mode and the desired spatial resolution of the rescan. As mentioned, the spatial position of the rescanned line image data set 11 with respect to the recorded line image data set 10 is correlated with the received rescan information, which in turn is correlated with the speed v of the relative movement between the object 2 to be recorded and the camera 3. In the example, the object 2 moving relative to the camera 3 is an object 2 moving on a conveyor belt 13. The speed v of the relative movement is determined by a rotational angle encoder 14 of the conveyor belt 13. The velocity v of the relative movement is the main parameter for rescanning, since it determines how far the object 2 to be recorded moves relative to the camera 3 within a certain time interval. Other parameters include: an image repetition rate or image frequency of the image sensor 4, which indicates how many images the image sensor 4 records within a certain time interval; and a desired spatial resolution of the rescan that accounts for a desired "pixel density" of the rescan. In the embodiment described, the parameters are configurable, for example, via a suitable interface of the camera 3, which is not shown in the figure, so that the recording system 1 for the successive recording of objects 2 moving relative to the camera 3 can be adapted very flexibly to different recording situations and recording requirements.
In the embodiment, the image sensor 4 includes an electronic shutter 7. Two different variants are to be distinguished here.
In the first modification, the electronic shutter 7 is adapted to start exposure of each of the plurality of pixel rows 12 at the same time. The type of electronic shutter, which is usually referred to as global shutter, frame shutter, full frame shutter or global electronic shutter, is characterized in that the exposure of the pixel rows of the image sensor 4 starts at the same time and usually also ends at the same time.
A rescanning of the recorded line image data set 10 for the image sensor 4 with global shutter is schematically and exemplarily shown in fig. 2. The figure shows a two-dimensional time-position diagram (time on the x-axis and spatial position on the y-axis). At each time t0、t1、t2Etc., the image sensor 4 with a plurality of pixel rows 12 records a plurality of row image data sets 10, respectively, which are visualized here only by means of individual transparent squares, for generating a rescanned row image data set 11. As can be seen, the recorded line image data sets 10 are at different times t0、t1、t2Etc. the recorded sets move spatially (in the y-direction in the diagram) relative to each other due to the relative motion between the object 2 to be recorded and the camera 3. The degree of spatial movement between two adjacent instants is here dependent on the speed v of the relative movement between the instants and the image repetition rate or image frequency of the image sensor 4. In the example shown, the time t0And t1In the direction of the movement (smaller velocity v)0-1) Less than time t1And t2In the direction of the movement (greater velocity v)1-2). The spatial position of the line image data set 11 to be rescanned, which is only visualized here by the individual black solid circles, with respect to the recorded line image data set 10 can be determined as a function of the speed v of the relative movement, the image repetition rate or image frequency of the image sensor 4 and the desired period of rescanningThe desired spatial resolution. In the example, the desired spatial resolution of the rescan corresponds exactly to the resolution of the image sensor 4. Thus, at time t0The recorded line image data set 10 can be used directly as rescanned line image data set 11. However, this no longer applies at the time t1And t2A recorded line image data set 10. Thus, a rescanned line image data set 11 is generated from the respectively adjacent recorded line image data sets 10. This can be done, for example, by means of linear interpolation, which is illustrated in the figure by way of example for the rescanned line image data set 11 with the reference L.
In a second variant, the electronic shutter 7 is adapted to start the exposure of each of the plurality of pixel rows 12 temporally staggered, wherein the rescanning unit 6 is adapted such that a rescanning of the recorded row image data set 10 is correlated with the temporal staggering of the start of the exposure of each of the plurality of pixel rows 12. An electronic shutter of the type generally referred to as a rolling shutter or electronic rolling shutter is characterized in that for a first row, the exposure is started at a first instant and the exposure of the other pixel rows is started at different instants, respectively, staggered in time. Since it is generally desirable for all pixel rows of an image to have the same integration time, the exposure is usually also temporally staggered at different times in each case.
The recording system 1 is adapted to perform successive recording in a forward run in which the temporal staggering of the start of exposure of each of the plurality of pixel lines 12 is performed opposite to the direction of movement of the imaging of the object 2 to be recorded on the image sensor 4 of the camera 3, or in a backward run in which the temporal staggering of the start of exposure of each of the plurality of pixel lines 12 is performed in the direction of movement of the imaging of the object 2 to be recorded on the image sensor 4 of the camera 3.
The rescanning of the recorded line image data set 10 for the image sensor 4 with rolling shutter is schematically and exemplarily shown in fig. 3 (forward run) and in fig. 4 (backward run). As explained, in a rolling shutter, both exposure and reading of the pixel rows of the image sensor 4 are not carried out simultaneously in each case, but are staggered in time row by row. By this, intuitively, in a two-dimensional time-position diagram (time on the x-axis and spatial position on the y-axis), the exposure instants of successive pixel rows are not vertically superposed, but are laterally shifted by the difference of the exposure instants of the respective pixel rows. The time offset has different effects depending on whether successive recordings are performed in a forward run or a backward run. The temporal offset causes the spatial position of the recorded line image data set (in the y direction in the diagram) to also shift slightly by the relative movement between the object 2 to be recorded and the camera 3. In this case, the recorded line image data set 10 is spatially pulled apart for the forward operation (fig. 3). This effect, which may also be referred to as scan expansion, causes a slight underscan compared to image sensor 4 (fig. 2) with a global shutter. The recorded line image data set 10 is then compressed spatially, correspondingly for the backward run (fig. 4). This effect, which may also be referred to as scan shrinkage, causes a slight overscan compared to image sensor 4 (FIG. 2) with a global shutter. Because of the very small time offset for typical image sensors with rolling shutters, running forward and running backward is not a problem with the information available for rescanning. The rescanning unit 6 is adapted such that a rescanning of the recorded line image data set 10 is related to a temporal interleaving of the start of exposure of each of the plurality of pixel lines 12, but taking into account the corresponding run.
Referring again to fig. 1, it should be noted that in the figure, for the sake of overview, the values of 4 pixel rows are taken as the number of the plurality of pixel rows 12. However, as stated, it is preferred that the plurality of pixel rows is in the range of 4 to 128 pixel rows, preferably 4 to 64 pixel rows, still more preferably 8 to 32 pixel rows, most preferably 8 to 16 pixel rows. Thus, in the embodiment, the number of the plurality of pixel rows 12 is actually in the range of 8 to 16 pixel rows.
In the exemplary embodiment, the image sensor 4 comprises a greater number of pixel rows than a plurality of pixel rows 12, 8 pixel rows being shown in fig. 1 for the sake of overview, but in commercially available area sensors it can be several hundred or several thousand pixel rows, and is suitable for limiting the image region to be read to a plurality of pixel rows 12. In other words, the area image sensor 4 is adapted to read a smaller number of pixel rows 12 than the total number of pixel rows of the area sensor 4 in each of the successive exposures. In the described embodiment, the region to be read can be configured as a configurable region of interest in the camera 3. This configuration possibility exists in most of the area sensors available, allowing the area to be read to be limited to a plurality of pixel rows 12.
In the camera 3, in addition to a so-called aperture stop in the objective, which is not shown in the figure, a stop 8 (slit stop) with a slit-shaped stop opening 9 is arranged in front of the image sensor 4. The diaphragm is positioned in the vicinity of the space in front of the image sensor 4 and is implemented in the embodiment as a slit with rounded ends. By providing the diaphragm 8 with the slit-shaped diaphragm opening 9 in the camera 3 irrespective of the area sensor 4 used, undesirable image artifacts, such as scattered light, for example, at the sensor cover glass, can be reduced.
In the spatial position of the line image data set 11 to be rescanned with respect to the recorded line image data set 10 in the plurality of recorded line image data sets 10 at different times t0、t1、t2In case two or more recorded line image data sets lie in between each two recorded line image data sets 10 or on one recorded line image data set 10, the rescanning unit is adapted to rescan the line image data sets 10 according to the recorded line image data sets at different times t0、t1、t2The recorded two recorded line image data sets 10 or one recorded line image data set 10 in each of the two or more sets are etc. to produce a rescanned line image data set 11.
Such a rescanning of the recorded line image data set 10 by means of a multiple scan is schematically and exemplarily shown in fig. 5 for an image sensor 4 with a global shutter. In this example, the rescan is performed by: for each of the two or more sets recorded at different times 10 of the plurality of recorded line image data sets 10, an own rescanned line image data set is determined and a rescanned line image data set 11 is generated from the two or more rescanned line image data sets thus obtained by combination. The combination can be performed, for example, by addition, averaging, weighted averaging or linear combination. For an image sensor 4 with a rolling shutter, a corresponding rescanning of the recorded line image data set 10 by means of a multiple scan can also be performed in a similar manner.
Hereinafter, an embodiment of a recording method for successively recording the object 2 moving relative to the camera 3 is exemplarily described with reference to a flowchart shown in fig. 6. In the described embodiment, the recording method is implemented by means of a recording system 1 which is schematically and exemplarily shown in fig. 1.
In step S101, a line image data set 10 of the object 2 to be recorded is recorded in the free-driving mode by means of the image sensor 4 of the camera 3.
In step S102, rescan information relating to the speed of the relative movement between the object 2 to be recorded and the camera 3 is received. This is achieved in the example by the receiving unit 5.
In step S103, the recorded line image data set 10 is rescanned in order to produce a rescanned line image data set 11, wherein the spatial position of the rescanned line image data set 11 with respect to the recorded line image data set 10 is correlated with the received rescan information. This is achieved in the example by rescanning the unit 6.
The image sensor 4 is an area sensor which comprises a plurality of pixel rows 12 and is suitable for recording a plurality of line image data sets 10 by means of the plurality of pixel rows 12 in each case for generating a rescanned line image data set 11.
In the embodiment shown in fig. 1, the camera 3 comprises a receiving unit 5 and a rescanning unit 6. In that case, the receiving unit 5 may be, for example, a suitable interface of the camera 3, and the received rescan information may comprise the speed v of the relative movement between the object 2 to be recorded and the camera 3. In an alternative embodiment, the camera 3 may not comprise the receiving unit 5 and the rescanning unit 6. For example, the recording system 1 for successively recording objects 2 moving relative to the camera 3 may comprise the camera 3 and a computer or another processing unit external with respect to the camera 3, said other processing unit comprising a receiving unit 5 and a rescanning unit 6. In that case, the line image data set recorded by the image sensor 4 of the camera 3 is provided to the external processing unit via a suitable interface, and the receiving unit 5 may be an interface of the external processing unit adapted to receive rescan information. In that case, the received rescan information may also include the speed v of the relative movement between the object 2 to be recorded and the camera 3. Furthermore, the rescan information may include an image repetition rate or an image frequency of the image sensor 4 in the free-run operation and/or a desired spatial resolution of the rescan. It is also possible, however, independent of whether the camera 3 comprises the receiving unit 5 and the rescanning unit 6, that two or more parameters have been correlated with one another in the rescan information, so that the parameters already comprise, for example, the spatial position of the rescanned line image data set 11 with respect to the recorded line image data set 10, which is determined from the speed v of the relative movement, the image repetition rate or image frequency of the image sensor 4 and the desired spatial resolution of the rescan.
In an alternative embodiment, the camera 3 may comprise a frame grabber (not shown in this figure), for example in the form of a frame grabber card, a frame grabber with a USB terminal, or the like, which is designed to generate a timing internal to the camera for recording "in free-run". Furthermore, the frame grabber may comprise a receiving unit 5 and a rescanning unit 6, such that a rescanning of the recorded line image data set 10 may be performed on the frame grabber to produce a rescanned line image data set 11.
In the embodiment shown in fig. 1, the image repetition rate or image frequency of the image sensor 4 in free-running operation and the desired spatial resolution can be configured. In other embodiments, only one of the parameters may be configurable, or none of the parameters may be configurable. In this case, the image repetition rate or image frequency preset for the image sensor 4 can be used to determine the spatial position of the rescanned line image data set 11 with respect to the recorded line image data set 10, and the spatial resolution of the image sensor 4 can be used, for example, as the desired spatial resolution of the rescan.
In the embodiment shown in fig. 1, the image sensor 4 is a monochrome image sensor. However, in other embodiments, the image sensor 4 may be a color image sensor having a mosaic filter. In this case, color images can also be recorded by means of the camera 3, wherein this can be achieved with relatively little effort by means of a mosaic filter. For this purpose, at least one further technical step is required compared to processing a line image data set of a monochrome image sensor, since in a color image sensor with a mosaic filter color interpolation is required, wherein the color values are determined substantially at intermediate pixel positions.
In the claims, the terms "having" and "including" do not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality.
A single unit or device may perform the functions of several of the elements listed in the embodiments. The fact that individual functions and/or elements are listed in different embodiments does not imply nor make advantageous the use of the combination of the described functions and/or elements.
The processes performed by one or more units or devices, such as receiving rescanning information relating to the speed of relative movement between the object to be recorded and the camera, rescanning the recorded line image dataset to produce a rescanned line image dataset, etc., may also be performed by a different number of units or devices. The described procedures may be implemented as program code of a computer program and/or corresponding hardware.
A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state storage medium, for example, sold together with or as part of other hardware. However, the computer program may also be sold in other forms, for example via a network or other telecommunication systems.
Reference signs in the claims shall not be construed as limiting the subject matter and scope of the claims to said reference signs.
In summary, a recording system for successively recording objects moving relative to a camera is described. The recording system includes: a camera, wherein the camera comprises an image sensor for recording a line image data set of an object to be recorded in a free-driving operation; a receiving unit for receiving rescan information related to a speed of a relative motion between an object to be recorded and a camera; and a rescanning unit for rescanning the recorded line image dataset to produce a rescanned line image dataset, wherein the spatial position of the rescanned line image dataset with respect to the recorded line image dataset is related to the received rescan information. The image sensor is an area sensor comprising a plurality of pixel rows and is suitable for recording a plurality of line image data sets by means of the plurality of pixel rows for generating a rescanned line image data set.
Claims (14)
1. A recording system (1) for successively recording objects (2) moving relative to a camera (3), comprising:
-the camera (3), wherein the camera (3) comprises an image sensor (4) for recording a line image data set (10) of the object (2) to be recorded in a free-driving operation,
-a receiving unit (5) for receiving rescan information related to a velocity (v) of a relative movement between the object (2) to be recorded and the camera (3), and
a rescanning unit (6) for rescanning the recorded line image data set (10) so as to produce a rescanned line image data set (11), wherein a spatial position of the rescanned line image data set (11) with respect to the recorded line image data set (10) is related to the received rescanning information,
wherein the image sensor (4) is an area sensor comprising a plurality of pixel rows (12) and is adapted to record a plurality of row image data sets (10) by means of the plurality of pixel rows (12) respectively for generating the rescanned row image data set (11).
2. The recording system (1) as claimed in claim 1, wherein the rescan unit (6) is configured to perform the rescan as a function of an image repetition rate or an image frequency of the image sensor (4) in the free-run operation and a desired spatial resolution of the rescan, wherein the image repetition rate or the image frequency and/or the desired spatial resolution can preferably be configured.
3. The recording system (1) according to claim 1 or 2, wherein the image sensor (4) comprises an electronic shutter (7) adapted to start exposure of each of the plurality of pixel rows (12) simultaneously.
4. The recording system (1) as defined in claim 1 or 2, wherein the image sensor (4) comprises an electronic shutter (7) adapted to start the exposure of each of the plurality of rows of pixels (12) temporally staggered, wherein the rescanning unit (6) is adapted such that the rescanning of the recorded row image data set (10) is related to the temporal staggering of the start of the exposure of each of the plurality of rows of pixels (12).
5. The recording system (1) as defined in claim 4, wherein the recording system (1) is adapted to perform the successive recording in a forward run in which the temporal staggering of the start of exposure of each of the plurality of pixel rows (12) is performed opposite to the direction of movement of the imaging of the object (2) to be recorded on the image sensor (4) of the camera (3), or in a backward run in which the temporal staggering of the start of exposure of each of the plurality of pixel rows (12) is performed in the direction of movement of the imaging of the object (2) to be recorded on the image sensor (4) of the camera (3).
6. The recording system (1) according to any one of claims 1 to 5, wherein the plurality of pixel rows (12) is in the range of 4 to 128 pixel rows, preferably 4 to 64 pixel rows, still more preferably 8 to 32 pixel rows, most preferably 8 to 16 pixel rows.
7. The recording system (1) according to any one of claims 1 to 6, wherein the image sensor (4) comprises a greater number of pixel rows than the plurality of pixel rows (12) and is adapted to limit an image area to be read onto the plurality of pixel rows (12).
8. Recording system (1) according to any of claims 1 to 7, wherein the area to be read is configurable as a configurable area of interest in the camera (3).
9. The recording system (1) according to any one of claims 1 to 8, wherein in the camera (3) a diaphragm (8) with a slit-shaped diaphragm opening (9) is arranged in front of the image sensor (4).
10. The recording system (1) as defined in any one of claims 1 to 9, wherein at a plurality of the spatial positions of the line image data set (11) to be rescanned with respect to the recorded line image data set (10) are at a plurality of theAt different times (t) in a recorded line image dataset (10)0,t1,t2) In case two or more recorded sets are between two recorded line image data sets (10) each or on one recorded line image data set (10), the rescanning unit (6) is adapted to rescan the recorded line image data sets (10) according to different time instants (t) in the recorded line image data sets0,t1,t2) Recording two of the recorded line image data sets (10) each or one of the recorded line image data sets (10) of the two or more sets to produce the rescanned line image data set (11).
11. The recording system (1) according to any one of claims 1 to 10, wherein the image sensor (4) is a monochrome image sensor, or wherein the image sensor (4) is a color image sensor with a mosaic filter.
12. The recording system (1) according to any one of claims 1 to 11, wherein the camera (3) comprises the receiving unit (5) and the rescanning unit (6).
13. The recording system (1) according to any one of claims 1 to 12, further comprising a sensor (14) for detecting a speed (v) of the relative movement between the object (2) to be recorded and the camera (3).
14. A recording method for successively recording an object (2) moving relative to a camera (3), comprising:
recording (S101) a line image data set (10) of the object (2) to be recorded by means of an image sensor (4) of the camera (3) in a free-driving mode,
-receiving (S102) rescan information related to a velocity (v) of a relative movement between the object (2) to be recorded and the camera (3), and
-rescanning (S103) the recorded line image data set (10) in order to produce a rescanned line image data set (11), wherein the spatial position of the rescanned line image data set (11) with respect to the recorded line image data set (10) is related to the received rescan information,
wherein the image sensor (4) is an area sensor comprising a plurality of pixel rows (12), wherein a plurality of row image data sets (10) are recorded by means of the plurality of pixel rows (12) for generating the rescanned row image data set (11), respectively.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102020127482.3 | 2020-10-19 | ||
DE102020127482.3A DE102020127482B4 (en) | 2020-10-19 | 2020-10-19 | Recording method and recording system for successively recording an object moving relative to a camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114390235A true CN114390235A (en) | 2022-04-22 |
Family
ID=80929240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111214649.XA Pending CN114390235A (en) | 2020-10-19 | 2021-10-19 | Recording method and recording system for successively recording objects moving relative to a camera |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114390235A (en) |
DE (1) | DE102020127482B4 (en) |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3971065A (en) | 1975-03-05 | 1976-07-20 | Eastman Kodak Company | Color imaging array |
JPH05504038A (en) | 1990-11-19 | 1993-06-24 | イーストマン コダック カンパニー | Scanning speed compensation |
US5608538A (en) | 1994-08-24 | 1997-03-04 | International Business Machines Corporation | Scan line queuing for high performance image correction |
GB2314987A (en) | 1996-07-05 | 1998-01-14 | Innovision Plc | Telecine apparatus |
US6606122B1 (en) | 1997-09-29 | 2003-08-12 | California Institute Of Technology | Single chip camera active pixel sensor |
US7164810B2 (en) | 2001-11-21 | 2007-01-16 | Metrologic Instruments, Inc. | Planar light illumination and linear imaging (PLILIM) device with image-based velocity detection and aspect ratio compensation |
US7425982B2 (en) | 2003-11-12 | 2008-09-16 | Euresys Sa | Method and apparatus for resampling line scan data |
DE102008016393B4 (en) | 2008-03-29 | 2009-11-19 | Basler Ag | Method for correcting image aberrations of electronic cameras |
DE102013000301A1 (en) | 2013-01-10 | 2014-07-10 | Basler Ag | Method and device for producing an improved color image with a color filter sensor |
DE102016112968B4 (en) | 2016-07-14 | 2018-06-14 | Basler Ag | Determination of color values for pixels at intermediate positions |
-
2020
- 2020-10-19 DE DE102020127482.3A patent/DE102020127482B4/en not_active Expired - Fee Related
-
2021
- 2021-10-19 CN CN202111214649.XA patent/CN114390235A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
DE102020127482A1 (en) | 2022-04-21 |
DE102020127482B4 (en) | 2022-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6014165A (en) | Apparatus and method of producing digital image with improved performance characteristic | |
US5157499A (en) | High-speed video camera using solid-state image sensor | |
US9621825B2 (en) | Camera system with multiple pixel arrays on a chip | |
US20030179418A1 (en) | Producing a defective pixel map from defective cluster pixels in an area array image sensor | |
JPH03502755A (en) | Photoelectric color image sensor | |
JP2007143131A (en) | Method and device for image signal processing | |
JP2009111813A (en) | Projector, image data acquisition method for projector, and imaging device | |
CN102668571A (en) | Sparse color pixel array with pixel substitutes | |
WO2019078335A1 (en) | Imaging device and method, and image processing device and method | |
JPH07143439A (en) | Picture image pickup device and picture processing unit | |
CN114390234A (en) | Recording method and recording system for successively recording objects moving relative to a camera | |
US5864362A (en) | High speed scanner for reading latent images in storage phosphors | |
CN114390235A (en) | Recording method and recording system for successively recording objects moving relative to a camera | |
US9282245B2 (en) | Image capturing apparatus and method of controlling the same | |
US7961224B2 (en) | Photon counting imaging system | |
JP3531035B2 (en) | Image capturing and recording device | |
JP2003309856A (en) | Imaging apparatus and method, recording medium, and program | |
JPH09219867A (en) | Still color picture image pickup device and its method | |
US7859563B2 (en) | System and method for synchronizing a strobe in video image capturing | |
US5726776A (en) | Method and arrangement for synchronizing the image recordings of monochrome and color recordings by means of light-sensitive sensors | |
US11902683B1 (en) | Method for forming a digital image | |
US20040100571A1 (en) | System and method for communicating information in an image capture device | |
JP3218157B2 (en) | Still image pickup device | |
JPH06303499A (en) | Image pickup device | |
CN117857935A (en) | Image sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20220422 |
|
WD01 | Invention patent application deemed withdrawn after publication |