EP4349002A1 - Verfahren zur verarbeitung von pixeldaten, zugehörige vorrichtung und programm - Google Patents

Verfahren zur verarbeitung von pixeldaten, zugehörige vorrichtung und programm

Info

Publication number
EP4349002A1
EP4349002A1 EP22734211.0A EP22734211A EP4349002A1 EP 4349002 A1 EP4349002 A1 EP 4349002A1 EP 22734211 A EP22734211 A EP 22734211A EP 4349002 A1 EP4349002 A1 EP 4349002A1
Authority
EP
European Patent Office
Prior art keywords
image
images
sensors
exposure time
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22734211.0A
Other languages
English (en)
French (fr)
Inventor
Dominique GINHAC
Barthélémy HEYRMAN
Steven TEL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Universite de Bourgogne
Original Assignee
Universite de Bourgogne
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universite de Bourgogne filed Critical Universite de Bourgogne
Publication of EP4349002A1 publication Critical patent/EP4349002A1/de
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/583Control of the dynamic range involving two or more exposures acquired simultaneously with different integration times

Definitions

  • the field of the disclosure is that of the acquisition of images by means of capture devices such as mobile communication terminals, digital cameras, cameras, microscopes, etc. More specifically, the disclosure relates to a method for acquiring images with high dynamic range, or HDR (for “High Dynamic Range”).
  • HDR for “High Dynamic Range”.
  • HDR images high dynamic range images
  • LDR Low Dynamic Range
  • the scene to be rendered is captured several times, by the same capture device, with different exposure times: short exposure times make it possible not to saturate the areas of the image with high luminosity, and long exposure times make it possible to to detect a useful signal in low light areas.
  • the different LDR images obtained are then processed to extract from each of them the best represented parts of the image, and these different parts are combined to construct an HDR image of the scene. It is generally accepted that this method of generating HDR images is costly in terms of time and the number of exposures to be made.
  • Non-destructive Read Out the electrical charges accumulated by the photoelectric conversion elements of the sensor can be read, without it being necessary to reset them: it is therefore possible, during the time the sensor is exposed, to take several readings pixel signals, allowing electrical charges to continue to build up as the sensor is exposed to light.
  • the exploitation of this non-destructive reading mode which makes it possible to carry out several readings of the signals associated with the pixels of the sensor during a single exposure time, offers an interesting solution, both to the problem of the cost in time of the previous methods of generation of HDR images, only to the problem of the appearance of artefacts. Indeed, it is possible to generate a high dynamic image of a scene from several images obtained by several successive non-destructive readings of the sensor during the same exposure time.
  • patent document US 7,868,938 proposes a new type of image capture device, in which a first reader operates in destructive reading mode to read the charges accumulated by the photoelectric conversion elements of the sensor, by reinitializing the signals of the pixels after each read, at the end of a standard exposure time, and a second reader operates in non-destructive read mode to obtain multiple NDRO images associated with different short exposure times, i.e. more shorter than the standard exposure time.
  • the different NDRO images associated with short exposure times are used to predict whether some pixels of the image obtained by the first reader will be saturated, due to an overexposure of the corresponding parts of the scene to be photographed during the exposure time. standard exposure.
  • an HDR image is generated in which the saturated pixels of the image obtained by the first reader at the standard exposure time are replaced by the corresponding unsaturated pixels extracted from an NDRO image associated with a time d shorter exposure.
  • This solution partially solves exposure problems, especially in the sense that overexposed pixels can be replaced by less exposed pixels, and the dynamic range of the resulting image is somewhat extended. But this method remains too computationally intensive, does not correct the problems of underexposure and above all requires at least two readings: one destructive and the other non-destructive. Furthermore, the artifact presence problem is not resolved.
  • the document FR3062009A1 proposes a technique which would make it possible to generate a less expensive wide dynamic range image, both in time and in computing power and which would have the advantage of to be adaptive.
  • it is proposed to carry out several non-destructive readings of a single and same sensor, and to adapt the replacement of the pixels of a current image by pixels of a following image according to quality criteria.
  • This method is actually more efficient in terms of dynamic range width.
  • this method does not make it possible to carry out a restitution of the flow in real time and all the same implements relatively significant resources, in particular at the level of the calculations of signal/noise ratios for the determination of the exposure times.
  • this method requires the use of a sensor allowing non-destructive reading, a sensor which is not widely available on the market and which is considerably more expensive.
  • the method implemented in the patent document FR3062009A1 requires the use of an NSC1201 sensor from New Imaging Technologies, and is therefore reserved for particular uses.
  • the disclosure responds to this need by proposing a method for generating a video stream comprising a set of high dynamic range images, called FIDR video stream, from a plurality of standard dynamic range images obtained by reading at least two image sensors each having an image production rate, each sensor comprising a plurality of pixels arranged in matrix form, and each associated with a photoelectric conversion element for converting received light into electrical charges and accumulating said electrical charges during a light exposure time, the method comprising a plurality of high dynamic range image creation iterations comprising determining exposure times, reading optical sensors and combining data from of these sensors in an iterative mode of operation involving temporary memory zone management.
  • a method for generating a video stream comprising a set of high dynamic range images, called HDR video stream, from a plurality of standard dynamic range images obtained by reading at least two image sensors each having an image production rate, each sensor comprising a plurality of pixels arranged in matrix form, and each associated with a photoelectric conversion element for converting received light into electrical charges and accumulating said electrical charges for a time of exposure to light,.
  • HDR video stream a set of high dynamic range images
  • such a method comprises a plurality of iterations for creating high dynamic range images comprising: determining at least three sensor exposure times comprising: a short exposure time TC, a long exposure TL and an intermediate exposure time T1, such that TC ⁇ TI ⁇ TL; at least one iteration of a sensor reading, among said at least two sensors, delivering at least three successive images, as a function of said at least three sensor exposure times; a recording, within at least three dedicated memory areas, of said at least three successive images, each memory area being dedicated to one sensor exposure time among said at least three sensor exposure times; generation of a high dynamic range image from information extracted from said at least three successive images recorded respectively within said at least three dedicated memory zones; adding said high dynamic range image to said HDR video stream.
  • said determination of said at least three sensor exposure times comprises a determination of the intermediate exposure time T1 as a function of said short exposure time TC and of the long exposure time TL.
  • the short exposure time is calculated so that it produces, when reading a sensor among said at least two sensors, an image with range standard dynamic whose percentage of pixels saturated in white is lower than a predetermined threshold.
  • the long exposure time is calculated so that it produces, when reading a sensor among said at least two sensors, a standard dynamic range image of which a percentage of pixels saturated in black is below a predetermined threshold.
  • the intermediate exposure time is obtained as the square root of the product of the short exposure time and the long exposure time.
  • the exposure time the long exposure time is less than the rate of production of images of at least of said sensors among said at least two sensors.
  • the generation of a high dynamic range image of a current iteration of creation of a high dynamic range image is implemented on the basis of information extracted from at least three successive images currents is implemented parallel to said at least three iterations of a reading of sensors, among said at least two sensors, delivering at least three successive images of the following iteration of creation of a high dynamic range image.
  • the frame rate of the HDR stream is at least equal to the frame rate of at least one image sensor among the said at least two image sensors.
  • the disclosure takes the form of a device, or a system, for generating a video stream comprising a set of high dynamic range images, called HDR video stream, from of a plurality of standard dynamic range images obtained by reading from at least two image sensors each having an image production rate, each sensor comprising a plurality of pixels arranged in matrix form, and each associated with a photoelectric conversion element making it possible to convert a light received into electric charges and to accumulate the said electric charges during a time of exposure to the light, characterized in that it comprises a calculation unit adapted to the implementation of the steps of the method HDR video stream generation according to the method previously described.
  • the various steps of the methods according to the disclosure are implemented by one or more software or computer programs, comprising software instructions intended to be executed by a data processor of an execution device according to the disclosure and being designed to control the execution of the various steps of the methods, implemented at the level of a communication terminal, of an electronic execution device and/or of a control device, within the framework of a distribution of the processing to be performed and determined by a scripted source code and/or a compiled code.
  • the disclosure also covers programs capable of being executed by a computer or by a data processor, these programs comprising instructions for controlling the execution of the steps of the methods as mentioned above.
  • a program may use any programming language, and be in the form of source code, object code, or intermediate code between source code and object code, such as in partially compiled form, or in any other desirable form.
  • the disclosure also relates to an information medium readable by a data processor, and comprising instructions of a program as mentioned above.
  • the information carrier can be any entity or device capable of storing the program.
  • the medium may include a storage medium, such as a ROM, for example a CD ROM or a microelectronic circuit ROM, or else a magnetic recording medium, for example a mobile medium (memory card) or a hard drive or SSD.
  • the information medium can be a transmissible medium such as an electrical or optical signal, which can be conveyed via an electrical or optical cable, by radio or by other means.
  • the program according to the disclosure can in particular be downloaded from a network of the Internet type.
  • the information carrier may be an integrated circuit in which the program is incorporated, the circuit being adapted to execute or to be used in the execution of the method in question.
  • the disclosure is implemented by means of software and/or hardware components.
  • the term "module” may correspond in this document to a software component, a hardware component or a set of hardware and software components.
  • a software component corresponds to one or more computer programs, one or more sub-programs of a program, or more generally to any element of a program or software capable of implementing a function or a set of functions, as described below for the module concerned.
  • Such a software component is executed by a data processor of a physical entity (terminal, server, gateway, set-top-box, router, etc.) and is likely to access the hardware resources of this physical entity (memories, recording media, communication bus, electronic input/output cards, user interfaces, etc.).
  • a hardware component corresponds to any element of a hardware assembly (or hardware) able to implement a function or a set of functions, according to what is described below for the module concerned. It can be a hardware component that can be programmed or has an integrated processor for executing software, for example an integrated circuit, a smart card, a memory card, an electronic card for executing firmware ( firmware), etc
  • FIG. 1 figure 1 schematically describes the method implemented
  • FIG. 2 describes two situations for processing pixel data from the sensors to produce an HDR stream with a rate equivalent to the rate of the SDR sensors;
  • FIG. 3 illustrates an architecture of a device capable of implementing a method which is the subject of the disclosure;
  • FIG. 4 figure 4 illustrates the implementation in parallel of the method that is the subject of the disclosure.
  • the method of producing an HDR video stream of the disclosure comprises combining, from at least three SDR video streams, the images composing these SDR streams. Indeed, since an SDR camera is not able to capture the full dynamic range of the scene, it inevitably loses detail in areas that are dimly lit (pixels saturated in black) and brightly lit (pixels saturated in white). The data thus acquired are then more difficult to use by artificial vision applications. There is therefore a strong need for wide dynamic range cameras that can be used in various application fields (e.g. video surveillance, autonomous vehicle or industrial vision), at a lower cost than existing solutions and which can produce a Real-time HDR.
  • various application fields e.g. video surveillance, autonomous vehicle or industrial vision
  • the method developed by the inventors aims to respond to this problem. It is more particularly based on the use of standard, inexpensive sensors, and on the implementation of suitable management of a memory for temporary storage of pixel data from these sensors, this memory playing a pivotal role synchronization between real-time acquisition and production, also real-time.
  • at least two sensors are used in parallel, these two sensors making it possible to generate, simultaneously, two images, which are recorded within a temporary storage space, comprising at least three storage locations.
  • the generation of the images and the recording thereof within the temporary storage space are carried out at a minimum at the speed of generation of the images originating from the sensors.
  • the plurality of sensors is embedded within a plurality of cameras (a sensor within a camera).
  • these cameras are for example all of the same type.
  • the cameras are for example configured to produce a stream of images at 60 images/second. In doing so, each image produced by a camera is exposed for a maximum time (ie integration time) before the sensor is read, by destructive reading.
  • each image produced by a camera is exposed for a maximum time (ie integration time), before the sensor is read, by destructive reading.
  • the integration time is directly related to the brightness of the scene and can be less than a millisecond for sufficiently bright scenes.
  • the reading time is related to the sensor reading circuit technology.
  • T 1/60 th of a second.
  • the addition of the integration and reading time ⁇ l/ 60th of a second and a waiting time occurs (up to max l/60th of a second).
  • the integration time is too long then we are obliged to reduce the reading speed and we go to 1/30 th of a second for example so as not to truncate the acquisitions. So in the end, the sensor can have a speed of l/60 th of a second and consequently, to preserve the rate, the integration time is between 0 and (l/60 th of a second - reading_time). The same logic applies for sensors at 1/30 th of a second.
  • the image production speed determines a maximum exposure time of each image before being produced.
  • One objective of the proposed method being to deliver an HDR stream which is produced at the same speed as that of the cameras, it is therefore necessary to produce HDR images at the speed at which the images are produced by the cameras: the time of maximum exposure of the images is therefore lower than the speed of production of these images by the sensor.
  • the exposure time of each image is configured throughout the execution of the HDR stream production method to ensure that it matches the exposure time. needed for each image.
  • the method implemented comprises a plurality of iterations, overall, for the creation of high dynamic range images comprising: a determination (D1) of at least three exposure times of sensors comprising: a short exposure time TC, a long exposure time TL and an intermediate exposure time T1, such that TC ⁇ TI ⁇ TL; at least one iteration of a reading (D2) of sensors, among said at least two sensors, delivering at least three successive images (IC, II, IL), as a function of said at least three exposure times (TC, T1, TL) of sensors; the number of iterations of this step (D2) depends on the number of sensors available: two sensors for three images implies at least two iterations, three sensors for three images implies one iteration for each sensor; other configurations are explained below; a recording (D3), within at least three dedicated memory areas (ZM#1, ZM#2, ZM#3), of said at least three successive images (IC, II, IL), each memory area being dedicated to a sensor exposure time among said at least
  • the method can be implemented so that the method being implemented so that at any instant an image acquired at short time (IC), an image acquired at intermediate time (II) and an image acquired at long time ( IL) are respectively present within said at least three dedicated memory zones (ZM#1, ZM#2, ZM#3).
  • the method can be implemented via two processes operating in parallel: a production process, comprising the iterations of steps D1 to D3, which ensure continuous production of images in the dedicated memory zones and a generation process stream which indifferently and continuously uses the images present in the dedicated memory areas to implement the iterations of steps D4 and D5.
  • step D2 For example by carrying out a different number of iterations in step D2: instead of three iterations, only two captures can be carried out (one capture on each sensor), allowing to fill the memory zones corresponding to each of the two captures (for example ZM#1, ZM#2), then at the following global iteration, again only two captures are carried out making it possible to fill the memory zones corresponding to each of the two captures (for example ZM#2, ZM#3).
  • Other modes of implementation can also be envisaged depending on the number of cameras available in particular, as explained below.
  • the system for implementing this combination technique comprises at least two cameras, each camera being equipped with a sensor capable of capturing a scene at a given speed and resolution.
  • the system includes a processing unit, which is configured to extract, from these at least two cameras, the at least three SDR video streams.
  • the processing unit also comprises at least three memory zones, intended to receive at least three different images, each image coming from one of the three SDR video streams.
  • the processing unit performs a combination of the three images of the three different memory areas to produce an HDR image from the three SDR images of the three memory areas.
  • Each of the three images recorded in the three different memory areas comes from a different exposure time of a camera sensor.
  • dl when two cameras are used, one first image 11 is obtained for an exposure time dl, a first image 12 is obtained for an exposure time d2 and a third image 13 is obtained for an exposure time d3, so that dl ⁇ d2 ⁇ d3.
  • d3 it is ensured that d3 is lower than the image production rate of the cameras. For example, if the camera produces 60 images/second, it is ensured that: dl ⁇ d2 ⁇ d3 “—s.
  • the proposed production method makes it possible to obtain at least three images from the same scene to be captured and to provide three streams, which are processed in real-time to deliver a single HDR video stream.
  • two cameras is also possible with three or more cameras, as will be described below.
  • the principle of the proposed method being to have, at each instant, within the three memory zones, an image (one image per memory zone), each of these three images having been captured with a different exposure time (a "short” time, a "long” time and an "intermediate” time, which is determined from the "short” time and the "long” time).
  • At least two ways of determining the exposure times by performing a calculation of short times and long times to minimize the black and white saturations and then by determining an intermediate time (for example in the form sqrt(TC*TL)); or by performing a calculation of the intermediate time with for example the "self-exposure" of the camera or by another self-exposure method to be implemented and then "empirical" determination of the TC (short times) and TL (long times ) by removing/adding one or more EVs (“exposure values”).
  • the exposure times of each of the images are determined at least partially at each capture iteration.
  • “short”, “long” and “intermediate” exposure times are set, an image is obtained by at least two cameras at each of these times (“short” , “long” and “intermediate") and recorded in the three memory areas: a first memory area for the image captured at the "short” exposure time, a first memory area for the image captured at the time d "intermediate” exposure and a third first memory area for the image captured at the “long” exposure time.
  • These three images are processed to provide an HDR image and this HDR image is added to the HDR stream.
  • the values of the acquisition times are estimated at each new acquisition from statistical analyzes carried out on the previous acquisitions. For three acquisitions, an estimate is made: the short time is estimated by minimizing the number of pixels saturated in white ( ⁇ 10% for example); the long time is estimated by minimizing the number of pixels saturated in black ( ⁇ 10% for example).
  • the intermediate exposure time is then estimated by calculation: it can for example be a simple calculation: square root of the product of the long time and the short time.
  • the long and short times are estimated based on several factors, including that of the frame rate of the HDR stream.
  • the creation of an HDR video stream requires continuous adaptation to variations in scene brightness that need to be quickly assessed and taken into account. Therefore, the prior techniques in which the best images are selected from a plurality of available images (as in the patent document FR3062009) are not applicable to the creation of an HDR video stream, since they require the availability of 'an excess number of images, on which the selection of the images to be kept takes place.
  • a rapid evaluation of short and long exposure times is carried out.
  • the step of evaluating these exposure times is based on the histograms of the images IC (short exposure time) and IL (long exposure time) previously acquired.
  • the histograms make it possible to have a precise estimate of the distribution of the pixels. If the number of pixels saturated in white (pixels with a value greater than 240 for 8-bit images for example) of the IC image is too large (more than 10 to 15% for example), then the exposure time of the The IC image at the next iteration should be decreased to be able to capture more information in brightly lit areas.
  • the exposure time can be increased to avoid too great a difference with the intermediate image, which would cause an information “hole” in the dynamic range.
  • the exposure time of the IL image at the next iteration should be increased.
  • the exposure time of the IL image at the next iteration must be reduced.
  • Variations in exposure times from one iteration to another can be expressed in the form of exposure value (EV): an increase (respectively decrease) of one unit of exposure index (1 EV) results in a multiplication (respectively division) by two of the exposure time. If the number of saturated pixels is very high, it is possible to increase or decrease by 2EV (a factor of 4 on the exposure time) and on the contrary, if you want to refine the number of saturated pixels around chosen thresholds, it is possible to limit the variations to 1/2 EV (one half), or even 1/3 EV (one third).
  • the long exposure time is adjusted according to the number of cameras (and therefore of sensors) used to supply the three memory zones. For example, if six sensors are available at 1/30 sec, the acquisitions of which are time-shifted, it is possible to increase the output rate.
  • the principle being that as soon as we have three images, we can do the processing in parallel with the other acquisitions.
  • the operational implementation conditions determine the strategy to adopt, in particular according to the cost of high-performance sensors and having a better acquisition speed (but more expensive) vs. a greater number of sensors (four, five or six) but individually less expensive.
  • Each processing follows an identical implementation (pixel registration, combination of SDR images) and identical processing times are illustrated for simplicity.
  • the times t1, t2 and t3 represent the maximum frame rate of SDR cameras (for example 60 imgs/sec, 30 imgs/sec, etc.).
  • an HDR image is obtained from the first iteration, at the end of the time period from 0 to t1, and so on.
  • an HDR image is only obtained from time t'2, then t'3 and so on. So there is a slight lag when producing the first HDR image. This lag at start-up ultimately makes it possible to take advantage of a longer time to perform the HDR image creation processing without reducing the image production rate.
  • the offset between t1 and t'1 corresponds to the HDR image processing time which initially exceeds the image production rate of the camera (ie “frame rate”).
  • Such an HDR stream acquisition device also called an HDR camera, comprises an acquisition subassembly (SSACQ) which firstly comprises N sensors (Cl, . . . CN).
  • the sensors are connected to an acquisition module (MACQ) comprising two sub-modules: an exposure time programming sub-module (CtrIEC), which is used to set the exposure time of each sensor and a acquisition (AcQ) itself which is in charge of reading, for each sensor, the matrix of pixels of this sensor and of transmitting, to the storage unit management module (MMU), sets of pixel data acquired.
  • MACQ acquisition module
  • CtrIEC exposure time programming sub-module
  • AcQ acquisition
  • MMU storage unit management module
  • the acquisition sub-module can program the sensor(s), launch the acquisition and then recover the image thus acquired by the corresponding sensor.
  • the storage unit management module receives the sets of pixel data from the acquisition module.
  • Each set of pixel data relates to an image acquired by a sensor.
  • the pixel data is provided by the acquisition module with an identifier, making it possible to determine the sensor of origin or even the exposure time or these two data at the same time.
  • the storage unit management module (MMU) stores the obtained pixel data sets in the memory areas (ZM1, ..., ZMN) according to the exposure time, the source sensor or these two information combined . More particularly, regardless of the number of sensors, the inventors have developed a specific memory management device which makes it possible to permanently have at least three images stored in the memory, each of them having a short, intermediate or long exposure.
  • image 1 (short time) is acquired in memory 1.
  • Image 2 (intermediate time) in memory 2
  • image 3 (long time) in memory 2.
  • the MMU module At the next iteration, for the storage of image 4 (short time), the MMU module overwrites the oldest image (image 1 - short time).
  • image 5 (intermediate time), it is image 2 that is overwritten.
  • the MMU module reads back the images in order to feed the processing sub-assembly (SSTRT) (see below).
  • SSTRT processing sub-assembly
  • the MMU module is implemented to synchronize the different acquisitions made in parallel. At each new acquisition, at least one of the three memories is refreshed with a new image, so that as in the previous case, the HDR generation is carried out at the acquisition speed.
  • the inventors have chosen to work in real time on the flow of data acquired by the sensors.
  • the processing operations carried out can be carried out in the acquisition stream, that is to say simultaneously with the reading of the new image and the rereading of the images already recorded.
  • the new image acquired by the sensor is transferred from the sensor to the MM U line by line.
  • the SSTRT processing module is capable of processing in parallel all the pixels of the row read with the pixels of the corresponding rows in the two other stored images.
  • the processing time for each pixel is much lower than the rate for supplying a complete line of pixels, which makes it possible to produce a new line of HDR pixels for each new line of pixels acquired.
  • the HDR stream acquisition device also includes a processing subassembly (SSTRT).
  • This processing sub-assembly performs the processing of the images recorded in the memory zones, in a continuous manner, at the theoretical speed of capture of the frames by the sensors (e.g. 60images/sec, 30images/sec, etc.), in order to produce an HDR stream having the same rate as the production rate of the sensors.
  • This processing subassembly comprises, in this example embodiment, a registration module (DEG), the function of which is to carry out a possible registration ("deghosting") of pixels corresponding to moving objects, when such objects exist in the scene to be captured.
  • the registration module (DEG) uses the N (eg three) acquired images to estimate the movement of the objects within these N images. This estimate is made for all the pixels. There is necessarily more movement in the case of acquisition with two sensors since the acquisitions are made sequentially (generally speaking, when one wishes to obtain n acquisitions and one only has at most n- 1 sensors, there are sequential acquisitions). In the multi-sensor case (more than two), movements are minimized (mainly motion blur, increasing with exposure time) due to simultaneous acquisition.
  • the N sets of raw pixels are transmitted directly to the HDR creation module (HDRC).
  • the HDR creation module uses the N streams in parallel to evaluate the HDR value of each of the pixels.
  • the method used uses the “Debevec and Malik” algorithm, which is adapted to the device so that it is more efficient in terms of processing time.
  • the Debevec method is based on the fact that a pixel of a visual scene has a constant irradiance value and that it is possible to estimate this irradiance from the values of the pixels obtained with different acquisition times and the transfer curve of the camera used.
  • the mathematical equations of Debevec's method require calculating the logarithm of the inverse of the camera's transfer function. According to the disclosure, in a real-time context, for reasons of efficiency, all the values of this logarithm have been precalculated for all possible pixel values (between 0 and 255 for an 8-bit sensor, between 0 and 1023 for a 10-bit sensor) and stored in a memory of the system, the calculation then being limited to a simple re-reading of a memory cell. Such an implementation makes it possible to ensure real-time processing of streams.
  • On-screen display (using an AFF display module);
  • Raw output (to a communication network, using a suitable ETH module).
  • the global algorithms use a processing common to all the pixels, which simplifies their implementation in real time but to the detriment of the overall quality of the result obtained.
  • the inventors have selected an algorithm of the type described in Duan et al (2010), which they have adapted to the implementation conditions previously described.
  • an Ethernet network controller allowing output of uncompressed HDR streams at the speed at which they are produced is implemented.
  • Such a controller makes it possible in particular to be able to evaluate the quality of the production algorithms of the HDR stream with metrics.
  • FIG. 4 illustrates another example of implementation, in which the pivot function of the memory areas is diverted to produce an HDR stream at a higher rate substantially close to that of the sensor of the SDR camera.
  • the memory zones storing the different images are used in a desynchronized manner, according to the programming carried out on the exposure times of the camera by the exposure time programming sub-module (CtrIEC) which has been presented previously. This technique can also be used with two cameras, as explained previously.
  • a single sensor (a single sensor) is used to produce three SDR streams, each stream having a speed of 20 images per second (ie 60 frames per second divided by three). Each exposure of each image is therefore less than 1/ 60th of a second.
  • an HDR video comprising at least 60 images per second is produced.
  • the images are recorded in the memory zone and the pixel registration and HDR image combination processing is carried out in real time as the images are read in the zones.
  • ZM#1 to ZM#3 memories It should be noted that this mode of use of the memory zone pivot function is well suited to the implementation of the production processing of an HDR stream in a use case where a single and unique SDR sensor is present for perform flow sensor
  • an image l[l] at short time is captured by the sensor of the SSACQ acquisition subassembly. This image is stored in the ZM#1 memory area by the MMU module.
  • an image I [2] at intermediate time is captured by the sensor. This image is stored in the memory area
  • This image is stored in memory area ZM#3.
  • the three memory zones each having an image (short time, intermediate time and long time), the processing subassembly recovers these three images in memory and performs the conversion into an HDR image (IHDR[123]).
  • the first HDR image is stored in memory area ZM#3.
  • the sensor of the SSACQ acquisition subassembly performs a capture of a new image l[4], at short time, which is stored in the memory area ZM#1 by the MMU module.
  • the three memory zones having once again each an image (short time I [4], intermediate time I [2] and long time I [3]), the processing subassembly recovers these three images in memory and again performs conversion to HDR image (IHDR[423]).
  • the second HDR image was therefore obtained by recording “at 1/ 60th of a second.
  • the sensor of the SSACQ acquisition subassembly performs a capture of a new image l[5], at the intermediate time, which is stored in the memory zone ZM#2 by the module MMU.
  • the three memory zones having once again each an image (short time l[4], intermediate time I [5] and long time l[3]), the processing subassembly recovers these three images in memory and again performs conversion to HDR image (IHDR[453]).
  • the third HDR image was therefore again obtained by recording “at 1/ 60th of a second.
  • the sensor of the SSACQ acquisition subassembly performs a capture of a new image l[6], at long time, which is stored in the memory zone ZM#2 by the module MMU.
  • the three memory zones having once again each an image (short time I [4], intermediate time I [5] and long time I [6]), the processing subassembly recovers these three images in memory and again performs conversion to HDR image (IHDR[456]).
  • the third HDR image was therefore again obtained by recording “at 1/ 60th of a second. This process is continued throughout the process of capturing and transforming into an HDR stream and it does deliver an HDR stream at 60 frames per second.
  • a problem that may arise is the presence of artifact. It is therefore often necessary to perform pixel registration, which is not the case or less the case when two or more sensors are used.
  • At least two identical sensors are used to implement the technique described. These at least two sensors, although identical, are each programmed to operate at different capture speeds. More particularly, it was explained above that the exposure time programming sub-module (CtrIEC) performs programming of the maximum exposure time of the sensors to obtain a short time, a long time and an intermediate depending on the long time. These exposure times are lower (if not much lower) than the production rate of the camera. For example, in the case of a production rate of 120 images per second, the short time can be 1/500 th of a second, the intermediate time 1/260 th of a second and the long time 1/140 th of a second. second).
  • CtrIEC exposure time programming sub-module
  • the at least two sensors are configured so that they produce SDR images at different production rates. More particularly, one of the sensors is configured to produce images at the rate of 120 images per second, while the other sensor is configured to produce images at the rate of 60 images per second. The second sensor produces images at a lower speed, but benefits from a longer exposure time.
  • the counterpart is that it can then more efficiently produce images in which the quantity of pixels saturated in black is less than the predetermined value (for example 10%).
  • the predetermined value for example 10%
  • the advantage of this solution is that there are fewer pixel registration processings due to the presence of artefacts.
  • the advantage also is to be able to use only two sensors: the sensor operating at a rate of 120 images per second makes it possible to perform two captures during the long-time capture time; the first sensor obtains the image at short time and the image at intermediate time while the second sensor obtains the image at long time.
  • the processing subassembly When the three images are present in the three areas of memory considered, these are obtained and processed by the processing subassembly to produce a single HDR image, according to one or other of the situations in Figure 4.
  • the second type of HDR stream is clocked at 120 frames per second, that is to say the "highest” value possible, by setting the production rate of the sensor configured to be the fastest. In which case, the method as described in FIG. 4 is implemented. Each new image obtained by the sensor whose production rate is 120 images per second is used immediately to calculate a new HDR image. In this situation, a common sensor image clocked at 60 frames per second will be used to produce two HDR flow sensor images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
EP22734211.0A 2021-06-02 2022-06-01 Verfahren zur verarbeitung von pixeldaten, zugehörige vorrichtung und programm Pending EP4349002A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2105800A FR3123734A1 (fr) 2021-06-02 2021-06-02 Procédé de traitement de données de pixels, dispositif et programme correspondant
PCT/EP2022/064985 WO2022253932A1 (fr) 2021-06-02 2022-06-01 Procede de traitement de données de pixels, dispositif et programme correspondant

Publications (1)

Publication Number Publication Date
EP4349002A1 true EP4349002A1 (de) 2024-04-10

Family

ID=77021502

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22734211.0A Pending EP4349002A1 (de) 2021-06-02 2022-06-01 Verfahren zur verarbeitung von pixeldaten, zugehörige vorrichtung und programm

Country Status (6)

Country Link
EP (1) EP4349002A1 (de)
JP (1) JP2024521366A (de)
KR (1) KR20240016331A (de)
CN (1) CN117795970A (de)
FR (1) FR3123734A1 (de)
WO (1) WO2022253932A1 (de)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4998056B2 (ja) 2006-05-11 2012-08-15 セイコーエプソン株式会社 撮像装置、撮像システム及び撮像方法
US9578223B2 (en) * 2013-08-21 2017-02-21 Qualcomm Incorporated System and method for capturing images with multiple image sensing elements
EP4270976A3 (de) * 2016-02-12 2024-01-10 Contrast, Inc. Vorrichtungen und verfahren für video mit hochdynamischem bereich
US9918018B2 (en) * 2016-04-04 2018-03-13 Illinois Tool Works Inc. Dynamic range enhancement systems and methods for use in welding applications
US9979906B2 (en) * 2016-08-03 2018-05-22 Waymo Llc Beam split extended dynamic range image capture system
FR3062009B1 (fr) 2017-01-17 2019-08-16 Centre National De La Recherche Scientifique Generation adaptative d’une image a grande gamme dynamique d’une scene, a partir d’une pluralite d’images obtenues par lecture non destructive d’un capteur d’image.

Also Published As

Publication number Publication date
FR3123734A1 (fr) 2022-12-09
CN117795970A (zh) 2024-03-29
WO2022253932A1 (fr) 2022-12-08
KR20240016331A (ko) 2024-02-06
JP2024521366A (ja) 2024-05-31

Similar Documents

Publication Publication Date Title
US11854167B2 (en) Photographic underexposure correction using a neural network
US9077911B2 (en) Multi-exposure video
FR2882160A1 (fr) Procede de capture d'images comprenant une mesure de mouvements locaux
EP0347984A1 (de) Verarbeitungssystem für Fernsehbilder mit Bewegungseinschätzer und verminderter Datenrate
EP3571834B1 (de) Adaptive erzeugung eines bildes mit hohem dynamikbereich einer szene basieren auf einer vielzahl von bildern, die durch zerstörungsfreies lesen eines bildsensors gewonnen werden
WO2016202926A1 (fr) Procédé et dispositif de production d'une image numérique
CA2889811A1 (fr) Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques
JP2023548748A (ja) フレーム処理および/またはキャプチャ命令システムおよび技法
Loke Astronomical image acquisition using an improved track and accumulate method
FR3052949B1 (fr) Procede et systeme de prise de vues a l'aide d'un capteur virtuel
EP4349002A1 (de) Verfahren zur verarbeitung von pixeldaten, zugehörige vorrichtung und programm
FR2881599A1 (fr) Procede et dispositif pour remplacer des pixels defectueux dans des cameras fpa
EP2221727B1 (de) Vorrichtung und Verfahren zum Verarbeiten von digitalen Daten
EP3918576A1 (de) Verfahren zur dynamischen dreidimensionalen bildgebung
FR2968878A1 (fr) Procede et dispositif pour generer des images comportant du flou cinetique
FR3078427A1 (fr) Detection dynamique de lumiere parasite dans une image numerique
CN113347490B (zh) 视频处理方法、终端及存储介质
US20230370727A1 (en) High dynamic range (hdr) image generation using a combined short exposure image
CN118317200A (en) Image processing method, device and apparatus
EP0561674B1 (de) Verfahren zur Korrektur von Pixelnichtuniformitäten eines Festkörpersensors und Vorrichtung dafür
FR3006500A1 (fr) Capteur cmos a photosites standard
FR3140234A1 (fr) procédé de traitement de paramétrage, dispositif, système et programme correspondant
FR3002715A1 (fr) Procede de production d'images et camera a capteur lineaire
TW202416719A (zh) 用於產生組合圖像的圖像壓縮
FR2858087A1 (fr) Procede de simulation numerique d'un rendu de support d'images

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231124

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR