EP2920956A1 - Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques - Google Patents

Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques

Info

Publication number
EP2920956A1
EP2920956A1 EP13808073.4A EP13808073A EP2920956A1 EP 2920956 A1 EP2920956 A1 EP 2920956A1 EP 13808073 A EP13808073 A EP 13808073A EP 2920956 A1 EP2920956 A1 EP 2920956A1
Authority
EP
European Patent Office
Prior art keywords
image
capture
panoramic
stereoscopic
final
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13808073.4A
Other languages
German (de)
English (en)
French (fr)
Inventor
Richard Ollier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AVINCEL GROUP Inc
Original Assignee
GIROPTIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GIROPTIC filed Critical GIROPTIC
Publication of EP2920956A1 publication Critical patent/EP2920956A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention relates to a method and a device for capturing and constructing a stream of panoramic or stereoscopic images.
  • This stream of panoramic or stereoscopic images may be recorded, transmitted or broadcast in the form of a film or processed to extract from the stream one or more static panoramic or stereoscopic images.
  • each image-capture device comprising a video capture sensor.
  • image for example of the CCD or CMOS type, coupled to optical means (objective) for projecting the image of a scene onto the image sensor.
  • optical means objective
  • the optical axes of the image capturing devices are oriented in different directions, and the optical fields of view of the image capturing devices may overlap to cover the entire field of the panoramic image.
  • the international patent application WO 2012/032236 discloses an example of a particularly compact optical device, comprising three image-capture devices, designated "optical groups", and allowing one-shot capture of panoramic images according to a field of view. 360 °.
  • panoramic image are to be taken in their broadest sense, and are not limited to an image captured in a 360 ° field, but more broadly cover an image constructed in a wider field than optical field covered by each image capture device used for capturing the panoramic image.
  • each image capture device acquires an image of a scene, in the form of a matrix of pixels, in a limited optical field, and the images are then transmitted to external digital processing means which enable to digitally "stitch" the images at their overlapping areas, so as to produce a final panoramic image.
  • Each pixel array representing an image captured by an image capture device results from a two-dimensional projection of the 3D surface of a sphere portion "viewed" by the image capture device. This two-dimensional projection depends on each image-capturing device, and in particular on the optical characteristics of the objective of the image-capture device, and on the spatial orientation ("Yaw”, “Pictch” and “Roll”). ) of the image capturing device when capturing the image.
  • This digital recollement can be performed automatically, as described for example in the international patent application WO 201 1/037964 or in the US patent application US 2009/0058988, or semi-automatically with manual assistance, such as described for example in the international patent application WO2010 / 01476.
  • the abovementioned digital recollection of the images requires computation time which is detrimental to the capture and the real-time rendering of a stream of panoramic images in the form of a film.
  • the present invention aims generally to propose a new technical solution for capturing and constructing an image stream panoramic or stereoscopic images by one or more image capturing devices.
  • the new solution of the invention makes it possible to accelerate the digital processing times, and thus facilitates the capture and construction in real time of a flow of panoramic or stereoscopic images.
  • the new solution of the invention makes it possible to overcome the aforementioned disadvantage resulting from the implementation of sensors which have different and independent optical means, and makes it possible in particular to obtain more easily panoramic or stereoscopic images of better quality.
  • the stream of panoramic or stereoscopic images may for example be recorded, transmitted or broadcast in the form of a film, or be processed later to extract from this stream one or more static panoramic or stereoscopic images.
  • the invention thus has for its first object a method for capturing and constructing a flow of panoramic or stereoscopic images of a scene, during which it is realized by means of at least one device for capturing images (C 1), several successive operations for capturing at least two different images of the scene, in the form of pixels, with or without overlap between the images, the successive capture operations being clocked at a frequency ( F) which defines a capture time (T) between the start of two successive capture operations; for each capture operation, (a) digitally treating the pixels of each captured image so as to form a final panoramic or stereoscopic image from said pixels with a processing time less than or equal to said capture time (T), and (b) generating, for a duration less than or equal to said capture time (T), a final panoramic or stereoscopic image previously formed; the digital processing (a) of each pixel of each captured image consists of at least keeping or abandoning said pixel, and in the case where the pixel is preserved, assigning to it one or more positions in the
  • Another object of the invention is a device for capturing and constructing a stream of panoramic or stereoscopic images.
  • This device comprises one or more image capture devices (C 1), which allow the capture of at least two different images in the form of a set of pixels, and electronic processing means which make it possible to construct an image panoramic or stereoscopic from the captured images;
  • the electronic processing means make it possible, by means of the one or more image-capture devices, to carry out several successive operations of capturing at least two different images of a scene, in the form of pixels, with or without overlap between the images, and with a timing of successive capture operations at a frequency (F) which defines a capture time (T) between the start of two successive capture operations;
  • the electronic processing means are capable, for each capture operation, of (a) numerically processing the pixels of each captured image so as to form a panoramic or stereoscopic final image from said pixels with a treatment duration less than or equal to said capture duration (T), and (b) generating, for a duration less than or equal to said capture duration
  • the invention also relates to a method for capturing and constructing a stream of panoramic or stereoscopic images of a scene, characterized in that it is realized by means of at least one device for capturing images (C 1), several successive operations for capturing at least two different images of the scene, in the form of pixels, with or without overlap between the images, in that during the image capture operations the pixels of the captured images are digitally processed to form panoramic or stereoscopic images, and a stream of panoramic or stereoscopic images is generated, and in that the digital processing of each pixel of each captured image consists at least of conserving or to abandon said pixel, and in the case where the pixel is preserved, to assign to it one or more positions in the panoramic or stereoscopic final image with a pre-weighting coefficient defined (W) for each position in the panoramic or stereoscopic final image.
  • C 1 device for capturing images
  • W pre-weighting coefficient defined
  • the invention also relates to a device for capturing and constructing a stream of panoramic or stereoscopic images, characterized in that it comprises one or more image capture devices (C 1) , which allow the capture of at least two different images in the form of a set of pixels, and electronic processing means which make it possible, by means of the one or more image-capture devices, to carry out several successive capture operations at least two different images of a scene, in the form of pixels, with or without overlap between the images, and which are capable, during the image capture operations, of digitally processing the pixels of the captured images in order to forming panoramic or stereoscopic images, and generating a stream of panoramic or stereoscopic images, and in that the digital processing of each pixel of each captured image consists of at least to conserve or abandoning said pixel, and in the case where the pixel is preserved, assigning to it one or more positions in the final panoramic or stereoscopic image with a predefined weighting coefficient (W) for each position in the final panoramic image or stereoscopic images,
  • W weighting
  • the subject of the invention is also a method for capturing and constructing a flow of panoramic or stereoscopic images of a scene, during which it is realized by means of at least one capture device.
  • images several successive operations of capturing at least two different images of the scene, in the form of pixels, with or without overlapping between the images; each image capture device allows the capture of an image in the form of a set of pixels and outputs for each captured image a stream of pixels synchronized at least by a first clock signal (H_captor).
  • H_captor first clock signal
  • Each pixel of each captured image is digitally processed so as to generate a final panoramic or stereoscopic image from said pixels in the form of a stream of pixels synchronized by at least a second clock signal (H).
  • the invention also relates to a device for capturing and constructing a stream of panoramic or stereoscopic images, said device comprising one or more image-capture devices that make it possible to carry out several successive operations of capturing at least two different images of a scene, in the form of pixels, with or without overlapping between the images, and electronic processing means which make it possible to construct a stream of panoramic or stereoscopic images from the captured images .
  • Each image capture device is capable of outputting, for each captured image, a stream of pixels synchronized at least by a first clock signal (H_captor).
  • the electronic processing means is adapted to digitally process each pixel of each captured image, thereby generating a panoramic or stereoscopic final image from said pixels in the form of a stream of pixels. synchronized by at least a second clock signal (H).
  • the invention also relates to a device for capturing and constructing at least one panoramic or stereoscopic image, said device comprising one or more image capture devices (C 1) that enable the capture of images. at least two different images, with or without overlap between the images, each image sensor (C 1) being able to deliver a flux of pixels for each captured image, and electronic processing means which make it possible to construct, during the image capture operations, a panoramic or stereoscopic image from the pixel streams of each captured image.
  • image capture devices (C 1) that enable the capture of images. at least two different images, with or without overlap between the images
  • each image sensor (C 1) being able to deliver a flux of pixels for each captured image
  • electronic processing means which make it possible to construct, during the image capture operations, a panoramic or stereoscopic image from the pixel streams of each captured image.
  • the electronic processing means is adapted to process each pixel of the pixel stream of each captured image by retaining or discarding said pixel, and in the case where the pixel is retained, assigning it one or more different positions in the final image panoramic or stereoscopic with a coefficient of predefined weighting (W) for each position in the panoramic or stereoscopic final image.
  • W predefined weighting
  • FIG. 1 is a block diagram of an example of an electronic architecture of a device of the invention.
  • FIG. 5 illustrates an example of geometric correspondence between a pixel P i; j of the final panoramic image and the pixel matrix captured by an image sensor.
  • FIGS. 6A to 61 schematize the different cases of remapping figures in the particular case of a RAW type image.
  • FIGS. 7A to 7D illustrate various examples of remapping a line of a sensor on a panoramic image.
  • FIG. 8 illustrates a particular example of remapping result of three images to form a final panoramic image.
  • FIG. 1 shows a particular example of device 1 according to the invention, which enables the capture and construction of panoramic images.
  • this device 1 has three image capturing devices Ci, C2, C3, for example of the CCD or CMOS type, which each allow the capture of an image each in the form of a matrix of pixels, and electronic processing means 10, which allow to construct a panoramic image from the pixels delivered by the image sensors Ci, C2, C3.
  • each image-capturing device Ci, C2, C3 comprises an image sensor, for example of the CCD or CMOS type, coupled to optical means (objective) comprising one or more lenses aligned with the image sensor. images, and to focus the light rays on the image sensor.
  • optical axes of the image capturing devices Ci, C2, C3 are oriented in different directions, and their optical fields cover the entire field of the final panoramic image, preferably with an overlap of the optical fields.
  • panoramic image should be taken in their broadest sense, and are not limited to a panoramic image constructed in a 360 ° field, but more generally cover an image constructed in a larger extended field. to the optical field covered by each image capture device used for image capture.
  • Ci, C2, C3 may for example constitute the three optical groups of the optical device with a small footprint, which is described in the international patent application WO 2012/032236, and which allows one-shot capture of panoramic images.
  • the device 1 of the invention is portable equipment, so that it can easily be transported and used in various places.
  • the digital processing means 10 deliver a base clock H10, which is generated for example from a quartz, and which is used to clock the operation of the image sensor of each capture device of images C1, C2, C3 .
  • the image sensor of each image capture device C1, C2, C3 delivers for each captured image a stream of pixels on a data bus "Pixels", synchronized by a first clock signal (“H_captor”) , which is generated by each image sensor from the base clock signal H10, and by two signals "Line Valid” and “Frame Valid".
  • the clock signals (“H_capteur) that are generated by each image capture device Ci, C2, C3 more particularly have the same frequency.
  • the electronic processing means 10 make it possible to construct a panoramic image from the pixels delivered by the image sensors of the image-capturing devices Ci, C2, C3, and in a manner comparable to the image-capturing devices Ci, C2 , C3 , output on a data bus "Pixels", a stream of pixels representative of the final panoramic image
  • the size of the "Pixel" data bus of the electronic processing means 10 may be the same or different from that of the "Pixel" data buses of the image capturing devices Ci, C2, C3, and is preferably greater.
  • the "Pixel" data buses of the image capturing devices Ci, C2, C3 are on eight bits and the "Pixels" data bus of the electronic processing means 10 is on 16 bits.
  • the stream of pixels generated by the electronic processing means 10 is synchronized by a second clock signal ("H"), which is generated by the electronic processing means 10 from the base clock signal, and by two signals "Line Valid And “Frame Valid” which are generated by the electronic processing means 10.
  • H second clock signal
  • FIG. 2 illustrates a particular and nonlimiting example of the invention, of synchronizing the aforementioned signals of a sensor.
  • the data transiting on the data buses "Pixels" are not represented.
  • the rising edge of the "Frame Valid" signal of each capture device Ci, C2 , C3 synchronizes the start of the transmission, on the "Pixel" data bus of each capture device Ci, C2 , C3 , pixels of an image captured by the capture device Ci, C2 , C3.
  • the falling edge of the signal "Frame Valid" of each capture device Ci, C2 , C3 marks the end of the transmission of the pixels, on the data bus "Pixels", of an image captured by said capture device Ci, C2 , C3.
  • These rising (respectively downstream) fronts of the "Frame Valid" signals delivered by the capture devices Ci, C2 , C3 are slightly offset temporally.
  • the "Line Valid" signal of each capture device Ci, C2 , C3 is synchronized on each rising edge of the signal “Frame Valid” and marks the beginning of the transmission of a line of pixels of the image. Each falling edge of the "Line Valid” signal marks the end of the transmission of a pixel line of the image.
  • the pixels of each image transmitted on each data bus “Pixels" of the three image capturing devices Ci, C2, C3 are sampled in parallel by the electronic processing means 10, respectively by means of each clock signal "H_capteur” delivered by each image capturing device Ci, C2 , C3.
  • the rising edge of the "Frame Valid" signal delivered by the electronic processing means 10 synchronises the start of the transmission, on the "Pixel" data bus of the electronic processing means 10, of an image final panoramic camera constructed from the pixels delivered by the image capturing devices Ci, C2 , C3 .
  • This rising edge is generated automatically, by the electronic processing means 10, from the rising edges of the Frame Valid signals delivered by the image capturing devices Ci, C2 , C3, more particularly by being generated on detection of the rising edge. generated lastly, that is to say in the particular example of FIG. 2, the rising edge of the "Frame Valid" signal delivered by the image-capturing device Ci.
  • the falling edge of the "Frame Valid" signal delivered by the electronic processing means 10 synchronizes the end of the transmission, on the data bus "Pixels" of the electronic processing means 10, of a final panoramic image constructed from the pixels delivered by the image capturing devices Ci, C2 , C3 .
  • the "Line Valid" signal delivered by the electronic processing means 10 is synchronized on each rising edge of the "Frame Valid” signal delivered by the electronic processing means 10, and marks the beginning of the transmission of a line of pixels of FIG. panoramic image. Each falling edge of the "Line Valid" signal delivered by the electronic processing means 10 marks the end of the transmission of a line of pixels of the panoramic image.
  • the writing of the pixels of each panoramic image on the data bus "Pixels" of the electronic processing means 10 is synchronized by the clock signal "H", which is generated by the electronic processing means 10, and which can be used by another external electronic device (for example the device 11) for reading these pixels on said data bus.
  • the clock signal "H” delivered by the electronic processing means 10 may be synchronous or asynchronous with the clock signals "H_capteur” delivered by the image sensors Ci, C2, C3 .
  • the frequency of the clock signal “H” may be equal to or different from the frequency of the clock signals "H_capteur” delivered by the image sensors Ci, C2, C3 .
  • the frequency of the clock signal “H” is greater than the frequency of the clock signals "H_capteur” delivered by the image sensors Ci, C2, C3 ., As illustrated in FIG. 2.
  • the time interval (t) is the time interval separating two successive rising edges of the image.
  • "Frame Valid" signal from the image capture device Ci that is to say the image capture device that first transmits the pixels on its data bus "Pixels”.
  • the electronic processing means 10 During said time interval (t) separating the start of two successive capture operations, the electronic processing means 10:
  • the flow of successive panoramic images is generated in real time by the electronic processing means at the same rate as the successive operations of image capture.
  • the capture time T of each time interval (t) between two successive image capture operations is 40ms, which corresponds to a capture frequency F of 25 Hz, and the electronic processing means also generate 25 panoramic images per second (a panoramic image every 40 ms).
  • the capture time T (duration of each time interval (t) between two successive operations of capturing images) will depend on the technology of the image capture devices Ci, C2, C3. In practice, the capture time T will preferably be less than or equal to 1 s, and more preferably still less than or equal to 100 ms.
  • the final panoramic image generated during each time interval (t), which separates the start of two successive capture operations is derived from the digital processing (a) of the pixels made during this same time interval. (t).
  • each successive panoramic image is generated in real time and substantially simultaneously with the capture of the images that were used to build this panoramic image, and before the next capture operation of the images that will be used to build the panoramic image. next.
  • the final image generated during each time interval (t), which separates the start of two successive capture operations is derived from the digital processing (a) of the pixels made during a time interval. time (t) earlier, and for example the time interval (t) previous.
  • each successive panoramic image is generated in real time and with a slight time shift with respect to the capture of the images that were used to build this panoramic image.
  • each panoramic image can start (rising edge of the "Frame Valid” signal delivered by the electronic processing means 10) during a given capture cycle (N), and terminate (falling edge of the signal " Frame Valid "delivered by the electronic processing means 10) during the next capture cycle (N + 1).
  • the duration between the rising edge and the next falling edge of the "Frame Valid” signal delivered by the electronic processing means 10 is less than or equal to the capture time T.
  • the pixel processing (a) performed for each capture operation may be temporally shifted with respect to the capture cycle.
  • the pixel processing time of all the images captured during a capture operation to form a final panoramic image is less than or equal to the capture time T.
  • the processing time (a) ) pixels realized to form a final panoramic image from images captured during a capture cycle N can be achieved by the electronic processing means 10 during a subsequent capture cycle, for example the N + capture cycle 1.
  • the electronic processing means 10 constitute a programmed electronic data processing unit which may, regardless of the invention, be implemented by means of any known type of electronic circuit or set of electronic circuits, such as for example under the form of one or more programmable circuits of the FPGA type and / or of one or more ASIC-type specific circuits, or of a programmable processing unit whose electronic architecture implements a microcontroller or a microprocessor.
  • the flow of successive panoramic images delivered in the form of a set of pixels by the electronic processing means 10 is processed by additional electronic processing means 1 1, which
  • they include a DSP-type circuit, which makes it possible, for example, to record in a memory and / or to display in real time on a screen a dynamic stream of panoramic images in the form of a film.
  • the additional electronic processing means 11 may be designed to process the flow of successive panoramic images delivered by the electronic processing means 10, by extracting from this stream one or more panoramic images.
  • each image capturing device Ci, C2, C3 comprises objective optical means "fisheye”, associated with a capture matrix, and each captured image is characterized by three spatial orientation information, which are commonly called “Yaw , "Pitch” and “Roll”, and which are specific to the spatial orientation of said image capturing device when capturing the image.
  • a "fisheye" lens has a useful spherical center detection surface (gray areas and white surface in FIG. 3), and the useful pixels of the image captured by the image sensor result known per se a two-dimensional projection of a portion ( Figure 3 - 864 pixels on 900 pixels) only of the detection surface of the image capture device.
  • each matrix of pixels representing an image captured by an image capture device Ci, C2, or C3 results from a two-dimensional projection of the 3D surface of a sphere part "seen” by the image capturing device Ci, C2, or C3.
  • This two-dimensional projection depends on each image-capturing device Ci, C2, or C3, and in particular optical means of the image-capturing device Ci, C2, or C3, and spatial orientation ("Yaw”). , "Pictch” and “Roll") of the image capturing device Ci, C2, or C3 when capturing the image.
  • FIG. 4 shows a matrix of pixels corresponding to an image captured by an image capture device C, (for example an image capture device Ci, C2 or C3 of FIG. 1).
  • the black pixels correspond to the pixels located outside the useful circular central part of the "fisheye” objective of the image capture device C 1.
  • Each pixel of this image captured by means of the image capture device C results from an operation called “mapping” or “mapping”, which corresponds to the aforementioned two-dimensional projection of the 3D surface of a portion of sphere “seen” by the "fisheye” lens of the image capture device C , and which is specific of this image sensor C ,.
  • the useful pixels of each image captured by each sensor C are remapped in the final panoramic image, with a portion at least one of said pixels that is remapped in the final panoramic image by preferably undergoing a new projection in two dimensions, which is different from their two-dimensional projection in the image of the image-capturing device C, from which they originate .
  • a single panoramic virtual image capture device is digitally reconstructed from the image capturing devices Ci, C2, or C3.
  • This remapping of the pixels is done automatically, by a processing (a) of each pixel of each captured image which consists in keeping or abandoning said pixel, and in the case where the pixel is conserved, to assign to it one or more positions in the image. final panoramic image with a predefined weighting coefficient for each position in the final panoramic image.
  • FIG. 4 only a portion of the final panoramic image is represented, said portion corresponding to the panoramic image portion resulting from the remapping of the pixels of an image captured by a single image-capture device C 1 .
  • the pixel Pi s situated in the first line of the image captured by the image capture device C, is by example remapped in the final panoramic image in the form of four pixels ⁇ , 9, ⁇ , ⁇ , Pi, n, Pi, 12 in four different positions adjacent in the first line of the final panoramic image, which results in a stretch of this pixel from the start image into the final panoramic image.
  • the mapping of this pixel Pi, s in the final panoramic image thus corresponds to a two-dimensional projection of this pixel in the final panoramic image which is different from the two-dimensional projection of this pixel in the captured initial image. by the image capture device.
  • This stretching of the pixel in the final panoramic image may, for example, advantageously be implemented to compensate in whole or in part for the optical deformations of the "fisheye" objective of the image capture device in the vicinity of its upper edge.
  • the same stretch can advantageously be implemented for the pixels in the vicinity of the lower edge.
  • the central pixel Ps, 8 of the image captured by the image capture device C is remapped identically in the final panoramic image in the form of a single pixel Pn, n, the objective "Fisheye" of the image capture device causing no or almost no optical distortion at its center.
  • the pixel Pio, 3 located in the lower left part of the image captured by the sensor C is for example remapped in the final panoramic image in the form of three pixels Pi 7 , 4 , Pis, 4, Pis, 5, in three different adjacent positions on two adjacent lines of the final panoramic image, which results in an enlargement in both directions of this pixel Pio, 3 of the starting image in the final panoramic image.
  • the mapping of this pixel Pio, 3 in the final panoramic image thus corresponds to a two-dimensional projection of this pixel in the final panoramic image which is different from the two-dimensional projection of this pixel in the captured initial image. by the image capture device.
  • a pixel may not be retained and not be included in the final panoramic image; for example, pixels in an area of overlapping images of at least two image capture devices.
  • an overlapping zone of the image capture devices for example, only one pixel of one of the sensors will be retained, the other corresponding pixels of the other sensors not being retained.
  • this affection is preferably performed with a predefined weighting coefficient of between 0 and 100%, for each position in the final panoramic image, that is to say for each pixel the panoramic final image.
  • This weighting and the reasons for this weighting will be better understood in the light of Figure 5.
  • each final pixel P i of the final panoramic image does not in practice correspond to the center of a pixel of the image captured by an image capture device C 1, but corresponds geometrically to a particular actual position P in the image captured by an image capture device C 1, which in the particular example shown diagrammatically in FIG. 4 is off-center near the lower and left corner of the pixel Pi of the image captured by the image capture device C ,.
  • Pj pixel j will be formed in this particular example not only from the pixel P2, but also the neighboring pixels Pi, P 3, P 4, by weighting the contribution of each pixel Pi, P2, P3, P4, e.g.
  • the pixel Pi is for example 25% of the pixel Pi, 35% of the pixel P2, 15% of the pixel P 3 and 5% of the pixel P.
  • the invention applies to any type of image format: RAW, YUV and its RGB derivatives ... In the case of RGB images whose color reconstruction has already been performed (information R, G, B known for each pixel of the image), the weighting referred to above will be made from adjacent pixels.
  • FIGS. 6A to 61 show the different cases of correspondence between a pixel Pj j of the final panoramic image and a matrix of pixels of the image captured by an image capture device d in the case of a coding of the RAW pixels.
  • the letters R, V, B respectively identify a pixel Red, Green, Blue.
  • W is the weight in the final image of the pixel R,, V, or B, of the initial image captured by the image capture device.
  • FIG. 6A corresponds to the case where the center of a pixel P ,,, red of the final panoramic image corresponds to a real position P, in the image captured by an image-capturing device C , which is on a blue pixel (B) of the image captured by an image capture device C 1.
  • said red pixel P ,,, of the final panoramic image will be constituted from the red pixels R 1, R 2, R 3, R 4 adjacent to said blue pixel B, respectively by applying weighting coefficients W 1 , W 2 , W 3 , W.
  • the values of these weighting coefficients W 1, W 2, W 3, W depend for example the center of gravity of the position P relative to the center pixel of each Ri, R2, R3, R 4. For example, if the position P is located at the center of the pixel P, in this case all the weighting coefficients W1, W 2, W 3, W 4 will be equal to 25%.
  • FIG. 6B corresponds to the case where the center of a pixel P ,,, blue of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a red pixel (R) of the image captured by an image capture device C 1.
  • FIG. 6C corresponds to the case where the center of a green pixel P ,,, of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a blue pixel (B) of the image captured by a C, image capture device.
  • FIG. 6D corresponds to the case where the center of a green pixel Pjj of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a red pixel (R) of the image captured by a C, image capture device.
  • FIG. 6E corresponds to the case where the center of a green pixel P ,,, of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a green pixel (V 5 ) of the image captured by a C, image capture device.
  • FIG. 6F corresponds to the case where the center of a pixel P ,,, red of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a green pixel (V) of the image captured by a C, image capture device.
  • FIG. 6G corresponds to the case where the center of a pixel P ,,, blue of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a green pixel (V) of the image captured by a C, image capture device.
  • FIG. 61 corresponds to the case where the center of a pixel P ,,, blue of the final panoramic image corresponds to a real position P in the image captured by a sensor C, which is on a blue pixel (B 5 ) of the image captured by a C, image capture device.
  • the remapping process in the final panoramic image of each pixel of the image captured by an image capture device C consists in keeping or abandoning said image.
  • pixel and in the case where the pixel is preserved, to assign to it one or more different positions in the final panoramic or stereoscopic image with a predefined weighting coefficient for each position (i.e. for each pixel) in the final panoramic image.
  • the notion of "position" in the panoramic final image is confused with the notion of "pixel" in the final panoramic image
  • the distortions of each lens of each image pickup device Q can be corrected at least in part in the final image.
  • the additional electronic processing means 11 may, for example, implement known image processing algorithms (in particular algorithms for white balance, time-of-exposure management and gain management) on the final panoramic image. delivered by the electronic processing means 10, which makes it possible, if necessary, to obtain a more homogeneous final panoramic image, in particular as regards the colorimetry, the white balance, the exposure time and the gain, compared to a implementing these image processing algorithms on each image from the image capturing devices Ci, C2, C3, before construction of the panoramic image.
  • known image processing algorithms in particular algorithms for white balance, time-of-exposure management and gain management
  • FIGS. 7A to 7D show particular examples of remapping the pixels of a line L present in the starting image of an objective "Fisheye” to account for the optical distortion of the "fisheye” lens and its orientation in space (Yaw, Pitch, Roll).
  • the remapping depends on the position of the line L with respect to the center and the lower and upper edges of the "fisheye” lens (FIGS. 7A, 7B, 7C) or depends on the orientation in the lens space " fisheye "( Figure 7D).
  • FIG. 8 shows a particular example of three images h, h captured respectively by three image sensors Ci, C 2 , C 3 and the final panoramic image (I) resulting from a remapping of the pixels of
  • pixel remapping can be used to construct a final panoramic image by implementing any type of two-dimensional projection different from the two-dimensional projection of the capture devices.
  • Ci, C 2 , C3 for example for the purpose of automatically incorporating special effects into the final panoramic image.
  • This remapping can for example be implemented in the form of a correspondence table of the type of the following one, affecting each pixel ⁇ , ⁇ of each image capture device C, preserved in the final panoramic image, one or more pixels (Px P ano, Y P ano) in the image final panoramic with a weighting coefficient W of the pixel ⁇ , ⁇ in the pixel (Px pan o, Ypano) of the final panoramic image.
  • a correspondence table of the type of the following one, affecting each pixel ⁇ , ⁇ of each image capture device C, preserved in the final panoramic image, one or more pixels (Px P ano, Y P ano) in the image final panoramic with a weighting coefficient W of the pixel ⁇ , ⁇ in the pixel (Px pan o, Ypano) of the final panoramic image.
  • the remapping operation in the final panoramic image of each pixel of each image capture device Ci, C2, C3 is performed automatically by the electronic processing means 10 , from a correspondence table stored in a memory.
  • the calculation of the remapping in the final panoramic image of each pixel of each image capture device Ci, C2, C3 can also be performed automatically by the electronic processing means 10 by means of an algorithm of FIG. calibration and dynamic calculation stored in memory.
  • each pixel (P ⁇ pan o, Y pano) of the panoramic image resulting from the remapping operation is delivered at the output of the electronic processing means 10 ("pixels"), being synchronized by the clock signal "H” delivered by the electronic processing means 10.
  • the clock signal "H” delivered by the electronic processing means 10 may be synchronous or asynchronous with the clock signals "H_capteur” delivered by the image sensors Ci, C2, C3.
  • An advantage of the architecture of FIG. 1 is to allow the additional electronic processing means 11 to "see” the image sensors Ci, C2, C3 and the electronic processing means 10 as a single panoramic virtual sensor.
  • the device of FIG. 1 can advantageously be used to remap the pixels in real time as they are acquired by the electronic processing means 10.
  • the invention is not limited to the implementation of three fixed image capture devices Ci, C2, C3, but can more generally be implemented with at least two fixed image capture devices Ci, C2 .
  • each image capture corresponding to a different orientation and / or different position of the image capture device Ci, C2 , C3 mobile.
  • the capture frequency F is equal to the capture frequency of the image capturing devices Ci, C2, C3.
  • the capture frequency F may be less than the capture frequency of the image capturing devices Ci, C2, C3, the electronic processing means 10 treating for example only one image on m images (m > 2) delivered by each sensor, which corresponds to a frequency of successive capture operations lower than the frequency of the images delivered by the image capturing devices Ci, C2, C3.
  • the invention is not limited to the construction of panoramic images, but can also be applied to the construction of stereoscopic images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Cameras In General (AREA)
EP13808073.4A 2012-11-15 2013-11-12 Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques Withdrawn EP2920956A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1260880A FR2998126B1 (fr) 2012-11-15 2012-11-15 Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques
PCT/FR2013/052707 WO2014076402A1 (fr) 2012-11-15 2013-11-12 Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques

Publications (1)

Publication Number Publication Date
EP2920956A1 true EP2920956A1 (fr) 2015-09-23

Family

ID=47754666

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13808073.4A Withdrawn EP2920956A1 (fr) 2012-11-15 2013-11-12 Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques

Country Status (15)

Country Link
US (1) US20150288864A1 (pt)
EP (1) EP2920956A1 (pt)
JP (2) JP2016503618A (pt)
KR (1) KR20150084807A (pt)
CN (1) CN104782114B (pt)
AU (1) AU2013346603B2 (pt)
BR (1) BR112015010788A8 (pt)
CA (1) CA2889811A1 (pt)
FR (4) FR2998126B1 (pt)
HK (1) HK1212835A1 (pt)
IL (1) IL238622A0 (pt)
IN (1) IN2015DN03812A (pt)
MX (1) MX355297B (pt)
TW (2) TWI612495B (pt)
WO (1) WO2014076402A1 (pt)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3190780A1 (en) * 2016-01-05 2017-07-12 Giroptic Two-lens spherical camera
WO2017182789A1 (en) 2016-04-18 2017-10-26 Argon Design Ltd Blending images
EP3249929A1 (en) 2016-05-25 2017-11-29 Thomson Licensing Method and network equipment for establishing a manifest
US20180018807A1 (en) * 2016-07-15 2018-01-18 Aspeed Technology Inc. Method and apparatus for generating panoramic image with texture mapping
CN108513119A (zh) * 2017-02-27 2018-09-07 阿里巴巴集团控股有限公司 图像的映射、处理方法、装置和机器可读介质
KR101925011B1 (ko) * 2017-03-14 2019-02-27 한국과학기술원 워터마크 삽입/검출 방법 및 장치
TWI775869B (zh) * 2017-06-29 2022-09-01 佳能企業股份有限公司 影像擷取裝置及影像處理方法
TWI642301B (zh) * 2017-11-07 2018-11-21 宏碁股份有限公司 影像處理方法與電子系統
KR102620783B1 (ko) * 2019-03-10 2024-01-04 구글 엘엘씨 베이스볼 스티치를 갖는 360도 광각 카메라
KR102294071B1 (ko) * 2020-07-21 2021-08-26 금오공과대학교 산학협력단 증강 현실에 기반한 o2o 카메라 솔루션에서의 객체 위치 표시 변환 및 id 부여 방법
KR102555534B1 (ko) * 2021-06-28 2023-07-17 한국과학기술원 구형 파노라마 영상에 대한 워터마크 검출 방법 및 장치

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6885392B1 (en) * 1999-12-31 2005-04-26 Stmicroelectronics, Inc. Perspective correction for preview area of panoramic digital camera
JP4499319B2 (ja) * 2001-08-24 2010-07-07 パナソニック株式会社 運転支援装置、運転支援方法および運転ガイドデータ作成方法
US20030135675A1 (en) * 2002-01-17 2003-07-17 Koninklijke Philips Electronics N.V. Configurable synchronous or asynchronous bus interface
US7782357B2 (en) * 2002-06-21 2010-08-24 Microsoft Corporation Minimizing dead zones in panoramic images
US7084904B2 (en) * 2002-09-30 2006-08-01 Microsoft Corporation Foveated wide-angle imaging system and method for capturing and viewing wide-angle images in real time
JP2004159014A (ja) * 2002-11-05 2004-06-03 Nec Corp デジタルカメラ付き携帯通信端末
TWI269648B (en) * 2004-03-09 2007-01-01 Chuin-Mu Wang Method and system for examining fitness by photography
EP2562578B1 (en) 2007-03-16 2017-06-14 Kollmorgen Corporation System for panoramic image processing
CN201118859Y (zh) * 2007-07-27 2008-09-17 浙江大学 单体式实时全景无缝无失真视频摄像机
CN101119482B (zh) * 2007-09-28 2011-07-20 北京智安邦科技有限公司 一种全景监控方法及设备
CA2714492C (en) * 2008-02-08 2014-07-15 Google, Inc. Panoramic camera with multiple image sensors using timed shutters
WO2010001476A1 (ja) 2008-07-04 2010-01-07 清正工業株式会社 医療廃棄物処理装置
JP2010252015A (ja) * 2009-04-15 2010-11-04 Panasonic Corp 画像合成装置、画像合成方法およびプログラム
TWI379245B (en) * 2009-04-27 2012-12-11 Asustek Comp Inc Method for continuously outputting character by video-recording
WO2011037964A1 (en) 2009-09-22 2011-03-31 Tenebraex Corporation Systems and methods for correcting images in a multi-sensor system
JP2011119974A (ja) * 2009-12-03 2011-06-16 Sony Corp パノラマ画像合成装置、パノラマ画像合成方法、及びプログラム
FR2964757B1 (fr) 2010-09-09 2013-04-05 Giroptic Dispositif optique pour la capture d'images selon un champ de 360°
CN102480622A (zh) * 2010-11-30 2012-05-30 比亚迪股份有限公司 立体图像获取方法及系统、移动终端
US9247133B2 (en) * 2011-06-01 2016-01-26 Apple Inc. Image registration using sliding registration windows
US20130321573A1 (en) * 2012-05-30 2013-12-05 Texas Instruments Incorporated Identification and display of time coincident views in video imaging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014076402A1 *

Also Published As

Publication number Publication date
IN2015DN03812A (pt) 2015-10-02
BR112015010788A2 (pt) 2017-07-11
TWI612495B (zh) 2018-01-21
KR20150084807A (ko) 2015-07-22
BR112015010788A8 (pt) 2019-10-01
FR3011968A1 (fr) 2015-04-17
AU2013346603B2 (en) 2017-09-07
CA2889811A1 (fr) 2014-05-22
FR2998126B1 (fr) 2014-12-26
MX2015006121A (es) 2015-08-06
FR3012001B1 (fr) 2016-05-06
HK1212835A1 (zh) 2016-06-17
MX355297B (es) 2018-04-12
FR3012000B1 (fr) 2016-05-06
JP2019041389A (ja) 2019-03-14
CN104782114B (zh) 2019-05-07
JP2016503618A (ja) 2016-02-04
FR3012001A1 (fr) 2015-04-17
FR3011968B1 (fr) 2016-05-06
TW201804432A (zh) 2018-02-01
US20150288864A1 (en) 2015-10-08
CN104782114A (zh) 2015-07-15
FR2998126A1 (fr) 2014-05-16
WO2014076402A1 (fr) 2014-05-22
AU2013346603A1 (en) 2015-05-14
IL238622A0 (en) 2015-06-30
FR3012000A1 (fr) 2015-04-17
TW201435792A (zh) 2014-09-16

Similar Documents

Publication Publication Date Title
EP2920956A1 (fr) Procede et dispositif de capture et de construction d'un flux d'images panoramiques ou stereoscopiques
CA2600185C (fr) Procede pour commander une action, notamment une modification de nettete, a partir d'une image numerique en couleurs
EP3057317B1 (en) Light-field camera
BE1022488B1 (fr) Systeme d'appareil de prise de vues a temps-de-vol
FR2882160A1 (fr) Procede de capture d'images comprenant une mesure de mouvements locaux
EP3335420A1 (en) Systems and methods for multiscopic noise reduction and high-dynamic range
CN103826033A (zh) 图像处理方法、图像处理设备、图像拾取设备和存储介质
CN102300113B (zh) 基于稀疏摄像机阵列的集成成像微图像阵列生成方法
FR2982448A1 (fr) Procede de traitement d'image stereoscopique comprenant un objet incruste et dispositif correspondant
JPH1062140A (ja) 形状の再構成方法および形状の再構成装置
EP3479561B1 (en) Plenoptic camera with higher resolution using a controllable birefringent layer
US20190166335A1 (en) Plenoptic sub aperture view shuffling for a richer color sampling
FR3054093A1 (fr) Procede et dispositif de detection d'un capteur d'images
TWI504936B (zh) 影像處理裝置
TW201322754A (zh) 高動態範圍影像感測裝置、方法、及裝置製造方法
WO2011033186A1 (fr) Procédé de numérisation tridimensionnelle d'une surface comprenant la projection d'un motif combiné
WO2011033187A1 (fr) Procédé de numérisation tridimensionnelle comprenant une double mise en correspondance
FR2968108A1 (fr) Procede de reduction de la taille d’une image stereoscopique
FR2966257A1 (fr) Procede et dispositif de construction d'une image en relief a partir d'images en deux dimensions
LU102214B1 (en) Method for digital image processing
BE1020522A5 (fr) Procede de determination des parametres geometriques indiquant le mouvement d'une camera.
WO2017149248A1 (fr) Appareil de prise de vue stereoscopique
EP4015228A1 (fr) Procédé de fabrication d'un dispositif de sécurité et dispositif de sécurité associé
WO2023105164A1 (fr) Procédé et dispositif de caractérisation de distorsions dans une caméra plénoptique
Hwang et al. Perspective view reconstruction of partially occluded objects by using computational integral imaging

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20150515

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
19U Interruption of proceedings before grant

Effective date: 20180305

19W Proceedings resumed before grant after interruption of proceedings

Effective date: 20200504

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: AVINCEL GROUP INC

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210601