EP2649802A1 - Verfahren und vorrichtung zum verarbeiten von bildinformationen zweier zur bilderfassung geeigneter sensoren eines stereo-sensor-systems - Google Patents
Verfahren und vorrichtung zum verarbeiten von bildinformationen zweier zur bilderfassung geeigneter sensoren eines stereo-sensor-systemsInfo
- Publication number
- EP2649802A1 EP2649802A1 EP11784451.4A EP11784451A EP2649802A1 EP 2649802 A1 EP2649802 A1 EP 2649802A1 EP 11784451 A EP11784451 A EP 11784451A EP 2649802 A1 EP2649802 A1 EP 2649802A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor
- image information
- image
- sensors
- offset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to a method and an apparatus for processing image information of two sensors suitable for image acquisition of a stereo sensor system that can be used in a vehicle.
- Stereo video image processing is becoming more and more important for environment detection in driver assistance systems, for robotics and other applications in automation technology.
- the spatial position of objects in the detection range of a stereo video system can be determined so as to be suitable for e.g. to detect pedestrians in driver assistance systems.
- the stereo measurement from digital images is described in textbooks for image processing (e.g., Trucco and Verri, Introductory Techniques for 3-D Computer Vision, Springer Verlag, 1998).
- the basis for the stereo measurement is the search for corresponding image sections between left and right sensor. Taking into account the optical imaging and the geometric arrangement of the sensors, a 3 D reconstruction of the image sections can take place.
- WO 02/095681 A1 describes a method in which a source image distorted by a camera optics is converted into an equalized target image with the aid of a tabular mapping rule. Each source pixel of the source image is assigned no, one or more target pixels in the target image.
- the present invention provides a method for processing image information of two sensors of a stereo sensor system suitable for image acquisition, furthermore a device which uses this method and finally a corresponding computer program product according to the independent patent claims.
- Advantageous embodiments emerge from the respective subclaims and the following description.
- Images captured by two sensors of a stereo sensor system typically have a geometric offset, that is, an image of a real object detected by the sensors is located at a different geometric position in the images.
- information about the offset is used in order, on the one hand, to optimize the acquisition of image information by the sensors and, on the other hand, to read out the image information from the sensors.
- different sections of the sensors can be read out at the same time knowing the offset.
- a size of a buffer store in which the read-out data is buffered for further processing can be reduced.
- further processing can be accelerated.
- the approach according to the invention can advantageously be used with a hardware-based method for 3D stereo measurement with a stereo video system.
- a method for optimized readout of a video sensor of the stereo video system can be realized.
- a reading for example, a right and a left image sensor, depending on calibration information.
- the image sensors depending on the determined in a calibration line offset with a temporal
- Line offset can be read out.
- different sections of the sensor so-called regions of interest (ROI's) can be read in the image sensors, depending on the existing in a calibration line offset in the left and right camera.
- ROI's regions of interest
- the present invention provides a method for processing image information of two sensors suitable for image acquisition of a stereo sensor
- each of the sensors is configured to detect the image information in sections in arranged at different positions sensor portions of the sensor, the method comprising the following steps:
- the stereo sensor system may be a stereo video system with two video sensors. By means of the stereo video system, the spatial position of objects in the detection range of the sensors can be determined.
- the stereo sensor system may provide information for a driver assistance system of a vehicle.
- the sensors can capture image information, which images a real environment detected by the sensors, in sections, for example line by line. Each image information may contain one or more include higher brightness values.
- the sensor sections may be lines and the position of a sensor section may be a line number.
- the sensors can be identical, and in particular have the same type, number and arrangement of sensor sections.
- two sensor sections of the two sensors can correspond with regard to their position within the sensors.
- Each sensor section may have a plurality of pixels.
- Each pixel may include image information that maps a particular region of the environment sensed by the sensor.
- the individual sensor sections can capture the image information at the same time or in chronological succession.
- corresponding image information can be taken to mean image information which images an identical or approximately identical region of the environment detected by the sensors.
- the information about the offset may represent information that has been determined in a calibration process.
- the information can be stored in a memory and read out of it.
- the offset may define a difference between the positions of two sensor sections having pixels with corresponding image information. Starting from a position of a sensor section of the first sensor, a position of a sensor section of the second sensor with an image information corresponding to the sensor section of the first sensor can be determined by means of the offset. If the sensor sections are arranged line by line, the offset can specify a number of lines.
- the image information of the first sensor and the second sensor can be read out directly from the sensor or from a temporary storage. The image information of the first sensor and the second sensor can be read out within a common read-out step.
- the image information of the second sensor can be read out in time directly before, at the same time, or in time directly after the image information of the first sensor has been read out.
- image information from different positions of the two sensors are read out in a read-out step.
- the read-out image information can be further processed directly or stored in a buffer for further processing. For example, based on the read and be performed with respect to their image content corresponding image information stereo measurement.
- the method may include a step of determining the offset.
- the offset may be based on a comparison between image information from sensor sections of the first sensor and image information from sensor sections of the second sensor. By comparing sensor sections can be determined with corresponding image contents and the directional distance of such sensor sections can be stored as an offset.
- the offset can be determined once, continuously or, for example, every time the stereo video system is put into operation.
- the image information of the first sensor can be read out only from a first sensor section and the image information of the second sensor only from a second sensor section, wherein an arrangement of the first sensor section within the first sensor with respect to an arrangement of the second sensor section within the second sensor different.
- the first sensor portion and the second sensor portion may be the same size.
- the first sensor section and the second sensor section have corresponding image contents.
- the read sensor sections can be processed directly in a subsequent step, without a rectification is necessary.
- the method may include a step of determining a shift between a first pixel within the first sensor section and a second pixel within the second sensor section, wherein the first pixel and the second pixel correspond to each other with respect to a captured image information.
- the first region of the first sensor section can be shifted by the value of the displacement relative to the second region of the second sensor section. This way you can
- the method may include a step of providing a first trigger signal configured to detect the image information of the first sensor to trigger by a arranged at a first position of the first sensor sensor section.
- the method may further comprise a step of providing a second triggering signal, temporally after or temporally prior to the step of providing the first triggering signal, wherein the second triggering signal is configured to detect the image information of the second sensor by one at the first position to trigger the sensor section arranged in the second sensor.
- the method may include a step of providing a trigger signal configured to simultaneously acquire the image information of the first sensor by a sensor portion located at a first position of the first sensor and the image information of the second sensor by one at a second position trigger the second sensor arranged sensor portion, wherein between the first position and the second position, the offset exists.
- a trigger signal configured to simultaneously acquire the image information of the first sensor by a sensor portion located at a first position of the first sensor and the image information of the second sensor by one at a second position trigger the second sensor arranged sensor portion, wherein between the first position and the second position, the offset exists.
- the image information of the first sensor can be read out of a line of the first sensor and the image information of the second sensor can be read out of a line of the second sensor.
- the offset can here determine a difference between a line number of the line of the second sensor and the line of the first sensor.
- the method may include at least one further step of reading out further image information of the first sensor and further image information of the second sensor.
- the readout takes place with respect to the further positions of the further sensor sections, from which the further image information is read, with the offset.
- staggered image information can be continuously read out of the two sensors. the.
- image information can be read out from a sensor section per sensor in each step of the read-out, which is directly adjacent to a sensor section from which it was read in the preceding read-out step.
- the present invention furthermore provides an apparatus for processing image information of two sensors of a stereo sensor system suitable for image acquisition, wherein each of the sensors is designed to detect the image information in sections in sensor sections of the sensor arranged at different positions, with the following features a provisioning means for providing information about a geometric offset between positions of two sensor sections of the first and second sensors that have corresponding image information; and a read-out device for reading out image information of the first sensor and image information of the second sensor, the read-out taking place with respect to the positions of the sensor sections from which the image information is read out with the offset.
- the device may be part of the stereo sensor system or part of a downstream processing unit.
- a device can be understood as meaning an electrical device which processes sensor signals and outputs control signals in dependence thereon.
- the device may have an interface, which may be formed in hardware and / or software. In a hardware training, the interfaces may for example be part of a so-called system ASICs, the various functions of the
- the interfaces are their own integrated circuits or at least partially consist of discrete components.
- the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
- a computer program product with program code which can be stored on a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory and is used to carry out the method according to one of the embodiments described above if the program is on a computer corresponding to a computer is also of advantage Device is running.
- FIG. 2 shows a basic structure of a hardware for stereo measurement
- FIG. 3 shows an illustration of a time-synchronous readout of two image sensors
- FIG. 6 shows a representation of a time-synchronized image acquisition by two image sensors
- FIG. 8 is a flowchart of a method of processing image information.
- Fig. 1 shows a principle of stereo measurement with image rectification. Shown are an original sensor image 101 of a left sensor and an original sensor image 103 a right sensor of a stereo imaging system.
- the sensor image 101 can be a video image on the left and the sensor image 103 can be a video image on the right.
- an object in the form of a vehicle can be recognized in the sensor images 101, 103.
- the vehicle is located in the sensor images 101, 103 at different positions.
- there is a line offset 107 between corresponding areas of the object for example the upper right corner areas 105 of the vehicle.
- An image rectification 110 generates a rectified image 11 1 from the sensor image 101 and a further rectified image 13 from the further sensor image 103.
- the upper right corner portions 105 of the vehicle are in the same line, so that the stereo correspondence 109 is given within the same picture line.
- the image sensor 201 may be an image sensor on the left and the image sensor 203 may be an image sensor on the right. Image information of the image sensor 201 is sent via a serial
- Line 221 of the left sensor is provided to an input buffer on the left 222, then provided on the left to rectify logic 223 and then to a line buffer 224 for the rectified image on the left.
- the rectification changes a position of the vehicle.
- Image information of the image sensor 203 is sent via a serial line 231 of the right sensor to a
- Input buffer on the right 232 then provided to a right rectification logic 233 and then to a rectified image line buffer 234 on the right.
- the rectification changes a position of the vehicle.
- Data from the line buffers 224, 234 are provided to disambiguation measurement logic 240 configured to generate 3-D coordinates 242.
- a hardware-assisted real-time stereo measurement is usually a rectification, ie equalization, the original sensor images 101, 103 in rectified images 1 1 1, 1 13.
- a correspondence search of Characteristics between left and right picture take place along a picture line.
- Fig. 1 shows the process schematically.
- a stereo hardware such as an FPGA
- Fig. 2 shows the schematic structure of a corresponding hardware.
- serial data line 221, 231 into which the camera image data, ie the brightness values, from the sensor 201, 203 to the evaluation hardware, e.g. with LVDS (Low Voltage Differential Signaling).
- evaluation hardware there is an input buffer 222, 232 in which a limited number of lines of input images are buffered.
- a rectification logic 223, 233 individually reads the image information from the input buffer 222, 232 and generates the rectified image information.
- next data buffer 224, 234 line by line. Between the lines of the intermediate buffer 224, 234 with the rectified image data of the left and right camera image, the stereo measurement and ultimately the 3 D reconstruction takes place.
- a calibration of the stereo camera system i. the determination of the geometric position of the cameras to each other and the optical distortion necessary for a meaningful stereo measurement.
- the image rectification then compensates for the position and aberrations using the calibration information during the evaluation.
- the images of the left and right cameras are usually recorded synchronously in time, in order to allow an accurate stereo measurement of dynamically moving objects. Also associated with this is a time-synchronous serial transport of the image lines from the left and right sensors to the stereo measurement processing hardware.
- CMOS sensors with a rolling shutter mode are used for image acquisition, image acquisition takes place for each line at a different time. This is shown schematically with reference to FIGS. 6 and 7. If the line offset is large due to mechanically unfavorable mounting of the cameras, the corresponding features required for the stereo measurement are recorded at different times. In particular, for the stereo measurement from the vehicle at high speeds, this leads to a correspondence measurement between left and right image line failing due to the fast movement of image features in the image.
- the inventive approach enables the reduction of the time offset of the measurement of corresponding image features when using rolling shutter sensors for stereo measurement and thus a reduction of the error of the 3 D stereo measurement.
- the method for stereo measurement of point features and their correspondence point assignment is shown schematically in Fig. 1.
- Sketched is the procedure for a stereo image pair.
- the upper stereo image pair 201, 203 shows the original image pair after being captured by the camera. Due to mechanical inaccuracies in the mounting of the camera and by optical distortion, the illustrated corresponding corners 105 of the imaged vehicle rear are offset by a plurality of lines.
- the line offset 107 is indicated schematically in the graph.
- the corresponding image features 105 in the left and right rectified image 1 1 1, 1 13 are each in one image line.
- the correspondence search is now possible in one line, which greatly facilitates a real-time realization in a computing hardware.
- FIG. 2 The construction of a hardware for the stereo measurement is shown in FIG. 2. From the two image sensors 201, 203, the image contents are sequentially read into the respective input buffers 222, 232 of the hardware.
- An input buffer 222, 232 consists of memory for several picture lines and thus contains a section of the input picture.
- a logic 223, 233 for rectifying the images fills the rectification buffers 224, 234 sequentially, again line by line.
- a next logic 240 performs a line-by-line stereo measurement between the left and right rectification buffers 224, 234 and generates displacement measurements. From this, the SD coordinates 242 of the image features can be derived.
- a line offset of the unadjusted stereo camera systems is unavoidable.
- the line offset requires a necessary enlargement of the input buffers 222, 232 in the hardware by the corresponding line offset.
- the size of the input buffer 222, 232 can be reduced by reading out the image sensors 201, 203 in a staggered manner. The control of the time offset takes place in dependence on the calibration information.
- Figures 3 and 4 show the flow of reading on the camera lines shown in Fig. 2 221, 231 schematically.
- FIG. 3 shows a signal course during readout of the image sensor on the serial data lines in the case of a line-synchronous readout of left and right image sensors.
- the upper course is assigned to the left sensor and the lower course to the right sensor.
- image information of a line 1 351, a line 2 352, a line 3 353, etc. of the left and the right sensor is simultaneously read out at the same time and transmitted via the serial data lines.
- Each of the lines 341, 352, 253 has a plurality of brightness values 356, of which only two are shown at the beginning of the line 351 for the sake of clarity. Shown is the general procedure for reading data in a stereo evaluation hardware shown schematically. The two sensors are read out isochronously serially. The same lines from left and right sensor are synchronously transmitted on the lines in time.
- FIG. 4 shows, corresponding to FIG. 3, a signal course during readout of the image sensor on the serial data lines in the case of a time-shifted readout of the image sensors as a function of caliber information, according to an exemplary embodiment of the present invention.
- the upper course is assigned to the left sensor and the lower course to the right sensor.
- a line 1 351, a line 2 352, a line 3 353, etc. are read out with respect to the left sensor.
- a line j 451, a line j + 1 452, a line j + 2 453, etc. are read out.
- lines 351, 451 are read out in a further read-out step
- lines 352, 452 and in a further read-out step lines 353, 453 are read out.
- Fig. 5 shows a readout of regions 501, 503 from a left image sensor
- the regions 501, 503 are the same size and cover different sensor areas of the sensors 201, 203. In this case, the regions 501, 503 are shifted relative to one another with respect to the rows of the sensors 201, 203. Based on the region 501 of the left sensor 201, the region 503 of the right sensor 204 is shifted upward by the distance 507. The distance 507 indicates a number of lines. In addition, the regions 501, 503 may also be shifted relative to each other with respect to the columns of the sensors 201, 203. Within the regions 501, 503, corresponding image contents 105 are identical to see or nearly identical positions within the regions 501, 503 arranged.
- CMOS sensors allow the selective readout of regions 501, 503 of the image sensor 201, 203. For the stereo measurement, this makes it possible, depending on the calibration data, to individually select the regions in the left and right sensors 201, 203 in such a way that between the Regions 501, 503 in the left and right image no or only a small line offset occurs. This also allows for a reduction in the size of the input buffer in the stereo hardware.
- a corresponding procedure is shown schematically in FIG.
- FIGS. 6 and 7 show this schematically.
- FIG. 6 shows a chronological sequence of the image recording with a rolling shutter sensor in a schematic representation, with a chronologically synchronized recording with left and right image sensor
- the corner region 105 is in turn arranged offset by a number of lines 107. Furthermore, recording times t0, t1 and a time difference dt are shown.
- FIG. 6 a time sequence of the image recording with a rolling shutter sensor in a schematic representation, with a staggered shutter mode, according to an exemplary embodiment of the present invention. Shown is the image sensor on the left 201 and the image sensor on the right 203, of which in turn the vehicle is detected with the corner region 105.
- the time offset is to be selected as a function of the calibration information, as shown in FIG. 7.
- the start of the image pickup of the left sensor 201 is delayed by the time dt.
- the delay dt is again calculated from calibration information and can be set individually for each stereo system.
- FIG. 8 shows a schematic illustration of a method for processing image information of two sensors 201, 203 of a stereo sensor system suitable for image acquisition.
- the sensors 201, 203 are designed to detect image information in sections in sensor sections arranged at different positions. As sensor sections, seven lines are shown by way of example here. To an image offset 107 between shots of the
- the reading is controlled by a read-out device 861.
- the read-out device 861 is designed to receive information about the image offset 107 and, based thereon in a read-out step, to read out a sensor section of the sensor 201 and a sensor section of the sensor 203. It is shown by way of example that the read-out device 861 reads out image information from the second line of the sensor 201 and image information from the sixth line of the sensor 203 in a read-out step.
- the second line of the sensor 201 and the sixth line of the sensor 203 each have the corresponding image content 105.
- sensor sections are read out with an offset of four lines.
- image information from the subsequent third line of the sensor 201 and image information from the following seventh line of the sensor 203 are read out accordingly.
- the read-out image information can be output to a processing device 865 and further processed or evaluated by it.
- the image information can be captured and buffered by the sensor sections.
- the read-out means 861 may be configured to read out the image information from a buffer.
- the readout device 861 may be configured to trigger detection of the image information by a corresponding sensor section.
- the readout device 861 may read the image information directly from the sensor sections.
- the read-out device 861 is designed to receive, in addition to the offset 807, further information about an offset within the individual sensor sections, in this case in the horizontal direction, and to read out only corresponding partial areas in response thereto. For example, if there is an offset of one quarter of a line length, for example, in the read step shown, the left half of the second line of the left sensor 201 and the middle two quarters of the sixth line of the right sensor 203 could be read out.
- the inventive approaches of time-shifted readout of the image sensors or the selection of regions as a function of the calibration information can be used, for example, in connection with assistance systems in the motor vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Input (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102010062496.9A DE102010062496B4 (de) | 2010-12-07 | 2010-12-07 | Verfahren und Vorrichtung zum Verarbeiten von Bildinformationen zweier zur Bilderfassung geeigneter Sensoren eines Stereo-Sensor-Systems |
PCT/EP2011/069617 WO2012076274A1 (de) | 2010-12-07 | 2011-11-08 | Verfahren und vorrichtung zum verarbeiten von bildinformationen zweier zur bilderfassung geeigneter sensoren eines stereo-sensor-systems |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2649802A1 true EP2649802A1 (de) | 2013-10-16 |
Family
ID=44992889
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP11784451.4A Withdrawn EP2649802A1 (de) | 2010-12-07 | 2011-11-08 | Verfahren und vorrichtung zum verarbeiten von bildinformationen zweier zur bilderfassung geeigneter sensoren eines stereo-sensor-systems |
Country Status (6)
Country | Link |
---|---|
US (1) | US9414044B2 (zh) |
EP (1) | EP2649802A1 (zh) |
JP (1) | JP2014503408A (zh) |
CN (1) | CN103444191B (zh) |
DE (1) | DE102010062496B4 (zh) |
WO (1) | WO2012076274A1 (zh) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2254091B1 (en) * | 2009-05-19 | 2020-03-25 | Veoneer Sweden AB | Vision system and method for a motor vehicle |
EP2946336B1 (en) | 2013-01-15 | 2023-06-21 | Mobileye Vision Technologies Ltd. | Stereo assist with rolling shutters |
AU2014239979B2 (en) * | 2013-03-15 | 2017-06-22 | Aurora Operations, Inc. | Methods, systems, and apparatus for multi-sensory stereo vision for robotics |
US10412600B2 (en) * | 2013-05-06 | 2019-09-10 | Itron Networked Solutions, Inc. | Leveraging diverse communication links to improve communication between network subregions |
US20140348416A1 (en) * | 2013-05-23 | 2014-11-27 | Himax Media Solutions, Inc. | Stereo image rectification apparatus and method |
CN104811688B (zh) * | 2014-01-28 | 2017-09-01 | 聚晶半导体股份有限公司 | 图像获取装置及其图像形变检测方法 |
TWI520098B (zh) | 2014-01-28 | 2016-02-01 | 聚晶半導體股份有限公司 | 影像擷取裝置及其影像形變偵測方法 |
CN104811680B (zh) * | 2014-01-28 | 2017-04-12 | 聚晶半导体股份有限公司 | 图像获取装置及其图像形变校正方法 |
US8988509B1 (en) * | 2014-03-20 | 2015-03-24 | Gopro, Inc. | Auto-alignment of image sensors in a multi-camera system |
CN104954776A (zh) * | 2014-03-24 | 2015-09-30 | 宏达国际电子股份有限公司 | 图像校正方法及其图像撷取装置 |
KR102425895B1 (ko) * | 2015-12-24 | 2022-07-27 | 삼성전자주식회사 | 전자장치의 데이터 동기 장치 및 방법 |
DE102016104731A1 (de) * | 2016-03-15 | 2017-09-21 | Connaught Electronics Ltd. | Verfahren zum Detektieren eines Rolling-Shutter-Effekts in Bildern eines Umgebungsbereichs eines Kraftfahrzeugs, Rechenvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug |
DE102023108151A1 (de) | 2023-03-30 | 2023-06-15 | Daimler Truck AG | Verfahren zum Betreiben eines Stereokamerasystems für ein Nutzfahrzeug |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5007A (en) * | 1847-03-13 | Improvement in steam-presses | ||
US8012A (en) * | 1851-04-01 | Improved machine for forming a lock on sheet metal | ||
JPH07107517A (ja) | 1993-09-30 | 1995-04-21 | Sanyo Electric Co Ltd | 立体撮像装置 |
JPH10224823A (ja) | 1997-02-05 | 1998-08-21 | Sanyo Electric Co Ltd | 立体映像表示方法および立体映像表示装置 |
WO2002095681A1 (de) | 2001-05-25 | 2002-11-28 | Siemens Aktiengesellschaft | Vorrichtung und verfahren zur verarbeitung von bilddaten |
US20040070667A1 (en) * | 2002-10-10 | 2004-04-15 | Fuji Photo Optical Co., Ltd. | Electronic stereoscopic imaging system |
US7112774B2 (en) * | 2003-10-09 | 2006-09-26 | Avago Technologies Sensor Ip (Singapore) Pte. Ltd | CMOS stereo imaging system and method |
JP4260129B2 (ja) | 2005-03-22 | 2009-04-30 | 富士フイルム株式会社 | 複眼撮影装置及びその調整方法 |
US8456515B2 (en) | 2006-07-25 | 2013-06-04 | Qualcomm Incorporated | Stereo image and video directional mapping of offset |
JP4382797B2 (ja) | 2006-11-09 | 2009-12-16 | 株式会社山武 | 対応点探索方法および3次元位置計測方法 |
EP2153641B2 (en) * | 2007-04-13 | 2021-10-27 | Ari M. Presler | Digital cinema camera system for recording, editing and visualizing images |
DE102008040985B4 (de) | 2008-08-05 | 2021-05-27 | Robert Bosch Gmbh | Verfahren zur Kalibrierung eines Mehrkamerasystems |
-
2010
- 2010-12-07 DE DE102010062496.9A patent/DE102010062496B4/de active Active
-
2011
- 2011-11-08 EP EP11784451.4A patent/EP2649802A1/de not_active Withdrawn
- 2011-11-08 WO PCT/EP2011/069617 patent/WO2012076274A1/de active Application Filing
- 2011-11-08 US US13/992,500 patent/US9414044B2/en active Active
- 2011-11-08 CN CN201180058509.6A patent/CN103444191B/zh active Active
- 2011-11-08 JP JP2013542435A patent/JP2014503408A/ja active Pending
Non-Patent Citations (1)
Title |
---|
See references of WO2012076274A1 * |
Also Published As
Publication number | Publication date |
---|---|
DE102010062496A1 (de) | 2012-06-14 |
US20130321588A1 (en) | 2013-12-05 |
CN103444191B (zh) | 2017-06-06 |
DE102010062496B4 (de) | 2022-01-20 |
JP2014503408A (ja) | 2014-02-13 |
WO2012076274A1 (de) | 2012-06-14 |
US9414044B2 (en) | 2016-08-09 |
CN103444191A (zh) | 2013-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102010062496B4 (de) | Verfahren und Vorrichtung zum Verarbeiten von Bildinformationen zweier zur Bilderfassung geeigneter Sensoren eines Stereo-Sensor-Systems | |
EP2710794B1 (de) | Verfahren zur erzeugung und auswertung eines bilds | |
DE102008034979B4 (de) | Verfahren und Einrichtung zur Erzeugung von fehlerreduzierten hochauflösenden und kontrastverbesserten Bildern | |
DE102009005861A1 (de) | Fahrzeugumgebungs-Erkennungssystem | |
EP2433092B1 (de) | Luftbildkamera-system und verfahren zum korrigieren von verzerrungen in einer luftbildaufnahme | |
DE102013220477B4 (de) | Verfahren und Vorrichtung zur Korrektur einer bildlichen Darstellung einer Fahrzeugumgebung | |
DE112012004352B4 (de) | Stereobildgebungsvorrichtung | |
DE202016007867U1 (de) | Steuerung des Sichtlinienwinkels einer Bildverarbeitungsplattform | |
DE102006051950B3 (de) | Steuerung für eine elektronische Zeilenkamera | |
DE112010003748T5 (de) | Verfahren und vorrichtung für die bildkorrektur | |
DE112019006274B4 (de) | Synchronisiertes Kamerasystem mit zwei unterschiedlichen Kameras und Verfahren zur Steuerung bzw. Synchronisierung einer ersten und einer zweiten Kamera des Kamerasystems | |
DE69221258T2 (de) | Weitwinkelbild-Erzeugungssystem | |
DE102012112118B4 (de) | Verwendung eines Bildsensors zur Aufzeichnung schnell aufeinander folgender Bilder | |
DE102011083745B4 (de) | Verfahren zum monokularen Motion-Stereo-basierten automatischen Vermessen von freien Parkplätzen, Computerprogrammprodukt und Vorrichtung | |
DE102010021221A1 (de) | Verfahren zur Bestimmung einer Ausrichtung einer an einem Fahrzeug angeordneten Kamera | |
DE2816839C3 (de) | Schaltungsanordnung zum Aufbereiten eines abgetasteten Musters | |
DE102009017135A1 (de) | Verfahren, Vorrichtung und Computerprogramm zur Videodatenanalyse | |
AT501761B1 (de) | Farbzeilenkamera | |
DE102005060048B4 (de) | Vorrichtung und Verfahren zur Aufnahme und Verarbeitung mindestens einer getakteten Bildfolge | |
DE102015006637A1 (de) | Verfahren zur Entzerrung von Bildern | |
DE102014219422B4 (de) | Fahrerassistenzsystem für ein Fahrzeug und ein Verfahren zum Anpassen eines Bildbereichs eines Bildes für ein Fahrerassistenzsystem | |
DE112019002126T5 (de) | Positionsschätzungsvorrichtung, positionsschätzungsverfahren und programm dafür | |
DE19713945A1 (de) | Vorrichtung zur Ortung von beweglichen Objekten | |
AT504626B1 (de) | Verfahren und einrichtung zur aufnahme, prüfung und auswertung von bildern eines gegenstandes | |
DE102014218627A1 (de) | Verfahren und Steuergerät zum Betreiben eines Bildsensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20130708 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20160601 |