US7304680B2 - Method and device for correcting an image, particularly for occupant protection - Google Patents

Method and device for correcting an image, particularly for occupant protection Download PDF

Info

Publication number
US7304680B2
US7304680B2 US10/469,782 US46978204A US7304680B2 US 7304680 B2 US7304680 B2 US 7304680B2 US 46978204 A US46978204 A US 46978204A US 7304680 B2 US7304680 B2 US 7304680B2
Authority
US
United States
Prior art keywords
source
address
target
image
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/469,782
Other languages
English (en)
Other versions
US20040202380A1 (en
Inventor
Thorsten Köhler
Ulrich Wagner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Assigned to SIEMENS AKTIENGESELLSCHAFT reassignment SIEMENS AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOHLER, THORSTEN, WAGNER, ULRICH
Publication of US20040202380A1 publication Critical patent/US20040202380A1/en
Application granted granted Critical
Publication of US7304680B2 publication Critical patent/US7304680B2/en
Assigned to CONTINENTAL AUTOMOTIVE GMBH reassignment CONTINENTAL AUTOMOTIVE GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SIEMENS AKTIENGESELLSCHAFT
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Definitions

  • the present invention relates to a method and a device for correcting an image, particularly for occupant protection systems.
  • Microsoft Research Technical Report MSR-TR-98-71 “A Flexible New Technique for Camera Calibration” discloses a method of compensating for image distortions in which a mathematical computing rule is used to map a source image recorded by a camera to a target image.
  • the computing rule calculates the corrected target image from the source image loaded into a working memory.
  • This method suffers from two principal drawbacks, namely that a large memory capacity is needed to store the source image and the calculation requires considerable computing capacity.
  • the present invention compensates for image distortions of an image caused by a camera system quickly and cost-effectively.
  • an image distorted by the optics of a camera system provides a source image in the image sensor of the camera system, which source image is distorted in different ways depending on the quality of the optics, the focal distance of the camera system and other optical parameters.
  • the source image is broken down into individual source pixels. Each individual source pixel is arranged in a predefined position in the source image and the gray scale value of each individual source pixel as recorded by the image sensor is stored at a predefined source pixel address in the image sensor.
  • the source image is mapped to a target image using a predefined imaging rule, the result being a corrected image.
  • the target image comprises target pixels, the gray scale value of each of which is stored under a target pixel address in a target memory.
  • One source pixel is mapped in the process to no, one or more than one target pixel.
  • the gray scale value of the source pixel address is stored in the process under the target pixel address.
  • the imaging rule for correcting a source image in a target image is preferably stored in tabular form in a memory of a microcontroller. This advantageously enables rapid processing of the imaging rule. There is, furthermore, no need for temporary storage of the source image, which considerably reduces the memory capacity required.
  • Another embodiment of the invention provides for the source pixel to be mapped directly to at least one target pixel without temporary storage as the source image is being read from the image sensor. This method advantageously not only reduces the memory capacity required, but also corrects the source image without delay, which is essential in particular in occupant protection systems.
  • Mapping the source image to the target image using the predefined imaging rule produces a target image having fewer pixels than the source image. This means that there are some source pixels that are not mapped to the target image.
  • the image sensor generally records more information than is actually necessary. This information is filtered out by the imaging rule, so the method advantageously filters and reduces the data volume. All that then has to be stored in the microcontroller that serves as the evaluation unit is the target image generated by the imaging rule. Once again this reduces the memory capacity required in the evaluation unit.
  • the microcontroller and/or evaluation unit furthermore includes a table, also referred to below as a rectifying table, which table in turn includes the imaging rule required for the correction method.
  • This table include the addresses of the source pixels that are to be mapped to the target pixels, which source pixels are also referred to below as “selected source pixels” having a “selected source pixel address”.
  • the microcontroller in the evaluation unit triggers all source pixel addresses for the source pixels in the image memory sequentially.
  • the source pixel addresses are selected using a pointer, which is incremented by means of a counter.
  • Each “selected source pixel address” is allocated to at least one target pixel address of a target pixel in the table.
  • the target addresses read from the table are filled with the gray scale value of the corresponding selected source pixel address.
  • the use of a table for the imaging rule reduces the computing capacity required for the evaluation unit.
  • the memory capacity required is also reduced, since only a target image having a smaller number of pixels is stored in the evaluation unit.
  • the corrected target image is, moreover, available virtually immediately once the source image has been read from the image sensor.
  • the image information of the source image is more closely packed in the edge regions of the source image than it is in the center of the image.
  • the gray scale value of at least one selected source pixel in the edge region is therefore distributed to more than one target pixel or stored in the target memory.
  • An additional table including information required for multiple assignment and in particular of the target pixel addresses allocated to one source pixel address is used in order to permit smooth and uninterrupted incrementing of the pointer for reading the rectifying table, which includes one target pixel address per selected source pixel address.
  • the additional table is also referred to below as the reference table. This reference table is preferably arranged directly below the rectifying table in the memory.
  • Each cycle (reading cycle) of reading a source pixel by the incrementing counter includes cycles, referred to below as reference cycles, in which no selected source pixel or one unselected source pixel is read.
  • Such a reference cycle accesses the reference table that includes the additional target pixel addresses for multiple assignments not included in the rectifying table.
  • the gray scale value of the selected source pixel is in this way allocated to multiple target pixels and stored in the target memory in a number of successive reference cycles, which are often interrupted by reading cycles. Since this happens in the “pauses” or “reference cycles” and the number of target pixels is smaller than the number of source pixels despite the multiple assignment, the image correction process is complete as soon as the entire source image has been read from the image sensor by means of the incrementing counter.
  • the correction of the source image is thus advantageously carried out in real-time.
  • the reduced time and memory capacity required is particularly noticeable when using two image sensors to generate a stereo image.
  • the method is preferably realized in an FBGA or ASIC.
  • An implementation in a microprocessor is also conceivable in a further embodiment.
  • the source image is the raw image supplied directly from the camera.
  • the level and nature of the distortion affecting this image depend on the optics used.
  • the source pixel is a picture element in the source image.
  • the source pixel address is an address of a specific picture element in the source image.
  • the picture elements are numbered linearly. It is perfectly straightforward to convert between image coordinates and source pixel address.
  • the range of values for the source pixel address space is from zero to (number of picture elements in the source image) minus one.
  • the target image is the image after rectifying or correction.
  • This image is the corrected image, in which the pixel format or pixel address space is smaller than the format of the source image or source image address space.
  • the target pixel is one picture element in the target image.
  • the target pixel address is the address of a specific picture element in the target image.
  • the picture elements are numbered linearly. It is perfectly straightforward to convert between image coordinates and target pixel address.
  • the range of values for the target pixel address space is from zero to (number of picture elements in the target image) ⁇ 1.
  • the rectifying table is the table in which the corresponding target pixel address is indicated for the source pixel address.
  • the number of entries is equal to the number of target addresses.
  • the number of target addresses is somewhat larger than the number of source pixel addresses in the case of multiple assignments. Rectification is carried out by working through this table.
  • the reference table is a table that stores the target pixel addresses that are called up by multiple assignment of a source pixel address to more than one target pixel address.
  • the content of the reference table includes a reference address and a target address as target pixel address.
  • FIG. 1 shows the interior of a vehicle having an optical image capture system.
  • FIG. 2 a shows a source image distorted by an optical system.
  • FIG. 2 b shows the source image from FIG. 2 a corrected using an imaging rule.
  • FIG. 3 shows an imaging rule that converts a source image into a corrected target image.
  • FIG. 4 shows the pixel address space of an image with the coordinates of the pixel address arranged in the form of a matrix.
  • FIG. 5 a shows an imaging table (rectifying table) for allocating source pixel addresses to target pixel addresses.
  • FIG. 5 b shows an additional table (reference table) for allocating a source pixel address to more than one target pixel address.
  • FIG. 6 shows a functional circuit diagram with which to implement the method for correcting an image recorded by an image sensor.
  • FIG. 7 a shows a flow diagram of a method for correcting a source image.
  • FIG. 7 b shows an additional flow diagram for correcting a source image.
  • FIG. 1 shows a schematic representation of a vehicle 1 containing a vehicle seat 2 having a seat cushion 23 , a seat backrest 21 and, located thereon, a headrest 22 .
  • an optical camera system 7 , 71 , 72 , 73 , 74 Arranged on the inside roof lining of the vehicle roof, preferably between the two vehicle seats, is an optical camera system 7 , 71 , 72 , 73 , 74 with which a predefined image area Bi of the vehicle interior can be captured. It is preferable for two image sensors 72 , 73 to capture the image area, with the vehicle seat 2 and an object 9 that may optionally be located thereon, through a camera optical system.
  • the object 9 in FIG. 1 is shown as a schematic representation of a vehicle occupant.
  • the objects 9 may in further embodiments be child seats, vehicle occupants, articles, etc. or the vehicle seat 2 may be empty.
  • a dashboard 5 Arranged in the front part of the vehicle 1 below the wind-screen 4 is a dashboard 5 under which is located a foot well 8 , which provides space for the feet and legs of the occupant 9 and includes an airbag 26 .
  • the foot well 8 is bounded at the bottom by the vehicle floor 6 , on which seat position adjustment rails 24 are arranged. Brackets connect the vehicle seat 2 to the seat position adjustment rails 24 in the area beneath the seat cushion 23 .
  • the vehicle seat 2 is thus arranged such that it can be adjusted in the X direction, that is to say the direction of travel.
  • the camera system 7 preferably contains two image sensors 72 , 73 , a light source 71 , which light source is preferably fitted with a number of light-emitting diodes, and an evaluation unit 74 .
  • the optical axes of the two image sensors 72 , 73 have a predefined separation L. This makes it possible using stereo image processing methods to capture range information concerning the objects in the predefined image area Bi from the images recorded by the two image sensors 72 , 73 .
  • the camera 7 preferably contains the two image sensors 72 , 73 and the light source 71 in a compact housing.
  • the evaluation unit 74 is preferably arranged remote from the compact housing and connected thereto by a data link in order to keep the camera system 7 , which is considered to be an inconvenience by the designers of vehicle interiors, as small as possible. It is, however, entirely possible to integrate the evaluation unit 74 (ECU) into the camera housing of the camera 7 .
  • the evaluation unit 74 has a target memory 105 in which the corrected target image can be stored.
  • a further embodiment provides for there to be just one image sensor 71 or 73 , which reduces costs.
  • the required range information is preferably determined in this configuration by means of light propagation time measurement or some other image processing method.
  • FIG. 2 a shows a source image S (original image) distorted by the optical system having points P 1 , P 2 , P 3 of different sizes.
  • the source image S is the image recorded by one of the image sensors 72 , 73 in FIG. 1 via a distorting camera optical system.
  • the picture elements P 1 , P 2 , P 3 which were originally arranged in the form of a matrix, have been distorted by the wide-angle characteristic of the camera optical system (fisheye effect).
  • FIG. 2 b shows a target image T that emerges after correction of the source image S.
  • the picture elements P 1 , P 2 , P 3 from the original distorted source image S can be seen in the target image arranged in an orthogonal matrix.
  • FIG. 3 shows a schematic representation of the imaging rule for converting a source image S into a corrected target image T.
  • the source image S distorted by an image sensor 72 , 73 ( FIG. 1 ) via a camera optical system is broken down into source pixels N 1 , S 1 to S 18 , NI, NX in the form of a matrix.
  • the source image has twenty six (26) columns in the X 1 direction (horizontal) and eighteen (18) rows in the Y 1 direction (vertical). This combination spans a source pixel address space (X 1 ,Y 1 ).
  • Certain predefined source pixels S 1 to S 18 each of which is arranged under a predefined source pixel address (X 1 ,Y 1 ), are selected from the source pixels of the source image S.
  • a corresponding gray scale value recorded by the image sensor the value of which gray scale value depends on the brightness of the corresponding source pixel, is stored under each source pixel address.
  • the target image T has a target pixel address space (X 2 ,Y 2 ) comprising 18 columns (X 2 ) and 12 rows (Y 2 ).
  • the gray scale value of the source pixel S 1 is stored under the target pixel address of the target pixel T 1 .
  • the allocation of the selected source pixel S 1 to the target pixel T 1 is unambiguous.
  • the source pixel N 1 lies outside the camera system image area Bi that is of interest for further image processing. This can be seen in FIG. 2 a at the upper left edge of the image, where the source pixel N 1 is located within a black field.
  • the image sensor 72 , 73 sees no usable image at this source pixel address (2;2), as the distortion of the camera optical system drags the image information toward the center of the image and the black edge of the source image S includes no image information.
  • Row 1 of the target image T represents the corrected image row produced from the curvilinear image row B in the source image S.
  • the source image S includes selected source pixels S 1 to S 18 and unselected source pixels N 1 , Ni.
  • the selected source pixels S 1 to S 18 are mapped to the target pixel address space of the target image T.
  • the unselected source pixels NI are not mapped to the target image.
  • the target image T is thus compressed by two pixels in the horizontal (X) direction as compared with the source image S.
  • the distortion in the source image S means that the relevant points shown at its edges are very much compressed, as is evident from the different distances Di and Dr in FIG. 2 , so the imaging rule decompresses, corrects and/or stretches the image by mapping the source pixel S 18 to two target pixels T 18 , T 18 * located one above the other.
  • the target pixel T 18 is located at address (18;1) and the target pixel T 18 * is located at address (18;2).
  • the imaging rule turns the source image S having a pixel address space (X 1 ;Y 1 ) into a target image T having a smaller pixel address space (X 2 ;Y 2 ). Redundant information is filtered out of the source image S and this reduction in data volume means that less memory capacity is needed to store the target image T.
  • the source pixel Nx (25;17), for example, is an unselected source pixel that is not contained in the imaging rule and is therefore not stored in the target memory 105 (see FIG. 1 ) for the target image T as “memory-consuming ballast”.
  • FIG. 4 shows a 7 ⁇ 16 pixel sample image to illustrate the management of the image pixels of the source image S and target image T and also the concept of a “pixel address space”.
  • This address allocation is used to read the gray scale values of an image sensor 72 , 73 ( FIG. 1 ).
  • the address Ad is incremented continuously by means of a counter and the corresponding gray scale value for each address Ad is read from the image sensor 72 , 73 .
  • the content of the image sensor 72 , 73 is thus read row-by-row.
  • the counter content including the address Ad is also referred to below as the Counter_Source.
  • the imaging rule for correcting the source image S is processed with the aid of the table TA from FIG. 5 a .
  • the table TA also referred to below as the rectifying table TA, has three columns including the table address TBA, the source pixel address SP and the target pixel address TP respectively.
  • the table TA is sorted with the source pixel addresses in ascending order. No source pixel address is repeated.
  • the table address TBA includes pointer addresses A, A+1, . . . , A+17 of a pointer that indicates the respective source pixel address SP of the source pixels S 1 to S 18 in the second column and the respective target pixel address TP of the target pixels T 1 to T 18 in the third column.
  • a cyclically incrementing counter having the counter content “Counter_Target” counts up through the table addresses TBA as is described in the figures that follow.
  • the rectifying table TA outputs the source pixel address SP and associated target pixel address TP corresponding to the current pointer address. The rectifying table TA thus assigns each “selected” or “predefined” source pixel address SP to precisely one target pixel address TP.
  • One source pixel address SP is, however, allocated to more than one target pixel address TP in the case of the multiple assignments described previously.
  • These multiple assignments or multiple mappings are made using an additional table TB shown in FIG. 5 b and also referred to below as the reference table TB.
  • the reference table TB assigns one source pixel address to multiple target pixel addresses; source pixel S 18 , for example, is assigned to both of target pixels T 18 * and T 18 **.
  • the reference table TB has three columns headed “Table address TBB”, “Source address SPB” and “Target address TPB”.
  • the first column includes table addresses TBB that refer to corresponding source addresses SPB in the second column and associated target addresses TPB in the third column.
  • a reference address in the column TPB for example the reference address 1 of the target pixel T 18 , identifies a target pixel address that also occurs in the rectifying table TA.
  • a reference address is a source address, for example the source address 1 of the column SPB, that likewise occurs in the rectifying table TA.
  • a target address in the column TPB for example the target addresses 1 . 1 and 1 . 2 of target pixels T 18 * and T 18 **, identifies a target pixel address that is formed by multiple assignments of one source pixel address to multiple target pixel addresses and does not occur in the rectifying table.
  • a target address of this nature is assigned the source address “0” in the column SPB.
  • the reference address is thus distinguished from the target address in the method described in greater detail below by means of the source address SPB value assigned in each case.
  • the pointer B is, for example, allocated a source address 1 and a reference address 1 .
  • This source address 1 indicates further target addresses 1 . 1 , 1 . 2 , which are contained in the following rows B+1, B+2, in addition to the reference address 1 also included in the table TA.
  • the table TB is thus used for the assignment of a source address to more than one target address as explained in greater detail below.
  • FIG. 6 shows the method for correcting a source image S using a tabular imaging rule TA with reference to a functional circuit diagram.
  • the function blocks may take the form of software, hardware or a combination of software and hardware.
  • FIG. 6 shows an imaging rule for correcting an image in which, for the sake of clarity, one-to-one assignments are made. This means that each selected source pixel S from the reference table TA is allocated to precisely one target pixel T.
  • a clock generator 100 sequentially increments a counter 101 having the counter content “Counter_Source”.
  • the output of the counter 101 is connected to the address input Ad of an image sensor 72 , 73 having a source pixel address space 102 arranged in the form of a matrix.
  • the counter content “Counter_Source” corresponds to the address Ad from FIG. 4 .
  • the gray scale value GREY of the selected address Ad of the image sensor 72 , 73 is output at the output of the image sensor 72 , 73 .
  • the rectifying table TA shown in FIG. 6 is already known from FIG. 5 a and is supplied by an additional counter 104 having the counter content “Counter_Target”.
  • the additional counter 104 indicates the pointer A of the table address TBA.
  • the source pixel address SP (X 1 ;Y 1 ) of the source pixel concerned and the target pixel address TP (X 2 ;Y 2 ) of the target pixel concerned are present in each case at the output of the table.
  • a comparator 103 compares the address Ad present at the image sensor 72 , 73 at a given moment with the source image address SP (X 1 ;Y 1 ).
  • an enable signal is sent to the target memory 105 that includes the target image address space.
  • the target memory 105 receives the gray scale value GREY of the source pixel concerned, which gray scale value is present in the image sensor under the instantaneous address Ad, and stores it under the target pixel address TP 1 that it obtains from the rectifying table TA.
  • the enable signal E continues to increment the counter content “Counter_Target” of the counter 104 for the next read operation so that the next selected source pixel address SP for comparison with the next instantaneous address Ad is available at the input of the comparator 103 .
  • the gray scale value GREY of a source pixel Si which is stored under the address Ad (X 1 , Y 1 ), is thus stored under the target image address TP 1 (X 2 , Y 2 ) in the target memory 105 .
  • the counter 101 is incremented so that the entire source image address space 102 of the image sensor 72 , 73 is read and the corresponding selected source pixels Si are mapped in the target image address space of the target memory 105 .
  • the gray scale value GREY of the selected source pixel is stored in the process under the mapped target address TP 1 .
  • FIG. 7A shows a flow diagram F 1 in which the method for correcting a source image S is explained.
  • the comparator 103 checks in the next step to determine whether the instantaneous address Ad (Counter_Source) matches the source pixel address (Counter_Target) present in rectifying table TA. If the result of this check is negative, the instantaneous address Ad of the image sensor 72 , 73 is assigned no mapping instruction from the source pixel address to the target pixel address. The target memory 105 is therefore not activated by means of the enable signal and the counter 104 (Counter_Target) is not incremented. Counter 101 having the counter content “Counter_Source” is incremented, however, in order that the next source pixel in the image memory can be read.
  • the current gray scale value GREY of the source pixel, for example S, recorded under the image address AD is stored in the target memory 105 under the target pixel address TP.
  • the target pixel address TP required for this purpose is made available from the rectifying table TA.
  • the enable signal of the comparator 103 and the clock generator 100 ensure that the method, which is preferably realized in a microcontroller, an ASIC and/or an FPGA, is time-synchronized.
  • the counter 101 (Counter_Source) is also incremented so that the gray scale value of the next source pixel S in the image sensor 72 , 73 can be read.
  • the read cycle for a new source image begins once a check has been made to confirm that the last address in the address space of the image sensor has been reached.
  • the imaging rule for correcting a source image S by mapping individual source pixels Si to target pixels Ti shown by way of example in FIG. 3 is thus implemented using the method according to FIG. 7 a for the entire source image S.
  • FIG. 7 b shows an additional flow diagram F 2 that, in contrast to the flow diagram F 1 from FIG. 7 a , is also able to undertake the assignment of one source pixel address SP of one source pixel to more than one target pixel address TP. This is done using both the rectifying table TA from FIG. 5 a and an additional reference table TB from FIG. 5 b that includes the multiple assignments.
  • the reference table TA preferably follows immediately after the rectifying table TB in the read-only storage of the evaluation unit 74 .
  • the method shown in the flow diagram F 2 begins after startup and the associated initialization routines.
  • the counter (Counter_Target) that indicates the table address TBA in the reference table TA is set to zero.
  • the pointer B in the table address TBB of the table TB is set to the first source address or reference address.
  • the first reference address here is the target pixel address to which the first source pixel address with multiple assignment is allocated, which equates in FIG. 5 b to the source address 1 .
  • a state machine is, moreover, assigned a status or state “Read_Ref_Addr” intended to lead to the subsequent reading of the next value B:+B+1 in the reference table TB.
  • the source pixel address “Counter_Target” set by the counter 104 and output by the table TA is compared in the next step with the instantaneous source pixel address “Counter_Source” set by the other counter 101 . If these two addresses are identical (“Counter_Source equals Counter_Target”), the gray scale value GREY of the currently selected source pixel is stored under the target pixel address in the target memory. This target pixel address corresponds to the reference address from the reference table TB in the case of multiple assignments.
  • the instantaneous address for the rectifying table TA (table address A) is incremented by one unit.
  • the method steps that follow the state assignment use the table TB to check whether the current instance is a case of multiple assignment of one source pixel address to more than one target pixel address. If the answer is affirmative, the gray scale value GREY of the target pixel address concerned, which is also referred to below as the reference address “Ref_Addr”, is read in the following read cycles and copied to the relevant additional target addresses 1 . 1 , 1 . 2 (see FIG. 5 b ).
  • This copying operation requires a total of three no-read cycles, that is to say three runs through the steps that follow the state query.
  • the instantaneous value of the state at the start of a new image was set to “Read_Ref_Addr”. This means that the state query selects the “Read_Ref_Addr” branch, which provides for the next reference address to be read from table TB in a first step if the instantaneous source pixel address is larger than the source pixel address of the reference pixel read from the reference table TB.
  • the “Ref_Pixel_Flag” is reset.
  • the “Ref_Pixel_Flag” indicates, in its reset state, that the gray scale value GREY may be read from the target memory at the position of the reference address “Ref_Addr”.
  • This gray scale value GREY was stored in the target memory as a result of a “Yes” response to the preceding “Counter_Target equals Counter_Source” query.
  • the flag “Ref_Pixel_Flag” is used to query whether the reference gray scale value GREY has already been read from the target memory at the position of the reference address “Ref_Addr” in the preceding cycle. If the reference gray scale value GREY has not already been read from the target memory, this is now done.
  • the next target address “Tar_Addr” belonging to the instantaneous reference address “Ref_Addr” is read.
  • the target address 1 . 1 from the table TB ( FIG. 5 b ) that belongs to the reference address 1 is, for example, read under table address B+1.
  • a next state “Write_Ref_Pixel” is then assigned, which state triggers the writing of this target address TAR_ADDR 1 . 1 to the target memory in the next cycle.
  • next state assignment indicates the state “Write_Ref_Pixel”
  • the gray scale value GREY of the reference address read in one of the preceding cycles is written to the target memory at the position of the target address TAR_ADDR.
  • a check is then made to determine whether the next value in the reference table TB is a reference address or a target address. This is done by evaluating the source address in the column SPB. A reference address is assigned to a source address, whereas a target address is assigned a fixed value, for example zero (“0”). This makes it possible to ascertain, by incrementing the pointer B in the table address TBB, whether a reference address or target address is present as the next value in the table TB.
  • the state is set to “Read_Ref_Pixel” so that the next target address Tar_Addr belonging to the instantaneous reference address Ref_Addr is read in the next read cycle. If, on the other hand, the next value in the table TB is a reference address, the state is set to “Read_Ref_Addr”, which results in the next reference address being read from the table TB for the subsequent multiple assignment in the next cycle.
  • the invention is preferably applied in occupant protection systems, mobile telephones or web cameras.
  • a further embodiment of the invention provides for the controlled distortion (morphing) of images to create desired optical effects, for example in video cameras.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
US10/469,782 2001-03-05 2002-10-03 Method and device for correcting an image, particularly for occupant protection Expired - Fee Related US7304680B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/DE2001/000825 WO2002078346A1 (de) 2001-03-05 2001-03-05 Verfahren und vorrichtung zum entzerren eines bildes, insbesondere für insassenschutzsysteme

Publications (2)

Publication Number Publication Date
US20040202380A1 US20040202380A1 (en) 2004-10-14
US7304680B2 true US7304680B2 (en) 2007-12-04

Family

ID=5648222

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/469,782 Expired - Fee Related US7304680B2 (en) 2001-03-05 2002-10-03 Method and device for correcting an image, particularly for occupant protection

Country Status (6)

Country Link
US (1) US7304680B2 (de)
EP (1) EP1368970B1 (de)
JP (1) JP2004530202A (de)
KR (1) KR100910670B1 (de)
DE (1) DE50107083D1 (de)
WO (1) WO2002078346A1 (de)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070115384A1 (en) * 2005-11-24 2007-05-24 Olympus Corporation Image processing apparatus and method for preferably correcting distortion aberration
US20070132863A1 (en) * 2005-12-14 2007-06-14 Sony Corporation Image taking apparatus, image processing method, and image processing program
US7811011B2 (en) 2006-02-06 2010-10-12 Leopold Kostal Gmbh & Co. Kg Camera arrangement behind an inclined pane
US8405726B2 (en) 2002-01-31 2013-03-26 Donnelly Corporation Vehicle accessory system
US20130155296A1 (en) * 2011-12-14 2013-06-20 Samsung Electronics Co., Ltd. Imaging apparatus and distortion compensation method
US8481916B2 (en) 1998-01-07 2013-07-09 Magna Electronics Inc. Accessory mounting system for a vehicle having a light absorbing layer with a light transmitting portion for viewing through from an accessory
US8513590B2 (en) 1998-01-07 2013-08-20 Magna Electronics Inc. Vehicular accessory system with a cluster of sensors on or near an in-cabin surface of the vehicle windshield
US8531278B2 (en) 2000-03-02 2013-09-10 Magna Electronics Inc. Accessory system for vehicle
US8531279B2 (en) 1999-08-25 2013-09-10 Magna Electronics Inc. Accessory mounting system for a vehicle
US8534887B2 (en) 1997-08-25 2013-09-17 Magna Electronics Inc. Interior rearview mirror assembly for a vehicle
US8570374B2 (en) 2008-11-13 2013-10-29 Magna Electronics Inc. Camera for vehicle
US8686840B2 (en) 2000-03-31 2014-04-01 Magna Electronics Inc. Accessory system for a vehicle
US8692659B2 (en) 1998-01-07 2014-04-08 Magna Electronics Inc. Accessory mounting system for vehicle
US9090213B2 (en) 2004-12-15 2015-07-28 Magna Electronics Inc. Accessory mounting system for a vehicle
US9233645B2 (en) 1999-11-04 2016-01-12 Magna Electronics Inc. Accessory mounting system for a vehicle
US9434314B2 (en) 1998-04-08 2016-09-06 Donnelly Corporation Electronic accessory system for a vehicle

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008077132A1 (en) 2006-12-19 2008-06-26 California Institute Of Technology Imaging model and apparatus
JP2010530087A (ja) * 2006-12-19 2010-09-02 創太 清水 画像処理プロセッサ
JPWO2008126227A1 (ja) * 2007-03-29 2010-07-22 富士通マイクロエレクトロニクス株式会社 表示制御装置、情報処理装置、および表示制御プログラム
JP5008139B2 (ja) * 2007-11-26 2012-08-22 株式会社リコー 画像撮像装置
JP4911628B2 (ja) * 2008-01-28 2012-04-04 株式会社リコー 画像処理方法、画像処理装置及び画像撮像装置
JP5062846B2 (ja) * 2008-07-04 2012-10-31 株式会社リコー 画像撮像装置
JP2012523783A (ja) * 2009-04-13 2012-10-04 テセラ・テクノロジーズ・ハンガリー・ケイエフティー 軌跡に基づいて画像センサーを読み出すための方法とシステム
DE102009028743B4 (de) 2009-08-20 2011-06-09 Robert Bosch Gmbh Verfahren und Steuergerät zur Entzerrung eines Kamerabildes
JP5376313B2 (ja) * 2009-09-03 2013-12-25 株式会社リコー 画像処理装置及び画像撮像装置
DE102011007644A1 (de) 2011-04-19 2012-10-25 Robert Bosch Gmbh Verfahren und Vorrichtung zur Bestimmung von zur Entzerrung eines Bildes geeigneten Werten und zur Entzerrung eines Bildes
US20140125802A1 (en) 2012-11-08 2014-05-08 Microsoft Corporation Fault tolerant display
EP3220099B1 (de) 2014-11-13 2019-11-06 Olympus Corporation Kalibriervorrichtung, kalibrierverfahren, optische vorrichtung, bildgebungsvorrichtung projektionsvorrichtung, messsystem und messverfahren
JP6805854B2 (ja) * 2017-02-02 2020-12-23 株式会社Jvcケンウッド 車載撮影装置及び映り込み抑制方法
US10657396B1 (en) * 2019-01-30 2020-05-19 StradVision, Inc. Method and device for estimating passenger statuses in 2 dimension image shot by using 2 dimension camera with fisheye lens

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0450718A1 (de) 1990-04-02 1991-10-09 Koninklijke Philips Electronics N.V. Anordnung zum geometrischen Korrigieren eines verzerrten Bildes
US5067019A (en) * 1989-03-31 1991-11-19 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Programmable remapper for image processing
US5185667A (en) * 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability
US5345542A (en) * 1991-06-27 1994-09-06 At&T Bell Laboratories Proportional replication mapping system
US5384588A (en) 1991-05-13 1995-01-24 Telerobotics International, Inc. System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5508734A (en) * 1994-07-27 1996-04-16 International Business Machines Corporation Method and apparatus for hemispheric imaging which emphasizes peripheral content
US5528194A (en) * 1991-05-13 1996-06-18 Sony Corporation Apparatus and method for performing geometric transformations on an input image
JPH09202180A (ja) * 1996-01-29 1997-08-05 Niles Parts Co Ltd 車載カメラ装置
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera
US5796426A (en) * 1994-05-27 1998-08-18 Warp, Ltd. Wide-angle image dewarping method and apparatus
US5815199A (en) 1991-01-31 1998-09-29 Matsushita Electric Works, Ltd. Interphone with television
US5818527A (en) * 1994-12-21 1998-10-06 Olympus Optical Co., Ltd. Image processor for correcting distortion of central portion of image and preventing marginal portion of the image from protruding
JPH10271490A (ja) 1997-03-24 1998-10-09 Yazaki Corp 車両後方監視装置
WO1999012349A1 (en) 1997-09-04 1999-03-11 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US5990941A (en) * 1991-05-13 1999-11-23 Interactive Pictures Corporation Method and apparatus for the interactive display of any portion of a spherical image
US5999660A (en) * 1995-07-26 1999-12-07 California Institute Of Technology Imaging system for correction of perceptual distortion in wide angle images
US6043837A (en) * 1997-05-08 2000-03-28 Be Here Corporation Method and apparatus for electronically distributing images from a panoptic camera system
US6211911B1 (en) * 1994-10-14 2001-04-03 Olympus Optical Co., Ltd. Image processing apparatus
JP2001145012A (ja) * 1999-11-12 2001-05-25 Toyota Autom Loom Works Ltd 画像歪補正装置
US6538691B1 (en) * 1999-01-21 2003-03-25 Intel Corporation Software correction of image distortion in digital cameras
US6618081B1 (en) * 1996-11-28 2003-09-09 Minolta Co., Ltd. Image acquisition device removing distortion in image signals
US6704434B1 (en) * 1999-01-27 2004-03-09 Suzuki Motor Corporation Vehicle driving information storage apparatus and vehicle driving information storage method
US20040169726A1 (en) * 2001-07-20 2004-09-02 6115187 Canada Inc. Method for capturing a panoramic image by means of an image sensor rectangular in shape
US20040234100A1 (en) * 2001-05-25 2004-11-25 Horst Belau Device and method for the processing of image data
US6833843B2 (en) * 2001-12-03 2004-12-21 Tempest Microsystems Panoramic imaging and display system with canonical magnifier
US20050007478A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple-view processing in wide-angle video camera
US6920234B1 (en) * 1999-05-08 2005-07-19 Robert Bosch Gmbh Method and device for monitoring the interior and surrounding area of a vehicle
US6937282B1 (en) * 1998-10-12 2005-08-30 Fuji Photo Film Co., Ltd. Method and apparatus for correcting distortion aberration in position and density in digital image by using distortion aberration characteristic
US7058235B2 (en) * 2001-02-09 2006-06-06 Sharp Kabushiki Kaisha Imaging systems, program used for controlling image data in same system, method for correcting distortion of captured image in same system, and recording medium storing procedures for same method
US20070025636A1 (en) * 2003-06-02 2007-02-01 Olympus Corporation Image processing device
US7202888B2 (en) * 2002-11-19 2007-04-10 Hewlett-Packard Development Company, L.P. Electronic imaging device resolution enhancement
US20070115384A1 (en) * 2005-11-24 2007-05-24 Olympus Corporation Image processing apparatus and method for preferably correcting distortion aberration
US7224392B2 (en) * 2002-01-17 2007-05-29 Eastman Kodak Company Electronic imaging system having a sensor for correcting perspective projection distortion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5853707A (ja) * 1981-09-25 1983-03-30 Toshiba Corp 三次元距離測定装置におけるテレビカメラの画像歪補正方式
JP2658642B2 (ja) * 1991-07-31 1997-09-30 松下電工株式会社 テレビインターホン
JP3033204B2 (ja) * 1991-01-31 2000-04-17 松下電工株式会社 テレビインターホン
JP3599255B2 (ja) * 1995-12-04 2004-12-08 本田技研工業株式会社 車両用環境認識装置

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5067019A (en) * 1989-03-31 1991-11-19 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Programmable remapper for image processing
EP0450718A1 (de) 1990-04-02 1991-10-09 Koninklijke Philips Electronics N.V. Anordnung zum geometrischen Korrigieren eines verzerrten Bildes
US5815199A (en) 1991-01-31 1998-09-29 Matsushita Electric Works, Ltd. Interphone with television
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability
USRE36207E (en) * 1991-05-13 1999-05-04 Omniview, Inc. Omniview motionless camera orientation system
US5384588A (en) 1991-05-13 1995-01-24 Telerobotics International, Inc. System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5528194A (en) * 1991-05-13 1996-06-18 Sony Corporation Apparatus and method for performing geometric transformations on an input image
US5185667A (en) * 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5990941A (en) * 1991-05-13 1999-11-23 Interactive Pictures Corporation Method and apparatus for the interactive display of any portion of a spherical image
US5345542A (en) * 1991-06-27 1994-09-06 At&T Bell Laboratories Proportional replication mapping system
US5796426A (en) * 1994-05-27 1998-08-18 Warp, Ltd. Wide-angle image dewarping method and apparatus
US5508734A (en) * 1994-07-27 1996-04-16 International Business Machines Corporation Method and apparatus for hemispheric imaging which emphasizes peripheral content
US6211911B1 (en) * 1994-10-14 2001-04-03 Olympus Optical Co., Ltd. Image processing apparatus
US5818527A (en) * 1994-12-21 1998-10-06 Olympus Optical Co., Ltd. Image processor for correcting distortion of central portion of image and preventing marginal portion of the image from protruding
US6795113B1 (en) * 1995-06-23 2004-09-21 Ipix Corporation Method and apparatus for the interactive display of any portion of a spherical image
US5999660A (en) * 1995-07-26 1999-12-07 California Institute Of Technology Imaging system for correction of perceptual distortion in wide angle images
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera
JPH09202180A (ja) * 1996-01-29 1997-08-05 Niles Parts Co Ltd 車載カメラ装置
US6618081B1 (en) * 1996-11-28 2003-09-09 Minolta Co., Ltd. Image acquisition device removing distortion in image signals
JPH10271490A (ja) 1997-03-24 1998-10-09 Yazaki Corp 車両後方監視装置
US6043837A (en) * 1997-05-08 2000-03-28 Be Here Corporation Method and apparatus for electronically distributing images from a panoptic camera system
WO1999012349A1 (en) 1997-09-04 1999-03-11 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US6937282B1 (en) * 1998-10-12 2005-08-30 Fuji Photo Film Co., Ltd. Method and apparatus for correcting distortion aberration in position and density in digital image by using distortion aberration characteristic
US6538691B1 (en) * 1999-01-21 2003-03-25 Intel Corporation Software correction of image distortion in digital cameras
US6704434B1 (en) * 1999-01-27 2004-03-09 Suzuki Motor Corporation Vehicle driving information storage apparatus and vehicle driving information storage method
US6920234B1 (en) * 1999-05-08 2005-07-19 Robert Bosch Gmbh Method and device for monitoring the interior and surrounding area of a vehicle
JP2001145012A (ja) * 1999-11-12 2001-05-25 Toyota Autom Loom Works Ltd 画像歪補正装置
US7058235B2 (en) * 2001-02-09 2006-06-06 Sharp Kabushiki Kaisha Imaging systems, program used for controlling image data in same system, method for correcting distortion of captured image in same system, and recording medium storing procedures for same method
US20040234100A1 (en) * 2001-05-25 2004-11-25 Horst Belau Device and method for the processing of image data
US20040169726A1 (en) * 2001-07-20 2004-09-02 6115187 Canada Inc. Method for capturing a panoramic image by means of an image sensor rectangular in shape
US6833843B2 (en) * 2001-12-03 2004-12-21 Tempest Microsystems Panoramic imaging and display system with canonical magnifier
US7224392B2 (en) * 2002-01-17 2007-05-29 Eastman Kodak Company Electronic imaging system having a sensor for correcting perspective projection distortion
US7202888B2 (en) * 2002-11-19 2007-04-10 Hewlett-Packard Development Company, L.P. Electronic imaging device resolution enhancement
US20050007478A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple-view processing in wide-angle video camera
US20070025636A1 (en) * 2003-06-02 2007-02-01 Olympus Corporation Image processing device
US20070115384A1 (en) * 2005-11-24 2007-05-24 Olympus Corporation Image processing apparatus and method for preferably correcting distortion aberration

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Claus et al.; "A Rational Function Lens Distortion Model for General Cameras"; Proceedings of the 2005 IEEE Computer Society Conference on COMputer Vision and Pattern Recognition; IEEE; 2005. *
Shah et al.; "A Simple Calibration Procedure for Fish-Eye (High Distortion) Lens Camera"; 1994 IEEE International Conference on Robotics and Automation. Proceedings; May 8-13, 1994; IEEE; vol. 4, pp. 3422-3427. *
Zhengyou Zhang, A Flexible New Technique for Camera Calibration, Dec. 2, 1998 (Updated on Dec. 14, 1998), Microsoft Research, Microsoft Corporation, One Microsoft Way, Redmond, WA 98052.

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9035233B2 (en) 1997-08-25 2015-05-19 Magna Electronics Inc. Accessory mounting system for mounting an electronic device at a windshield of a vehicle
US9718357B2 (en) 1997-08-25 2017-08-01 Magna Electronics Inc. Vehicular accessory system
US8534887B2 (en) 1997-08-25 2013-09-17 Magna Electronics Inc. Interior rearview mirror assembly for a vehicle
US8926151B2 (en) 1997-08-25 2015-01-06 Magna Electronics Inc. Vehicular accessory system
US8513590B2 (en) 1998-01-07 2013-08-20 Magna Electronics Inc. Vehicular accessory system with a cluster of sensors on or near an in-cabin surface of the vehicle windshield
US8481916B2 (en) 1998-01-07 2013-07-09 Magna Electronics Inc. Accessory mounting system for a vehicle having a light absorbing layer with a light transmitting portion for viewing through from an accessory
US8692659B2 (en) 1998-01-07 2014-04-08 Magna Electronics Inc. Accessory mounting system for vehicle
US9527445B2 (en) 1998-01-07 2016-12-27 Magna Electronics Inc. Accessory mounting system for mounting an accessory at a vehicle such that a camera views through the vehicle windshield
US9434314B2 (en) 1998-04-08 2016-09-06 Donnelly Corporation Electronic accessory system for a vehicle
US9283900B2 (en) 1999-08-25 2016-03-15 Magna Electronics Inc. Accessory mounting system for a vehicle
US9446715B2 (en) 1999-08-25 2016-09-20 Magna Electronics Inc. Vision system for a vehicle
US8531279B2 (en) 1999-08-25 2013-09-10 Magna Electronics Inc. Accessory mounting system for a vehicle
US9539956B2 (en) 1999-08-25 2017-01-10 Magna Electronics Inc. Accessory system for a vehicle
US9637053B2 (en) 1999-11-04 2017-05-02 Magna Electronics Inc. Accessory mounting system for a vehicle
US9233645B2 (en) 1999-11-04 2016-01-12 Magna Electronics Inc. Accessory mounting system for a vehicle
US8749367B2 (en) 1999-11-04 2014-06-10 Magna Electronics Inc. Driver assistance system for a vehicle
US9193302B2 (en) 1999-11-04 2015-11-24 Magna Electronics Inc. Vision system for a vehicle
US10427604B2 (en) 2000-03-02 2019-10-01 Magna Electronics Inc. Vision system for a vehicle
US10059265B2 (en) 2000-03-02 2018-08-28 Magna Electronics Inc. Vision system for a vehicle
US9843777B2 (en) 2000-03-02 2017-12-12 Magna Electronics Inc. Cabin monitoring system for a vehicle
US8531278B2 (en) 2000-03-02 2013-09-10 Magna Electronics Inc. Accessory system for vehicle
US8686840B2 (en) 2000-03-31 2014-04-01 Magna Electronics Inc. Accessory system for a vehicle
US9783125B2 (en) 2000-03-31 2017-10-10 Magna Electronics Inc. Accessory system for a vehicle
US8508593B1 (en) 2002-01-31 2013-08-13 Magna Electronics Vehicle accessory system
US8405726B2 (en) 2002-01-31 2013-03-26 Donnelly Corporation Vehicle accessory system
US10543786B2 (en) 2002-01-31 2020-01-28 Magna Electronics Inc. Vehicle camera system
US9862323B2 (en) 2002-01-31 2018-01-09 Magna Electronics Inc. Vehicle accessory system
US8749633B2 (en) 2002-01-31 2014-06-10 Magna Electronics Inc. Vehicle accessory system
US9266474B2 (en) 2004-08-18 2016-02-23 Magna Electronics Inc. Accessory system for vehicle
US10773724B2 (en) 2004-08-18 2020-09-15 Magna Electronics Inc. Accessory system for vehicle
US8710969B2 (en) 2004-08-18 2014-04-29 Magna Electronics Inc. Accessory system for vehicle
US9090213B2 (en) 2004-12-15 2015-07-28 Magna Electronics Inc. Accessory mounting system for a vehicle
US10046714B2 (en) 2004-12-15 2018-08-14 Magna Electronics Inc. Accessory mounting system for a vehicle
US10710514B2 (en) 2004-12-15 2020-07-14 Magna Electronics Inc. Accessory mounting system for a vehicle
US20070115384A1 (en) * 2005-11-24 2007-05-24 Olympus Corporation Image processing apparatus and method for preferably correcting distortion aberration
US7733407B2 (en) * 2005-11-24 2010-06-08 Olympus Corporation Image processing apparatus and method for preferably correcting distortion aberration
US20070132863A1 (en) * 2005-12-14 2007-06-14 Sony Corporation Image taking apparatus, image processing method, and image processing program
US7764309B2 (en) * 2005-12-14 2010-07-27 Sony Corporation Image taking apparatus, image processing method, and image processing program for connecting into a single image a plurality of images taken by a plurality of imaging units disposed such that viewpoints coincide with each other
US7811011B2 (en) 2006-02-06 2010-10-12 Leopold Kostal Gmbh & Co. Kg Camera arrangement behind an inclined pane
US8570374B2 (en) 2008-11-13 2013-10-29 Magna Electronics Inc. Camera for vehicle
US20130155296A1 (en) * 2011-12-14 2013-06-20 Samsung Electronics Co., Ltd. Imaging apparatus and distortion compensation method

Also Published As

Publication number Publication date
KR20030081500A (ko) 2003-10-17
WO2002078346A1 (de) 2002-10-03
KR100910670B1 (ko) 2009-08-04
EP1368970A1 (de) 2003-12-10
EP1368970B1 (de) 2005-08-10
US20040202380A1 (en) 2004-10-14
JP2004530202A (ja) 2004-09-30
DE50107083D1 (de) 2005-09-15

Similar Documents

Publication Publication Date Title
US7304680B2 (en) Method and device for correcting an image, particularly for occupant protection
CN106462996B (zh) 无失真显示车辆周边环境的方法和装置
JP4903194B2 (ja) 車載用カメラユニット、車両外部ディスプレイ方法及びドライビングコリドーマーカー生成システム
US9196022B2 (en) Image transformation and multi-view output systems and methods
JP5361072B2 (ja) 改良型撮像装置
JP4859652B2 (ja) 画像表示装置
JP4677104B2 (ja) 車両用表示装置
JP2009017020A (ja) 画像処理装置及び表示画像生成方法
IES20060800A2 (en) A method and apparatus for calibrating an image capturing device, and a method and apparatus for outputting image frames from sequentially captured image frames with compensation for image capture device offset
US6584235B1 (en) Wide dynamic range fusion using memory look-up
WO2006087993A1 (ja) 周辺監視装置および周辺監視方法
CN105196917A (zh) 全景式监控影像装置及其工作方法
EP3916670A1 (de) System und verfahren zur korrektur von geometrischen verzerrungen in bildern
EP3772035B1 (de) Adaptive subkacheln zur verzerrungskorrektur in sichtbasierten assistenzsystemen und verfahren
KR101964864B1 (ko) 차량의 어라운드 뷰 영상의 왜곡 보정 방법
TWI443604B (zh) 影像校正方法及影像校正裝置
JPH10164326A (ja) 画像取り込み装置
US11483547B2 (en) System and method for adaptive correction factor subsampling for geometric correction in an image processing system
JP2012134586A (ja) 車両周辺監視装置
CN117002406A (zh) 一种车辆的后排影音娱乐系统
CN102469249B (zh) 影像校正方法及影像校正装置
KR100855426B1 (ko) 이미지 데이터를 처리하기 위한 장치 및 방법
US7953292B2 (en) Semiconductor integrated circuit device and rendering processing display system
CN113022590A (zh) 车辆用周边显示装置
WO2017086029A1 (ja) 画像処理装置、画像処理方法、移動体、身体装着型電子機器及びコンピュータプログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOHLER, THORSTEN;WAGNER, ULRICH;REEL/FRAME:015070/0354

Effective date: 20030818

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: CONTINENTAL AUTOMOTIVE GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS AKTIENGESELLSCHAFT;REEL/FRAME:027263/0068

Effective date: 20110704

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20151204