WO2015004866A1 - 画像生成装置、画像生成方法及び、そのための非遷移のコンピュータ読み取り可能記憶媒体 - Google Patents
画像生成装置、画像生成方法及び、そのための非遷移のコンピュータ読み取り可能記憶媒体 Download PDFInfo
- Publication number
- WO2015004866A1 WO2015004866A1 PCT/JP2014/003347 JP2014003347W WO2015004866A1 WO 2015004866 A1 WO2015004866 A1 WO 2015004866A1 JP 2014003347 W JP2014003347 W JP 2014003347W WO 2015004866 A1 WO2015004866 A1 WO 2015004866A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- partial image
- density
- adjacent
- target
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 48
- 238000012937 correction Methods 0.000 claims abstract description 36
- 239000002131 composite material Substances 0.000 claims abstract description 29
- 238000006243 chemical reaction Methods 0.000 claims description 153
- 238000003384 imaging method Methods 0.000 claims description 21
- 238000013459 approach Methods 0.000 claims description 5
- 230000001131 transforming effect Effects 0.000 abstract 1
- 238000012545 processing Methods 0.000 description 59
- 230000008569 process Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 7
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000003786 synthesis reaction Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present disclosure relates to an image generation device for generating a composite image using a plurality of captured images, an image generation method, and a non-transition computer-readable storage medium therefor.
- a technique in which an image representing a wide subject area exceeding the imaging range of one camera is generated by combining a plurality of captured images. For example, in Patent Document 1, based on the density characteristics of the images A and B in the overlapping part of the images A and B adjacent to each other, correction is performed so that the density characteristic of one image B approaches that of the other image A. The technology is described.
- an overlapping part with the image B in the image A is cut out as a combining part a
- an overlapping part with the image A in the image B is cut out as a combining part b.
- histograms Ha (x) and Hb (x) for the synthesis units a and b are generated from the pixel values of the synthesis units a and b.
- gradation conversion tables LUTa and LUTb that can execute histogram equalization are generated for the histograms Ha (x) and Hb (x), respectively.
- gradation conversion is performed on the entire area of the image B using the gradation conversion table LUTb, and the inverse gradation conversion is further performed on the image after gradation conversion using the gradation conversion table LUTa. Executed. That is, gradation conversion processing for bringing the histogram Hb (x) of the synthesis unit b close to the histogram Ha (X) of the synthesis unit a is performed on the entire image B.
- Patent Document 1 The technique described in Patent Document 1 described above is applied to a composite image in which a plurality of images are continuously arranged in a band shape (in a certain direction). Therefore, if other captured images are corrected in order based on one of the plurality of captured images, the density characteristic shift between adjacent images is reduced in all boundary portions (portions where the images are adjacent to each other) in the composite image. Can be suppressed.
- a composite image for example, there is a composite image in which a plurality of images are continuously arranged in a ring shape, such as an image in which the periphery of the vehicle is viewed from above (a so-called top view image).
- the technique described in Patent Document 1 may cause a problem that it is not possible to suppress a shift in density characteristics between adjacent images at one or more boundary portions in the composite image.
- the image B is corrected based on the image A
- the image C is corrected based on the corrected image B
- the image D may be corrected based on the corrected image C.
- the image A and the image A can be obtained by correcting the images B, C, and D in order as in the case of the belt-like shape.
- a shift in density characteristic occurs. If the image A is corrected based on the corrected image D, the deviation of the density characteristics between the image A and the image D is suppressed, but instead, the corrected image A and the image B are between Since the deviation of the density characteristic occurs, the problem cannot be solved.
- Such a problem is not limited to an image in which the periphery of the vehicle is viewed from above, and may be a composite image in which a plurality of images are continuously arranged in a ring shape.
- the present disclosure relates to an image generation apparatus, an image generation method, and non-transition computer-readable storage therefor that suppress a deviation in density characteristics at all boundary portions in a composite image in which a plurality of images are continuously arranged in a ring shape.
- the purpose is to provide a medium.
- an image that generates a composite image in which partial images cut out from the captured image are continuously arranged in a ring shape using a plurality of captured images in which a part of the imaged region overlaps uses at least one of the plurality of partial images as a reference partial image, sets a target partial image other than the reference partial image, and converts the target partial image from both sides of the target partial image in the composite image.
- Correction means for correcting at least one of the first adjacent partial image and the second adjacent partial image, which are the adjacent partial images, as a reference partial image, and the density characteristics of the target partial image in the first adjacent partial image
- a first conversion table that is a density conversion table for performing correction to approximate the density characteristic of the target partial image, and a density characteristic of the target partial image that is close to the density characteristic of the second adjacent partial image.
- a second conversion table that is a density conversion table for performing correction, and the density of the target partial image is generated using both the first conversion table and the second conversion table.
- Mixing conversion means for converting for converting.
- the generation means captures a target captured image that is the captured image that is a source of the target partial image and a first adjacent captured image that is the captured image that is a source of the first adjacent partial image.
- a density conversion table for correcting the density characteristic of the target captured image in the second common area to the density characteristic of the second adjacent captured image in the second common area is generated as the second conversion table.
- the closer the pixel in the target partial image is to the first adjacent partial image the greater the influence of the first conversion table is compared to the influence of the second conversion table
- the density of the target partial image is such that the closer the pixel in the target partial image is to the second adjacent partial
- the image generation method used in the generation device includes at least one of the plurality of partial images as a reference partial image, a non-reference partial image as a target partial image, and the target partial image in the composite image.
- a second conversion table that is a density conversion table for performing a correction approaching the density characteristics of the image is the second adjacent part
- a generation step that generates the target, using both the first conversion table and the second conversion table.
- a target captured image that is the captured image that is the source of the target partial image and a first adjacent captured image that is the captured image that is the source of the first adjacent partial image are captured.
- a density conversion table for correcting the density characteristic of the target captured image in FIG. 5 to be close to the density characteristic of the second adjacent captured image in the second common area is generated as the second conversion table.
- the density of the target partial image is such that the closer the pixel in the target partial image is to the second adjacent partial image, the greater the influence of the second conversion table is compared with the influence of the first conversion table. Convert.
- deviations in density characteristics can be suppressed at all boundary portions in a composite image in which a plurality of partial images are continuously arranged in a ring shape.
- an image that generates a composite image in which partial images cut out from the captured image are continuously arranged in a ring shape using a plurality of captured images in which a part of the subject region overlaps A non-transition computer-readable storage medium including instructions for causing a computer to function as a generation device, wherein the instructions use at least one of the plurality of partial images as a reference partial image, and other than the reference partial image.
- the target partial image is determined based on at least one of the first adjacent partial image and the second adjacent partial image that are the partial images adjacent to the target partial image from both sides in the composite image.
- a correction step for correcting the image and the density characteristic of the target partial image to be close to the density characteristic of the first adjacent partial image A first conversion table that is a density conversion table for performing correction, and a second conversion table for performing correction to bring the density characteristic of the target partial image closer to the density characteristic of the second adjacent partial image. And a conversion step for converting the density of the target partial image using both the first conversion table and the second conversion table.
- a target captured image that is the captured image that is the source of the target partial image and a first adjacent captured image that is the captured image that is the source of the first adjacent partial image are captured.
- a density conversion table for correcting the density characteristic of the target captured image in FIG. 5 to be close to the density characteristic of the second adjacent captured image in the second common area is generated as the second conversion table.
- the density of the target partial image is such that the closer the pixel in the target partial image is to the second adjacent partial image, the greater the influence of the second conversion table is compared with the influence of the first conversion table. Convert.
- FIG. 1 is a block diagram showing a configuration of a top view image generation device
- FIG. 2 is a diagram showing an imaging range of a camera installed in a vehicle
- FIG. 3 is a flowchart of the top view image generation process.
- 4 (A) and 4 (C) are diagrams illustrating captured images converted into top view images
- FIG. 4 (B) is a diagram illustrating partial images constituting the top view image.
- FIG. 5A is a diagram showing a procedure for associating the processing target concentration B with the processing target concentration A
- FIG. 5B is a diagram showing a procedure for generating an LUT.
- FIG. 6C are diagrams illustrating the second density conversion process.
- FIG. 7 is a graph showing the relationship between the pixel position of the embodiment and the weight of the LUT
- FIG. 8 is a graph showing the relationship between the pixel position and the LUT weight in the modified example.
- a top-view image generation device 1 shown in FIG. 1 is a device mounted on a vehicle 9 (see FIG. 2), and displays a top-view image that is an image (an image looking down from above) of the vehicle 9 as viewed from above. It is a device that generates and displays for the driver's visual recognition.
- the top view image generation apparatus 1 according to the present embodiment includes an imaging unit 10, a storage unit 20, an image generation unit 30, and a display unit 40.
- the imaging unit 10 images the surroundings of the vehicle 9.
- the imaging unit 10 includes four cameras 11, 12, 13, and 14 shown in FIG.
- the first camera 11 is provided in the front portion of the vehicle 9 and images a lower (ground surface) imaging range 11 ⁇ / b> A in the front area of the vehicle 9.
- the camera 12 is provided on the left side (for example, the left door mirror) of the vehicle 9 and images the lower imaging range 12 ⁇ / b> A in the left side region of the vehicle 9.
- the camera 13 is provided at the rear portion of the vehicle 9 and images a lower imaging range 13 ⁇ / b> A in the rear region of the vehicle 9.
- the camera 14 is provided on the right side of the vehicle 9 (for example, the right door mirror), and images the lower imaging range 14 ⁇ / b> A in the right side region of the vehicle 9.
- These four cameras 11, 12, 13, and 14 are wide-angle cameras capable of imaging at an angle of view of 180 degrees, and four imaging ranges 11A, 12A, 13A, and 14A by these four cameras 11, 12, 13, and 14 are used.
- the entire circumference of the vehicle 9 is covered.
- the storage unit 20 shown in FIG. 1 stores unique data (calibration data) of the installation state of the four cameras 11, 12, 13, and 14 with respect to the vehicle 9.
- the storage unit 20 includes a storage device in which calibration data is stored in advance.
- Images captured by the four cameras 11, 12, 13, and 14 are combined (joined) as a series of images (top view images) that are continuously arranged in an annular shape so as to surround the vehicle 9 by a process that will be described later. .
- the positions of the boundary portions where the images taken by different cameras are adjacent are aligned as accurately as possible. Therefore, in the stage before shipment of the vehicle 9, the calibration of the four cameras 11, 12, 13, and 14 is performed for each vehicle 9, and according to the individual difference of the vehicle 9 (individual difference of the camera installation state). Calibration data is stored in the storage unit 20.
- a road surface on which a certain pattern (a pattern suitable for specifying the amount of deviation between images in a composite image) is drawn is actually used with four cameras 11, 12, 13, and 14. This is done by imaging. That is, by actually capturing an image and generating a top view image, the amount of deviation between the images in the top view image is specified, so the calibration data is set so that the amount of deviation approaches zero.
- information on the coordinate position, pitch angle, yaw angle, and roll angle in the three-dimensional space with respect to a certain point for each of the four cameras 11, 12, 13, and 14 is the calibration data. As stored in the storage unit 20 in advance.
- the image generation unit 30 shown in FIG. 1 is configured using a microcomputer including a CPU 31, a ROM 32, a RAM 33, and the like, and a CPU 31 as a processing subject (computer) performs processing according to a program recorded on a recording medium such as a ROM 32. Will run. Specifically, the image generation unit 30 performs position correction based on calibration data and image density correction (density conversion) on the images captured by the four cameras 11, 12, 13, and 14 to obtain a top view image. Is generated.
- the display unit 40 displays the top view image generated by the image generation unit 30 to a vehicle occupant (for example, a driver).
- the display unit 40 includes a display device (display) for displaying an image to a vehicle occupant.
- top view image generation process (top view image generation method) executed by the image generation unit 30 (specifically, the CPU 31) according to a program will be described with reference to the flowchart of FIG. Note that the top view image generation processing in FIG. 3 is periodically executed at every imaging cycle (for example, a short cycle in which imaging can be performed a plurality of times per second) by the four cameras 11, 12, 13, and 14.
- imaging cycle for example, a short cycle in which imaging can be performed a plurality of times per second
- the image generation unit 30 acquires images captured by the four cameras 11, 12, 13, and 14 from the imaging unit 10 (S101). Note that the imaging by the four cameras 11, 12, 13, and 14 is performed simultaneously (synchronously).
- the image generation unit 30 converts the images captured by the cameras 11, 12, 13, and 14 into top-view images (as if viewed from the top, and are based on wide-angle imaging). The image is converted into a distortion-corrected image (S102).
- the image generation unit 30 includes a first camera 11, a second camera 12, a third camera 13, and a fourth camera 14.
- the first top-view image 11B, the second top-view image 12B, the third top-view image 13B, and the fourth top-view image 14B which are obtained by converting the captured images by the top view, respectively, are generated.
- 4A and 4C in order to make each top view image 11B, 12B, 13B, 14B easy to see, a set of the first top view image 11B and the third top view image 13B;
- the second top view image 12B and the fourth top view image 14B are shown separately.
- the left end portion of the first top view image 11B overlaps with the front end portion of the second top view image 12B, and the right end portion of the first top view image 11B overlaps with the front end portion of the fourth top view image 14B.
- the left end portion of the third top view image 13B overlaps with the lower end portion of the second top view image 12B, and the right end portion of the third top view image 13B is the lower end portion of the fourth top view image 14B. And overlap.
- the top view image 10C is configured by arranging four partial images 11C, 12C, 13C, and 14C centered on the vehicle 9 continuously in a ring shape so as not to overlap each other.
- the image generation unit 30 follows the shape of the top view image 10C shown in FIG. 4B from each of the four top view images 11B, 12B, 13B, and 14B. , 13C, 14C are cut out.
- the image generation unit 30 executes a reference image setting process for setting, as a reference partial image, one of the four partial images 11C, 12C, 13C, and 14C whose image density characteristics satisfy a predetermined condition.
- the reference partial image is an image that is a reference for density conversion, which will be described later, among the four partial images 11C, 12C, 13C, and 14C, and that is not a target for density conversion.
- the image generation unit 30 has the largest number N of pixels existing in a preset density range (density TH1 to TH2) among the four partial images 11C, 12C, 13C, and 14C. Select a partial image.
- the analysis target regions 11R, 12R, 13R, and 14R are set at positions near the center where distortion is small. The determination is made based on the pixels included in each of these areas.
- the concentration range (concentrations TH1 and TH2) is set at the center of the range from the minimum concentration to the maximum concentration that the concentration can take.
- the density range is set so that it is difficult to select an image whose density is biased toward the minimum density side or the maximum density side. Note that the density can be restated as a pixel value, a gradation value, luminance, or the like.
- the image generation unit 30 sets the image as a reference partial image. That is, a maximum of one reference partial image is set. In other words, the reference partial image is not set when none of the four partial images 11C, 12C, 13C, and 14C satisfies the above-mentioned predetermined density characteristics. In this case, as will be described later, the correction (density conversion processing) of the partial images 11C, 12C, 13C, and 14C is not performed (S104: NO).
- the image generation unit 30 determines whether or not a reference partial image has been set in the reference image setting process of S103 (S104). If the image generation unit 30 determines that the reference partial image has been set (S104: YES), the image other than the reference partial image (hereinafter referred to as “target partial image”) among the partial images 11C, 12C, 13C, and 14C. .) Is performed (S105 to S111).
- the image generation unit 30 is a partial image (both ends in this example) of images captured by the cameras 11, 12, 13, and 14, and includes the overlapping regions 112, 123, 134, and 141 (described above).
- a duplicate image which is an image forming FIG. 2) is specified.
- overlapping images are specified in the above-described top view images 11B, 12B, 13B, and 14B.
- the image generation unit 30 generates a histogram representing the relationship between the density (pixel value) and the frequency (frequency) as information representing the density characteristic for each overlapping image (S105). Specifically, since there are two overlapping images in each of the four overlapping regions 112, 123, 134, 141, there are eight overlapping images and eight histograms are generated.
- the image generating unit 30 adds the histogram B (density) of the other overlapping image B to the histogram A (density characteristic A) of one overlapping image A.
- a density conversion table for approximating the characteristic B) is generated (S106).
- the density conversion table lookup table
- the LUT is generated for each of the four overlapping regions 112, 123, 134, and 141. That is, four LUTs are generated.
- LUT generation methods for bringing the histogram B close to the histogram A.
- the LUT is generated by the following procedures [M1] to [M4].
- the minimum density of the histogram A is the processing target density A
- the minimum density of the histogram B is the processing target density B.
- the processing target concentration B is associated with the processing target concentration A.
- the density of the histogram B corresponding to each density of the histogram A is determined. Then, as shown in FIG. 5B, the LUT is generated by connecting the determined corresponding points using the density of the histogram B as the input density and the density of the histogram A corresponding to those densities as the output density. Is done.
- the image generation unit 30 determines whether or not density conversion processing (S110 or S111), which will be described later, has been completed for all target partial images (S107). If the image generation unit 30 determines that the density conversion processing has not been completed for all target partial images (one or more target partial images before density conversion exist) (S107: NO), before density conversion.
- One of the target partial images is selected as a processing target (S108).
- the target partial image selected here is hereinafter referred to as a “processing target image”.
- the order of selecting the processing target images is set in advance according to the position of the reference partial image.
- an order is set such that the target partial images adjacent to the reference partial image in the counterclockwise direction are sequentially selected in the circumferential direction (counterclockwise).
- This order is an example, and is not particularly limited.
- at least a target partial image that is not adjacent to any of the reference partial image and the target partial image after density conversion is set so as not to be selected as a processing target image. Yes.
- the image generation unit 30 includes two partial images adjacent to the processing target image from both sides in the top view image 10C (hereinafter referred to as “first adjacent partial image” and “second adjacent partial image”). It is determined whether one of them is a target partial image before density conversion (S109). When it is determined that one of the first adjacent partial image and the second adjacent partial image is the target partial image before density conversion (S109: YES), the image generation unit 30 performs first density conversion processing. Is executed (S110), the process returns to S107. On the other hand, the image generation unit 30 is not the target partial image before density conversion of either the first adjacent partial image or the second adjacent partial image (either the reference partial image or the target partial image after density conversion). If it is determined (Yes) (S109: NO), after executing the second density conversion process (S111), the process returns to S107.
- the first density conversion process is a reference partial image that is one of the first adjacent partial image and the second adjacent partial image or a target partial image after density conversion (herein, simply referred to as “adjacent partial image”). Is used (as a reference) to convert the density of the processing target image.
- an LUT density conversion process for bringing the process target image closer to the adjacent partial image generated based on the overlapping area corresponding to the boundary between the process target image and the adjacent partial image is performed. Therefore, the density of the image to be processed is uniformly converted. As a result, discontinuity (density characteristic shift) on the density surface at the boundary between the processing target image and the adjacent partial image is suppressed.
- the second density conversion process is a process of converting the density of the processing target image using both the first adjacent partial image and the second adjacent partial image (as a reference).
- the first adjacent partial image and the second adjacent partial image here are either a reference partial image or a target partial image after density conversion.
- the LUT generated based on the overlapping area is used among the pixels included in the processing target image 14C, for the pixels adjacent to the first adjacent partial image 11C, the boundary between the processing target image 14C and the first adjacent partial image 11C (line segment (P1 , P3)), the LUT generated based on the overlapping area is used.
- This LUT is an LUT for correcting the density characteristic of the processing target image 14C to be close to the density characteristic of the first adjacent partial image 11C, and is hereinafter referred to as a “first LUT”.
- the overlapping region referred to here is a top view image (hereinafter referred to as “target captured image 14B”) and “first adjacent captured image” that is the source (cutout source) of the processing target image 14C and the first adjacent partial image 11C.
- first common area 11B is a common area (hereinafter referred to as" first common area "). That is, the first LUT is an LUT for correcting the density characteristic of the target captured image 14B in the first common area to be close to the density characteristic of the first adjacent captured image 11B in the first common area. .
- the boundary between the processing target image 14C and the second adjacent partial image 13C (line segments (P2, P4)).
- LUT generated based on the overlapping area corresponding to () is used.
- This LUT is an LUT for correcting the density characteristic of the processing target image 14C to be close to the density characteristic of the second adjacent partial image 13C, and is hereinafter referred to as a “second LUT”.
- the overlapping region referred to here is a target captured image 14B and a top view image (hereinafter referred to as “second adjacent captured image 13B”) that is a source (a cut-out source) of the second adjacent partial image 13C.
- the second LUT is an LUT for correcting the density characteristic of the target captured image 14B in the second common area to be close to the density characteristic of the second adjacent captured image 13B in the second common area.
- a combined LUT obtained by combining the first LUT and the second LUT is used.
- the processing target image 14C is such that the closer the pixel in the processing target image 14C is to the first adjacent partial image 11C, the greater the influence of the first LUT compared to the influence of the second LUT.
- a synthesis LUT for converting the density of the image is set. In other words, the closer the pixel in the processing target image 14C is to the second adjacent partial image 13C, the greater the influence of the second LUT compared to the influence of the first LUT.
- a synthetic LUT for converting the density is set. That is, the synthesis LUT varies depending on the pixel position in the processing target image 14C. For example, for a pixel located intermediate to the first adjacent partial image 11C and the second adjacent partial image 13C, a combined LUT obtained by averaging the first LUT and the second LUT with equal weight is used.
- the first LUT weight W1 according to the position on the line segment (distance to the points P1, P2).
- the weight L2 of the second LUT may be changed linearly.
- the line segment (P3, P4) is included in a rectangular region having one side and the line segment (P5, P6) that is a part of the line segment (P1, P2) as the opposite side.
- the weights may be set so that the pixels on the straight line parallel to the line segment (P3, P5) all have the same weight.
- the image generation unit 30 determines in S107 described above that density conversion processing has been completed for all target partial images (S107: YES), the image generation unit 30 shifts the processing to S112. If the image generation unit 30 determines that the reference partial image has not been set in S104 described above (S104: NO), it skips the processing for density conversion (S105 to S111) and performs the processing in S112. To move to.
- the image generation unit 30 In S112, the image generation unit 30 generates a top view image in which the four partial images 11C, 12C, 13C, and 14C (images after density conversion when density conversion is performed) are continuously arranged in a ring shape. To do. Then, the image generation unit 30 displays the generated top view image on the display unit 40 (S113), and then ends the top view image generation process of FIG.
- the image generation unit 30 includes a first LUT for performing correction to bring the density characteristic of the target partial image closer to the density characteristic of the first adjacent partial image; A second LUT for correcting the density characteristic of the target partial image to be close to the density characteristic of the second adjacent partial image is generated (S105 to S106). Then, the image generation unit 30 converts the density of at least one target partial image using both the first LUT and the second LUT (S111). Specifically, the image generation unit 30 is configured so that the influence of the first LUT is larger than the influence of the second LUT as the position of the target partial image is closer to the first adjacent partial image. The density of the target partial image is converted. In other words, the image generation unit 30 sets the target so that the influence of the second LUT is larger than the influence of the first LUT as the position of the target partial image is closer to the second adjacent partial image. Convert the density of the partial image.
- the present embodiment it is possible to suppress a shift in density characteristics at all boundary portions in a composite image (top view image) in which a plurality of partial images are continuously arranged in a ring shape.
- the image itself is digitally corrected, it can be adjusted with higher accuracy than a configuration in which, for example, the gain of the camera is corrected in an analog manner.
- the image generation unit 30 sets, as a reference partial image, a plurality of partial images 11C, 12C, 13C, and 14C whose image density characteristics satisfy a predetermined condition (S103). Therefore, according to the present embodiment, it is possible to appropriately correct the partial images 11C, 12C, 13C, and 14C as compared with the case where the reference partial image is fixed (preset). That is, the partial image to be used as the reference partial image may vary depending on the environment around the vehicle (how light strikes the vehicle, such as daytime and nighttime, whether there is an oncoming vehicle, etc.).
- the reference partial image when the reference partial image is fixed, even a partial image that is not appropriate as the reference partial image (for example, an image that is too dark or too bright as a whole) is used as the reference partial image.
- a partial image that is not appropriate as the reference partial image for example, an image that is too dark or too bright as a whole
- the reference partial image since the image whose density specification satisfies a predetermined condition is set as the reference partial image, it is suitable for the reference partial image among the plurality of partial images 11C, 12C, 13C, and 14C.
- a partial image can be selected.
- the image generating unit 30 determines that the partial images 11C, 12C are not included in the plurality of partial images 11C, 12C, 13C, 14C that satisfy a predetermined condition for setting as a reference partial image. , 13C, and 14C are not corrected (S104). Therefore, according to the present embodiment, an improper composite image is obtained by performing correction even though an appropriate partial image does not exist as a reference partial image among the plurality of partial images 11C, 12C, 13C, and 14C. It can suppress producing
- the image generation unit 30 is a top view of the periphery of the vehicle 9 using a plurality of captured images obtained by capturing the periphery of the vehicle 9 with the plurality of cameras 11, 12, 13, and 14 mounted on the vehicle 9.
- a top view image 10C is generated. Therefore, according to the present embodiment, a more natural top view image can be presented to the driver.
- the image generation unit 30 corresponds to an example of an image generation apparatus
- S103 to S111 correspond to an example of processing and correction steps as correction means
- S103 corresponds to an example of processing as a setting unit
- S105 to S106 correspond to examples of processing and generation steps as a generation unit
- S110 corresponds to an example of processing as a fixed conversion unit
- S111 corresponds to an example of processing as a mixing conversion unit and a mixing conversion step.
- the cameras 11, 12, 13, and 14 correspond to an example of an imaging device
- the top view image 10C corresponds to an example of a composite image
- the first LUT corresponds to an example of a first conversion table
- the second LUT corresponds to an example of a second conversion table.
- the LUT generation method is not limited to the method exemplified in the above embodiment, and other known methods may be used. For example, as described in Patent Document 1 described above, an LUT that can execute histogram equalization may be generated for the histogram, and the same correction processing may be performed.
- the density conversion method using the synthetic LUT is not limited to the method exemplified in the above embodiment.
- the pixels on the straight line parallel to the line segment (P3, P5) all have the same weight with respect to the pixel on the line segment (P1, P2).
- the weight may be set so that That is, the weight setting method for the pixels included in the two triangular regions (triangles (P1, P3, P5) and triangles (P2, P4, P6)) is different from the above embodiment.
- the processing target image 14C for each straight line parallel to the line segment (P1, P2), both ends of the straight line are replaced with P1 and P2 in FIG. 7, and weights for pixels on the straight line are set. Also good.
- the weight L1 of the first LUT and the first LUT are set according to the position on the line segment (distance to the points P5, P6).
- the LUT weight W2 may be linearly changed.
- the weight may be set so that all the pixels on the straight line parallel to the line segment (P3, P5) have the same weight.
- the weights may be set so that the weights are maximized.
- the processing target image 14C may be divided into a plurality of regions, and weights may be set for each region.
- the timing for setting the reference partial image is not limited for each imaging cycle, and the reference partial image is prevented from being frequently changed. Also good.
- the reference partial image may be fixed for a certain period.
- the current reference partial image does not satisfy the criteria for selecting the reference partial image continuously for a certain period of time thereafter, and is more severe than the conditions for selecting the same reference partial image. Conditions may be set.
- the same reference partial image means that the arrangement in the top view image 10C is the same, and changing the reference partial image selects a partial image having a different arrangement in the top view image 10C as the reference partial image. Means that.
- the analysis target areas 11R, 12R, 13R, and 14R set in each of the partial images 11C, 12C, 13C, and 14C are set, and the determination is made based on the pixels included in these areas.
- the present invention is not limited to this (S103).
- the determination may be made based on all the pixels included in the partial images 11C, 12C, 13C, and 14C.
- a maximum of one reference partial image is set (S103), but the present invention is not limited to this, and a plurality of reference partial images may be set.
- S103 an image having the first largest number of pixels N and an image having the second largest number of pixels may be selected as the reference partial image.
- the reference partial image may not be set (S103).
- the present invention is not limited to this, and one or more reference partial images are always set and set.
- the density conversion process may be performed based on the reference partial image.
- the reference partial image can be changed.
- the present invention is not limited to this, and a partial image at a predetermined position in the top view image 10C may be always used as the reference partial image.
- the position and number of cameras are not limited to those exemplified in the above embodiment.
- the present invention can also be applied to a composite image other than the top view image.
- the present invention can be realized in various forms such as an image generation apparatus, an image generation method, an image generation program, a recording medium on which the image generation program is recorded, and a system including the image generation apparatus as components. .
- each section is expressed as S100, for example.
- each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
- each section configured in this manner can be referred to as a device, module, or means.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Geometry (AREA)
Abstract
Description
図1に示すトップビュー画像生成装置1は、車両9(図2参照)に搭載された装置であって、車両9を上面視した画像(上空から見下ろしたような画像)であるトップビュー画像を生成し、運転者の視認のために表示する装置である。本実施形態のトップビュー画像生成装置1は、撮像部10と、記憶部20と、画像生成部30と、表示部40と、を備える。
次に、画像生成部30(具体的にはCPU31)がプログラムに従い実行するトップビュー画像生成処理(トップビュー画像生成方法)について、図3のフローチャートを用いて説明する。なお、図3のトップビュー画像生成処理は、4つのカメラ11,12,13,14による撮像周期(例えば1秒間に複数回撮像可能な短い周期)ごとに定期的に実行される。
[M4]上記式(1)が満たされた場合、現在の処理対象濃度Aに対応付けられた処理対象濃度Bで確定し、処理対象濃度A及び処理対象濃度Bをそれぞれ次に高い濃度に変更した後、上記[M2]へ戻る。
以上詳述した実施形態によれば、以下の効果が得られる。
以上、本発明の実施形態について説明したが、本発明は、上記実施形態に限定されることなく、種々の形態を採り得ることは言うまでもない。
Claims (7)
- 被写領域の一部が重複する複数の撮像画像を用いて、前記撮像画像から切り出された部分画像(11C,12C,13C,14C)が環状に連続して配置された合成画像(10C)を生成する画像生成装置(30)であって、
複数の前記部分画像のうち少なくとも一つを基準部分画像とし、前記基準部分画像以外を対象部分画像として、前記対象部分画像を、前記合成画像において当該対象部分画像に対して両側から隣接する前記部分画像である第1の隣接部分画像及び第2の隣接部分画像のうち少なくとも一方を基準部分画像として補正する補正手段(S103~S111)と、
前記対象部分画像の濃度特性を前記第1の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第1の変換テーブルと、前記対象部分画像の濃度特性を前記第2の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第2の変換テーブルと、を生成する生成手段(S105~S106)と、
前記第1の変換テーブル及び前記第2の変換テーブルの両方を用いて前記対象部分画像の濃度を変換する混合変換手段(S111)と、
を備え、
前記生成手段は、
前記対象部分画像の元となる前記撮像画像である対象撮像画像と、前記第1の隣接部分画像の元となる前記撮像画像である第1の隣接撮像画像と、の被写領域が共通する第1の共通領域における前記対象撮像画像の濃度特性を、前記第1の共通領域における前記第1の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第1の変換テーブルとして生成し、
前記対象撮像画像と、前記第2の隣接部分画像の元となる前記撮像画像である第2の隣接撮像画像との被写領域が共通する第2の共通領域における前記対象撮像画像の濃度特性を、前記第2の共通領域における前記第2の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第2の変換テーブルとして生成し、
前記混合変換手段は、前記対象部分画像における位置が前記第1の隣接部分画像に近い画素ほど、前記第1の変換テーブルによる影響が前記第2の変換テーブルによる影響と比較して大きくなり、前記対象部分画像における位置が前記第2の隣接部分画像に近い画素ほど、前記第2の変換テーブルによる影響が前記第1の変換テーブルによる影響と比較して大きくなるように、前記対象部分画像の濃度を変換する
画像生成装置。 - 複数の前記部分画像のうち画像の濃度特性が所定の条件を満たす前記部分画像を、前記基準部分画像に設定する設定手段(S103)を更に備える
請求項1に記載の画像生成装置。 - 前記補正手段は、複数の前記部分画像の中に、前記所定の条件を満たす前記部分画像が存在しない場合には、前記部分画像の補正を行わない(S104)
請求項2に記載の画像生成装置。 - 前記第1の変換テーブル及び前記第2の変換テーブルのうち一方を用いて前記対象部分画像の濃度を変換する固定変換手段(S110)を更に備え、
前記補正手段は、前記第1の隣接部分画像及び前記第2の隣接部分画像が、前記基準部分画像及び濃度変換後の前記対象部分画像のうちいずれか一方、及び、濃度変換前の前記対象部分画像、である場合には、前記固定変換手段により前記対象部分画像の濃度を変換し、
前記補正手段は、前記第1の隣接部分画像及び前記第2の隣接部分画像が共に、前記基準部分画像及び濃度変換後の前記対象部分画像のうちいずれかである場合には、前記混合変換手段により前記対象部分画像の濃度を変換する
請求項1から請求項3までのいずれか1項に記載の画像生成装置。 - 複数の前記撮像画像は、車両に搭載された複数の撮像装置によって撮像され、
前記合成画像は、車両の周囲を上面視した画像である
請求項1から請求項4までのいずれか1項に記載の画像生成装置。 - 被写領域の一部が重複する複数の撮像画像を用いて、前記撮像画像から切り出された部分画像(11C,12C,13C,14C)が環状に連続して配置された合成画像(10C)を生成する画像生成装置(30)で用いられる画像生成方法であって、
複数の前記部分画像のうち少なくとも一つを基準部分画像とし、前記基準部分画像以外を対象部分画像として、前記対象部分画像を、前記合成画像において当該対象部分画像に対して両側から隣接する前記部分画像である第1の隣接部分画像及び第2の隣接部分画像のうち少なくとも一方を基準として補正する補正ステップ(S103~S111)と、
前記対象部分画像の濃度特性を前記第1の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第1の変換テーブルと、前記対象部分画像の濃度特性を前記第2の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第2の変換テーブルと、を生成する生成ステップ(S105~S106)と、
前記第1の変換テーブル及び前記第2の変換テーブルの両方を用いて前記対象部分画像の濃度を変換する混合変換ステップ(S111)と、
を備え、
前記生成ステップでは、
前記対象部分画像の元となる前記撮像画像である対象撮像画像と、前記第1の隣接部分画像の元となる前記撮像画像である第1の隣接撮像画像と、の被写領域が共通する第1の共通領域における前記対象撮像画像の濃度特性を、前記第1の共通領域における前記第1の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第1の変換テーブルとして生成し、
前記対象撮像画像と、前記第2の隣接部分画像の元となる前記撮像画像である第2の隣接撮像画像との被写領域が共通する第2の共通領域における前記対象撮像画像の濃度特性を、前記第2の共通領域における前記第2の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第2の変換テーブルとして生成し、
前記混合変換ステップでは、前記対象部分画像における位置が前記第1の隣接部分画像に近い画素ほど、前記第1の変換テーブルによる影響が前記第2の変換テーブルによる影響と比較して大きくなり、前記対象部分画像における位置が前記第2の隣接部分画像に近い画素ほど、前記第2の変換テーブルによる影響が前記第1の変換テーブルによる影響と比較して大きくなるように、前記対象部分画像の濃度を変換する
画像生成方法。 - 被写領域の一部が重複する複数の撮像画像を用いて、前記撮像画像から切り出された部分画像(11C,12C,13C,14C)が環状に連続して配置された合成画像(10C)を生成する画像生成装置(30)としてコンピュータを機能させるためのインストラクションを含んだ非遷移のコンピュータ読み取り可能記憶媒体であり、
当該インストラクションは、
複数の前記部分画像のうち少なくとも一つを基準部分画像とし、前記基準部分画像以外を対象部分画像として、前記対象部分画像を、前記合成画像において当該対象部分画像に対して両側から隣接する前記部分画像である第1の隣接部分画像及び第2の隣接部分画像のうち少なくとも一方を基準として補正する補正ステップ(S103~S111)と、
前記対象部分画像の濃度特性を前記第1の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第1の変換テーブルと、前記対象部分画像の濃度特性を前記第2の隣接部分画像の濃度特性に近づける補正を行うための濃度変換テーブルである第2の変換テーブルと、を生成する生成ステップ(S105~S106)と、
前記第1の変換テーブル及び前記第2の変換テーブルの両方を用いて前記対象部分画像の濃度を変換する混合変換ステップ(S111)と、
を備え、
前記生成ステップでは、
前記対象部分画像の元となる前記撮像画像である対象撮像画像と、前記第1の隣接部分画像の元となる前記撮像画像である第1の隣接撮像画像と、の被写領域が共通する第1の共通領域における前記対象撮像画像の濃度特性を、前記第1の共通領域における前記第1の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第1の変換テーブルとして生成し、
前記対象撮像画像と、前記第2の隣接部分画像の元となる前記撮像画像である第2の隣接撮像画像との被写領域が共通する第2の共通領域における前記対象撮像画像の濃度特性を、前記第2の共通領域における前記第2の隣接撮像画像の濃度特性に、近づける補正を行うための濃度変換テーブルを、前記第2の変換テーブルとして生成し、
前記混合変換ステップでは、前記対象部分画像における位置が前記第1の隣接部分画像に近い画素ほど、前記第1の変換テーブルによる影響が前記第2の変換テーブルによる影響と比較して大きくなり、前記対象部分画像における位置が前記第2の隣接部分画像に近い画素ほど、前記第2の変換テーブルによる影響が前記第1の変換テーブルによる影響と比較して大きくなるように、前記対象部分画像の濃度を変換する
非遷移のコンピュータ読み取り可能記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/902,938 US9691142B2 (en) | 2013-07-08 | 2014-06-23 | Image generating device, image generating method, and non-transitory computer-readable storage medium |
CN201480038750.6A CN105431883B (zh) | 2013-07-08 | 2014-06-23 | 图像生成装置、图像生成方法以及用于图像生成的非暂时性的计算机可读存储介质 |
DE112014003174.7T DE112014003174B4 (de) | 2013-07-08 | 2014-06-23 | Bilderzeugungsvorrichtung, Bilderzeugungsverfahren und nichtflüchtiges computerlesbares Speichermedium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013142726A JP5929849B2 (ja) | 2013-07-08 | 2013-07-08 | 画像生成装置、画像生成方法及び画像生成プログラム |
JP2013-142726 | 2013-07-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015004866A1 true WO2015004866A1 (ja) | 2015-01-15 |
Family
ID=52279576
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003347 WO2015004866A1 (ja) | 2013-07-08 | 2014-06-23 | 画像生成装置、画像生成方法及び、そのための非遷移のコンピュータ読み取り可能記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9691142B2 (ja) |
JP (1) | JP5929849B2 (ja) |
CN (1) | CN105431883B (ja) |
DE (1) | DE112014003174B4 (ja) |
WO (1) | WO2015004866A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9691142B2 (en) | 2013-07-08 | 2017-06-27 | Denso Corporation | Image generating device, image generating method, and non-transitory computer-readable storage medium |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10614603B2 (en) * | 2016-09-19 | 2020-04-07 | Qualcomm Incorporated | Color normalization for a multi-camera system |
CN111028192B (zh) * | 2019-12-18 | 2023-08-08 | 维沃移动通信(杭州)有限公司 | 一种图像合成方法及电子设备 |
US11681297B2 (en) * | 2021-03-02 | 2023-06-20 | Southwest Research Institute | Autonomous vehicle and self-location estimating method in autonomous vehicle |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09321972A (ja) * | 1996-05-28 | 1997-12-12 | Canon Inc | 画像合成装置及び方法 |
JP2011110247A (ja) * | 2009-11-27 | 2011-06-09 | Shimadzu Corp | X線撮影装置およびx線撮影方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5982951A (en) | 1996-05-28 | 1999-11-09 | Canon Kabushiki Kaisha | Apparatus and method for combining a plurality of images |
US7015954B1 (en) * | 1999-08-09 | 2006-03-21 | Fuji Xerox Co., Ltd. | Automatic video system using multiple cameras |
JP3948229B2 (ja) | 2001-08-01 | 2007-07-25 | ソニー株式会社 | 画像撮像装置及び方法 |
US7352887B2 (en) * | 2003-04-11 | 2008-04-01 | Hologic, Inc. | Scatter rejection for composite medical imaging systems |
JP4085283B2 (ja) * | 2005-02-14 | 2008-05-14 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法 |
JP4982127B2 (ja) | 2006-07-19 | 2012-07-25 | クラリオン株式会社 | 画像表示システム |
EP2355037A1 (en) | 2009-12-18 | 2011-08-10 | Nxp B.V. | Method of and system for determining an average colour value for pixels |
WO2012056518A1 (ja) * | 2010-10-26 | 2012-05-03 | 株式会社モルフォ | 画像処理装置、画像処理方法及び画像処理プログラム |
JP6102930B2 (ja) * | 2011-10-14 | 2017-03-29 | オムロン株式会社 | 射影空間監視のための方法および装置 |
JP2013142726A (ja) | 2012-01-06 | 2013-07-22 | Canon Inc | 電子写真感光体の製造装置 |
KR102062921B1 (ko) * | 2013-05-14 | 2020-01-06 | 현대모비스 주식회사 | 다수 영상의 밝기 균일화 방법 |
JP5929849B2 (ja) | 2013-07-08 | 2016-06-08 | 株式会社デンソー | 画像生成装置、画像生成方法及び画像生成プログラム |
US9426365B2 (en) * | 2013-11-01 | 2016-08-23 | The Lightco Inc. | Image stabilization related methods and apparatus |
-
2013
- 2013-07-08 JP JP2013142726A patent/JP5929849B2/ja active Active
-
2014
- 2014-06-23 US US14/902,938 patent/US9691142B2/en active Active
- 2014-06-23 CN CN201480038750.6A patent/CN105431883B/zh active Active
- 2014-06-23 WO PCT/JP2014/003347 patent/WO2015004866A1/ja active Application Filing
- 2014-06-23 DE DE112014003174.7T patent/DE112014003174B4/de active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09321972A (ja) * | 1996-05-28 | 1997-12-12 | Canon Inc | 画像合成装置及び方法 |
JP2011110247A (ja) * | 2009-11-27 | 2011-06-09 | Shimadzu Corp | X線撮影装置およびx線撮影方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9691142B2 (en) | 2013-07-08 | 2017-06-27 | Denso Corporation | Image generating device, image generating method, and non-transitory computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
DE112014003174T5 (de) | 2016-03-24 |
DE112014003174B4 (de) | 2023-07-27 |
US20160155219A1 (en) | 2016-06-02 |
CN105431883B (zh) | 2018-09-04 |
JP5929849B2 (ja) | 2016-06-08 |
US9691142B2 (en) | 2017-06-27 |
CN105431883A (zh) | 2016-03-23 |
JP2015014997A (ja) | 2015-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9544562B2 (en) | Converting an image from a dual-band sensor to a visible color image | |
WO2015004866A1 (ja) | 画像生成装置、画像生成方法及び、そのための非遷移のコンピュータ読み取り可能記憶媒体 | |
US10007853B2 (en) | Image generation device for monitoring surroundings of vehicle | |
JP6269286B2 (ja) | 画像ノイズ低減方法及び画像ノイズ低減装置 | |
US9135688B2 (en) | Method for brightness equalization of various images | |
EP3110128A1 (en) | Color gamut mapping based on the mapping of cusp colors obtained through simplified cusp lines | |
US10270942B2 (en) | Method of mapping source colors of an image in a chromaticity plane | |
US10602027B2 (en) | Color gamut mapping using a lightness mapping based also on the lightness of cusp colors belonging to different constant-hue leaves | |
JP6255928B2 (ja) | 俯瞰画像生成装置 | |
US9639916B2 (en) | Image processing device, and image processing method | |
JP5020792B2 (ja) | 合成画像生成装置および合成画像生成方法 | |
US8861850B2 (en) | Digital image color correction | |
EP3701490B1 (en) | Method and system of fast image blending for overlapping regions in surround view | |
KR101469717B1 (ko) | 광각 카메라를 이용한 영상 시스템 및 그 방법 | |
JP5202749B1 (ja) | 画像処理方法 | |
JP6957665B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
US11410274B2 (en) | Information processing device and program | |
JP2008042664A (ja) | 画像表示装置 | |
JP2020102755A (ja) | 半導体装置、画像処理方法およびプログラム | |
JP6896658B2 (ja) | ホワイトバランス調整装置及びホワイトバランス調整方法 | |
EP4266669A1 (en) | Imaging device and image processing device | |
JP6978720B2 (ja) | 画像処理装置 | |
JP2019067097A (ja) | 画像処理装置、画像処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480038750.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14823364 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14902938 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112014003174 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14823364 Country of ref document: EP Kind code of ref document: A1 |