WO2015064095A1 - Image correction parameter output device, camera system, and correction parameter output method - Google Patents

Image correction parameter output device, camera system, and correction parameter output method Download PDF

Info

Publication number
WO2015064095A1
WO2015064095A1 PCT/JP2014/005471 JP2014005471W WO2015064095A1 WO 2015064095 A1 WO2015064095 A1 WO 2015064095A1 JP 2014005471 W JP2014005471 W JP 2014005471W WO 2015064095 A1 WO2015064095 A1 WO 2015064095A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
correction parameter
control information
images
moving body
Prior art date
Application number
PCT/JP2014/005471
Other languages
French (fr)
Japanese (ja)
Inventor
崇功 金谷
理洋 森島
貴裕 岡田
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to EP14858173.9A priority Critical patent/EP3065390B1/en
Priority to PCT/JP2014/005471 priority patent/WO2015064095A1/en
Priority to US15/032,910 priority patent/US10097733B2/en
Priority to JP2015544805A priority patent/JPWO2015064095A1/en
Publication of WO2015064095A1 publication Critical patent/WO2015064095A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6027Correction or control of colour gradation or colour contrast
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Definitions

  • the present invention relates to an image correction parameter output device, a camera system, and a correction parameter output method for outputting parameters used for correcting a captured image.
  • the visibility of the combined image may be lowered depending on the environment around the moving body and the state of the moving body.
  • the value of an appropriate correction parameter that can ensure visibility for various light sources around the moving body is not necessarily constant.
  • the combined image may not be able to ensure sufficient visibility for various states of the moving object.
  • An object of the present invention made in view of such circumstances is to provide an image correction parameter output device, a camera system, and a correction parameter output method for outputting an image correction parameter capable of improving the visibility of a combined image around a moving object. There is to do.
  • an image correction parameter output apparatus includes: A storage unit for storing correction parameter groups of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving object, in association with the control information of the moving object; A control information acquisition unit for acquiring control information of the mobile body; And an output unit that outputs a correction parameter group corresponding to the acquired control information.
  • the output unit preferably outputs different correction parameter groups depending on whether or not the control information includes information indicating lighting of the moving body.
  • the correction parameter group preferably includes a color correction parameter for correcting so as to reduce a color difference between the plurality of images.
  • the correction parameter output apparatus is An image acquisition unit for acquiring a plurality of images corrected based on the correction parameter group output by the output unit; One of the plurality of corrected images is defined as a reference image, and the other corrected image is adjusted based on a color signal component in an overlapping area between the reference image and another corrected image.
  • An adjustment parameter calculation unit for calculating the color adjustment parameter of The output unit preferably outputs the calculated color adjustment parameter.
  • the image acquisition unit acquires an image adjusted based on the color adjustment parameter output by the output unit
  • the adjustment parameter calculation unit calculates a color adjustment parameter for adjusting the corrected image based on a color signal component in an overlapping area between the adjusted image and a corrected image different from the reference image. Calculate
  • the output unit preferably outputs the calculated color adjustment parameter.
  • the reference image is preferably determined based on the control information from among the plurality of corrected images.
  • the reference image is preferably determined based on information indicating a traveling direction of the moving body included in the control information from among the plurality of corrected images.
  • the camera system includes: A plurality of imaging units that generate a plurality of images that partially image the peripheral area of the moving body; A storage unit that stores a plurality of pattern correction parameter groups for correcting the plurality of images in association with the control information of the moving body; A control information acquisition unit for acquiring control information of the mobile body; An output unit that outputs a correction parameter group corresponding to the acquired control information; an image processing unit that corrects the plurality of images based on a correction parameter output by the output unit; And an image combining unit that combines the plurality of corrected images to generate a combined image.
  • the image correction parameter output method includes: Storing a correction parameter group of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving body in association with the control information of the moving body; Obtaining control information of the mobile body; And a step of outputting a correction parameter group corresponding to the acquired control information.
  • the image correction parameter output device the camera system, and the correction parameter output method according to the present invention, it is possible to output an image correction parameter that improves the visibility of the combined image around the moving body.
  • FIG. 1 is a functional block diagram showing a schematic configuration of a camera system according to the first embodiment of the present invention.
  • the camera system 100 includes an imaging device 10 and a display device 11.
  • the imaging device 10 includes a plurality of imaging units, in the present embodiment, for example, a front camera 12, a rear camera 13, and a side camera 14 (a left side camera 14L and a right side camera 14R).
  • the display device 11 is disposed at a position that can be viewed from the driver's seat.
  • the front camera 12 is arranged so as to be able to image a peripheral region in front of the moving body 15.
  • the rear camera 13 is arranged so as to be able to image a peripheral region behind the moving body 15.
  • the side cameras 14 are arranged so that, for example, the left and right door mirrors 16 can vertically image the peripheral regions on the sides of the moving body 15 downward. Further, the side cameras 14 are symmetrically arranged on both the left and right sides of the moving body 15.
  • the front camera 12, the rear camera 13, and the side camera 14 are provided with a lens having a wide angle of view such as a fisheye lens, for example, and a peripheral region of the moving body 15 can be photographed at a wide angle.
  • the imaging range of the front camera 12 includes a front area FA of the moving body 15.
  • the imaging range of the rear camera 13 includes a rear area ReA of the moving body 15.
  • the imaging range of the left side camera 14L and the imaging range of the right side camera 14R include a left side area LA and a right side area RA of the moving body 15, respectively.
  • the imaging ranges of the cameras 12, 13, 14L, and 14R include regions around the four corners of the moving body 15 overlapping each other.
  • the imaging range of the front camera 12 and the imaging range of the side cameras 14 (14L, 14R) include the left front area FLA and the right front area FRA of the moving body 15 overlapping each other.
  • the imaging range of the rear camera 13 and the imaging range of the side cameras 14 (14L, 14R) include the left rear region ReLA and the right rear region ReRA of the moving body 15 overlapping each other.
  • the area around the moving body 15 where the imaging ranges of the cameras 12, 13, 14L, and 14R overlap each other is referred to as an overlapping area (FLA, FRA, ReLA, ReRA).
  • the front camera 12 includes an optical system 17a, an imager 18a, an image processing unit 19a, a camera control unit 20a, an image combining device (image combining unit) 21, and a correction parameter output device 22 (see FIG. 1). ).
  • the optical system 17a includes a plurality of lenses and forms a subject image.
  • the optical system 17a has a wide angle of view, and can form a subject image included in the peripheral region of the moving body 15 as described above.
  • the imager 18a is, for example, a CMOS image sensor, and generates an image obtained by capturing a subject image formed by the optical system 17a.
  • the image processing unit 19a performs image processing such as image conversion, color correction, gamma correction, and luminance correction on the image generated by the imager 18a.
  • the image processing unit 19a outputs an image subjected to image processing.
  • the image processing unit 19a converts the wide-angle captured image generated by the imager 18a into an overhead image by image conversion. That is, a captured image generated by wide-angle imaging and generally distorted in the peripheral portion of the image is converted into an overhead image when the peripheral area of the moving body 15 is viewed from above the moving body 15 vertically downward. Specifically, the image processing unit 19a converts the image captured by the imager 18a into an overhead image in the front area FA and the overlapping areas FLA and FRA (see FIG. 3) of the moving body 15.
  • the image processing unit 19a corrects the color of the captured image or the overhead image by color correction.
  • the image processing unit 19 a acquires color correction parameters from the correction parameter output device 22.
  • the image processing unit 19a performs color correction by multiplying a specific color signal component in the captured image or the overhead image by the acquired color correction parameter.
  • the image processing unit 19a corrects the nonlinearity of the input signal versus the emission intensity of the display device 11 by, for example, normal gamma correction.
  • the image processing unit 19a corrects the brightness of the captured image or the overhead image by brightness correction.
  • the image processing unit 19 a acquires a brightness correction parameter from the correction parameter output device 22.
  • the image processing unit 19a performs luminance correction by multiplying the luminance signal component in the captured image or the overhead image by the acquired luminance correction parameter.
  • the camera control unit 20a controls the operation of each part of the front camera 12. For example, the camera control unit 20a causes the imager 18a to capture the peripheral region of the moving body 15 in synchronization with the rear camera 13 and the side camera 14, and periodically generate an image at, for example, 30 fps.
  • the camera control unit 20a transmits and receives information via the in-vehicle network 101 or a dedicated line.
  • the image combining device 21 combines images output from the image processing units 19a, 19b, 19c, and 19d of the front camera 12, the rear camera 13, and the side cameras 14L and 14R to generate a combined image.
  • the combined image is, for example, an overhead image of the entire periphery of the moving body 15.
  • the front area FA and the overlapping areas FLA and FRA of the moving body are images of the front camera 12
  • the rear area ReA and the overlapping areas ReLA and ReRA are images of the rear camera 13, and the left and right sides.
  • the images of the side cameras 14L and 14R are used for the direction areas LA and RA, respectively (see FIG. 3).
  • the image combining device 21 outputs the generated combined image to the display device 11.
  • the correction parameter output device 22 (see FIG. 1) includes a control information acquisition unit 23, a storage unit 24, an output unit 25, and a control unit 26.
  • the control information acquisition unit 23 acquires control information of the moving body 15.
  • the control information includes various information regarding the time information and the state of the moving body 15.
  • the variety of information related to the state of the moving body 15 includes, for example, information indicating whether the lighting (head lamp, tail lamp, and brake lamp) of the moving body 15 is turned on or off.
  • the control information acquisition unit 23 can acquire control information by an arbitrary method.
  • the control information acquisition unit 23 may acquire the control information from the mobile body 15 via the in-vehicle network 101, or control information output by other components of the mobile body 15 May be acquired by wire / wireless.
  • the storage unit 24 stores a plurality of pattern correction parameter groups respectively associated with various control information.
  • the correction parameter group is determined in advance through experiments or simulations.
  • the correction parameters (color correction parameter and luminance correction parameter) of each camera 12, 13, 14L, 14R included in the corresponding correction parameter group are as follows. Are the same.
  • the time information included in the control information indicates “7 o'clock to 17 o'clock”, it is determined that the state is “daytime”.
  • the correction parameter of the front camera 12 is a reference value.
  • the color correction parameter is determined to increase the saturation of the image
  • the luminance correction parameter is determined to increase the luminance of the image.
  • the correction parameter of the front camera 12 is a reference value.
  • the color correction parameter is determined to reduce (weaken) the redness of the image, and the luminance correction parameter is determined to increase the luminance of the image.
  • the color correction parameter is determined to increase the saturation of the image, and the luminance correction parameter is determined to increase the luminance of the image.
  • the state is “ It is determined that it is “night (head / tail lamp ON)”.
  • the correction parameter of the front camera 12 is a reference value.
  • the color correction parameter is determined to reduce (strong) redness of the image, and the luminance correction parameter is determined to increase the luminance of the image.
  • the color correction parameter is determined to increase the saturation of the image, and the luminance correction parameter is determined to increase the luminance of the image.
  • the state determination when the time information included in the control information indicates “17:00 to 7:00” and the control information includes information indicating lighting of the headlamp (white) and the brake lamp (red), the state is It is determined that it is “Night (head brake lamp ON)”.
  • the red brake lamp In the case of night, when the red brake lamp is irradiated only behind the moving body 15, the rear subject is captured as an image that is more reddish than the left and right front subjects. In general, the brake lamp is brighter than the tail lamp and red is stronger. Therefore, by using the correction parameters as described above, the color and brightness that can be visually recognized when the images taken by the cameras 12, 13, 14L, and 14R are equal on average are adjusted.
  • the color correction parameter is an image among the correction parameters of the front camera 12, the rear camera 13, and the side cameras 14L and 14R.
  • the luminance correction parameter is determined to increase the luminance of the image.
  • the time information included in the control information indicates “17:00 to 7:00”, it is determined that the state is “night (lighting OFF)”.
  • the output unit 25 (see FIG. 1) converts the correction parameters included in the correction parameter group corresponding to the control information acquired by the control information acquisition unit 23 into the image processing units 19a, 19b, and 14R of the cameras 12, 13, 14L, and 14R. It outputs to 19c and 19d.
  • the control unit 26 controls the operation of each part of the correction parameter output device 22.
  • the control unit 26 causes the control information acquisition unit 23 to acquire the control information of the moving body 15 simultaneously with the image generation by the imagers 18a, 18b, 18c, and 18d of the cameras 12, 13, 14L, and 14R, and causes the output unit 25 to The correction parameter is output periodically.
  • the control part 26 transmits / receives information via the vehicle-mounted network 101 or a dedicated line.
  • the rear camera 13 and the side cameras 14 are optical systems 17b, 17c, 17d, imagers 18b, 18c, 18d, image processing units 19b, 19c, 19d, And camera control units 20b, 20c, and 20d.
  • the functions and configurations of the optical systems 17b, 17c, 17d, imagers 18b, 18c, 18d, image processing units 19b, 19c, 19d, and camera control units 20b, 20c, 20d are the same as those of the front camera 12.
  • the image processing unit 19b of the rear camera 13 converts an image captured by the imager 18b of the rear camera 13 into an overhead view image in the rear area ReA and the overlapping areas ReLA and ReRA.
  • the image processing unit 19c of the left side camera 14L converts the image captured by the imager 18c of the left side camera 14L into an overhead view image in the left side area LA and the overlapping areas FLA and ReLA.
  • the image processing unit 19d of the right side camera 14R converts the image captured by the imager 18d of the right side camera 14 into an overhead view image in the right side area RA and the overlapping areas FRA and ReRA.
  • the display device 11 is, for example, an LCD and can display a real-time moving image.
  • the display device 11 acquires and displays the combined image output from the image combining device 21.
  • the display device 11 may be configured by a touch panel, for example, and may function as an interface that receives user operations.
  • the display device 11 can transmit and receive information via the in-vehicle network 101 or a dedicated line.
  • the camera control units 20a, 20b, 20c, and 20d of the cameras 12, 13, 14L, and 14R control the imagers 18a, 18b, 18c, and 18d to generate an image that captures the peripheral area of the moving body 15. (Step S100).
  • control unit 26 of the correction parameter output device 22 controls the control information acquisition unit 23 to acquire control information when the captured images are captured by the cameras 12, 13, 14L, and 14R (step S101).
  • the control information includes time information and information indicating lighting of the moving body 15 (head lamp, tail lamp, and brake lamp).
  • control unit 26 reads out a correction parameter group corresponding to the control information in step S101 from the correction parameter groups of a plurality of patterns stored in the storage unit 24 (step S102), and sets the correction parameter to each camera 12, Output to 13, 14L, 14R.
  • the camera control units 20a, 20b, 20c, and 20d of the cameras 12, 13, 14L, and 14R control the image processing unit 19 to convert the images generated in step S100 into overhead images (step S103). ).
  • each of the camera control units 20a, 20b, 20c, and 20d controls the image processing units 19a, 19b, 19c, and 19d to correct the bird's-eye view image in step S103 based on the correction parameter in step S102 ( In step S104, the corrected overhead image is output to the image combining device 21.
  • the image combining device 21 generates a combined image of the plurality of corrected overhead images in step S104 (step S105) and outputs the combined image to the display device 11.
  • the display device 11 displays the combined image of step S105 (step S106).
  • the correction parameters for correcting a plurality of images according to the control information of the moving body 15 are output. For this reason, as described below, correction suitable for various states of the moving body 15 can be performed, and the visibility of the combined image can be improved.
  • the headlamp when the headlamp is turned on at night, white light from the headlamp is irradiated in front of the moving body 15, so that the subject images in front of the moving body 15 are left, right, and rear subjects. Whiter than the statue. Therefore, as shown in FIG. 6A, the left and right regions on the image FIm by the front camera 12 and the images LIm and RIm by the side camera 14 are different in average color and luminance.
  • the tail lamp or the brake lamp when the tail lamp or the brake lamp is lit at night, the red light of the tail lamp or the brake lamp is irradiated to the rear of the moving body 15, so that the subject image behind the moving body 15 is left, right , And more reddish than the front subject image. Therefore, the entire image ReIm from the rear camera 13 and the images LIm and RIm from the side camera 14 are different in average color and luminance.
  • the correction parameter output device of the first embodiment reads out and outputs a correction parameter group corresponding to control information of the moving body 15 from a plurality of correction parameter groups stored in advance in the storage unit 24.
  • the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R can visually recognize that the captured images are equal on average with respect to various states of the moving body 15. Color and luminance can be corrected (see FIG. 6B).
  • different correction parameters are output depending on whether or not the control information includes information indicating lighting of the moving body 15. For this reason, the difference of the color and brightness
  • the image processing units 19a, 19b, 19c, and 19d perform image conversion, color correction, luminance correction, and the like on the images generated by the imagers 18a, 18b, 18c, and 18d, as in the first embodiment.
  • the image processing is performed.
  • the image processing units 19 a, 19 b, 19 c, and 19 d according to the present embodiment acquire adjustment parameters that will be described later from the correction parameter output device 220.
  • the image processing units 19a, 19b, 19c, and 19d perform image adjustment by multiplying the color signal component of the image by the acquired color adjustment parameter.
  • the correction parameter output device 220 includes a control information acquisition unit 23, a storage unit 24, an output unit 25, a control unit 26, an image acquisition unit 270, and an adjustment parameter calculation unit 280.
  • the configurations and functions of the control information acquisition unit 23, the storage unit 24, and the control unit 26 are the same as those in the first embodiment.
  • the output unit 25 uses the correction parameters included in the correction parameter group corresponding to the control information acquired by the control information acquisition unit 23 as image processing units of the cameras 12, 13, 14L, and 14R. It outputs to 19a, 19b, 19c, 19d. Further, the output unit 25 in the present embodiment outputs the adjustment parameters calculated by the adjustment parameter calculation unit 280 to the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R.
  • the image acquisition unit 270 acquires images from the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R.
  • the adjustment parameter calculation unit 280 calculates an adjustment parameter for adjusting a plurality of images acquired by the image acquisition unit 270. For example, the adjustment parameter calculation unit 280 calculates the adjustment parameter by a two-stage process as described below.
  • the adjustment parameter calculation unit 280 determines one of the plurality of images acquired by the image acquisition unit 270 as a reference image.
  • a reference image it is assumed that an image from the front camera 12 is set as a reference image.
  • the adjustment parameter calculation unit 280 calculates the average value of the color signal components of the image in each overlapping region (FLA, FRA) for the reference image and adjacent images (images by the side cameras 14L and 14R) each including an overlapping region common to the reference image. Are calculated respectively.
  • the adjustment parameter calculation unit 280 when the difference in color signal component average value between the reference image and the adjacent image is greater than or equal to a predetermined threshold, that is, when the average color difference between the reference image and the adjacent image is large, A color adjustment parameter to be multiplied by the color signal component of the adjacent image is calculated so as to reduce the color difference. For example, the adjustment parameter calculation unit 280 calculates the color adjustment parameter of the adjacent image so that the difference between the average values is less than a predetermined threshold or the average value of the adjacent image matches the average value of the reference image. .
  • the adjustment parameter calculation unit 280 outputs the color adjustment parameters calculated via the output unit 25 to the side cameras 14L and 14R.
  • the adjustment parameter calculation unit 280 performs a second stage process described below.
  • the adjustment parameter calculation unit 280 performs the overlap regions (ReLA, ReRA) on the images by the side cameras 14L and 14R and the third image (image by the rear camera 13) that includes an overlap region common to the images and is different from the reference image.
  • the average values of the color signal components of the image at are respectively calculated.
  • the adjustment parameter calculation unit 280 calculates the average value A of the color signal component average value of the image by the left side camera 14L and the color signal component average value of the image by the right side camera 14R. Further, the adjustment parameter calculation unit 280 calculates an average value B of the color signal component average value of the image by the rear camera 13 in the overlap region ReLA and the color signal component average value of the image by the rear camera 13 in the overlap region ReRA. .
  • the adjustment parameter calculation unit 280 sets the average value B to the average value A so that the difference between the average values is less than the predetermined threshold.
  • the color adjustment parameters of the image by the rear camera 13 are calculated so as to coincide with each other.
  • the adjustment parameter calculation unit 280 outputs the color adjustment parameter calculated via the output unit 25 to the rear camera 13.
  • the adjustment parameter calculation unit 280 determines which of the plurality of images acquired by the image acquisition unit 270 is determined as the above-described reference image based on the control information acquired by the control information acquisition unit 23.
  • the control information includes information indicating the traveling direction of the moving body 15.
  • the adjustment parameter calculation unit 280 determines the traveling direction of the moving body 15 based on the control information. For example, when moving forward, the front camera 12 image, when moving backward, the rear camera 13 image, when turning left or right, the side camera 14 (14L or 14R) is determined as a reference image.
  • step S200 to step S203 processing similar to that from step S100 to step S103 in the first embodiment is performed.
  • each of the camera control units 20a, 20b, 20c, and 20d controls the image processing unit 19 to correct the bird's-eye view image in step S203 based on the correction parameter in step S202 (step S204).
  • the overhead image is output to the correction parameter output device 220.
  • the correction parameter output device 220 determines one of the corrected overhead images in step S204 as a reference image (step S205).
  • a reference image an image obtained by the front camera 12 is defined as a reference image.
  • the correction parameter output device 220 adjusts the images by the side cameras 14L and 14R based on the reference image and the color signal components of the images by the side cameras 14L and 14R among the plurality of overhead images corrected in step S204. Are calculated (step S206) and output to the side cameras 14L and 14R.
  • the side cameras 14L and 14R adjust the images by the side cameras 14L and 14R corrected in step S204 based on the color adjustment parameters in step S206 (step S207), and the adjusted images are sent to the correction parameter output device 220. Output.
  • the correction parameter output device 220 adjusts the image by the rear camera 13 based on the color signal component of the image by the side camera 14L, 14R adjusted in step S207 and the image by the rear camera 13 corrected in step S204. Are calculated (step S208) and output to the rear camera 13.
  • the rear camera 13 adjusts the image by the rear camera 13 corrected in step S204 based on the color adjustment parameter in step S208 (step S209).
  • each of the cameras 12, 13, 14L, and 14R outputs the corrected or adjusted overhead image to the correction parameter output device 22 (step S210). Specifically, the image by the front camera 12 corrected in step S204, the image by the side cameras 14L and 14R adjusted in step S207, and the image by the rear camera 13 adjusted in step S209 are output.
  • the image combining device 21 generates a combined image based on the overhead image in step S210 (step S211) and outputs the combined image to the display device 11.
  • the display device 11 displays the combined image in step S211 (step S212).
  • the color signal component of the reference image is corrected after correcting the images by the cameras 12, 13, 14L, and 14R as in the camera system of the first embodiment. Since the images from the cameras 12, 13, 14L, and 14R are adjusted with reference to, the visibility of the combined image can be improved.
  • the color of the other image is matched with the color of the image including the peripheral area of the moving body 15 that the driver pays attention to. And the visibility of the combined image can be further improved.
  • the imaging device 10 includes the front camera 12, the rear camera 13, and the side camera 14, but may include more cameras.
  • a configuration further including a distant camera that can capture the entire periphery of the moving body 15 may be used.
  • each component of the camera system of the above-described embodiment can be divided and rearranged.
  • the image combining device 21 and the correction parameter output device 22 may be divided from the front camera 12 and configured as a single device.
  • a navigation system may be further provided, and the image combining device 21 and the correction parameter output device 22 may be provided in the navigation system.
  • the correction parameter may be any other parameter related to image correction.
  • the adjustment parameter calculation unit 280 calculates the color adjustment parameter.
  • the luminance adjustment parameter may be calculated by the same process.
  • the image processing units 19a, 19b, 19c, and 19d perform image adjustment by multiplying the luminance signal component of the image by the luminance adjustment parameter.
  • control information has been described as including time information, information indicating lighting of the moving body 15, information indicating the traveling direction of the moving body 15, and the like. Information may be included.
  • the configuration has been described in which the image processing units 19a, 19b, 19c, and 19d correct the captured images (or overhead images) generated by the imagers 18a, 18b, 18c, and 18d.
  • a configuration may be adopted in which correction parameters are input to an AFE (Analog Front End) or a white balance control unit having a color, and the color and luminance are corrected when an image is generated by the imagers 18a, 18b, 18c, and 18d.
  • the luminance correction parameter has been described as a parameter to be multiplied with the luminance signal component of the image, but may be a parameter indicating an aperture value and a shutter speed, for example.
  • the luminance is adjusted by inputting the luminance correction parameter to the aperture and exposure time control unit.
  • a parameter to be multiplied with the brightness signal component of the image and a parameter indicating the aperture value and the shutter speed may be used in combination.
  • the adjustment parameter calculation unit 280 calculates the correction parameter of the image by the rear camera 13 using the color signal component of the image by the left and right side cameras 14. 14 may be configured to calculate the correction parameter using the color signal component of the image of No. 14.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

 The present invention outputs image correction parameters capable of improving the visibility of a composite image of the surrounding area of a moving body. An image correction parameter output device (22) is provided with: a storage unit (24) that stores, in association with control information for a moving body (15), a correction parameter group comprising a plurality patterns for correcting a plurality of images to be captured after partially superimposing regions of the surrounding area of the moving body (15); a control information acquisition unit (23) that acquires the control information for the moving body (15); and an output unit (25) that outputs the correction parameter group associated with the acquired control information.

Description

画像の補正パラメータ出力装置、カメラシステム、および補正パラメータ出力方法Image correction parameter output device, camera system, and correction parameter output method 関連出願へのクロスリファレンスCross-reference to related applications
 本出願は、日本国特許出願2013-224508号(2013年10月29日出願)の優先権を主張するものであり、当該出願の開示全体を、ここに参照のために取込む。 This application claims the priority of Japanese Patent Application No. 2013-224508 (filed on October 29, 2013), the entire disclosure of which is incorporated herein by reference.
 本発明は、撮像画像の補正に用いるパラメータを出力する画像の補正パラメータ出力装置、カメラシステム、および補正パラメータ出力方法に関する。 The present invention relates to an image correction parameter output device, a camera system, and a correction parameter output method for outputting parameters used for correcting a captured image.
 従来、自動車などの移動体に複数の車載カメラを設置して移動体周辺を撮像し、生成した複数の撮像画像を用いて移動体周辺を俯瞰する結合画像を表示するアラウンドビューモニタシステムが知られている。このようなシステムにおいて、各撮像画像間のつなぎ目に視覚的な連続性を持たせる技術が知られている(例えば、特許文献1)。 2. Description of the Related Art Conventionally, there is known an around view monitor system in which a plurality of in-vehicle cameras are installed on a moving body such as an automobile, the periphery of the moving body is imaged, and a combined image overlooking the periphery of the moving body is displayed using the generated captured images. ing. In such a system, a technique is known in which visual continuity is provided at a joint between captured images (for example, Patent Document 1).
特開2010-116196号公報JP 2010-116196 A
 画像の視認性確保のため、所定の補正パラメータを用いた画像の色および明るさの補正が行われることが一般的である。しかしながら、移動体周辺の環境や移動体の状態によっては、結合画像の視認性が低下することがあった。例えば、移動体周辺の多様な光源に対して、視認性を確保し得る適切な補正パラメータの値は必ずしも一定ではない。このように、結合画像は、移動体の多様な状態に対して十分な視認性を確保できないことがあった。 In order to ensure the visibility of the image, it is common to correct the color and brightness of the image using predetermined correction parameters. However, the visibility of the combined image may be lowered depending on the environment around the moving body and the state of the moving body. For example, the value of an appropriate correction parameter that can ensure visibility for various light sources around the moving body is not necessarily constant. Thus, the combined image may not be able to ensure sufficient visibility for various states of the moving object.
 かかる事情に鑑みてなされた本発明の目的は、移動体周辺の結合画像の視認性を向上可能な画像の補正パラメータを出力する画像の補正パラメータ出力装置、カメラシステム、および補正パラメータ出力方法を提供することにある。 An object of the present invention made in view of such circumstances is to provide an image correction parameter output device, a camera system, and a correction parameter output method for outputting an image correction parameter capable of improving the visibility of a combined image around a moving object. There is to do.
 上記課題を解決するために本発明に係る画像の補正パラメータ出力装置は、
 移動体の周辺領域を一部重複して撮像する複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶する記憶部と、
 前記移動体の制御情報を取得する制御情報取得部と、
 前記取得した制御情報に対応する補正パラメータ群を出力する出力部とを備える
ことを特徴とする。
In order to solve the above-described problems, an image correction parameter output apparatus according to the present invention includes:
A storage unit for storing correction parameter groups of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving object, in association with the control information of the moving object;
A control information acquisition unit for acquiring control information of the mobile body;
And an output unit that outputs a correction parameter group corresponding to the acquired control information.
 また、本発明に係る補正パラメータ出力装置において、
 前記出力部は、制御情報に前記移動体の照明の点灯を示す情報が含まれるか否かに応じて異なる補正パラメータ群を出力する
ことが好ましい。
In the correction parameter output apparatus according to the present invention,
The output unit preferably outputs different correction parameter groups depending on whether or not the control information includes information indicating lighting of the moving body.
 また、本発明に係る補正パラメータ出力装置において、
 前記補正パラメータ群には、前記複数の画像間で色の差を低減するように補正するための色補正パラメータが含まれる
ことが好ましい。
In the correction parameter output apparatus according to the present invention,
The correction parameter group preferably includes a color correction parameter for correcting so as to reduce a color difference between the plurality of images.
 また、本発明に係る補正パラメータ出力装置は、
 前記出力部が出力した前記補正パラメータ群に基づいて補正された複数の画像を取得する画像取得部と、
 前記補正された複数の画像のうち1つを基準画像として定め、該基準画像と他の補正された画像との重複領域における色信号成分に基づいて、前記他の補正された画像を調整するための色調整パラメータを算出する調整パラメータ算出部とを更に備え、
 前記出力部は、前記算出した色調整パラメータを出力する
ことが好ましい。
The correction parameter output apparatus according to the present invention is
An image acquisition unit for acquiring a plurality of images corrected based on the correction parameter group output by the output unit;
One of the plurality of corrected images is defined as a reference image, and the other corrected image is adjusted based on a color signal component in an overlapping area between the reference image and another corrected image. An adjustment parameter calculation unit for calculating the color adjustment parameter of
The output unit preferably outputs the calculated color adjustment parameter.
 また、本発明に係る補正パラメータ出力装置において、
 前記画像取得部は、前記出力部が出力した前記色調整パラメータに基づいて調整された画像を取得し、
 前記調整パラメータ算出部は、前記調整された画像と、前記基準画像とは異なる補正された画像との重複領域における色信号成分に基づいて、前記補正された画像を調整するための色調整パラメータを算出し、
 前記出力部は、前記算出した色調整パラメータを出力する
ことが好ましい。
In the correction parameter output apparatus according to the present invention,
The image acquisition unit acquires an image adjusted based on the color adjustment parameter output by the output unit,
The adjustment parameter calculation unit calculates a color adjustment parameter for adjusting the corrected image based on a color signal component in an overlapping area between the adjusted image and a corrected image different from the reference image. Calculate
The output unit preferably outputs the calculated color adjustment parameter.
 また、本発明に係る補正パラメータ出力装置において、
 前記基準画像は、前記補正された複数の画像のうちから前記制御情報に基づいて定められる
ことが好ましい。
In the correction parameter output apparatus according to the present invention,
The reference image is preferably determined based on the control information from among the plurality of corrected images.
 また、本発明に係る補正パラメータ出力装置において、
 前記基準画像は、前記補正された複数の画像のうちから、前記制御情報に含まれる前記移動体の進行方向を示す情報に基づいて定められる
ことが好ましい
In the correction parameter output apparatus according to the present invention,
The reference image is preferably determined based on information indicating a traveling direction of the moving body included in the control information from among the plurality of corrected images.
 また、本発明に係るカメラシステムは、
 移動体の周辺領域を一部重複して撮像する複数の画像を生成する複数の撮像部と、
 前記複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶する記憶部と、
 前記移動体の制御情報を取得する制御情報取得部と、
 前記取得した制御情報に対応する補正パラメータ群を出力する出力部と
 前記出力部が出力する補正パラメータに基づいて前記複数の画像を補正する画像処理部と、
 前記補正した複数の画像を結合し結合画像を生成する画像結合部とを備える
ことを特徴とする。
Moreover, the camera system according to the present invention includes:
A plurality of imaging units that generate a plurality of images that partially image the peripheral area of the moving body;
A storage unit that stores a plurality of pattern correction parameter groups for correcting the plurality of images in association with the control information of the moving body;
A control information acquisition unit for acquiring control information of the mobile body;
An output unit that outputs a correction parameter group corresponding to the acquired control information; an image processing unit that corrects the plurality of images based on a correction parameter output by the output unit;
And an image combining unit that combines the plurality of corrected images to generate a combined image.
 また、本発明に係る画像の補正パラメータ出力方法は、
 移動体の周辺領域を一部重複して撮像する複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶するステップと、
 前記移動体の制御情報を取得するステップと、
 前記取得した制御情報に対応する補正パラメータ群を出力するステップとを含む
ことを特徴とする。
The image correction parameter output method according to the present invention includes:
Storing a correction parameter group of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving body in association with the control information of the moving body;
Obtaining control information of the mobile body;
And a step of outputting a correction parameter group corresponding to the acquired control information.
 本発明に係る画像の補正パラメータ出力装置、カメラシステム、および補正パラメータ出力方法によれば、移動体周辺の結合画像の視認性を向上する画像の補正パラメータを出力可能である。 According to the image correction parameter output device, the camera system, and the correction parameter output method according to the present invention, it is possible to output an image correction parameter that improves the visibility of the combined image around the moving body.
本発明の第1の実施形態に係るカメラシステムの概略構成を示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the camera system which concerns on the 1st Embodiment of this invention. 図1のカメラシステムの構成要素の配置を示す概略図である。It is the schematic which shows arrangement | positioning of the component of the camera system of FIG. 図1の撮像装置の撮像範囲を示す概略図である。It is the schematic which shows the imaging range of the imaging device of FIG. 図1の記憶部が記憶する補正パラメータ群の例を示す図である。It is a figure which shows the example of the correction parameter group which the memory | storage part of FIG. 1 memorize | stores. 図1のカメラシステムの動作を説明するフローチャートである。It is a flowchart explaining operation | movement of the camera system of FIG. 移動体周辺の結合画像の例を示す図である。It is a figure which shows the example of the joint image around a mobile body. 本発明の第2の実施形態に係るカメラシステムの概略構成を示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the camera system which concerns on the 2nd Embodiment of this invention. 図7のカメラシステムの動作を説明するフローチャートである。It is a flowchart explaining operation | movement of the camera system of FIG.
 以下、本発明の実施形態について、図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
(第1の実施形態)
 はじめに、本発明の第1の実施形態に係る補正パラメータ出力装置およびカメラシステムについて説明する。図1は、本発明の第1の実施形態に係るカメラシステムの概略構成を示す機能ブロック図である。
(First embodiment)
First, the correction parameter output device and camera system according to the first embodiment of the present invention will be described. FIG. 1 is a functional block diagram showing a schematic configuration of a camera system according to the first embodiment of the present invention.
 図1に示すように、カメラシステム100は、撮像装置10と、表示装置11と、を備える。撮像装置10は、複数の撮像部、本実施形態においては、例えば、フロントカメラ12、リアカメラ13、サイドカメラ14(左サイドカメラ14L,右サイドカメラ14R)を有する。 As shown in FIG. 1, the camera system 100 includes an imaging device 10 and a display device 11. The imaging device 10 includes a plurality of imaging units, in the present embodiment, for example, a front camera 12, a rear camera 13, and a side camera 14 (a left side camera 14L and a right side camera 14R).
 図2に示すように、表示装置11は、運転席から視認可能な位置に配置される。フロントカメラ12は、移動体15の前方の周辺領域を撮像可能となるように配置される。リアカメラ13は、移動体15の後方の周辺領域を撮像可能となるように配置される。サイドカメラ14は、例えば左右のドアミラー16において鉛直下向きに、移動体15の側方の周辺領域をそれぞれ撮像可能となるように配置される。また、サイドカメラ14は、移動体15の左右両側にそれぞれ対称に配置される。 As shown in FIG. 2, the display device 11 is disposed at a position that can be viewed from the driver's seat. The front camera 12 is arranged so as to be able to image a peripheral region in front of the moving body 15. The rear camera 13 is arranged so as to be able to image a peripheral region behind the moving body 15. The side cameras 14 are arranged so that, for example, the left and right door mirrors 16 can vertically image the peripheral regions on the sides of the moving body 15 downward. Further, the side cameras 14 are symmetrically arranged on both the left and right sides of the moving body 15.
 フロントカメラ12、リアカメラ13、およびサイドカメラ14は、例えば魚眼レンズなどの画角の広いレンズを備えており、移動体15の周辺領域を広角撮影可能である。図3に示すように、フロントカメラ12の撮像範囲は、移動体15の前方領域FAを含む。リアカメラ13の撮像範囲は、移動体15の後方領域ReAを含む。左サイドカメラ14Lの撮像範囲および右サイドカメラ14Rの撮像範囲は、移動体15の左側方領域LAおよび右側方領域RAをそれぞれ含む。 The front camera 12, the rear camera 13, and the side camera 14 are provided with a lens having a wide angle of view such as a fisheye lens, for example, and a peripheral region of the moving body 15 can be photographed at a wide angle. As shown in FIG. 3, the imaging range of the front camera 12 includes a front area FA of the moving body 15. The imaging range of the rear camera 13 includes a rear area ReA of the moving body 15. The imaging range of the left side camera 14L and the imaging range of the right side camera 14R include a left side area LA and a right side area RA of the moving body 15, respectively.
 各カメラ12,13,14L,14Rの撮像範囲は、移動体15の四隅周辺の領域を互いに重複して含む。詳細には、フロントカメラ12の撮像範囲とサイドカメラ14(14L,14R)の撮像範囲は、移動体15の左前方の領域FLAおよび右前方の領域FRAを、互いに重複して含む。また、リアカメラ13の撮像範囲とサイドカメラ14(14L,14R)の撮像範囲は、移動体15の左後方の領域ReLAおよび右後方の領域ReRAを、互いに重複して含む。以下、各カメラ12,13,14L,14Rの撮像範囲が互いに重複する移動体15周辺の領域を重複領域(FLA,FRA,ReLA,ReRA)という。 The imaging ranges of the cameras 12, 13, 14L, and 14R include regions around the four corners of the moving body 15 overlapping each other. Specifically, the imaging range of the front camera 12 and the imaging range of the side cameras 14 (14L, 14R) include the left front area FLA and the right front area FRA of the moving body 15 overlapping each other. Further, the imaging range of the rear camera 13 and the imaging range of the side cameras 14 (14L, 14R) include the left rear region ReLA and the right rear region ReRA of the moving body 15 overlapping each other. Hereinafter, the area around the moving body 15 where the imaging ranges of the cameras 12, 13, 14L, and 14R overlap each other is referred to as an overlapping area (FLA, FRA, ReLA, ReRA).
 次に、撮像部(フロントカメラ12、リアカメラ13、サイドカメラ14)の構成について説明する。フロントカメラ12は、光学系17aと、イメージャ18aと、画像処理部19aと、カメラ制御部20aと、画像結合装置(画像結合部)21と、補正パラメータ出力装置22と、を備える(図1参照)。 Next, the configuration of the imaging unit (front camera 12, rear camera 13, side camera 14) will be described. The front camera 12 includes an optical system 17a, an imager 18a, an image processing unit 19a, a camera control unit 20a, an image combining device (image combining unit) 21, and a correction parameter output device 22 (see FIG. 1). ).
 光学系17aは、複数のレンズを含んで構成され、被写体像を結像させる。本実施形態において、光学系17aは広い画角を有しており、上述のように移動体15の周辺領域に含まれる被写体像を結像可能である。 The optical system 17a includes a plurality of lenses and forms a subject image. In the present embodiment, the optical system 17a has a wide angle of view, and can form a subject image included in the peripheral region of the moving body 15 as described above.
 イメージャ18aは、例えばCMOS撮像素子であって、光学系17aによって結像する被写体像を撮像した画像を生成する。 The imager 18a is, for example, a CMOS image sensor, and generates an image obtained by capturing a subject image formed by the optical system 17a.
 画像処理部19aは、イメージャ18aが生成した画像に、画像変換、色補正、ガンマ補正、および輝度補正などの画像処理を施す。また、画像処理部19aは、画像処理を施した画像を出力する。 The image processing unit 19a performs image processing such as image conversion, color correction, gamma correction, and luminance correction on the image generated by the imager 18a. The image processing unit 19a outputs an image subjected to image processing.
 画像処理部19aは、画像変換により、イメージャ18aが生成した広角の撮像画像を俯瞰画像に変換する。即ち、画像変換によって、広角撮影により生成し、一般的に画像周辺部が歪んだ撮像画像が、移動体15の周辺領域を移動体15の上方から鉛直下向きにみた俯瞰画像に変換される。具体的には、画像処理部19aは、イメージャ18aによる撮像画像を、移動体15の前方領域FAおよび重複領域FLA,FRA(図3参照)における俯瞰画像に変換する。 The image processing unit 19a converts the wide-angle captured image generated by the imager 18a into an overhead image by image conversion. That is, a captured image generated by wide-angle imaging and generally distorted in the peripheral portion of the image is converted into an overhead image when the peripheral area of the moving body 15 is viewed from above the moving body 15 vertically downward. Specifically, the image processing unit 19a converts the image captured by the imager 18a into an overhead image in the front area FA and the overlapping areas FLA and FRA (see FIG. 3) of the moving body 15.
 画像処理部19aは、色補正により、撮像画像又は俯瞰画像の色を補正する。色補正のために、画像処理部19aは、補正パラメータ出力装置22から色補正パラメータを取得する。例えば、画像処理部19aは、撮像画像又は俯瞰画像における特定の色信号成分に対して、取得した色補正パラメータを乗じることにより色補正を行う。 The image processing unit 19a corrects the color of the captured image or the overhead image by color correction. For color correction, the image processing unit 19 a acquires color correction parameters from the correction parameter output device 22. For example, the image processing unit 19a performs color correction by multiplying a specific color signal component in the captured image or the overhead image by the acquired color correction parameter.
 画像処理部19aは、例えば通常のガンマ補正により、表示装置11の入力信号対発光強度の非線形性を補正する。 The image processing unit 19a corrects the nonlinearity of the input signal versus the emission intensity of the display device 11 by, for example, normal gamma correction.
 画像処理部19aは、輝度補正により、撮像画像又は俯瞰画像の輝度を補正する。輝度補正のために、画像処理部19aは、補正パラメータ出力装置22から輝度補正パラメータを取得する。例えば、画像処理部19aは、撮像画像又は俯瞰画像における輝度信号成分に対して、取得した輝度補正パラメータを乗じることにより輝度補正を行う。 The image processing unit 19a corrects the brightness of the captured image or the overhead image by brightness correction. For brightness correction, the image processing unit 19 a acquires a brightness correction parameter from the correction parameter output device 22. For example, the image processing unit 19a performs luminance correction by multiplying the luminance signal component in the captured image or the overhead image by the acquired luminance correction parameter.
 カメラ制御部20aは(図1参照)、フロントカメラ12の各部位の動作を制御する。例えば、カメラ制御部20aは、リアカメラ13およびサイドカメラ14と同期して移動体15の周辺領域をイメージャ18aに撮像させ、周期的に、例えば30fpsで画像を生成させる。また、カメラ制御部20aは、車載ネットワーク101又は専用線を介して情報の送受信を行う。 The camera control unit 20a (see FIG. 1) controls the operation of each part of the front camera 12. For example, the camera control unit 20a causes the imager 18a to capture the peripheral region of the moving body 15 in synchronization with the rear camera 13 and the side camera 14, and periodically generate an image at, for example, 30 fps. The camera control unit 20a transmits and receives information via the in-vehicle network 101 or a dedicated line.
 画像結合装置21は、フロントカメラ12、リアカメラ13、およびサイドカメラ14L,14Rの画像処理部19a,19b,19c,19dがそれぞれ出力した画像を結合して、結合画像を生成する。結合画像は、例えば移動体15の全周囲の俯瞰画像である。本実施形態における全周囲の俯瞰画像では、移動体の前方領域FAおよび重複領域FLA,FRAにはフロントカメラ12の画像、後方領域ReAおよび重複領域ReLA,ReRAにはリアカメラ13の画像、左右側方領域LA,RAにはサイドカメラ14L,14Rの画像がそれぞれ用いられる(図3参照)。また、画像結合装置21は、生成した結合画像を表示装置11に出力する。 The image combining device 21 combines images output from the image processing units 19a, 19b, 19c, and 19d of the front camera 12, the rear camera 13, and the side cameras 14L and 14R to generate a combined image. The combined image is, for example, an overhead image of the entire periphery of the moving body 15. In the bird's-eye view of the entire periphery in this embodiment, the front area FA and the overlapping areas FLA and FRA of the moving body are images of the front camera 12, the rear area ReA and the overlapping areas ReLA and ReRA are images of the rear camera 13, and the left and right sides. The images of the side cameras 14L and 14R are used for the direction areas LA and RA, respectively (see FIG. 3). In addition, the image combining device 21 outputs the generated combined image to the display device 11.
 補正パラメータ出力装置22は(図1参照)、制御情報取得部23と、記憶部24と、出力部25と、制御部26と、を備える。 The correction parameter output device 22 (see FIG. 1) includes a control information acquisition unit 23, a storage unit 24, an output unit 25, and a control unit 26.
 制御情報取得部23は、移動体15の制御情報を取得する。本実施形態において、制御情報には、時刻情報および移動体15の状態に関する多様な情報が含まれる。移動体15の状態に関する多様な情報には、例えば、移動体15の照明(ヘッドランプ、テールランプ、およびブレーキランプ)の点灯又は消灯を示す情報が含まれる。制御情報取得部23は、任意の方法により制御情報を取得可能であり、例えば車載ネットワーク101を介して移動体15から取得してもよく、あるいは移動体15の他の構成要素が出力する制御情報を有線/無線により取得してもよい。 The control information acquisition unit 23 acquires control information of the moving body 15. In the present embodiment, the control information includes various information regarding the time information and the state of the moving body 15. The variety of information related to the state of the moving body 15 includes, for example, information indicating whether the lighting (head lamp, tail lamp, and brake lamp) of the moving body 15 is turned on or off. The control information acquisition unit 23 can acquire control information by an arbitrary method. For example, the control information acquisition unit 23 may acquire the control information from the mobile body 15 via the in-vehicle network 101, or control information output by other components of the mobile body 15 May be acquired by wire / wireless.
 記憶部24は、多様な制御情報にそれぞれ対応付けられた複数のパターンの補正パラメータ群を記憶する。補正パラメータ群は、予め実験又はシミュレーションなどにより定められている。 The storage unit 24 stores a plurality of pattern correction parameter groups respectively associated with various control information. The correction parameter group is determined in advance through experiments or simulations.
 例えば図4のパターン1に示すように、状態が“昼”である場合、対応する補正パラメータ群に含まれる各カメラ12,13,14L,14Rの補正パラメータ(色補正パラメータおよび輝度補正パラメータ)は同一である。状態の判別に関し、制御情報に含まれる時刻情報が“7時~17時”を示す場合、状態が“昼”であると判別する。 For example, as shown in pattern 1 in FIG. 4, when the state is “daytime”, the correction parameters (color correction parameter and luminance correction parameter) of each camera 12, 13, 14L, 14R included in the corresponding correction parameter group are as follows. Are the same. Regarding the determination of the state, if the time information included in the control information indicates “7 o'clock to 17 o'clock”, it is determined that the state is “daytime”.
 昼の場合、各カメラ12,13,14L,14Rの撮影範囲の多様な被写体は太陽光などの同一の光源により照らされることが多い。それゆえ、通常のホワイトバランスを実行させる共通の値(基準値)を各補正パラメータとして用いることにより、各カメラ12,13,14L,14Rが撮像した各画像の色および輝度が平均的に等しいと視認し得るように調整される。 In the daytime, various subjects in the shooting ranges of the cameras 12, 13, 14L, and 14R are often illuminated by the same light source such as sunlight. Therefore, when a common value (reference value) for executing normal white balance is used as each correction parameter, the colors and luminances of the images captured by the cameras 12, 13, 14L, and 14R are equal on average. It is adjusted so that it can be visually recognized.
 一方、例えば図4のパターン2に示すように、状態が“夜(ヘッドランプON)”である場合、フロントカメラ12の補正パラメータは基準値である。また、リアカメラ13およびサイドカメラ14L,14Rの補正パラメータにおいて、色補正パラメータは画像の彩度を上げるように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。状態の判別に関し、制御情報に含まれる時刻情報が“17時~7時”を示し、かつ制御情報にヘッドランプ(白色)の点灯を示す情報が含まれる場合、状態が“夜(ヘッドランプON)”であると判別する。 On the other hand, for example, as shown in the pattern 2 of FIG. 4, when the state is “night (headlamp ON)”, the correction parameter of the front camera 12 is a reference value. In the correction parameters of the rear camera 13 and the side cameras 14L and 14R, the color correction parameter is determined to increase the saturation of the image, and the luminance correction parameter is determined to increase the luminance of the image. Regarding the determination of the state, when the time information included in the control information indicates “17:00 to 7:00” and the control information includes information indicating the lighting of the headlamp (white), the state is “night (headlamp ON ) ”.
 夜の場合、移動体15の前方のみにヘッドランプが照射されると、前方の被写体は、左右後方の被写体よりも画像上で明度、彩度が高くなる。それゆえ、前述のような補正パラメータを用いることにより、各カメラ12,13,14L,14Rが撮像した画像の色および輝度が平均的に等しいと視認し得るように調整される。 In the case of night, when the headlamp is irradiated only in front of the moving body 15, the front subject has higher brightness and saturation on the image than the left and right rear subjects. Therefore, by using the correction parameters as described above, adjustment is performed so that the images captured by the cameras 12, 13, 14L, and 14R can be visually recognized as having the same color and brightness on average.
 また、例えば図4のパターン3に示すように、状態が“夜(ヘッド・テールランプON)”である場合、フロントカメラ12の補正パラメータは基準値である。また、リアカメラ13の補正パラメータにおいて、色補正パラメータは画像の赤みを低減(弱)するように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。また、サイドカメラ14L,14Rの補正パラメータにおいて、色補正パラメータは画像の彩度を上げるように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。状態の判別に関し、制御情報に含まれる時刻情報が“17時~7時”を示し、かつ制御情報にヘッドランプ(白色)およびテールランプ(赤色)の点灯を示す情報が含まれる場合、状態が“夜(ヘッド・テールランプON)”であると判別する。 For example, as shown in pattern 3 of FIG. 4, when the state is “night (head / tail lamp ON)”, the correction parameter of the front camera 12 is a reference value. In the correction parameters of the rear camera 13, the color correction parameter is determined to reduce (weaken) the redness of the image, and the luminance correction parameter is determined to increase the luminance of the image. In the correction parameters of the side cameras 14L and 14R, the color correction parameter is determined to increase the saturation of the image, and the luminance correction parameter is determined to increase the luminance of the image. Regarding the determination of the state, when the time information included in the control information indicates “17:00 to 7:00” and the control information includes information indicating lighting of the head lamp (white) and the tail lamp (red), the state is “ It is determined that it is “night (head / tail lamp ON)”.
 夜の場合、移動体15の後方のみに赤色のテールランプが照射されると、後方の被写体は、左右前方の被写体よりも赤みの強い画像として撮像される。それゆえ、前述のような補正パラメータを用いることにより、各カメラ12,13,14L,14Rが撮像した画像の色および輝度が平均的に等しいと視認し得るように調整される。 In the case of night, when the red tail lamp is irradiated only to the rear of the moving body 15, the rear subject is captured as an image that is more reddish than the left and right front subjects. Therefore, by using the correction parameters as described above, adjustment is performed so that the images captured by the cameras 12, 13, 14L, and 14R can be visually recognized as having the same color and brightness on average.
 また、例えば図4のパターン4に示すように、状態が“夜(ヘッド・ブレーキランプON)”である場合、フロントカメラ12の補正パラメータは基準値である。また、リアカメラ13の補正パラメータにおいて、色補正パラメータは画像の赤みを低減(強)するように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。また、サイドカメラ14L,14Rの補正パラメータにおいて、色補正パラメータは画像の彩度を上げるように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。状態の判別に関し、制御情報に含まれる時刻情報が“17時~7時”を示し、かつ制御情報にヘッドランプ(白色)およびブレーキランプ(赤色)の点灯を示す情報が含まれる場合、状態が“夜(ヘッド・ブレーキランプON)”であると判別する。 For example, as shown in pattern 4 of FIG. 4, when the state is “night (head brake lamp ON)”, the correction parameter of the front camera 12 is a reference value. In the correction parameters of the rear camera 13, the color correction parameter is determined to reduce (strong) redness of the image, and the luminance correction parameter is determined to increase the luminance of the image. In the correction parameters of the side cameras 14L and 14R, the color correction parameter is determined to increase the saturation of the image, and the luminance correction parameter is determined to increase the luminance of the image. Regarding the state determination, when the time information included in the control information indicates “17:00 to 7:00” and the control information includes information indicating lighting of the headlamp (white) and the brake lamp (red), the state is It is determined that it is “Night (head brake lamp ON)”.
 夜の場合、移動体15の後方のみに赤色のブレーキランプが照射されると、後方の被写体は、左右前方の被写体よりも赤みの強い画像として撮像される。一般に、ブレーキランプはテールランプよりも明るく、赤色が強い。それゆえ、前述のような補正パラメータを用いることにより、各カメラ12,13,14L,14Rが撮像した画像が平均的に等しいと視認し得る色および輝度に調整される。 In the case of night, when the red brake lamp is irradiated only behind the moving body 15, the rear subject is captured as an image that is more reddish than the left and right front subjects. In general, the brake lamp is brighter than the tail lamp and red is stronger. Therefore, by using the correction parameters as described above, the color and brightness that can be visually recognized when the images taken by the cameras 12, 13, 14L, and 14R are equal on average are adjusted.
 また、例えば図4のパターン5に示すように、状態が“夜(照明OFF)”である場合、フロントカメラ12、リアカメラ13、およびサイドカメラ14L,14Rの補正パラメータにおいて、色補正パラメータは画像の彩度を上げるように定められ、輝度補正パラメータは画像の輝度を上げるように定められる。状態の判別に関し、制御情報に含まれる時刻情報が“17時~7時”を示す場合、状態が“夜(照明OFF)”であると判別する。 For example, as shown in the pattern 5 in FIG. 4, when the state is “night (lighting OFF)”, the color correction parameter is an image among the correction parameters of the front camera 12, the rear camera 13, and the side cameras 14L and 14R. The luminance correction parameter is determined to increase the luminance of the image. Regarding the determination of the state, when the time information included in the control information indicates “17:00 to 7:00”, it is determined that the state is “night (lighting OFF)”.
 夜の場合、移動体15の照明がいずれも消灯していると、移動体15周囲の被写体は、昼の場合と比較して画像上で明度、彩度が低くなる。それゆえ、前述のような補正パラメータを用いることにより、各カメラ12,13,14L,14Rが撮像した画像の色および輝度が平均的に等しいと視認し得るように調整される。 In the case of night, when all the lights of the moving body 15 are turned off, subjects around the moving body 15 have lower brightness and saturation on the image than in the daytime. Therefore, by using the correction parameters as described above, adjustment is performed so that the images captured by the cameras 12, 13, 14L, and 14R can be visually recognized as having the same color and brightness on average.
 出力部25は(図1参照)、制御情報取得部23が取得した制御情報に対応する補正パラメータ群に含まれる補正パラメータを、各カメラ12,13,14L,14Rの画像処理部19a,19b,19c,19dに出力する。 The output unit 25 (see FIG. 1) converts the correction parameters included in the correction parameter group corresponding to the control information acquired by the control information acquisition unit 23 into the image processing units 19a, 19b, and 14R of the cameras 12, 13, 14L, and 14R. It outputs to 19c and 19d.
 制御部26は、補正パラメータ出力装置22の各部位の動作を制御する。例えば、制御部26は、各カメラ12,13,14L,14Rのイメージャ18a,18b,18c,18dによる画像生成と同時に移動体15の制御情報を制御情報取得部23に取得させ、出力部25に補正パラメータを周期的に出力させる。また、制御部26は、車載ネットワーク101又は専用線を介して情報の送受信を行う。 The control unit 26 controls the operation of each part of the correction parameter output device 22. For example, the control unit 26 causes the control information acquisition unit 23 to acquire the control information of the moving body 15 simultaneously with the image generation by the imagers 18a, 18b, 18c, and 18d of the cameras 12, 13, 14L, and 14R, and causes the output unit 25 to The correction parameter is output periodically. Moreover, the control part 26 transmits / receives information via the vehicle-mounted network 101 or a dedicated line.
 リアカメラ13およびサイドカメラ14(14L,14R)は(図1参照)、フロントカメラ12と同様に、光学系17b,17c,17d、イメージャ18b,18c,18d、画像処理部19b,19c,19d、およびカメラ制御部20b,20c,20dをそれぞれ備える。光学系17b,17c,17d、イメージャ18b,18c,18d、画像処理部19b,19c,19d、およびカメラ制御部20b,20c,20dの機能および構成は、フロントカメラ12と同様である。 As with the front camera 12, the rear camera 13 and the side cameras 14 (14L, 14R) (see FIG. 1) are optical systems 17b, 17c, 17d, imagers 18b, 18c, 18d, image processing units 19b, 19c, 19d, And camera control units 20b, 20c, and 20d. The functions and configurations of the optical systems 17b, 17c, 17d, imagers 18b, 18c, 18d, image processing units 19b, 19c, 19d, and camera control units 20b, 20c, 20d are the same as those of the front camera 12.
 例えば、リアカメラ13の画像処理部19bは、リアカメラ13のイメージャ18bによる撮像画像を後方領域ReAおよび重複領域ReLA,ReRAにおける俯瞰画像に変換する。また、左サイドカメラ14Lの画像処理部19cは、左サイドカメラ14Lのイメージャ18cによる撮像画像を左側方領域LAおよび重複領域FLA,ReLAにおける俯瞰画像に変換する。また、右サイドカメラ14Rの画像処理部19dは、右サイドカメラ14のイメージャ18dによる撮像画像を右側方領域RAおよび重複領域FRA,ReRAにおける俯瞰画像に変換する。 For example, the image processing unit 19b of the rear camera 13 converts an image captured by the imager 18b of the rear camera 13 into an overhead view image in the rear area ReA and the overlapping areas ReLA and ReRA. In addition, the image processing unit 19c of the left side camera 14L converts the image captured by the imager 18c of the left side camera 14L into an overhead view image in the left side area LA and the overlapping areas FLA and ReLA. In addition, the image processing unit 19d of the right side camera 14R converts the image captured by the imager 18d of the right side camera 14 into an overhead view image in the right side area RA and the overlapping areas FRA and ReRA.
 表示装置11は、例えばLCDであり、リアルタイムの動画像を表示可能である。表示装置11は、画像結合装置21が出力する結合画像を取得して表示する。表示装置11は、例えばタッチパネルで構成され、ユーザ操作を受け付けるインターフェースとして機能してもよい。また、表示装置11は、車載ネットワーク101又は専用線を介して情報を送受信可能である。 The display device 11 is, for example, an LCD and can display a real-time moving image. The display device 11 acquires and displays the combined image output from the image combining device 21. The display device 11 may be configured by a touch panel, for example, and may function as an interface that receives user operations. The display device 11 can transmit and receive information via the in-vehicle network 101 or a dedicated line.
 次に、第1の実施形態に係るカメラシステム100が実行する処理について、図5のフローチャートを用いて説明する。本処理は、例えば、撮像装置10が稼働したときに開始され、ユーザによる終了指示があるまで繰り返し実行される。 Next, processing executed by the camera system 100 according to the first embodiment will be described with reference to the flowchart of FIG. This process is started, for example, when the imaging apparatus 10 is operated, and is repeatedly executed until an end instruction is given by the user.
 はじめに、各カメラ12,13,14L,14Rのカメラ制御部20a,20b,20c,20dは、イメージャ18a,18b,18c,18dを制御して、移動体15の周辺領域を撮像した画像を生成する(ステップS100)。 First, the camera control units 20a, 20b, 20c, and 20d of the cameras 12, 13, 14L, and 14R control the imagers 18a, 18b, 18c, and 18d to generate an image that captures the peripheral area of the moving body 15. (Step S100).
 次に、補正パラメータ出力装置22の制御部26は、制御情報取得部23を制御して、各カメラ12,13,14L,14Rによる撮像画像の撮像時における制御情報を取得する(ステップS101)。制御情報には、時刻情報および移動体15の照明(ヘッドランプ、テールランプ、およびブレーキランプ)の点灯を示す情報が含まれる。 Next, the control unit 26 of the correction parameter output device 22 controls the control information acquisition unit 23 to acquire control information when the captured images are captured by the cameras 12, 13, 14L, and 14R (step S101). The control information includes time information and information indicating lighting of the moving body 15 (head lamp, tail lamp, and brake lamp).
 続いて、制御部26は、記憶部24が記憶している複数のパターンの補正パラメータ群から、ステップS101の制御情報に対応する補正パラメータ群を読出し(ステップS102)、補正パラメータを各カメラ12,13,14L,14Rに出力する。 Subsequently, the control unit 26 reads out a correction parameter group corresponding to the control information in step S101 from the correction parameter groups of a plurality of patterns stored in the storage unit 24 (step S102), and sets the correction parameter to each camera 12, Output to 13, 14L, 14R.
 次に、各カメラ12,13,14L,14Rのカメラ制御部20a,20b,20c,20dは、画像処理部19を制御して、ステップS100で生成した画像を俯瞰画像にそれぞれ変換する(ステップS103)。 Next, the camera control units 20a, 20b, 20c, and 20d of the cameras 12, 13, 14L, and 14R control the image processing unit 19 to convert the images generated in step S100 into overhead images (step S103). ).
 続いて、各カメラ制御部20a,20b,20c,20dは、画像処理部19a,19b,19c,19dを制御して、ステップS102の補正パラメータに基づいて、ステップS103の俯瞰画像をそれぞれ補正し(ステップS104)、補正した俯瞰画像を画像結合装置21に出力する。 Subsequently, each of the camera control units 20a, 20b, 20c, and 20d controls the image processing units 19a, 19b, 19c, and 19d to correct the bird's-eye view image in step S103 based on the correction parameter in step S102 ( In step S104, the corrected overhead image is output to the image combining device 21.
 次に、画像結合装置21は、ステップS104の複数の補正した俯瞰画像の結合画像を生成し(ステップS105)、表示装置11に出力する。 Next, the image combining device 21 generates a combined image of the plurality of corrected overhead images in step S104 (step S105) and outputs the combined image to the display device 11.
 そして、表示装置11は、ステップS105の結合画像を表示する(ステップS106)。 Then, the display device 11 displays the combined image of step S105 (step S106).
 このように、第1の実施形態の画像の補正パラメータ出力装置によれば、移動体15の制御情報に応じて複数の画像を補正するための補正パラメータを出力する。このため、以下に説明するように、移動体15の多様な状態に適応した補正が可能となり、結合画像の視認性を向上可能である。 As described above, according to the image correction parameter output apparatus of the first embodiment, the correction parameters for correcting a plurality of images according to the control information of the moving body 15 are output. For this reason, as described below, correction suitable for various states of the moving body 15 can be performed, and the visibility of the combined image can be improved.
 例えば、夜間においてヘッドランプが点灯している場合、移動体15の前方にヘッドランプの白色光が照射されるため、移動体15の前方の被写体像は、左方、右方、および後方の被写体像に比べて白みがかる。したがって、図6(a)に示すように、フロントカメラ12による画像FIm上の左右領域と、サイドカメラ14による画像LIm,RImとは、平均的な色および輝度が異なる。同様に、夜間においてテールランプ又はブレーキランプが点灯している場合、移動体15の後方にテールランプ又はブレーキランプの赤色光が照射されるため、移動体15の後方の被写体像は、左方、右方、および前方の被写体像に比べて赤みがかる。したがって、リアカメラ13による画像ReIm全体と、サイドカメラ14による画像LIm,RImとは、平均的な色および輝度が異なる。 For example, when the headlamp is turned on at night, white light from the headlamp is irradiated in front of the moving body 15, so that the subject images in front of the moving body 15 are left, right, and rear subjects. Whiter than the statue. Therefore, as shown in FIG. 6A, the left and right regions on the image FIm by the front camera 12 and the images LIm and RIm by the side camera 14 are different in average color and luminance. Similarly, when the tail lamp or the brake lamp is lit at night, the red light of the tail lamp or the brake lamp is irradiated to the rear of the moving body 15, so that the subject image behind the moving body 15 is left, right , And more reddish than the front subject image. Therefore, the entire image ReIm from the rear camera 13 and the images LIm and RIm from the side camera 14 are different in average color and luminance.
 第1の実施形態の補正パラメータ出力装置は、記憶部24に予め記憶した複数のパターンの補正パラメータ群から、移動体15の制御情報に対応する補正パラメータ群を読出して出力する。このようにして、各カメラ12,13,14L,14Rの画像処理部19a,19b,19c,19dは、移動体15の多様な状態に対して、各撮像画像を平均的に等しいと視認し得る色および輝度に補正可能となる(図6(b)参照)。 The correction parameter output device of the first embodiment reads out and outputs a correction parameter group corresponding to control information of the moving body 15 from a plurality of correction parameter groups stored in advance in the storage unit 24. In this way, the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R can visually recognize that the captured images are equal on average with respect to various states of the moving body 15. Color and luminance can be corrected (see FIG. 6B).
 また、第1の実施形態では、制御情報に移動体15の照明の点灯を示す情報が含まれるか否かによって異なる補正パラメータを出力する。このため、移動体15の照明の点灯の有無による各画像の色および輝度の差を低減し、結合画像の視認性をさらに向上可能である。 In the first embodiment, different correction parameters are output depending on whether or not the control information includes information indicating lighting of the moving body 15. For this reason, the difference of the color and brightness | luminance of each image by the presence or absence of lighting of the moving body 15 can be reduced, and the visibility of a combined image can further be improved.
(第2の実施形態)
 次に、本発明の第2の実施形態について説明する。第2の実施形態に係るカメラシステム100の構成は第1の実施形態と同様であるが、画像処理部の機能および補正パラメータ出力装置の構成が異なる。
(Second Embodiment)
Next, a second embodiment of the present invention will be described. The configuration of the camera system 100 according to the second embodiment is the same as that of the first embodiment, but the function of the image processing unit and the configuration of the correction parameter output device are different.
 画像処理部19a,19b,19c,19dは(図7参照)、第1の実施形態と同様に、イメージャ18a,18b,18c,18dが生成した画像に、画像変換、色補正、および輝度補正などの画像処理を施す。本実施形態に係る画像処理部19a,19b,19c,19dは、補正パラメータ出力装置220から後述する調整パラメータを取得する。例えば、画像処理部19a,19b,19c,19dは、画像の色信号成分に対して、取得した色調整パラメータを乗じることにより画像調整を行う。 The image processing units 19a, 19b, 19c, and 19d (see FIG. 7) perform image conversion, color correction, luminance correction, and the like on the images generated by the imagers 18a, 18b, 18c, and 18d, as in the first embodiment. The image processing is performed. The image processing units 19 a, 19 b, 19 c, and 19 d according to the present embodiment acquire adjustment parameters that will be described later from the correction parameter output device 220. For example, the image processing units 19a, 19b, 19c, and 19d perform image adjustment by multiplying the color signal component of the image by the acquired color adjustment parameter.
 本実施形態に係る補正パラメータ出力装置220は、制御情報取得部23と、記憶部24と、出力部25と、制御部26と、画像取得部270と、調整パラメータ算出部280とを備える。制御情報取得部23、記憶部24、および制御部26の構成および機能は、第1の実施形態と同様である。 The correction parameter output device 220 according to the present embodiment includes a control information acquisition unit 23, a storage unit 24, an output unit 25, a control unit 26, an image acquisition unit 270, and an adjustment parameter calculation unit 280. The configurations and functions of the control information acquisition unit 23, the storage unit 24, and the control unit 26 are the same as those in the first embodiment.
 出力部25は、第1の実施形態と同様に、制御情報取得部23が取得した制御情報に対応する補正パラメータ群に含まれる補正パラメータを、各カメラ12,13,14L,14Rの画像処理部19a,19b,19c,19dに出力する。また、本実施形態における出力部25は、調整パラメータ算出部280が算出する調整パラメータを、各カメラ12,13,14L,14Rの画像処理部19a,19b,19c,19dに出力する。 Similarly to the first embodiment, the output unit 25 uses the correction parameters included in the correction parameter group corresponding to the control information acquired by the control information acquisition unit 23 as image processing units of the cameras 12, 13, 14L, and 14R. It outputs to 19a, 19b, 19c, 19d. Further, the output unit 25 in the present embodiment outputs the adjustment parameters calculated by the adjustment parameter calculation unit 280 to the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R.
 画像取得部270は、各カメラ12,13,14L,14Rの画像処理部19a,19b,19c,19dから画像を取得する。 The image acquisition unit 270 acquires images from the image processing units 19a, 19b, 19c, and 19d of the cameras 12, 13, 14L, and 14R.
 調整パラメータ算出部280は、画像取得部270が取得した複数の画像を調整するための調整パラメータを算出する。例えば、調整パラメータ算出部280は、以下に示すように2段階の処理により調整パラメータを算出する。 The adjustment parameter calculation unit 280 calculates an adjustment parameter for adjusting a plurality of images acquired by the image acquisition unit 270. For example, the adjustment parameter calculation unit 280 calculates the adjustment parameter by a two-stage process as described below.
 はじめに、第1段階の処理について説明する。調整パラメータ算出部280は、画像取得部270が取得した複数の画像のうちの1つを基準画像に定める。ここではフロントカメラ12による画像を基準画像に定めるものとする。 First, the first stage process will be described. The adjustment parameter calculation unit 280 determines one of the plurality of images acquired by the image acquisition unit 270 as a reference image. Here, it is assumed that an image from the front camera 12 is set as a reference image.
 調整パラメータ算出部280は、基準画像および基準画像と共通の重複領域をそれぞれ含む隣接画像(サイドカメラ14L,14Rによる画像)について、各重複領域(FLA,FRA)における画像の色信号成分の平均値をそれぞれ算出する。 The adjustment parameter calculation unit 280 calculates the average value of the color signal components of the image in each overlapping region (FLA, FRA) for the reference image and adjacent images (images by the side cameras 14L and 14R) each including an overlapping region common to the reference image. Are calculated respectively.
 調整パラメータ算出部280は、基準画像と隣接画像との色信号成分平均値の差が所定の閾値以上である場合、即ち基準画像と隣接画像との間で平均的な色の差が大きい場合、色の差を低減するように隣接画像の色信号成分に乗じる色調整パラメータを算出する。例えば、調整パラメータ算出部280は、平均値の差が所定の閾値未満となるように、或いは隣接画像の平均値を基準画像の平均値に一致させるように、隣接画像の色調整パラメータを算出する。 The adjustment parameter calculation unit 280, when the difference in color signal component average value between the reference image and the adjacent image is greater than or equal to a predetermined threshold, that is, when the average color difference between the reference image and the adjacent image is large, A color adjustment parameter to be multiplied by the color signal component of the adjacent image is calculated so as to reduce the color difference. For example, the adjustment parameter calculation unit 280 calculates the color adjustment parameter of the adjacent image so that the difference between the average values is less than a predetermined threshold or the average value of the adjacent image matches the average value of the reference image. .
 そして、調整パラメータ算出部280は、出力部25を介して算出した色調整パラメータをサイドカメラ14L,14Rに出力する。 Then, the adjustment parameter calculation unit 280 outputs the color adjustment parameters calculated via the output unit 25 to the side cameras 14L and 14R.
 調整パラメータ算出部280は、上述のように算出した色調整パラメータに基づいて調整されたサイドカメラ14L,14Rによる画像を画像取得部270が取得すると、以下に説明する第2段階の処理を行う。 When the image acquisition unit 270 acquires images from the side cameras 14L and 14R adjusted based on the color adjustment parameters calculated as described above, the adjustment parameter calculation unit 280 performs a second stage process described below.
 調整パラメータ算出部280は、サイドカメラ14L,14Rによる画像および当該画像とそれぞれ共通の重複領域を含み基準画像と異なる第3の画像(リアカメラ13による画像)について、各重複領域(ReLA,ReRA)における画像の色信号成分の平均値をそれぞれ算出する。 The adjustment parameter calculation unit 280 performs the overlap regions (ReLA, ReRA) on the images by the side cameras 14L and 14R and the third image (image by the rear camera 13) that includes an overlap region common to the images and is different from the reference image. The average values of the color signal components of the image at are respectively calculated.
 調整パラメータ算出部280は、左サイドカメラ14Lによる画像の色信号成分平均値と、右サイドカメラ14Rによる画像の色信号成分平均値と、の平均値Aを算出する。また、調整パラメータ算出部280は、重複領域ReLAにおけるリアカメラ13による画像の色信号成分平均値と、重複領域ReRAにおけるリアカメラ13による画像の色信号成分平均値と、の平均値Bを算出する。 The adjustment parameter calculation unit 280 calculates the average value A of the color signal component average value of the image by the left side camera 14L and the color signal component average value of the image by the right side camera 14R. Further, the adjustment parameter calculation unit 280 calculates an average value B of the color signal component average value of the image by the rear camera 13 in the overlap region ReLA and the color signal component average value of the image by the rear camera 13 in the overlap region ReRA. .
 調整パラメータ算出部280は、平均値Aと平均値Bとの差が所定の閾値以上である場合、例えば、平均値の差が所定の閾値未満となるように、或いは平均値Bを平均値Aに一致させるように、リアカメラ13による画像の色調整パラメータを算出する。 When the difference between the average value A and the average value B is equal to or greater than a predetermined threshold, the adjustment parameter calculation unit 280, for example, sets the average value B to the average value A so that the difference between the average values is less than the predetermined threshold. The color adjustment parameters of the image by the rear camera 13 are calculated so as to coincide with each other.
 そして、調整パラメータ算出部280は、出力部25を介して算出した色調整パラメータをリアカメラ13に出力する。 Then, the adjustment parameter calculation unit 280 outputs the color adjustment parameter calculated via the output unit 25 to the rear camera 13.
 好適には、調整パラメータ算出部280は、画像取得部270が取得した複数の画像のうちいずれの画像を上述の基準画像に定めるかを、制御情報取得部23が取得した制御情報に基づいて決定する。ここで、制御情報には、移動体15の進行方向を示す情報が含まれる。調整パラメータ算出部280は、制御情報に基づいて移動体15の進行方向を判定し、例えば前進時にはフロントカメラ12による画像、後進時にはリアカメラ13による画像、左折時又は右折時にはサイドカメラ14(14L又は14R)による画像を基準画像に定める。 Preferably, the adjustment parameter calculation unit 280 determines which of the plurality of images acquired by the image acquisition unit 270 is determined as the above-described reference image based on the control information acquired by the control information acquisition unit 23. To do. Here, the control information includes information indicating the traveling direction of the moving body 15. The adjustment parameter calculation unit 280 determines the traveling direction of the moving body 15 based on the control information. For example, when moving forward, the front camera 12 image, when moving backward, the rear camera 13 image, when turning left or right, the side camera 14 (14L or 14R) is determined as a reference image.
 次に、第2の実施形態に係るカメラシステム100が実行する処理について、図8のフローチャートを用いて説明する。本処理は、例えば、撮像装置10が稼働したときに開始され、ユーザによる終了指示があるまで繰り返し実行される。 Next, processing executed by the camera system 100 according to the second embodiment will be described with reference to the flowchart of FIG. This process is started, for example, when the imaging apparatus 10 is operated, and is repeatedly executed until an end instruction is given by the user.
 ステップS200からステップS203では、第1の実施形態におけるステップS100からステップS103と同様の処理が行われる。 From step S200 to step S203, processing similar to that from step S100 to step S103 in the first embodiment is performed.
 続いて、各カメラ制御部20a,20b,20c,20dは、画像処理部19を制御して、ステップS202の補正パラメータに基づいて、ステップS203の俯瞰画像をそれぞれ補正し(ステップS204)、補正した俯瞰画像を補正パラメータ出力装置220に出力する。 Subsequently, each of the camera control units 20a, 20b, 20c, and 20d controls the image processing unit 19 to correct the bird's-eye view image in step S203 based on the correction parameter in step S202 (step S204). The overhead image is output to the correction parameter output device 220.
 次に、補正パラメータ出力装置220は、ステップS200の制御情報に基づいて、ステップS204の複数の補正した俯瞰画像のうち1つを基準画像に定める(ステップS205)。ここでは、フロントカメラ12による画像を基準画像に定めるものとする。 Next, based on the control information in step S200, the correction parameter output device 220 determines one of the corrected overhead images in step S204 as a reference image (step S205). Here, an image obtained by the front camera 12 is defined as a reference image.
 続いて、補正パラメータ出力装置220は、基準画像およびステップS204の複数の補正した俯瞰画像のうちサイドカメラ14L,14Rによる画像の色信号成分に基づいて、サイドカメラ14L,14Rによる画像を調整するための色調整パラメータを算出し(ステップS206)、サイドカメラ14L,14Rに出力する。 Subsequently, the correction parameter output device 220 adjusts the images by the side cameras 14L and 14R based on the reference image and the color signal components of the images by the side cameras 14L and 14R among the plurality of overhead images corrected in step S204. Are calculated (step S206) and output to the side cameras 14L and 14R.
 次に、サイドカメラ14L,14Rは、ステップS206の色調整パラメータに基づいて、ステップS204で補正したサイドカメラ14L,14Rによる画像を調整し(ステップS207)、調整した画像を補正パラメータ出力装置220に出力する。 Next, the side cameras 14L and 14R adjust the images by the side cameras 14L and 14R corrected in step S204 based on the color adjustment parameters in step S206 (step S207), and the adjusted images are sent to the correction parameter output device 220. Output.
 次に、補正パラメータ出力装置220は、ステップS207で調整したサイドカメラ14L,14Rによる画像およびステップS204で補正したリアカメラ13による画像の色信号成分に基づいて、リアカメラ13による画像を調整するための色調整パラメータを算出し(ステップS208)、リアカメラ13に出力する。 Next, the correction parameter output device 220 adjusts the image by the rear camera 13 based on the color signal component of the image by the side camera 14L, 14R adjusted in step S207 and the image by the rear camera 13 corrected in step S204. Are calculated (step S208) and output to the rear camera 13.
 次に、リアカメラ13は、ステップS208の色調整パラメータに基づいて、ステップS204で補正したリアカメラ13による画像を調整する(ステップS209)。 Next, the rear camera 13 adjusts the image by the rear camera 13 corrected in step S204 based on the color adjustment parameter in step S208 (step S209).
 続いて、各カメラ12,13,14L,14Rは、補正又は調整した俯瞰画像を補正パラメータ出力装置22に出力する(ステップS210)。詳細には、ステップS204で補正したフロントカメラ12による画像、ステップS207で調整したサイドカメラ14L,14Rによる画像、およびステップS209で調整したリアカメラ13による画像を出力する。 Subsequently, each of the cameras 12, 13, 14L, and 14R outputs the corrected or adjusted overhead image to the correction parameter output device 22 (step S210). Specifically, the image by the front camera 12 corrected in step S204, the image by the side cameras 14L and 14R adjusted in step S207, and the image by the rear camera 13 adjusted in step S209 are output.
 次に、画像結合装置21は、ステップS210の俯瞰画像による結合画像を生成し(ステップS211)、表示装置11に出力する。 Next, the image combining device 21 generates a combined image based on the overhead image in step S210 (step S211) and outputs the combined image to the display device 11.
 そして、表示装置11は、ステップS211の結合画像を表示する(ステップS212)。 Then, the display device 11 displays the combined image in step S211 (step S212).
 このように、第2の実施形態のカメラシステムによれば、第1の実施形態のカメラシステムと同様に各カメラ12,13,14L,14Rによる画像を補正した上で、基準画像の色信号成分を基準として各カメラ12,13,14L,14Rによる画像を調整するため、結合画像の視認性を向上可能である。 As described above, according to the camera system of the second embodiment, the color signal component of the reference image is corrected after correcting the images by the cameras 12, 13, 14L, and 14R as in the camera system of the first embodiment. Since the images from the cameras 12, 13, 14L, and 14R are adjusted with reference to, the visibility of the combined image can be improved.
 また、第2の実施形態では、例えば移動体15の進行方向に応じて基準画像を決定するため、運転者が注目する移動体15の周辺領域を含む画像の色に合わせて他の画像の色を補正し、結合画像の視認性を更に向上可能である。 In the second embodiment, for example, since the reference image is determined according to the traveling direction of the moving body 15, the color of the other image is matched with the color of the image including the peripheral area of the moving body 15 that the driver pays attention to. And the visibility of the combined image can be further improved.
 本発明を諸図面や実施形態に基づき説明してきたが、当業者であれば本開示に基づき種々の変形や修正を行うことが容易であることに注意されたい。したがって、これらの変形や修正は本発明の範囲に含まれることに留意されたい。 Although the present invention has been described based on the drawings and embodiments, it should be noted that those skilled in the art can easily make various modifications and corrections based on the present disclosure. Therefore, it should be noted that these variations and modifications are included in the scope of the present invention.
 例えば、上述の実施形態において、撮像装置10は、フロントカメラ12、リアカメラ13、サイドカメラ14を備えるが、更に多くのカメラを備えてもよい。例えば、移動体15の全周囲を撮像可能な遠景カメラを更に備える構成であってもよい。 For example, in the above-described embodiment, the imaging device 10 includes the front camera 12, the rear camera 13, and the side camera 14, but may include more cameras. For example, a configuration further including a distant camera that can capture the entire periphery of the moving body 15 may be used.
 また、上述の実施形態のカメラシステムの各構成要素は、分割及び再配置可能である。例えば、画像結合装置21および補正パラメータ出力装置22を、フロントカメラ12から分割し単独の装置として構成してもよい。また、例えばナビゲーションシステムを更に備え、画像結合装置21および補正パラメータ出力装置22をナビゲーションシステムに備えるように構成してもよい。 In addition, each component of the camera system of the above-described embodiment can be divided and rearranged. For example, the image combining device 21 and the correction parameter output device 22 may be divided from the front camera 12 and configured as a single device. Further, for example, a navigation system may be further provided, and the image combining device 21 and the correction parameter output device 22 may be provided in the navigation system.
 また、上述の実施形態において、補正パラメータは、画像の補正に関する他の任意のパラメータを採用してもよい。また、例えば第2の実施形態では、調整パラメータ算出部280が色調整パラメータを算出する構成について説明したが、同様の処理により輝度調整パラメータを算出する構成であってもよい。この場合、例えば画像処理部19a,19b,19c,19dは、画像の輝度信号成分に対して、輝度調整パラメータを乗じることにより画像調整を行う。 In the above-described embodiment, the correction parameter may be any other parameter related to image correction. For example, in the second embodiment, the configuration in which the adjustment parameter calculation unit 280 calculates the color adjustment parameter has been described. However, the luminance adjustment parameter may be calculated by the same process. In this case, for example, the image processing units 19a, 19b, 19c, and 19d perform image adjustment by multiplying the luminance signal component of the image by the luminance adjustment parameter.
 また、上述の実施形態において、制御情報には、時刻情報や移動体15の照明の点灯を示す情報、移動体15の進行方向を示す情報などが含まれるものとして説明したが、他の任意の情報を含んでもよい。 In the above-described embodiment, the control information has been described as including time information, information indicating lighting of the moving body 15, information indicating the traveling direction of the moving body 15, and the like. Information may be included.
 また、上述の実施形態では、画像処理部19a,19b,19c,19dが、イメージャ18a,18b,18c,18dが生成した撮像画像(又は俯瞰画像)を補正する構成について説明したが、例えばゲインコントローラを有するAFE(Analog Front End)又はホワイトバランスの制御部に補正パラメータを入力し、イメージャ18a,18b,18c,18dによる画像生成時に色および輝度を補正する構成であってもよい。 In the above-described embodiment, the configuration has been described in which the image processing units 19a, 19b, 19c, and 19d correct the captured images (or overhead images) generated by the imagers 18a, 18b, 18c, and 18d. A configuration may be adopted in which correction parameters are input to an AFE (Analog Front End) or a white balance control unit having a color, and the color and luminance are corrected when an image is generated by the imagers 18a, 18b, 18c, and 18d.
 また、上述の実施形態において、輝度補正パラメータは、画像の輝度信号成分に対して乗じるパラメータとして説明したが、例えば絞り値およびシャッタ速度を示すパラメータであってもよい。この場合、絞りおよび露出時間の制御部に輝度補正パラメータを入力して輝度の調整を行う。また、輝度補正パラメータとして、画像の輝度信号成分に対して乗じるパラメータと、絞り値およびシャッタ速度を示すパラメータとを組合わせて用いてもよい。 In the above-described embodiment, the luminance correction parameter has been described as a parameter to be multiplied with the luminance signal component of the image, but may be a parameter indicating an aperture value and a shutter speed, for example. In this case, the luminance is adjusted by inputting the luminance correction parameter to the aperture and exposure time control unit. Further, as the brightness correction parameter, a parameter to be multiplied with the brightness signal component of the image and a parameter indicating the aperture value and the shutter speed may be used in combination.
 また、第2の実施形態において、調整パラメータ算出部280は、左右のサイドカメラ14による画像の色信号成分を用いてリアカメラ13による画像の補正パラメータを算出するが、左右何れか一方のサイドカメラ14による画像の色信号成分を用いて補正パラメータを算出する構成であってもよい。 In the second embodiment, the adjustment parameter calculation unit 280 calculates the correction parameter of the image by the rear camera 13 using the color signal component of the image by the left and right side cameras 14. 14 may be configured to calculate the correction parameter using the color signal component of the image of No. 14.
 10  撮像装置
 11  表示装置
 12  フロントカメラ
 13  リアカメラ
 14,14L,14R  サイドカメラ
 15  移動体
 16  ドアミラー
 17a,17b,17c,17d  光学系
 18a,18b,18c,18d  イメージャ
 19a,19b,19c,19d  画像処理部
 20a,20b,20c,20d  カメラ制御部
 21  画像結合装置
 22,220  補正パラメータ出力装置
 23  制御情報取得部
 24  記憶部
 25  出力部
 26  制御部
 100  カメラシステム
 101  車載ネットワーク
 270  画像取得部
 280  調整パラメータ算出部
DESCRIPTION OF SYMBOLS 10 Imaging device 11 Display apparatus 12 Front camera 13 Rear camera 14, 14L, 14R Side camera 15 Moving body 16 Door mirror 17a, 17b, 17c, 17d Optical system 18a, 18b, 18c, 18d Imager 19a, 19b, 19c, 19d Image processing Unit 20a, 20b, 20c, 20d Camera control unit 21 Image combination device 22, 220 Correction parameter output device 23 Control information acquisition unit 24 Storage unit 25 Output unit 26 Control unit 100 Camera system 101 In-vehicle network 270 Image acquisition unit 280 Adjustment parameter calculation Part

Claims (9)

  1.  移動体の周辺領域を一部重複して撮像する複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶する記憶部と、
     前記移動体の制御情報を取得する制御情報取得部と、
     前記取得した制御情報に対応する補正パラメータ群を出力する出力部とを備えることを特徴とする画像の補正パラメータ出力装置。
    A storage unit for storing correction parameter groups of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving object, in association with the control information of the moving object;
    A control information acquisition unit for acquiring control information of the mobile body;
    An image correction parameter output apparatus comprising: an output unit that outputs a correction parameter group corresponding to the acquired control information.
  2.  請求項1に記載の補正パラメータ出力装置であって、前記出力部は、制御情報に前記移動体の照明の点灯を示す情報が含まれるか否かに応じて異なる補正パラメータ群を出力することを特徴とする補正パラメータ出力装置。 The correction parameter output apparatus according to claim 1, wherein the output unit outputs different correction parameter groups depending on whether or not the control information includes information indicating lighting of the moving body. A characteristic correction parameter output device.
  3.  請求項1に記載の補正パラメータ出力装置であって、前記補正パラメータ群には、前記複数の画像間で色の差を低減するように補正するための色補正パラメータが含まれることを特徴とする補正パラメータ出力装置。 The correction parameter output apparatus according to claim 1, wherein the correction parameter group includes a color correction parameter for correcting so as to reduce a color difference between the plurality of images. Correction parameter output device.
  4.  請求項1に記載の補正パラメータ出力装置であって、
     前記出力部が出力した前記補正パラメータ群に基づいて補正された複数の画像を取得する画像取得部と、
     前記補正された複数の画像のうち1つを基準画像として定め、該基準画像と他の補正された画像との重複領域における色信号成分に基づいて、前記他の補正された画像を調整するための色調整パラメータを算出する調整パラメータ算出部とを更に備え、
     前記出力部は、前記算出した色調整パラメータを出力することを特徴とする補正パラメータ出力装置。
    The correction parameter output device according to claim 1,
    An image acquisition unit for acquiring a plurality of images corrected based on the correction parameter group output by the output unit;
    One of the plurality of corrected images is defined as a reference image, and the other corrected image is adjusted based on a color signal component in an overlapping area between the reference image and another corrected image. An adjustment parameter calculation unit for calculating the color adjustment parameter of
    The correction parameter output apparatus, wherein the output unit outputs the calculated color adjustment parameter.
  5.  請求項4に記載の補正パラメータ出力装置であって、
     前記画像取得部は、前記出力部が出力した前記色調整パラメータに基づいて調整された画像を取得し、
     前記調整パラメータ算出部は、前記調整された画像と、前記基準画像とは異なる補正された画像との重複領域における色信号成分に基づいて、前記補正された画像を調整するための色調整パラメータを算出し、
     前記出力部は、前記算出した色調整パラメータを出力することを特徴とする補正パラメータ出力装置。
    The correction parameter output device according to claim 4,
    The image acquisition unit acquires an image adjusted based on the color adjustment parameter output by the output unit,
    The adjustment parameter calculation unit calculates a color adjustment parameter for adjusting the corrected image based on a color signal component in an overlapping area between the adjusted image and a corrected image different from the reference image. Calculate
    The correction parameter output apparatus, wherein the output unit outputs the calculated color adjustment parameter.
  6.  請求項4に記載の補正パラメータ出力装置であって、前記基準画像は、前記補正された複数の画像のうちから前記制御情報に基づいて定められることを特徴とする補正パラメータ出力装置。 5. The correction parameter output device according to claim 4, wherein the reference image is determined based on the control information from among the plurality of corrected images.
  7.  請求項6に記載の補正パラメータ出力装置であって、前記基準画像は、前記補正された複数の画像のうちから、前記制御情報に含まれる前記移動体の進行方向を示す情報に基づいて定められることを特徴とする補正パラメータ出力装置。 The correction parameter output device according to claim 6, wherein the reference image is determined based on information indicating a traveling direction of the moving body included in the control information from among the plurality of corrected images. A correction parameter output device.
  8.  移動体の周辺領域を一部重複して撮像する複数の画像を生成する複数の撮像部と、
     前記複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶する記憶部と、
     前記移動体の制御情報を取得する制御情報取得部と、
     前記取得した制御情報に対応する補正パラメータ群を出力する出力部と
     前記出力部が出力する補正パラメータに基づいて前記複数の画像を補正する画像処理部と、
     前記補正した複数の画像を結合し結合画像を生成する画像結合部とを備えることを特徴とするカメラシステム。
    A plurality of imaging units that generate a plurality of images that partially image the peripheral area of the moving body;
    A storage unit that stores a plurality of pattern correction parameter groups for correcting the plurality of images in association with the control information of the moving body;
    A control information acquisition unit for acquiring control information of the mobile body;
    An output unit that outputs a correction parameter group corresponding to the acquired control information; an image processing unit that corrects the plurality of images based on a correction parameter output by the output unit;
    A camera system comprising: an image combining unit that combines the plurality of corrected images to generate a combined image.
  9.  移動体の周辺領域を一部重複して撮像する複数の画像を補正するための複数のパターンの補正パラメータ群を、該移動体の制御情報に対応付けて記憶するステップと、
     前記移動体の制御情報を取得するステップと、
     前記取得した制御情報に対応する補正パラメータ群を出力するステップとを含むことを特徴とする画像の補正パラメータ出力方法。
    Storing a correction parameter group of a plurality of patterns for correcting a plurality of images captured by partially overlapping a peripheral area of the moving body in association with the control information of the moving body;
    Obtaining control information of the mobile body;
    And a step of outputting a correction parameter group corresponding to the acquired control information.
PCT/JP2014/005471 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method WO2015064095A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP14858173.9A EP3065390B1 (en) 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method
PCT/JP2014/005471 WO2015064095A1 (en) 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method
US15/032,910 US10097733B2 (en) 2013-10-29 2014-10-29 Image correction parameter output apparatus, camera system and correction parameter output method
JP2015544805A JPWO2015064095A1 (en) 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-224508 2013-10-29
JP2013224508 2013-10-29
PCT/JP2014/005471 WO2015064095A1 (en) 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method

Publications (1)

Publication Number Publication Date
WO2015064095A1 true WO2015064095A1 (en) 2015-05-07

Family

ID=53003722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/005471 WO2015064095A1 (en) 2013-10-29 2014-10-29 Image correction parameter output device, camera system, and correction parameter output method

Country Status (4)

Country Link
US (1) US10097733B2 (en)
EP (1) EP3065390B1 (en)
JP (1) JPWO2015064095A1 (en)
WO (1) WO2015064095A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018533858A (en) * 2015-08-27 2018-11-15 ノキア テクノロジーズ オサケユイチア Method and apparatus for modifying multiframe images based on anchor frames

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6520919B2 (en) * 2014-03-28 2019-05-29 日本電気株式会社 Image correction apparatus, image correction method and program
JP6354578B2 (en) * 2014-12-26 2018-07-11 株式会社Jvcケンウッド Imaging system
JP2018206323A (en) * 2017-06-09 2018-12-27 アイシン精機株式会社 Image processing apparatus
CN109040613B (en) * 2017-06-09 2022-03-25 株式会社爱信 Image processing apparatus
CN109218630B (en) * 2017-07-06 2022-04-12 腾讯科技(深圳)有限公司 Multimedia information processing method and device, terminal and storage medium
US11138761B2 (en) * 2018-06-12 2021-10-05 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN108650495B (en) * 2018-06-28 2024-04-26 华域视觉科技(上海)有限公司 Panoramic looking-around system for vehicle and self-adaptive light supplementing method thereof
JP7115253B2 (en) * 2018-11-28 2022-08-09 トヨタ自動車株式会社 In-vehicle camera system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007141098A (en) * 2005-11-21 2007-06-07 Nissan Motor Co Ltd Image-processing device and image-processing method
JP2007243464A (en) * 2006-03-07 2007-09-20 Aisin Aw Co Ltd Method and device for supporting parking
JP2010116196A (en) 2008-11-14 2010-05-27 Nihon Tetra Pak Kk Cutting and opening apparatus for square-like packaged article packaged with sheet-like packaging material
JP2011049735A (en) * 2009-08-26 2011-03-10 Alpine Electronics Inc Vehicle periphery image providing device
JP2011205375A (en) * 2010-03-25 2011-10-13 Fujitsu Ten Ltd Image generating apparatus, image display system, and image generating method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3297040B1 (en) * 2001-04-24 2002-07-02 松下電器産業株式会社 Image composing and displaying method of vehicle-mounted camera and apparatus therefor
EP1771811A4 (en) * 2004-07-26 2010-06-09 Silicon Optix Inc Panoramic vision system and method
JP2007312126A (en) * 2006-05-18 2007-11-29 Toshiba Corp Image processing circuit
KR100966288B1 (en) 2009-01-06 2010-06-28 주식회사 이미지넥스트 Around image generating method and apparatus
JP5347716B2 (en) * 2009-05-27 2013-11-20 ソニー株式会社 Image processing apparatus, information processing method, and program
JP5299867B2 (en) * 2009-06-30 2013-09-25 日立コンシューマエレクトロニクス株式会社 Image signal processing device
US8502860B2 (en) * 2009-09-29 2013-08-06 Toyota Motor Engineering & Manufacturing North America (Tema) Electronic control system, electronic control unit and associated methodology of adapting 3D panoramic views of vehicle surroundings by predicting driver intent
JP6163207B2 (en) * 2013-07-18 2017-07-12 クラリオン株式会社 In-vehicle device
KR101519209B1 (en) * 2013-08-06 2015-05-11 현대자동차주식회사 Apparatus and method for providing image
CN105984387A (en) * 2015-02-06 2016-10-05 德尔福电子(苏州)有限公司 Aerial view monitor system with function of automatic aligning
US10040394B2 (en) * 2015-06-17 2018-08-07 Geo Semiconductor Inc. Vehicle vision system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007141098A (en) * 2005-11-21 2007-06-07 Nissan Motor Co Ltd Image-processing device and image-processing method
JP2007243464A (en) * 2006-03-07 2007-09-20 Aisin Aw Co Ltd Method and device for supporting parking
JP2010116196A (en) 2008-11-14 2010-05-27 Nihon Tetra Pak Kk Cutting and opening apparatus for square-like packaged article packaged with sheet-like packaging material
JP2011049735A (en) * 2009-08-26 2011-03-10 Alpine Electronics Inc Vehicle periphery image providing device
JP2011205375A (en) * 2010-03-25 2011-10-13 Fujitsu Ten Ltd Image generating apparatus, image display system, and image generating method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3065390A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018533858A (en) * 2015-08-27 2018-11-15 ノキア テクノロジーズ オサケユイチア Method and apparatus for modifying multiframe images based on anchor frames

Also Published As

Publication number Publication date
EP3065390B1 (en) 2019-10-02
EP3065390A1 (en) 2016-09-07
EP3065390A4 (en) 2017-08-02
US10097733B2 (en) 2018-10-09
US20160269597A1 (en) 2016-09-15
JPWO2015064095A1 (en) 2017-03-09

Similar Documents

Publication Publication Date Title
WO2015064095A1 (en) Image correction parameter output device, camera system, and correction parameter output method
JP6724982B2 (en) Signal processing device and imaging device
JP6305430B2 (en) Imaging setting changing device, imaging system, and imaging setting changing method
JP2009017020A (en) Image processor and method for generating display image
US11082631B2 (en) Image processing device
JP2018078420A (en) Vehicle image display device and vehicle image display program
JP4363207B2 (en) Image processing method, image processing system, and image processing apparatus
JP5190715B2 (en) In-vehicle monitor system, parking support apparatus using the in-vehicle monitor system, and color adjustment method for in-vehicle monitor system
JP2013016981A (en) Imaging display control system
US20180069998A1 (en) Imaging apparatus, imaging system, and vehicle
JP5716944B2 (en) In-vehicle camera device
JP6266022B2 (en) Image processing device, alarm device, and image processing method
JP2008230464A (en) Automatic exposure device for on-vehicle camera
KR101601324B1 (en) Image acquiring method of vehicle camera system
JP6322723B2 (en) Imaging apparatus and vehicle
JP2013009041A (en) Vehicle photographing display control system
JP2017183880A (en) Electric circuit for electronic mirror
WO2015115103A1 (en) Image processing device, camera system, and image processing method
JP2011181019A (en) Bird's-eye view image generation device
JP6713938B2 (en) Surrounding situation grasp support system
JP2019197948A (en) Imaging device and method for controlling the same
JP2018074440A (en) Vehicle rear image display device
JP2014021543A (en) Visual field support device and program for vehicle
JP2015070280A (en) Image processing system, camera system, and image processing method
JP2019129339A (en) Photographed image display system and electronic mirror system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14858173

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015544805

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2014858173

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014858173

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15032910

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE