WO2016129552A1 - カメラパラメータ調整装置 - Google Patents

カメラパラメータ調整装置 Download PDF

Info

Publication number
WO2016129552A1
WO2016129552A1 PCT/JP2016/053653 JP2016053653W WO2016129552A1 WO 2016129552 A1 WO2016129552 A1 WO 2016129552A1 JP 2016053653 W JP2016053653 W JP 2016053653W WO 2016129552 A1 WO2016129552 A1 WO 2016129552A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
parameter
vehicle
unit
Prior art date
Application number
PCT/JP2016/053653
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
大輔 杉浦
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to DE112016000689.6T priority Critical patent/DE112016000689T5/de
Publication of WO2016129552A1 publication Critical patent/WO2016129552A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/40Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
    • B60R2300/402Image calibration
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view

Definitions

  • the present disclosure relates to a camera parameter adjustment device that adjusts camera parameters for generating a bird's-eye view image from an image captured by an in-vehicle camera that captures a vehicle periphery.
  • the vehicle and its surroundings are seen from above the own vehicle by combining the images taken by a plurality of in-vehicle cameras that photograph each area around the own vehicle.
  • a device that generates an image (that is, an overhead image) and displays the image on a display.
  • each captured image of each in-vehicle camera is converted into an overhead image. Convert.
  • an overhead image generated from an image captured by a certain in-vehicle camera is referred to as a partial overhead image.
  • a plurality of partial overhead images generated from the captured images for each on-vehicle camera are connected by a synthesis process, thereby generating an overhead image (omitted as an integrated overhead image) in all directions around the host vehicle.
  • the relative position between the subject and the subject vehicle shown in the bird's-eye view image when the camera parameter does not correspond to the actual mounting posture or mounting position of the in-vehicle camera, the relative position between the subject and the subject vehicle shown in the bird's-eye view image However, it will deviate from the actual relative position.
  • Patent Document 1 discloses a method for specifying a pitch angle and a mounting height of a camera provided in a vehicle and correcting the pitch angle and the mounting height as camera parameters.
  • Patent Document 2 also discloses a method for evaluating the mounting error of the in-vehicle camera.
  • the camera parameters of each in-vehicle camera can be adjusted.
  • the outline of one subject is discontinuously displayed near the joint of the partial overhead image of the integrated overhead image. May end up. If the contours of the subject are displayed discontinuously at the joints of the partial overhead images, the user feels uncomfortable.
  • the present disclosure has been made based on this situation, and the object of the present disclosure is to display the subject even when one subject is displayed across a plurality of partial overhead images in the integrated overhead image. Is to provide a camera parameter adjusting device capable of suppressing the display of the outline of the image being shifted at the joint between the partial overhead images.
  • the present disclosure for achieving the object includes at least two cameras that are mounted on a vehicle and shoot different ranges in the periphery of the vehicle, and mounting positions and attachments to the vehicle that are set for each of the plurality of cameras.
  • An overhead view that generates a partial overhead view image that is an overhead view image corresponding to the shooting range of the camera, based on a parameter storage unit that stores camera parameters representing the posture, an image captured by the camera, and a camera parameter corresponding to the camera
  • a conversion processing unit a reference camera selection unit that determines a reference camera that is a reference camera for adjusting camera parameters of a predetermined camera among a plurality of cameras, an image captured by the reference camera, and a reference camera A first position for identifying a relative position of an object photographed by the reference camera with respect to the vehicle based on the corresponding camera parameter.
  • Relative position specified with respect to the common photographed object that is an object photographed by both the adjustment target camera and the reference camera, and the first relative position identifying unit with respect to the common photographed object A deviation degree calculation unit that calculates the degree of deviation from the specified relative position, and a parameter calculation that calculates a matching parameter that is a camera parameter for which the deviation degree calculated by the deviation degree calculation unit is 0 as a camera parameter for the adjustment target camera.
  • the overhead view conversion processing unit uses the matching parameter calculated by the parameter calculation unit as a camera parameter when generating a partial overhead view image from an image captured by the adjustment target camera.
  • the deviation degree calculation unit is determined from the relative position of the object determined from the reference camera image and the adjustment target camera image.
  • the degree of deviation from the relative position is calculated.
  • the parameter calculation unit calculates a camera parameter (that is, a matching parameter) in which the deviation degree calculated by the deviation degree calculation unit is 0 as the camera parameter of the adjustment target camera.
  • the overhead view conversion processing unit generates a partial overhead view image using the calculated matching parameter for the image captured by the adjustment target camera.
  • the alignment parameter calculated here is, for example, the relative position of the common photographed object specified by the second relative position specifying unit using the alignment parameter instead of the preset camera parameter, 1 is a camera parameter in which the relative position of the common object specified by the relative position specifying unit becomes equal.
  • the positional relationship between the object and the vehicle in the partial overhead view image represented by the partial overhead view image generated from the captured image of the adjustment target camera using the alignment parameter is the image of the reference camera. It matches with the positional relationship between the object and the vehicle in the partial bird's-eye view image represented by the partial bird's-eye view image.
  • the plurality of partial overhead images generated based on the images taken by the plurality of cameras described above are one overhead image (that is, an integrated overhead image) that represents the surroundings of the vehicle as described above in the background art section. Is used to generate
  • the contour of the subject is between the partial overhead images. It is possible to suppress the display from being shifted at the joint.
  • FIG. 3 is a block diagram illustrating an example of a schematic configuration of a control unit 1.
  • FIG. It is a figure for demonstrating the change of the positional relationship of the corresponding
  • FIG. It is a conceptual diagram showing the environment where the own vehicle is traveling, the front bird's-eye view area Af, and the rear bird's-eye view area Ar. It is an example of an integrated overhead view image.
  • FIG. 10 is a conceptual diagram for explaining an effect of a configuration of modification example 5.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a driving support system 100 to which a camera parameter adjustment device according to the present invention is applied.
  • the driving support system 100 assists the driver in recognizing the situation around the vehicle by converting a captured image of a camera that captures a predetermined area outside the passenger compartment into a bird's-eye view image and displaying it on a display.
  • the vehicle on which the driving support system 100 is mounted is referred to as a host vehicle, and the configuration and operation of the system will be described.
  • the driving support system 100 includes a control unit 1, a front camera 2, a rear camera 3, a wheel speed sensor 4, a steering angle sensor 5, a shift position sensor 6, and a display 7. ing.
  • the control unit 1, the front camera 2, the rear camera 3, the wheel speed sensor 4, the steering angle sensor 5, the shift position sensor 6, and the display 7 are configured to be able to communicate with each other via a known in-vehicle network. ing.
  • the control unit 1 controls the operation of the driving support system 100. Details of the control unit 1 will be described later.
  • the front camera 2 is a camera provided to photograph a predetermined range in front of the host vehicle (referred to as a front photographing range).
  • a front photographing range for example, a well-known CMOS camera, CCD camera, or the like whose shooting range is set to a wide angle (for example, an angle of view of 175 °) by a wide angle lens can be used.
  • the front camera 2 may be installed, for example, near the center of the front bumper in the vehicle width direction so that a desired range in front of the host vehicle is the shooting range.
  • the installation position of the front camera 2 is not limited to the vicinity of the center part in the vehicle width direction of the front bumper, but is a position that does not obstruct the driver's view of the front of the vehicle, for example, in the vicinity of the rearview mirror in the vehicle interior or the upper end of the windshield. It only has to be attached. Video signals captured by the front camera 2 are sequentially output to the control unit 1.
  • the rear camera 3 is a camera provided so as to photograph a predetermined range (referred to as a rear photographing range) behind the host vehicle.
  • the rear camera 3 may be a known CMOS camera or CCD camera whose shooting range is set to a wide angle by a wide angle lens.
  • the rear camera 3 may be installed, for example, near the center of the rear bumper in the vehicle width direction so that a desired range behind the host vehicle is the shooting range.
  • the installation position of the rear camera 3 is not limited to the vicinity of the center part in the vehicle width direction of the rear bumper, and may be attached to a position that does not block the view for the driver's rear confirmation, for example, near the upper end of the rear window.
  • Video signals taken by the rear camera 3 are sequentially output to the control unit 1.
  • the wheel speed sensor 4 sequentially outputs a pulse signal corresponding to the rotational speed of the wheel (for example, every several tens of milliseconds).
  • the control unit 1 converts the pulse signal input from the wheel speed sensor 4 into a vehicle speed using a known method.
  • the steering angle sensor 5 sequentially detects the steering angle and sequentially outputs a signal corresponding to the steering angle.
  • a rotational torque sensor that detects rotational torque generated when the user operates the steering can be used.
  • the shift position sensor 6 detects the shift position of the vehicle and outputs a signal corresponding to the detected shift position to the control unit 1.
  • the shift position includes a reverse position for transmitting driving force in a direction in which the host vehicle moves backward, a drive position for transmitting driving force in a direction in which the host vehicle moves forward, and the like.
  • Display 7 displays text and images based on signals input from control unit 1.
  • the display 7 is capable of full color display, for example, and can be configured using a liquid crystal display, an organic EL display, or the like.
  • the display 7 is a display disposed near the center of the instrument panel in the vehicle width direction.
  • the display 7 may be a display provided in the meter unit or a known head-up display.
  • the control unit 1 is configured as a normal computer, and includes a well-known CPU 11, a memory 12, a storage 13, an input / output interface (hereinafter referred to as I / O), a bus line connecting these configurations, and the like.
  • the CPU 11 is a well-known central processing unit, and executes various arithmetic processes by using the memory 12 as an arithmetic area.
  • the memory 12 may be realized by a temporary storage medium such as a RAM, for example, and functions as a main storage device for the CPU 11.
  • the storage 13 may be realized by a non-volatile storage medium such as a ROM or a flash memory, and functions as an auxiliary storage device for the CPU 11. Although only one CPU 11 is shown here, a plurality of CPUs 11 may be provided.
  • the I / O controls data transmission / reception between the control unit 1 and an external device connected to the control unit 1 such as the front camera 2 and the rear camera 3.
  • the I / O converts the video signals input from the front camera 2 and the rear camera 3 into image data in a format that allows image processing such as a bird's-eye view described later, and stores the image data in the memory 12.
  • the storage 13 stores programs for executing various processes.
  • the storage 13 stores camera parameters set in advance for each in-vehicle camera.
  • Camera parameters indicate parameters indicating the installation position and mounting orientation of each in-vehicle camera with respect to the center of the vehicle (external parameters), lens distortion coefficient, focal length, optical axis center, pixel size, pixel ratio, etc. Includes parameters (internal parameters).
  • the installation position of each in-vehicle camera can be represented by, for example, three-dimensional coordinates with the center of the host vehicle as the origin, the vehicle longitudinal direction as the X axis, the vehicle width direction as the Y axis, and the vehicle height direction as the Z axis. good.
  • the mounting posture represents the optical axis direction of the in-vehicle camera.
  • the mounting posture may be represented by the angles (the so-called pitch angle, roll angle, and yaw angle) formed by the optical axis of the in-vehicle camera with respect to each of the aforementioned X axis, Y axis, and Z axis.
  • the center of the host vehicle as a reference point for defining the installation position is, for example, a point where the distance from the front end of the vehicle to the rear end is equal on the center line of the vehicle equidistant from both sides of the host vehicle. .
  • the center position in the vehicle width direction of the rear wheel shaft may be the center.
  • the internal parameters described above are used for performing distortion correction on image data taken by each in-vehicle camera.
  • the external parameter is used for viewpoint conversion (or coordinate conversion) from the image subjected to distortion correction processing, or for specifying the relative position of the subject with respect to the subject vehicle from the position of the subject in the captured image. Used for etc.
  • the area 13a storing the camera parameters described above corresponds to the parameter storage unit described in the claims.
  • the storage 13 stores data of an image obtained by looking down at the host vehicle.
  • the control unit 1 includes a parameter acquisition unit F1, an overhead view conversion processing unit F2, a movement amount specifying unit F3, and a history image as functional blocks realized by executing a program stored in the storage 13.
  • a management unit F4 an interpolation overhead image generation unit F5, an image composition unit F6, and a display processing unit F7 are provided.
  • a reference camera setting unit F8, a common object determination unit F9, a front relative position specifying unit G1, a rear relative position specifying unit G2, a deviation degree calculating unit G3, and a parameter calculating unit G4 are provided. .
  • some or all of the functions of the control unit 1 may be configured by hardware using one or a plurality of ICs.
  • the control unit 1 corresponds to the camera parameter adjusting device described in the claims.
  • the parameter acquisition unit F1 first refers to the storage 13 and acquires camera parameters for each in-vehicle camera. Further, the parameter acquisition unit F1 in the present embodiment includes an error correction unit F11 corresponding to the camera mounting error correction function disclosed in Patent Document 1.
  • the error correction unit F11 calculates an error between a design value for the pitch angle and the mounting height among various parameters corresponding to the vehicle-mounted camera, and calculates the pitch angle and the mounting height by correcting the calculated error. calculate. That is, the error correction unit F11 plays a role of correcting the pitch angle and the mounting height among various camera parameters. Data (error data) indicating the difference between the pitch angle and the design value of the mounting height calculated by the error correction unit F11 is provided to the reference camera setting unit F8.
  • the error correction unit F11 corresponds to the error calculation unit described in the claims.
  • the overhead view conversion processing unit F2 generates an overhead view image (that is, a partial overhead view image) corresponding to the shooting range of the in-vehicle camera from the image data captured by each in-vehicle camera stored in the memory 12. For example, the overhead conversion processing unit F2 performs a known distortion correction process on the image data captured by the front camera 2 using a distortion correction parameter corresponding to the front camera 2. Then, viewpoint conversion processing is performed on the image subjected to the distortion correction processing using camera parameters corresponding to the front camera 2, and a partial overhead view image (front overhead image) corresponding to the imaging region of the front camera 2 is obtained. create.
  • a series of processes for generating a partial overhead view image from a captured image is also referred to as an overhead view conversion process.
  • the process in which the overhead conversion processing unit F2 generates a partial overhead image (rear overhead image) corresponding to the imaging area of the rear camera 3 from the image data captured by the rear camera 3 is also forward from the image data captured by the front camera 2. This is the same as the process for generating the overhead image.
  • the overhead view conversion processing unit F ⁇ b> 2 generates a backward overhead image based on image data captured by the rear camera 3 and camera parameters corresponding to the rear camera 3.
  • the front bird's-eye view image is an image representing a predetermined range in the front photographing range
  • the rear bird's-eye view image is an image representing the predetermined range in the rear photographing range.
  • the area represented by the front bird's-eye image is referred to as a front bird's-eye view area
  • the area represented by the rear bird's-eye view image is referred to as a rear bird's-eye view area.
  • the front overhead image and the rear overhead image generated by the overhead conversion processing unit F2 are temporarily stored in the memory 12 and used by other functional blocks.
  • the latest partial bird's-eye view image sequentially generated by the bird's-eye conversion processing unit F2 is a bird's-eye view image that represents the current state of the shooting region of the in-vehicle camera in real time.
  • the front overhead image and the rear overhead image are distinguished from each other and stored. Moreover, the front bird's-eye view image and the rear bird's-eye view image generated at a plurality of time points may be sorted and stored in time series order.
  • the area 12a that stores partial overhead images corresponding to the respective in-vehicle cameras corresponds to the image data storage unit described in the claims.
  • the image data storage unit is provided in the memory 12 is illustrated as an example, but the image data storage unit may be provided in the storage 13.
  • the movement amount specifying unit F3 calculates the movement amount of the host vehicle from a certain time point to the present time based on signals input from the wheel speed sensor 4, the steering angle sensor 5, and the shift position sensor 6.
  • the movement amount of the host vehicle here includes a movement distance and a movement direction.
  • a known method may be used as a method for specifying the amount of change in the vehicle position from the vehicle information.
  • the movement amount specifying unit F3 sequentially specifies the current vehicle speed based on the signal input from the wheel speed sensor 4, and calculates the movement distance per unit time by integrating the vehicle speed. Further, the movement amount specifying unit F3 specifies the moving direction of the host vehicle based on the signal input from the shift position sensor 6 and the signal input from the steering angle sensor 5. The change amount of the vehicle position sequentially calculated by the movement amount specifying unit F3 is used by the history image management unit F4 and the like.
  • the movement amount specifying unit F3 determines whether the host vehicle is moving forward or backward. For example, the movement amount specifying unit F3 determines whether the host vehicle is moving forward or backward based on a signal input from the shift position sensor 6. Note that the method of determining whether the host vehicle is moving forward or backward (that is, the traveling direction) is not limited to this. For example, when a sensor for detecting the rotation direction of the tire is provided, whether the host vehicle is moving forward or backward may be determined from the rotation direction. Further, when a GPS receiver or the like is provided, it may be determined whether the vehicle is moving forward or backward based on a time change in the position of the host vehicle determined according to the reception result of the GPS receiver.
  • the history image management unit F4 manages a plurality of front overhead images and rear overhead images stored in the memory 12. Specifically, based on the movement amount specified by the movement amount specifying unit F3, for each partial overhead image, the relative position (corresponding region) of the area represented by the partial overhead image of the surrounding area of the host vehicle is set. to manage. That is, the corresponding area of the partial overhead image is sequentially calculated and updated based on the movement amount of the vehicle from the time when the partial overhead image is generated to the present.
  • FIG. 3 is a diagram for explaining a change in a positional relationship between the corresponding region of the partial overhead view image and the own vehicle accompanying the movement of the own vehicle.
  • the corresponding area of the front overhead image If (T1) newly generated at time T1 is a front overhead area.
  • the diagram shown on the right side of FIG. 3 shows the positional relationship between the corresponding area of the front bird's-eye view image If (T1) and the host vehicle when the host vehicle has advanced 0.3 m from time T1 (referred to as time T2). Yes.
  • the relative position of the front-viewing area itself with respect to the host vehicle is constant. That is, the corresponding area of the front bird's-eye view image If (T2) generated at time T2 is a front bird's-eye view area.
  • the corresponding region of the front bird's-eye view image If (T1) becomes a region on the rear side of the host vehicle relatively by 0.3 m from the front bird's-eye view region as the host vehicle advances. .
  • the history image management unit F4 sequentially updates the corresponding area of the partial overhead image for each partial overhead image based on the movement amount specified by the movement amount specifying unit F3.
  • the corresponding area of each partial bird's-eye view image may be represented by coordinates on the XY plane composed of the X axis and the Y axis included in the XYZ coordinate system described above.
  • the data indicating the corresponding area stored in association with the partial overhead image corresponds to an example of the position specifying data described in the claims.
  • the partial overhead view image is stored as it is (that is, in full size).
  • the difference portion is, for example, a region that does not overlap with the front overhead image If (T2) in the front overhead image If (T1) in FIG.
  • information for example, time information and location information
  • a mode may be employed in which the corresponding region is calculated later as necessary. This is because if there is time and position information at the time of generating the partial overhead image, the corresponding area of the partial overhead image with respect to the current position of the host vehicle can be calculated backward. For example, if there is time information at the time when the partial overhead image is generated, the corresponding area of the partial overhead image with respect to the current position of the host vehicle can be calculated from the total movement amount from that time to the present. That is, the time and position information at the time when the partial overhead image is generated also correspond to an example of the position specifying data described in the claims.
  • the partial overhead images other than the latest partial overhead image are referred to as history images.
  • a front bird's-eye view image that is not the latest is referred to as a front history image
  • a rear bird's-eye view image that is not the latest is referred to as a rear history image.
  • the interpolated overhead image generation unit F5 generates a partial overhead image (referred to as an interpolated overhead image) representing an area between the front overhead area and the backward overhead area based on the history image stored in the memory 12. .
  • an interpolated overhead image is generated based on the forward history image.
  • an interpolated overhead image is generated based on the rear history image. That is, the interpolated overhead image is generated based on the partial overhead image generated from the captured image of the in-vehicle camera (referred to as the traveling direction camera) on the traveling direction side of the host vehicle.
  • the image composition unit F6 synthesizes the latest front bird's-eye image, the latest rear bird's-eye view image, and the interpolated bird's-eye view image, so that the vehicle and the predetermined area around the own vehicle are seen from above the own vehicle. An image (assumed as an integrated overhead image) is generated.
  • the image composition unit F6 corresponds to the overhead image integration processing unit described in the claims.
  • the one read from the storage 13 is used as the bird's-eye view image of the host vehicle itself.
  • FIG. 5 shows an integrated bird's-eye view image generated when the host vehicle is traveling forward on a lane defined by white lines L1 and L2 as shown in FIG.
  • symbol Af in FIG. 4 represents an example of the front bird's-eye view area
  • symbol Ar represents an example of the rear bird's-eye view area
  • the integrated overhead image is an image in which the latest front overhead image If, the latest rear overhead image Ir, and the interpolated overhead image Im are arranged at predetermined positions determined with reference to the own vehicle. is there.
  • boundary lines B1 and B2 representing joints are displayed at the joint portion of each image in the integrated overhead image, but the boundary lines B1 and B2 may not be displayed as another aspect. Good.
  • the white line L1f and the white line L1m, the white line L2f and the white line which are images corresponding to the white lines L1 and L2, at the joint portion between the interpolated overhead image Im and the forward history image. L2m is continuous.
  • the in-vehicle camera that has captured the image data that is the source of the overhead image is different at the joint between the rear overhead image Ir and the interpolated overhead image Im, there is a shift due to the difference in the characteristics of each in-vehicle camera. May occur. That is, the white line L1m and the white line L1r, and the white line L2m and the white line L2r may be discontinuous, respectively.
  • the interpolation overhead image generation unit F5 is provided and the integrated overhead image includes the interpolation overhead image.
  • the present invention is not limited to this.
  • the interpolation overhead image generation unit F5 may not be provided.
  • the display processing unit F7 displays the integrated overhead image generated by the image composition unit F6 on the display 7.
  • the reference camera setting unit F8 sets one of the front camera 2 and the rear camera 3 as an in-vehicle camera (hereinafter referred to as a reference camera) that is used as a reference when determining a matching parameter to be described later.
  • the reference camera setting unit F8 determines a reference camera based on error data for each in-vehicle camera provided from the parameter acquisition unit F1.
  • the reliability calculation unit F81 included in the reference camera setting unit F8 is a functional unit that calculates the reliability for each in-vehicle camera.
  • the reliability calculation unit F81 corresponds to the reliability evaluation unit described in the claims.
  • the reference camera setting unit F8 employs, as the reference camera, the onboard camera having the higher reliability calculated by the reliability calculation unit F81 among the front camera 2 and the rear camera 3.
  • the reliability may be calculated using a predetermined function, a table, or the like having variables of the pitch angle error and the mounting height error. It is assumed that the reliability is smaller as the pitch angle error is larger and the mounting height error is larger.
  • a method of evaluating the mounting error of the in-vehicle camera by the method disclosed in Patent Document 1 and calculating the reliability of each in-vehicle camera is adopted, but is not limited thereto.
  • the mounting error may be evaluated by the method disclosed in Patent Document 2, and the reliability of each in-vehicle camera may be calculated.
  • This reference camera setting unit F8 corresponds to the reference camera selection unit described in the claims.
  • the vehicle-mounted camera determined as the reference camera is also simply referred to as a reference camera, and the vehicle-mounted camera on the side not employed as the reference camera is also referred to as a matching target camera.
  • the matching target camera corresponds to the adjustment target camera described in the claims.
  • the common photographed object determination unit F9 determines whether or not the front camera 2 and the rear camera 3 have photographed the same object (referred to as a common photographed object). Details of the operation of the common photographed object determination unit F9 will be described later.
  • the common photographed object determination unit F9 corresponds to the determination unit described in the claims.
  • the front side relative position specifying unit G1 specifies the relative position of the common photographed object from the front bird's-eye image stored in the memory 12 including the latest front bird's-eye view image.
  • the rear-side relative position specifying unit G2 specifies the relative position of the common photographed object from the rear bird's-eye view image stored in the memory 12 including the latest rear bird's-eye view image.
  • the front side relative position specifying unit G1 corresponds to the first relative position specifying unit described in the claims, and the rear side relative position specifying unit.
  • G2 is equivalent to the 2nd relative position specific part as described in a claim.
  • the front relative position specifying unit G1 corresponds to the second relative position specifying unit described in the claims, and the rear side relative position is set.
  • the specifying unit G2 corresponds to the first relative position specifying unit described in the claims.
  • the deviation degree calculation unit G3 calculates the degree of deviation between the relative position of the common photographed object specified by the front relative position specifying part G1 and the relative position of the common object specified by the rear relative position specifying part G2. This divergence degree corresponds to the degree of deviation described in the claims, and the divergence degree calculation unit corresponds to the deviation degree calculation unit described in the claims.
  • the parameter calculation unit G4 is a camera parameter in which the divergence degree calculated by the divergence degree calculation unit G3 is 0 as the camera parameter of the matching target camera based on the relative position specified based on the partial overhead view image derived from the in-vehicle camera. (Consistency parameter) is calculated.
  • the matching parameters calculated by the parameter calculation unit G4 correspond to the external parameters of the camera parameters.
  • the matching parameter may correspond to an internal parameter.
  • FIG. 6 is a flowchart corresponding to a parameter calculation process performed by the control unit 1 in a situation where the host vehicle is moving forward as an example. For example, it is assumed that the host vehicle travels forward on a lane defined by white lines L1 and L2 as shown in FIG.
  • the flowchart shown in FIG. 6 may be started when a predetermined start condition is satisfied.
  • a predetermined start condition for example, it is assumed that the operation is started when the steering angle is 0 degree, the vehicle speed is greater than 0 km / h, and a predetermined speed, for example, 30 km / h or less.
  • the range set for the vehicle speed may be appropriately designed.
  • the processing in the middle of processing, when the vehicle speed becomes a value outside the above-mentioned range or the steering angle becomes a value other than 0 degrees, the processing is interrupted.
  • the process may be continued. .
  • step S1 the parameter acquisition unit F1 reads camera parameters for each in-vehicle camera from the storage 13. Further, the error correction unit F11 calculates an attachment error for each on-vehicle camera, and corrects the camera parameter read from the storage 13.
  • step S2 the process proceeds to step S2. At this time, error data representing the attachment error calculated by the error correction unit F11 is provided to the reference camera setting unit F8.
  • step S2 the reference camera setting unit F8 calculates the reliability for each in-vehicle camera based on the error data calculated in step S1. Then, the in-vehicle camera having the higher reliability is adopted as the reference camera, and the process proceeds to step S3.
  • the front camera 2 is adopted as the reference camera.
  • step S3 the overhead view conversion processing unit F2 generates a forward overhead image and a backward overhead image using the image data input from each of the front camera 2 and the rear camera 3 and the camera parameters acquired in step S1. Move on to S4.
  • step S4 the movement amount specifying unit F3 determines whether or not the movement of the host vehicle (forward movement here) is continued. If the vehicle speed is greater than 0, it is determined that the movement is continuing (step S4 YES), and the process proceeds to step S5. On the other hand, when the vehicle speed is 0, it is determined that the movement is interrupted (NO in step S4), and this flow is finished.
  • step S5 the overhead conversion processing unit F2 generates a new front overhead image and a rear overhead image based on the image data newly input from each of the front camera 2 and the rear camera 3, and proceeds to step S6. Note that the front bird's-eye view image that is no longer the latest front bird's-eye view image is held in the memory 12 as a front history image.
  • step S6 the movement amount specifying unit F3 specifies the movement amount since the previous step S4 was performed, and proceeds to step S7.
  • step S7 the history image management unit F4 updates the corresponding area of the forward history image based on the movement amount specified by the movement amount specifying unit F3 in step S6, and proceeds to step S8.
  • step S8 the common photographed object determination unit F9 determines whether an object photographed by the front camera 2 is photographed by the rear camera 3 or not. If the common object determination unit F9 determines that the object photographed by the front camera 2 is also photographed by the rear camera 3, step S8 is YES and the process proceeds to step S9. On the other hand, if the common photographed object determination unit F9 determines that the object photographed by the front camera 2 is not photographed by the rear camera 3, step S8 is NO and the process returns to step S4.
  • step S4 to step S8 are performed until it is determined that the object photographed by the front camera 2 is also photographed by the rear camera 3 by the common object determination unit F9. The process is repeated.
  • the common captured object determination unit F9 determines whether or not the forward overhead image is an image of an object having a predetermined feature amount.
  • the object having a predetermined feature amount here is, for example, a known edge extraction or contour extraction such as a white line defining a lane, a road marking, a block provided along a road, or the like. It is an object that can be distinguished from the object.
  • An object having a feature amount included in the front bird's-eye view image functions as a common photographed object.
  • the road marking is a line, symbol, or character drawn on the road in order to display regulations or instructions regarding road traffic.
  • the common photographed object determination unit F9 displays the front bird's-eye view image as a front camera. 2 is employed as an image (referred to as a comparative image) for determining whether or not the object photographed in 2 is also photographed by the rear camera 3.
  • FIG. 7 is a diagram showing the positional relationship between the corresponding area of the latest forward-looking image If (T) at a certain time T and the host vehicle.
  • the front bird's-eye view image If (T) is an image obtained by photographing white lines L1 and L2 as objects having a predetermined feature amount. That is, the common photographed object determination unit F9 employs the front overhead image If (T) as a comparison image.
  • symbol L1f in FIG. 7 has shown the image corresponding to the actual white line L1
  • symbol L2f has shown the image corresponding to the actual white line L2.
  • the rear bird's-eye view image Ir (T) represents the rear bird's-eye view image generated at time T.
  • the relative position of the corresponding region of the front overhead image If (T) approaches the rear overhead region. Further, as the host vehicle moves forward, the white lines L1 and L2 are included in the rear photographing range.
  • FIG. 8 shows the positional relationship between the corresponding area of the front bird's-eye view image If (T) and the rear bird's-eye view area at the time when the white lines L1 and L2 are included in the rear photographing range (time T + n). . Since the white lines L1 and L2 exist in the rear photographing range, as shown in FIG. 8, the latest rear overhead image Ir (T + n) at time T + n includes images corresponding to the white lines L1 and L2. A symbol L1r in FIG. 8 indicates an image corresponding to the actual white line L1, and a symbol L2r indicates an image corresponding to the actual white line L2.
  • the common object determination unit F9 generates the rear bird's-eye image and the front bird's-eye image If (T) that are sequentially generated. In comparison, it is determined whether or not the same photographed object (here, white lines L1 and L2) is included.
  • the common captured object determination unit F9 captures the object captured by the front camera 2 However, it is determined that the rear camera 3 has also photographed.
  • whether or not the same object is captured by comparing the two images may be determined with the aid of a known method such as edge detection processing, contour extraction processing, or pixel analysis processing.
  • the front relative position specifying unit G1 determines the white line L1 for the host vehicle from the corresponding area of the front overhead image If (T) at the time T and the positions of the white lines L1f and L2f in the front overhead image If (T).
  • the relative position of L2 (referred to as the front side relative position) is specified.
  • the rear relative position specifying unit G2 specifies the relative positions of the white lines L1 and L2 (rear side relative positions) with respect to the host vehicle from the positions of the white lines L1r and L2r in the rear overhead image Ir (T + n).
  • the process in step S9 proceeds to step S10.
  • step S10 the divergence degree calculation unit G3 calculates the degree of divergence between the front relative position and the rear relative position, and proceeds to step S11.
  • step S11 the parameter calculation unit G4 calculates a matching parameter for the rear camera 3 in which the deviation degree calculated by the deviation degree calculation unit G3 is zero.
  • This matching parameter is a parameter of the rear camera 3 calculated so that the rear relative position coincides with the front relative position.
  • the display position of the subject (white lines L1, L2) between the partial bird's-eye images is displayed.
  • An integrated overhead view image in which the shift is suppressed can be displayed.
  • the front camera 2 was used as the reference camera as an example in the above, an example in which the alignment parameter for the rear camera 3 is calculated is illustrated.
  • the reference camera is the rear camera 3
  • the parameter calculation unit G4 A matching parameter for the front camera 2 is calculated.
  • control unit 1 when the host vehicle is moving forward is exemplified, but the same processing may be performed when the host vehicle is moving backward.
  • the divergence degree calculation unit G3 determines the common photographing determined from the captured images of the in-vehicle cameras. The degree of deviation of the relative position of the object is calculated. Further, the parameter calculation unit G4 calculates a camera parameter (that is, a matching parameter) in which the deviation degree is 0 (or the deviation degree is reduced) as the camera parameter of the matching target camera. Then, the overhead conversion processing unit F2 generates a partial overhead image from the image captured by the matching target camera using the calculated matching parameter, and further, the image composition unit F6 generates using the matching parameter. An integrated overhead view image is generated using the partial overhead view image.
  • the parameter calculation unit G4 calculates a matching parameter for the rear camera 3.
  • the positional relationship between the object and the vehicle in the rear overhead image represented by the rear overhead image generated using the alignment parameter matches the positional relationship between the object and the vehicle in the front overhead image represented by the front overhead image.
  • the in-vehicle camera with higher reliability is the in-vehicle camera with the smaller mounting error.
  • an in-vehicle camera with a relatively small mounting error is used as a reference to generate an alignment parameter for the other in-vehicle camera
  • an integrated overhead image including a partial overhead image generated using the alignment parameter is used. It can suppress that the relative position with respect to the own vehicle of the object around a vehicle shift
  • the configuration of the present embodiment it is possible to display a partial overhead image and an integrated overhead image that more appropriately represent the positional relationship between the host vehicle and an object existing around the host vehicle.
  • the reliability is calculated to be higher as the mounting error is smaller.
  • the performance of the in-vehicle camera itself for example, the resolution and the number of pixels
  • the reliability may be calculated. For example, the reliability may be increased as the number of years since installation is shorter, or the reliability may be calculated higher as the performance is improved.
  • the camera parameter cannot be adjusted by a combination of the front camera 2 and the rear camera 3 in which a common shooting area does not occur.
  • the camera parameters of the front camera 2 or the rear camera 3 can be adjusted using the history image stored in the memory 12.
  • ⁇ Modification 1> In the above-described embodiment, the aspect in which the relative position is specified from the position of the object in the captured partial overhead view image is illustrated, but the present invention is not limited thereto.
  • the relative position of the object may be specified based on the position of the object in the image before being converted into the partial overhead image or in the image before the distortion correction processing (that is, the captured image).
  • the image managed by the history image management unit F4 may be an image input from the in-vehicle camera instead of the partial overhead view image.
  • the reference camera setting unit F8 may use the traveling direction side camera as a reference camera. That is, the front camera 2 may be the reference camera when the host vehicle is moving forward, and the rear camera 3 may be the reference camera when the host vehicle is moving backward.
  • the surrounding situation on the traveling direction side is more important than the surrounding situation on the leaving side. Therefore, the user is more likely to gaze at the partial overhead view image corresponding to the shooting range of the traveling direction side camera in the integrated overhead view image.
  • the vehicle since the vehicle will move away from the obstacle that is not in the direction of travel, that is, the side that is leaving, information about the surrounding situation on the side that is leaving is relatively useful. Is low.
  • the positional relationship between the obstacle on the leaving side and the own vehicle is not as strictly required as the positional relationship between the obstacle and the own vehicle existing on the traveling direction side. .
  • an object that serves as a mark of the positional relationship between the vehicle and the external environment such as an index object
  • a white line is displayed across a plurality of partial bird's-eye view images.
  • Some driving support systems include an object detection device that detects a relative position of an object existing in a detection range by transmitting a search wave to a predetermined detection range.
  • the driving support system 100 includes such an object detection device, the relative position detected by the object detection device and the relative position specified from the captured image of the in-vehicle camera are detected for the same object. By comparing, the reliability of the in-vehicle camera may be evaluated, and the reference camera may be determined based on the evaluation result.
  • the driving support system 100 includes a front object detection device that includes at least a part of the front shooting range in the detection range, and a rear object detection device that includes at least a part of the rear shooting range in the detection range.
  • a front object detection device and the rear object detection device for example, a laser radar, a sonar, a millimeter wave radar, or the like can be adopted.
  • the control unit 1 in this modified example in addition to the various functional blocks provided in the control unit 1 in the above-described embodiment, an image analysis unit G5 that analyzes an image captured by each in-vehicle camera.
  • the image analysis unit G5 includes a front image analysis unit G51 that analyzes an image captured by the front camera 2 as a finer functional unit, and a rear image analysis unit G52 that analyzes an image captured by the rear camera 3.
  • the front image analysis unit G51 performs edge extraction, contour extraction, Gaussian processing, noise removal processing, pattern matching processing, and the like on the image captured by the front camera 2 to detect a predetermined detection target.
  • a detection target is detected, the position of the detection target with respect to the host vehicle is specified from the position and size of the detection target in the image.
  • the detection target is preferably an object that can also be detected by transmission and reception of exploration waves, such as blocks, curbs, road signs, and the like provided on the road surface. Moreover, it is preferable that it is a stationary object.
  • the rear image analysis unit G52 also performs the same processing as the front image analysis unit G51 on the image captured by the rear camera 3, detects a predetermined detection target, and detects a detection target from the image. Then, the relative position of the detection object with respect to the host vehicle is specified.
  • the reference camera setting unit F8 calculates an error between the relative position specified by the front image analysis unit G51 and the relative position detected by the front object detection device with respect to a predetermined object existing in front of the host vehicle. Further, an error between the relative position specified by the rear image analysis unit G52 and the relative position detected by the rear object detection device is calculated for a predetermined object existing behind the host vehicle.
  • the relative position detected by the object detection device is higher in accuracy than the relative position specified by image analysis from an image taken by one (that is, monocular) vehicle-mounted camera. Therefore, it can be considered that the in-vehicle camera having the larger deviation from the relative position detected by the object detection device has lower reliability. In other words, the in-vehicle camera with the smaller deviation from the relative position detected by the object detection device can be evaluated with higher reliability. That is, an in-vehicle camera having a smaller deviation from the relative position detected by the object detection device may be adopted as the reference camera.
  • ⁇ Modification 4> Data that is used as a material for determining whether or not the front camera 2 and the rear camera 3 have photographed a common object is included in the image, not a unit such as a photographed image or an overhead image generated from the image.
  • a unit of a detection target having a predetermined feature amount may be used.
  • control unit 1 in the modified example 4 will be described taking as an example the case where the host vehicle is moving forward.
  • control unit 1 in the fourth modification includes the image analysis unit G5 described in the third modification.
  • the front image analysis unit G51 extracts a predetermined detection object from the image captured by the front camera 2, specifies its relative position, and detects the detection object including the relative position. Information is stored in the memory 12.
  • the relative position of the detection target may be expressed by the coordinates on the XY plane described above.
  • the information about the detection target other than the relative position corresponds to the outline or color of the detection target. Note that the relative position of the detection target may be sequentially updated by the front side relative position specifying unit G1 based on the movement amount specified by the movement amount specifying unit F3.
  • detection objects detected in common between a plurality of frames may be handled as the same object.
  • a method for determining whether or not the object included in the image at the previous time and the object included in the image at the next time are the same may be performed with the aid of a known method.
  • the rear image analysis unit G52 performs edge extraction, contour extraction, Gaussian processing, noise removal processing, and the like on the image captured by the rear camera 3. An image that has been subjected to such various processes is defined as a processed image. Then, the common photographed object determination unit F9 determines whether or not the processed image generated by the rear image analysis unit G52 includes the object detected by the front image analysis unit G51 using pattern matching processing or the like. That's fine.
  • the data indicating the relative position of the detection target object may be expressed in more detail as a relative position for each of a plurality of feature points constituting the detection target object.
  • the front camera 2 and the rear camera 3 are provided, one of which is the reference camera and the other is the matching target camera.
  • the driving support system 100 may include a right side camera 8 and a left side camera 9, as shown in FIG.
  • the matching parameter may be calculated using either the front camera 2 or the rear camera 3 as a reference camera and the other in-vehicle cameras in order as matching target cameras.
  • the common captured object determination unit F9 compares the captured image of the front camera 2 with the captured image of the right side camera 8. Thus, the presence or absence of the common photographed object may be determined.
  • the common captured object determination unit F9 includes the overlapping area.
  • the matching with respect to the right-side camera 8 is performed so that the relative position of the common photographed object specified from the photographed image of the front camera 2 matches the relative position of the common photographed object specified from the photographed image of the right-side camera 8.
  • the calculation parameter may be calculated.
  • the front camera 2 is used as a reference camera, the alignment parameters are calculated for the rear camera 3, the right side camera 8, and the left side camera 9, respectively, and these alignment parameters are converted into an overhead view by the overhead view conversion processing unit F2. You may apply to processing.
  • the integrated bird's-eye view image in which the shift at the joint between the partial images is suppressed can be displayed.
  • the left side of FIG. 12 represents an integrated overhead image generated without applying the matching parameter
  • the right side is a conceptual diagram showing the integrated overhead image generated by applying the matching parameter.
  • 1 control unit (camera parameter adjustment device), 2 front camera, 3 rear camera, 4 wheel speed sensor, 5 steering angle sensor, 6 shift position sensor, 7 display, F1 parameter acquisition unit, F11 error correction unit (Error calculation unit), F2 overhead conversion processing unit, F3 movement amount specifying unit, F4 history image management unit, F5 interpolated overhead image generation unit, F6 image composition unit (overhead image integration processing unit), F7 display processing unit, F8 standard Camera setting unit (reference camera selection unit), F81, reliability calculation unit (reliability evaluation unit), F9, common object determination unit (determination unit), G1, front side relative position specification unit, G2, rear side relative position specification unit, G3 Deviation degree calculation part (deviation degree calculation part), G4 parameter calculation part

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
PCT/JP2016/053653 2015-02-10 2016-02-08 カメラパラメータ調整装置 WO2016129552A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112016000689.6T DE112016000689T5 (de) 2015-02-10 2016-02-08 Kameraparametereinstellvorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-024544 2015-02-10
JP2015024544A JP6471522B2 (ja) 2015-02-10 2015-02-10 カメラパラメータ調整装置

Publications (1)

Publication Number Publication Date
WO2016129552A1 true WO2016129552A1 (ja) 2016-08-18

Family

ID=56615543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/053653 WO2016129552A1 (ja) 2015-02-10 2016-02-08 カメラパラメータ調整装置

Country Status (3)

Country Link
JP (1) JP6471522B2 (de)
DE (1) DE112016000689T5 (de)
WO (1) WO2016129552A1 (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566707A (zh) * 2018-01-11 2020-08-21 株式会社电装 设置位置信息提供装置以及设置位置信息提供方法
CN111693254A (zh) * 2019-03-12 2020-09-22 纬创资通股份有限公司 车载镜头偏移检测方法与车载镜头偏移检测系统

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021144257A (ja) 2018-06-06 2021-09-24 ソニーグループ株式会社 情報処理装置、情報処理方法、プログラム、及び、移動体
JP7153889B2 (ja) * 2019-03-01 2022-10-17 パナソニックIpマネジメント株式会社 画像補正装置、画像生成装置、カメラシステム及び車両
KR102611759B1 (ko) * 2019-06-20 2023-12-11 현대모비스 주식회사 차량의 어라운드 뷰 영상의 보정 장치 및 그 제어 방법
CN110719406B (zh) * 2019-10-15 2022-06-14 腾讯科技(深圳)有限公司 拍摄处理方法、拍摄设备及计算机设备
KR102387684B1 (ko) * 2020-08-28 2022-04-20 사이텍 주식회사 자율주행 차량의 카메라 검교정 장치

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006277738A (ja) * 2005-03-03 2006-10-12 Nissan Motor Co Ltd 車載画像処理装置及び車両用画像処理方法
JP2010244326A (ja) * 2009-04-07 2010-10-28 Alpine Electronics Inc 車載周辺画像表示装置
JP2014048803A (ja) * 2012-08-30 2014-03-17 Denso Corp 画像処理装置、及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006277738A (ja) * 2005-03-03 2006-10-12 Nissan Motor Co Ltd 車載画像処理装置及び車両用画像処理方法
JP2010244326A (ja) * 2009-04-07 2010-10-28 Alpine Electronics Inc 車載周辺画像表示装置
JP2014048803A (ja) * 2012-08-30 2014-03-17 Denso Corp 画像処理装置、及びプログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566707A (zh) * 2018-01-11 2020-08-21 株式会社电装 设置位置信息提供装置以及设置位置信息提供方法
CN111566707B (zh) * 2018-01-11 2023-06-16 株式会社电装 设置位置信息提供装置以及设置位置信息提供方法
CN111693254A (zh) * 2019-03-12 2020-09-22 纬创资通股份有限公司 车载镜头偏移检测方法与车载镜头偏移检测系统

Also Published As

Publication number Publication date
JP6471522B2 (ja) 2019-02-20
JP2016149613A (ja) 2016-08-18
DE112016000689T5 (de) 2017-10-19

Similar Documents

Publication Publication Date Title
JP6471522B2 (ja) カメラパラメータ調整装置
US9467679B2 (en) Vehicle periphery monitoring device
JP4530060B2 (ja) 駐車支援装置及び方法
US8880344B2 (en) Method for displaying images on a display device and driver assistance system
JP5212748B2 (ja) 駐車支援装置
WO2016002163A1 (ja) 画像表示装置、画像表示方法
JP5953824B2 (ja) 車両用後方視界支援装置及び車両用後方視界支援方法
WO2019008764A1 (ja) 駐車支援方法及び駐車支援装置
US20130010119A1 (en) Parking assistance apparatus, parking assistance system, and parking assistance camera unit
JP6586849B2 (ja) 情報表示装置及び情報表示方法
JP5516998B2 (ja) 画像生成装置
US20140043466A1 (en) Environment image display apparatus for transport machine
WO2010070920A1 (ja) 車両周囲画像生成装置
JP2009085651A (ja) 画像処理システム
WO2020012879A1 (ja) ヘッドアップディスプレイ
JP7426174B2 (ja) 車両周囲画像表示システム及び車両周囲画像表示方法
WO2015122124A1 (ja) 車両周辺画像表示装置、車両周辺画像表示方法
US20190100141A1 (en) Ascertainment of Vehicle Environment Data
JPWO2019142660A1 (ja) 画像処理装置および画像処理方法、並びにプログラム
JP7000383B2 (ja) 画像処理装置および画像処理方法
CN110053625B (zh) 距离计算装置和车辆控制装置
KR20170057684A (ko) 전방 카메라를 이용한 주차 지원 방법
JP7316620B2 (ja) 画像正規化のためのシステムと方法
US20200231099A1 (en) Image processing apparatus
US20220222947A1 (en) Method for generating an image of vehicle surroundings, and apparatus for generating an image of vehicle surroundings

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16749196

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112016000689

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16749196

Country of ref document: EP

Kind code of ref document: A1