WO2014034064A1 - 画像処理装置、及び記憶媒体 - Google Patents
画像処理装置、及び記憶媒体 Download PDFInfo
- Publication number
- WO2014034064A1 WO2014034064A1 PCT/JP2013/004988 JP2013004988W WO2014034064A1 WO 2014034064 A1 WO2014034064 A1 WO 2014034064A1 JP 2013004988 W JP2013004988 W JP 2013004988W WO 2014034064 A1 WO2014034064 A1 WO 2014034064A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- calibration target
- image
- vehicle
- bird
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 39
- 240000004050 Pentaglottis sempervirens Species 0.000 claims abstract description 60
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims abstract description 60
- 238000005259 measurement Methods 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 9
- 239000003550 marker Substances 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000002085 persistent effect Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 description 32
- 230000008569 process Effects 0.000 description 25
- 238000006243 chemical reaction Methods 0.000 description 9
- 230000010365 information processing Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
Definitions
- the present disclosure relates to an image processing apparatus that performs calibration related to installation of a camera used in a vehicle, and a storage medium that includes instructions for performing calibration.
- the following technology is known as a technology for performing calibration related to installation of a camera mounted on a car. That is, a feature point representing the same point is extracted from two image frames photographed by the camera, and the amount of movement of the point represented by the feature point on the world coordinate system is compared with the amount of movement of the own vehicle calculated from the vehicle speed of the own vehicle.
- a technique for calculating the posture of a camera so as to minimize the error is known (for example, Patent Document 1).
- the above-described technology has problems in two respects.
- the first problem is that the technique described above does not take into account the movement of the vehicle in the yaw direction, and the measurer must accurately advance the vehicle straight, putting a burden on the measurer.
- the second problem is that there is a limit to the place where the calibration can be executed because it must be a place where a sufficient linear distance can be taken as the place where the calibration is executed.
- the present disclosure has been made in view of the above points, and an object of the present disclosure is to provide an image processing apparatus and a storage medium that can calculate the value of a posture parameter of a camera without moving the vehicle straight.
- the image processing apparatus includes a captured image input unit, a calibration target specifying unit, a vehicle information input unit, a calculation unit, and a storage control unit.
- the photographed image input unit inputs a plurality of images photographed by a camera mounted on the vehicle.
- the calibration target identifying unit identifies the same calibration target included in common in the first captured image and the second captured image that are input by the captured image input unit and are captured at at least two points where the vehicle has traveled. To do.
- the vehicle information input unit inputs information that can specify a travel distance and a direction change of the vehicle.
- the calculating unit converts the first captured image and the second captured image including the calibration target into a first bird's-eye image and a second bird's-eye image, respectively, and the vehicle information input unit inputs the converted bird's-eye images. Reflecting the respective photographing positions and photographing directions of the first photographed image and the second photographed image specified from the information and arranging them on a common coordinate system, in the first bird's-eye image and the second bird's-eye image, The value of at least one posture parameter of the camera is calculated so that the position difference and rotation difference of the calibration target are reduced.
- the storage control unit causes the storage unit to store the value of the posture parameter calculated by the calculation unit.
- the value of the camera posture parameter reflecting it is calculated. That is, the measurement may be facilitated because the vehicle may be turned at the time of measurement or may not be moved straight.
- a computer-readable persistent and tangible storage medium inputs a plurality of images taken from a camera mounted on a vehicle, and is taken and inputted at at least two points where the vehicle has traveled.
- the same calibration target included in common is specified, information that can specify the travel distance and direction change of the vehicle is input, and the first captured image including the calibration target
- the second photographed image are converted into a first bird's-eye image and a second bird's-eye image, respectively, and each bird's-eye image is identified by information that can identify the travel distance and direction change of the vehicle and the second photographed image.
- the position difference of the calibration target in the first bird's-eye image and the second bird's-eye image calculates the value of the pose parameters of the camera so that the rolling difference is reduced, including instructions to be performed by a computer for storing the value of the calculated the orientation parameters.
- the value of the camera posture parameter reflecting it is calculated. That is, the measurement may be facilitated because the vehicle may be turned at the time of measurement or may not be moved straight.
- the image processing apparatus includes a captured image input unit, a calibration target specifying unit, a vehicle information input unit, a positional relationship acquisition unit, a calculation unit, and a storage control unit.
- the captured image input unit inputs a first captured image captured by a camera mounted on the vehicle and a second captured image captured after the first captured image is captured and the vehicle travels to the image processing apparatus.
- the calibration target specifying unit specifies the first calibration target included in the first captured image input from the captured image input unit and the second calibration target included in the second captured image and having the same shape as the first calibration target. To do.
- the vehicle information input unit inputs information that can specify the travel distance and direction change of the vehicle between the point where the first captured image is captured and the point where the second captured image is captured.
- the positional relationship acquisition unit acquires a relative positional relationship between the first calibration target and the second calibration target.
- the calculation unit converts the first captured image and the second captured image into a first bird's-eye image and a second bird's-eye image, respectively, and the converted first bird's-eye image is identified by information input by the vehicle information input unit.
- the respective shooting positions and shooting directions of the image and the second shot image are reflected and arranged on a common coordinate system.
- the calculation unit overlaps the first calibration target and the second calibration target in a common coordinate system based on the relative positional relationship between the first calibration target and the second calibration target acquired by the positional relationship acquisition unit, A value of at least one posture parameter of the camera is calculated so that a positional difference and a rotational difference between the first calibration target and the second calibration target are reduced.
- the storage control unit stores the value of the posture parameter calculated by the calculation unit in the storage unit.
- the value of the camera posture parameter reflecting it is calculated. That is, the measurement may be facilitated because the vehicle may be turned at the time of measurement or may not be moved straight.
- FIG. 1 is a block diagram illustrating a schematic configuration of an image processing system according to an embodiment.
- FIG. 2A is an explanatory diagram showing the camera mounting position on the vehicle
- FIG. 2B is an explanatory diagram for explaining the coordinates of the camera mounting position and the direction of the optical axis of the camera.
- FIG. 3 is a flowchart for explaining the posture parameter determination process.
- FIG. 4A is a bird's-eye view image of the default calibration target photographed
- FIG. 4B is an ideal default calibration target.
- FIG. 5A is a bird's-eye image of the pseudo calibration target at the first time point
- FIG. 5B is a bird's-eye image of the pseudo calibration target at the second time point
- FIG. It is an image in which a bird's-eye image of a pseudo calibration target at one time point and a bird's-eye image of a pseudo calibration target at a second time point are arranged on the world coordinate system.
- the image processing system (IMAG PROC SYS) 5 includes an in-vehicle camera (VH CAMERA) 11, an image processing device (IMAG PROC) 21, and a display device (DISPLAY) 31.
- VH CAMERA in-vehicle camera
- IMAG PROC image processing device
- DISPLAY display device
- the in-vehicle camera 11 has an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) mounted on the vehicle, images the periphery of the vehicle, and captures the captured image at a predetermined frequency (for example, 1 Output to the image processing apparatus 21 at 60 frames per second).
- a predetermined frequency for example, 1 Output to the image processing apparatus 21 at 60 frames per second.
- CMOS complementary metal oxide semiconductor
- FIG. 2 (A) it is installed at the upper part of the rear end of the vehicle 7 so that the rear periphery of the vehicle can be photographed.
- a camera for photographing the front periphery of the vehicle and a camera for photographing the side periphery may be added or replaced.
- the origin of the world coordinate system in the present embodiment is based on the position of the in-vehicle camera 11 at the time of shooting of a shot image taken at an early time out of the two shot images selected in S150 of the posture parameter determination process described below. And That is, the intersection point of the perpendicular line and the ground surface, which is lowered from the center position R (camera viewpoint R) of the image sensor (not shown) of the in-vehicle camera 11 at the time of shooting, is the origin Q, the rear of the vehicle is the y direction, In the z direction, the right direction is set as the x direction when the vehicle rear is viewed from the center of the vehicle.
- attitude parameters of the in-vehicle camera 11 will be described with reference to FIG.
- a camera coordinate system (x1, y1, z1) having the above-described camera viewpoint R as the origin R is set, and for the camera coordinate system, the rotation angle around the x axis of the world coordinate is Ax, and the world coordinate around the y axis is set.
- the rotation angle is Ay (not shown), and the rotation angle around the z-axis of world coordinates is Az (not shown).
- Ax corresponds to the pitch angle
- Ay corresponds to the roll angle
- Az corresponds to the yaw angle.
- the posture parameters of the in-vehicle camera 11 are the coordinates of the origin R of the camera coordinate system ((0, 0, H) at the start of measurement), the pitch angle Ax, and the roll angle Ay with respect to the world coordinate system. , And yaw angle Az.
- At least one of the position in the x direction and the position in the y direction of the origin R of the camera coordinate system with respect to the world coordinate system is also referred to as a horizontal mounting position of the camera 11.
- the z-direction position of the origin R of the camera coordinate system with respect to the world coordinate system is also referred to as a vertical attachment position.
- the pitch angle Ax of the attitude parameter is also referred to as an attachment pitch angle
- the roll angle Ay is also referred to as an attachment roll angle
- the yaw angle Az is also referred to as an attachment yaw angle.
- the mounting yaw angle indicates the horizontal swing angle of the in-vehicle camera 11, and the mounting pitch angle indicates the vertical swing angle. Then, using these values (coordinates and angles) that have been clarified by past measurements, the processing described below is performed to calibrate these values.
- the display device 31 includes a liquid crystal display or an organic EL display, and can display an image processed by the image processing device 21 based on a captured image captured by the in-vehicle camera 11.
- the image processing apparatus 21 includes an image storage unit (IMAGE STOR) 22, a sensor information input unit (SENS INFO INPUT) 23, an operation unit (OPERATE) 24, a storage unit (STORAGE) 25, and a control unit (CONTROL) 26. .
- IMAGE STOR image storage unit
- SENS INFO INPUT sensor information input unit
- OPERATE operation unit
- STORAGE storage unit
- CONTROL CONTROL
- the image storage unit 22 includes a storage device such as a DRAM, and stores captured images sequentially output from the in-vehicle camera 11 for a predetermined time (for example, the past 10 seconds).
- the sensor information input unit 23 inputs movement distance information obtained from a vehicle speed pulse sensor or the like that can grasp the movement distance of the vehicle, and rotation information obtained from a steering angle sensor or a gyroscope that can grasp the rotation angle of the vehicle. This is an input interface.
- Such information SENS INFO
- ECU information processing apparatus
- the operation unit 24 includes a touch panel provided on the display surface of the display device 31 and mechanical key switches installed around the display device 31.
- the operation unit 24 is a device that can input various operation instructions from a driver or the like. is there.
- the storage unit 25 includes a nonvolatile storage device such as a flash memory, and stores the above-described attitude parameters of the in-vehicle camera 11, a program executed by the control unit 26, and color information and shape information of a default calibration target described later. It is a device.
- storage part 25 is used, and the process (for example, conversion to a bird's-eye view image) of the image
- this posture parameter is also used for determination when warning that the mounting position or mounting angle of the in-vehicle camera 11 is abnormal due to vehicle vibration or the like.
- the control unit 26 includes a microcomputer including a CPU, a RAM, a ROM, an I / O, and the like.
- the control unit 26 reads a program stored in the storage unit 25 and executes various processes.
- Attitude parameter determination processing is executed by reading a program from the storage unit 25 into the control unit 26 when the measurer operates the operation unit 24 and inputs an instruction to update the attitude parameter while the vehicle is stopped. Is started.
- the measurer instructs, it is possible to select whether to update all of the posture parameters (complete update) or to update a part (partial update).
- an execution instruction must be input after setting a predetermined calibration target in the imaging range of the in-vehicle camera 11 in advance.
- the default calibration target is a target intended to be used for measurement, and examples thereof include a square panel having a size of several tens of centimeters.
- control unit 26 When starting the posture parameter determination process, the control unit 26 first branches the process depending on whether or not a complete update has been instructed by the measurer when starting the posture parameter determination process (S105). When complete update is instructed, the process proceeds to S110, and when partial update is instructed, the process proceeds to S125.
- control unit 26 captures one frame of the latest captured image including the default calibration target that is captured by the in-vehicle camera 11 and stored in the image storage unit 22. input.
- the image including the default calibration target is also referred to as a third captured image.
- control unit 26 determines the pitch angle, the roll angle, and the camera among the posture parameters described above so that the deformation of the default calibration target included in the captured image read in S110 is small (preferably minimum).
- the height H (z coordinate) of the coordinates of the viewpoint is determined (S115). This determination will be described in detail.
- the control unit 26 first converts the image input in S110 into a bird's-eye view image.
- This conversion method is well known and is described in detail, for example, in JP-A-10-211849. That is, the coordinates on the screen plane T in FIG. 2B are converted into coordinates on the ground surface.
- the xs direction is a direction parallel to the horizontal direction of the screen plane T
- the ys direction is a direction perpendicular to the xs direction on the screen plane T.
- the rotation about the axis of the world coordinate system is only about the x axis, but the conversion is performed in consideration of the rotation about the y axis or the z axis. Also, the conversion is performed using the value of the posture parameter stored in the storage unit 25.
- the control unit 26 When the control unit 26 finishes the conversion to the bird's-eye view image, the control unit 26 specifies a default calibration target included in the bird's-eye view image.
- the default calibration target is specified using the color information and shape information of the default calibration target stored in the storage unit 25.
- the control unit 26 determines the shape after the bird's-eye conversion of the default calibration target in the captured image with respect to the pitch angle, the roll angle, and the height H (z coordinate) of the camera viewpoint coordinates among the posture parameters.
- the numerical value closest to the ideal shape (the shape specified by the shape information of the default calibration target stored in the storage unit 25) is searched.
- the search method is, for example, increasing or decreasing each parameter value in turn with the posture parameter value at the previous execution stored in the storage unit 25 as a central value, and performing bird's-eye conversion on the captured image and comparing it with the ideal shape.
- the storage unit 25 also functions as a default calibration target storage unit.
- the bird's-eye view images of the two predefined calibration targets 61a and 61b taken as illustrated in FIG. 4A are the ideal-shaped default calibration targets 62 illustrated in FIG. ),
- the pitch angle, the roll angle, and the height H (z coordinate) of the coordinates of the camera viewpoint are searched among the posture parameters.
- the square default calibration target is distorted into a trapezoid or when the areas of the default calibration targets are different, the influence of the pitch angle, roll angle, and camera viewpoint coordinate height H (z coordinate) is large. .
- the control unit 26 determines the pitch angle, the roll angle, and the camera viewpoint coordinate height H (z coordinate) in S ⁇ b> 115
- the control unit 26 uses the determined values. Then, the value of the posture parameter stored in the storage unit 25 is updated (S120). That is, among the posture parameters, the pitch angle, the roll angle, and the coordinate height H (z coordinate) of the camera viewpoint are updated to the latest values.
- control unit 26 notifies that the vehicle is to be moved (S125). Specifically, it is realized by displaying characters on the display device 31.
- control unit 26 starts to sequentially input the latest photographed image captured by the in-vehicle camera 11 and stored in the image storage unit 22 (S135).
- the control unit 26 starts to sequentially input the travel distance information about the travel distance of the vehicle from the vehicle speed pulse sensor and the rotation information about the direction of the vehicle from the steering angle sensor via the sensor information input unit 23 (S140).
- the input information is stored in the RAM in the control unit 26 in association with each captured image input in S135.
- the moving distance and the rotation angle of the vehicle from the place where the immediately preceding captured image is captured are associated with each captured image that continues at equal intervals.
- the control unit 26 analyzes the captured image input in S135 and determines a pseudo calibration target from the image (S145).
- a pseudo-calibration target is not an object placed for the purpose of measurement, but is a fixed object (an object that does not move, including white lines) near the ground surface around the road of the vehicle, and is larger than a certain size. And an object that can identify the degree of rotation (an object that is not a perfect circle when viewed from above). As specific examples, various things existing near the ground surface, such as white lines on the road, yellow lines, curbs, reflectors, road patterns, braille blocks on the sidewalk, etc., can be considered. Further, only one pseudo calibration target may be selected and determined from the captured image, or a plurality of pseudo calibration targets may be selected. The control unit 26 executes S145 until an appropriate pseudo calibration target can be determined.
- the pseudo calibration target is also called a calibration target.
- control unit 26 selects two images (images photographed from two different points) including the same pseudo-calibration target determined in S145 from the photographed images input in S135 (S150).
- an interval between the two points in time an interval that allows the vehicle to move several meters is appropriate.
- An image captured at the first time point is referred to as a first captured image
- an image captured at the second time point is referred to as a second captured image.
- control unit 26 inputs the movement distance and the rotation angle of the vehicle between two time points (between two points) at which the image selected in S150 is captured, and inputs the movement distance and rotation angle of the vehicle associated with each image in S140. It calculates based on information and rotation information (S155).
- control unit 26 sets x of the coordinates of the camera viewpoint out of the above-described posture parameters so that the position difference and the rotation difference of the pseudo calibration target at the two time points on the world coordinate system are small (preferably minimum). , Y, and yaw angle are determined (S160). This determination will be described in detail.
- the control unit 26 first converts each of the two captured images selected in S150 into a bird's-eye view image. Specifically, the first captured image is converted into a first bird's-eye image, and the second captured image is converted into a second bird's-eye image. At this time, conversion is performed using the value of the posture parameter stored in the storage unit 25.
- the control unit 26 determines the position of the camera viewpoint among the posture parameters so that the difference (shift) in the position and rotation of the pseudo calibration target included in each of the two converted bird's-eye images is reduced on the world coordinate system.
- the search method is performed by, for example, sequentially increasing / decreasing the value of each parameter with the value of the posture parameter at the previous execution stored in the storage unit 25 as a central value, and performing bird's-eye conversion on the captured image to evaluate the difference.
- the method to do is considered, it is not restricted to this.
- FIG. 5 is used to illustrate the positional difference (positional deviation) and rotational difference (rotational deviation) of the pseudo calibration target.
- FIG. 5A is an image obtained by converting the captured images of the pseudo calibration targets 71 and 72 at the first time point selected in S150 into bird's-eye images, and is an image near the pseudo calibration targets 71 and 72.
- two square-shaped pseudo calibration targets 71 and 72 are arranged side by side and converted into a state of being slightly rotated counterclockwise as compared with FIG. 5B described later.
- FIG. 5B is an image obtained by converting the captured images of the pseudo calibration targets 71 and 72 at the second time point selected in S150 into a bird's-eye view image, and is the same pseudo calibration target as the pseudo calibration target of FIG. It is an image near 71, 72.
- two square-shaped pseudo calibration targets 71 and 72 are arranged side by side and converted to a state slightly rotated clockwise (aligned state in the drawing) as compared with FIG. 5A. .
- FIG. 5C is an image in which FIGS. 5A and 5B are arranged with the coordinate system aligned. That is, it is an image in which the pseudo calibration targets 71 and 72 are always arranged in the world coordinate system.
- the pseudo calibration targets 71 and 72 at the first time point and the pseudo calibration targets 71 and 72 at the second time point have a difference in position (deviation) and a difference in rotation (deviation).
- the x, y, and yaw angles of the camera viewpoint coordinates are searched for from among the posture parameters. Note that the difference in the position and rotation of the pseudo-calibration target is largely due to the influence of the x, y and yaw angles of the camera viewpoint coordinates.
- the posture parameters stored in the storage unit 25 are determined based on these determined values.
- the value is updated (S165). That is, among the posture parameters, the x, y and yaw angles of the camera viewpoint coordinates are updated to the latest values.
- the control unit 26 notifies the measurer that the update of the posture parameter is completed (S170). Specifically, it is realized by displaying characters on the display device 31. When the notification is finished, the control unit 26 ends the present process (posture parameter determination process).
- the posture parameter value of the camera 11 reflecting the rotation is calculated. That is, the measurement may be facilitated because the vehicle may be turned at the time of measurement or may not be moved straight.
- a calibration target a marker, a panel or the like (predetermined calibration target) installed around the vehicle for the purpose of measurement may be specified as a pseudo calibration target, but it is not intended to be used for measurement.
- An object that is not present, that is, an object that exists in the vicinity of the ground surface and can be distinguished from the road surface may be specified as the pseudo calibration target. That is, any color information or shape information may be stored in advance in the storage unit 25 as the default calibration target, and the predetermined calibration target is not necessarily limited to a marker or panel installed around the vehicle. For example, if the color information or the shape information of the white line on the road surface is known in advance, it can be set as the default calibration target.
- the posture parameter value of the camera 11 can be calculated without preparing a default calibration target such as a marker before measurement.
- the posture parameter value calculated in S160 is at least one of the horizontal mounting position of the camera 11 and the mounting yaw angle (horizontal swing angle) of the camera 11.
- Examples of the horizontal mounting position of the camera 11 include one (X or Y) or both (X and Y) directions of orthogonal coordinate axes on a horizontal plane.
- Shape information that can specify the original shape of the default calibration target is stored in advance, and a captured image (third captured image) obtained by capturing the default calibration target is converted into a third bird's-eye image, and the third bird's-eye image is displayed.
- a type other than the posture parameter calculated in S160 there is a type other than the posture parameter calculated in S160.
- the value of the posture parameter may be calculated in S115.
- the value of the posture parameter calculated in S115 is also stored in the storage unit 25.
- the difference in shape includes, for example, size and distortion.
- the parameter value calculated in S115 is at least one of the mounting pitch angle (vertical swing angle) of the camera 11, the mounting roll angle, and the vertical mounting position.
- the process performed by the control unit 26 in S135 functions as a captured image input unit (captured image input unit), and the process performed by the control unit 26 in S145 is a calibration target specifying unit (calibration target specifying unit).
- the process performed by the control unit 26 in S140 functions as a vehicle information input unit (vehicle information input unit)
- the process performed by the control unit 26 in S135 functions as a captured image input unit (captured image input unit)
- the process performed by the control unit 26 in S160 functions as a first calculation unit (first calculation unit)
- the process performed by the control unit 26 in S115 functions as a second calculation unit (second calculation unit)
- the processing performed in S165 functions as a storage control unit (storage control means).
- the image processing apparatus 21 includes a captured image input unit (S135), a calibration target specifying unit (S145), a vehicle information input unit (S140), a positional relationship acquisition unit, a calculation unit (S160), and storage control. Part (S165).
- the photographed image input unit (S135) is an image processing device that displays a first photographed image photographed by the camera 11 mounted on the vehicle and a second photographed image photographed after the first photographed image was photographed and the vehicle traveled. 21.
- the calibration target specifying unit (S145) includes the first calibration target included in the first captured image and the second captured image that are input by the captured image input unit (S135) and have the same shape as the first calibration target. A second calibration target is identified.
- the vehicle information input unit (S140) inputs information that can specify the travel distance and direction change of the vehicle between the point where the first captured image is captured and the point where the second captured image is captured.
- the positional relationship acquisition unit acquires a relative positional relationship between the first calibration target and the second calibration target.
- the calculation unit (S160) converts the first photographed image and the second photographed image into the first bird's-eye image and the second bird's-eye image, respectively, and converts each bird's-eye image into the information input by the vehicle information input unit (S140). Reflecting the respective shooting positions and shooting directions of the identified first and second shot images, they are arranged on a common coordinate system.
- the calculation unit (S160) based on the relative positional relationship between the first calibration target and the second calibration target acquired by the positional relationship acquisition unit, in the common coordinate system, the first calibration target and the second calibration target. And the value of at least one posture parameter of the camera 11 is calculated so that the positional difference and rotational difference between the first calibration target and the second calibration target are reduced.
- the storage control unit (S165) causes the storage unit 25 to store the posture parameter value calculated by the calculation unit (S160).
- the first calibration target and the second calibration target may be targets having the same shape located at different positions.
- the image processing apparatus 21 as described above is configured so that the same pseudo-calibration target photographed from two different points is respectively reflected on the ground plane on the world coordinate system, which is a common coordinate system, reflecting the movement and rotation of the vehicle. Deploy. Then, among the posture parameters of the in-vehicle camera 11, x, x of the coordinates of the camera viewpoint, which is the first type of posture parameter, so as to reduce the positional deviation and the rotational deviation when the arranged pseudo calibration target is viewed from above. Review the values of the horizontal mounting position and the mounting yaw angle indicated by y. Therefore, in order to calculate the value of the attitude parameter reflecting the rotational movement of the vehicle, the vehicle may be turned during measurement.
- the image processing device 21 selects and determines the pseudo calibration target from various types existing near the ground surface around the vehicle. For this reason, by moving the vehicle in the vicinity of the pseudo calibration target without placing a predetermined calibration target such as a marker around the vehicle before measurement, x, y of the coordinates of the camera viewpoint in the posture parameters The value of the yaw angle can be reviewed.
- a default calibration target (a calibration target that is not a pseudo calibration target) included in the bird's-eye view image.
- the posture parameters of the in-vehicle camera 11 the mounting pitch angle, the mounting roll angle, and the camera viewpoint coordinate height H (in the vertical direction indicated by the z coordinate) of the posture parameters of the in-vehicle camera 11.
- the value of (mounting position) can be reviewed.
- attitude parameters x, y, z, pitch angle, roll angle, and yaw angle
- one in-vehicle camera 11 is installed as a camera for photographing the rear periphery of the vehicle 7, but a plurality of cameras may be installed in the vehicle 7.
- the plurality of cameras may be installed so as to be able to photograph the front, rear, or side of the vehicle, respectively.
- the posture parameter determination process described above may be performed for each camera.
- the control unit 26 of the image processing apparatus 21 performs all the steps of the attitude parameter determination process, but some of the steps are performed by another information processing apparatus. It may be. That is, the control unit 26 transmits data necessary for executing some steps to another information processing apparatus, and the control unit 26 receives the result data from the information processing apparatus, thereby realizing the posture parameter determination process. May be.
- the other information processing apparatus may be an in-vehicle information processing apparatus or a non-in-vehicle information processing apparatus that can communicate via a communication network. As described above, if some steps are performed by another information processing apparatus, the processing load on the image processing apparatus 21 can be suppressed, and the hardware configuration can be simplified. .
- the pseudo calibration target is used from S145 to S160, but a default calibration target (a panel or the like installed around the vehicle for measurement purposes) may be used. That is, a predetermined calibration target may be arranged around the vehicle traveling path in advance, and the vehicle may travel in the vicinity thereof, and the control unit 26 may use it to perform the processes of S145 to S160.
- the present disclosure may include a computer-readable persistent and tangible storage medium including instructions for image processing executed by the control unit 26.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1に示すように、本実施形態による画像処理システム(IMAG PROC SYS)5は、車載カメラ(VH CAMERA)11と画像処理装置(IMAG PROC)21と表示装置(DISPLAY)31とを備える。
次に画像処理装置21の動作について説明する。以下では、本開示に関する処理を中心に説明し、従来より知られたこの手の車載カメラ11が撮影した撮影画像を表示装置に表示させるための処理(例えば、車庫入れを補助するために撮影画像を鳥瞰画像に変換し車輪の予測軌跡などを合成して表示装置に表示させる処理等)については、説明を省略する。
上記実施形態の変形例による画像処理装置21は、撮影画像入力部(S135)、校正ターゲット特定部(S145)、車両情報入力部(S140)、位置関係取得部、算出部(S160)と記憶制御部(S165)を含む。撮影画像入力部(S135)は、車両に搭載されたカメラ11より撮影された第1撮影画像と、第1撮影画像が撮影され車両が走行した後に撮影された第2撮影画像とを画像処理装置21に入力する。校正ターゲット特定部(S145)は、撮影画像入力部(S135)により入力された、第1撮影画像に含まれる第1校正ターゲットと、第2撮影画像に含まれ、第1校正ターゲットと同一形状の第2校正ターゲットを特定する。車両情報入力部(S140)は、第1撮影画像を撮影した地点と、第2撮影画像を撮影した地点間の車両の走行距離及び向き変化を特定可能な情報を入力する。位置関係取得部は、第1校正ターゲットと第2校正ターゲットとの相対的な位置関係を取得する。算出部(S160)は、第1撮影画像と第2撮影画像をそれぞれ第1鳥瞰画像と第2鳥瞰画像に変換し、変換した各鳥瞰画像を、車両情報入力部(S140)が入力した情報より特定される第1撮影画像と第2撮影画像のそれぞれの撮影位置及び撮影方向を反映させて共通の座標系上に配置させる。その後、算出部(S160)は、位置関係取得部が取得した第1校正ターゲットと第2校正ターゲットとの相対的な位置関係に基づいて、共通の座標系において第1校正ターゲットと第2校正ターゲットが重なり、第1校正ターゲットと第2校正ターゲットの位置差及び回転差が少なくなるようカメラ11の少なくとも1つの姿勢パラメータの値を算出する。記憶制御部(S165)は、算出部(S160)により算出された姿勢パラメータの値を記憶部25に記憶させる。第1校正ターゲットと第2校正ターゲットは、違う位置に位置する同一形状をもつターゲットでもよい。
以上のような画像処理装置21は、要するに、異なる2地点より撮影された同一の疑似校正ターゲットを、車両の移動及び回転を反映して共通の座標系である世界座標系上の地平面にそれぞれ配置する。そして、その配置した疑似校正ターゲットを上方から見た場合の位置ずれや回転ずれが少なくなるように、車載カメラ11の姿勢パラメータのうちの、第1種姿勢パラメータであるカメラ視点の座標のx,yにより示される水平方向の取付位置及び取付ヨー角の値を見直す。したがって、車両の回転移動も反映して姿勢パラメータの値を算出するため、測定の際に車両を旋回させてもよい。
(1)上記実施形態では、同一の疑似校正ターゲットが含まれる二時点の画像(異なる二地点から撮影した画像)を選択するようになっていたが(S150)、同一の疑似校正ターゲットが含まれる三以上の時点の画像(異なる三地点から撮影した画像)を選択して、以降の処理を行うようになっていてもよい。このようになっていれば、より精度の高い姿勢パラメータを算出できる。
Claims (9)
- 車両に搭載されたカメラより撮影された複数の画像を入力する撮影画像入力部(26,S135)と、
前記撮影画像入力部(26,S135)により入力された、車両が走行した少なくとも2地点で撮影された第1撮影画像と第2撮影画像において、共通して含まれる同一の校正ターゲットを特定する校正ターゲット特定部(26,S145)と、
前記車両の走行距離及び向き変化を特定可能な情報を入力する車両情報入力部(26,S140)と、
前記校正ターゲットが含まれる前記第1撮影画像と前記第2撮影画像をそれぞれ第1鳥瞰画像と第2鳥瞰画像に変換し、変換した各鳥瞰画像を、前記車両情報入力部(26,S140)が入力した情報より特定される前記第1撮影画像と前記第2撮影画像のそれぞれの撮影位置及び撮影方向を反映させて共通の座標系上に配置させ、前記第1鳥瞰画像と前記第2鳥瞰画像において前記校正ターゲットの位置差及び回転差が少なくなるよう前記カメラの少なくとも1つの姿勢パラメータの値を算出する算出部(26,S160)と、
前記算出部(26,S160)により算出された前記姿勢パラメータの値を記憶部(25)に記憶させる記憶制御部(26,S165)と、
を備える画像処理装置。 - 前記校正ターゲット特定部(26,S145)は、地表面近傍に存在する路面と区別可能な物体を前記校正ターゲットとして特定する、
請求項1に記載の画像処理装置。 - 前記少なくとも1つの姿勢パラメータの値は、前記カメラの水平方向の取付位置、又は、前記カメラの取付ヨー角の少なくともいずれか一つである、
請求項1又は2に記載の画像処理装置。 - 前記校正ターゲットとは異なる既定校正ターゲットの本来の形状を特定可能な形状情報を記憶する既定校正ターゲット記憶部(25)と、
前記算出部(26,S160)は第1算出部といい、前記姿勢パラメータは第1種姿勢パラメータといい、
前記第1種姿勢パラメータとは異なる少なくとも1つの第2種姿勢パラメータの値を算出する第2算出部(26,S115)、
をさらに備え、
前記第2算出部(26,S115)は、
前記撮影画像入力部(26,S135)により入力された前記カメラより撮影され前記既定校正ターゲットを含む第3撮影画像を第3鳥瞰画像に変換し、
前記第3鳥瞰画像に含まれる前記既定校正ターゲットの形状と、前記既定校正ターゲット記憶部(25)に記憶されている前記形状情報により特定される前記既定校正ターゲットの本来の形状との相違が少なくなるように前記第2種姿勢パラメータの値を算出し、
前記記憶制御部(26,S165)は、前記第2算出部(26,S115)により算出された前記第2種姿勢パラメータの値も前記記憶部(25)に記憶させる、
請求項1~3のいずれか1項に記載の画像処理装置。 - 前記少なくとも1つの第2種姿勢パラメータの値は、前記カメラの取付ピッチ角、取付ロール角、又は、鉛直方向の取付位置の少なくともいずれか一つである、
請求項4に記載の画像処理装置。 - 前記第1算出部(26,S160)は、前記第2算出部(26,S115)によって算出されて前記記憶部(25)に記憶された前記第2種姿勢パラメータの値に基づいて前記第1撮影画像と第2撮影画像をそれぞれ第1鳥瞰画像と第2鳥瞰画像に変換する、
請求項4又は5に記載の画像処理装置。 - 前記校正ターゲットは、少なくとも道路上の白線、黄線、縁石、反射板、路面の模様、歩道の点字ブロックのうち1つを含み、
前記既定校正ターゲットは、測定に用いることを目的としたターゲットであり、少なくとも車両の周辺に設置されたマーカとパネルのうち1つを含む、
請求項1~6のいずれか1項に記載の画像処理装置。 - 車両に搭載されたカメラより撮影した複数の画像を入力し、
車両が走行した少なくとも2地点で撮影され入力された第1撮影画像と第2撮影画像において、共通して含まれる同一の校正ターゲットを特定し、
前記車両の走行距離及び向き変化を特定可能な情報を入力し、
前記校正ターゲットを含む前記第1撮影画像と前記第2撮影画像をそれぞれ第1鳥瞰画像と第2鳥瞰画像に変換し、
各鳥瞰画像を、前記車両の走行距離及び向き変化を特定可能な情報より特定される第1撮影画像と第2撮影画像のそれぞれの撮影位置及び撮影方向を反映させて共通の座標系上に配置させ、
前記第1鳥瞰画像と前記第2鳥瞰画像において前記校正ターゲットの位置差及び回転差が少なくなるよう前記カメラの姿勢パラメータの値を算出し、
前記算出された前記姿勢パラメータの値を記憶する、
ためのコンピュータによって実施される命令を含み、コンピュータ読み取り可能な持続的且つ有形の記憶媒体。 - 車両に搭載されたカメラより撮影された第1撮影画像と、前記第1撮影画像が撮影され前記車両が走行した後に撮影された第2撮影画像とを入力する撮影画像入力部(26,S135)と、
前記撮影画像入力部(26,S135)により入力された、前記第1撮影画像に含まれる第1校正ターゲットと、前記第2撮影画像に含まれ、前記第1校正ターゲットと同一形状の第2校正ターゲットを特定する校正ターゲット特定部(26,S145)と、
前記第1撮影画像を撮影した地点と、前記第2撮影画像を撮影した地点間の前記車両の走行距離及び向き変化を特定可能な情報を入力する車両情報入力部(26,S140)と、
前記第1校正ターゲットと前記第2校正ターゲットとの相対的な位置関係を取得する位置関係取得部と、
前記第1撮影画像と前記第2撮影画像をそれぞれ第1鳥瞰画像と第2鳥瞰画像に変換し、変換した各鳥瞰画像を、前記車両情報入力部(26,S140)が入力した情報より特定される前記第1撮影画像と前記第2撮影画像のそれぞれの撮影位置及び撮影方向を反映させて共通の座標系上に配置させ、前記位置関係取得部が取得した前記第1校正ターゲットと前記第2校正ターゲットとの相対的な位置関係に基づいて、前記共通の座標系において前記第1校正ターゲットと前記第2校正ターゲットが重なり、前記第1校正ターゲットと前記第2校正ターゲットの位置差及び回転差が少なくなるよう前記カメラの少なくとも1つの姿勢パラメータの値を算出する算出部(26,S160)と、
前記算出部(26,S160)により算出された前記姿勢パラメータの値を記憶部(25)に記憶させる記憶制御部(26,S165)と、
を備える画像処理装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112013004266.5T DE112013004266T5 (de) | 2012-08-30 | 2013-08-23 | Bildbearbeitungsvorrichtung und Speichermedium |
US14/421,204 US9967526B2 (en) | 2012-08-30 | 2013-08-23 | Image processing device and storage medium |
CN201380044872.1A CN104641394B (zh) | 2012-08-30 | 2013-08-23 | 图像处理装置及图像处理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-190056 | 2012-08-30 | ||
JP2012190056A JP5820787B2 (ja) | 2012-08-30 | 2012-08-30 | 画像処理装置、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014034064A1 true WO2014034064A1 (ja) | 2014-03-06 |
Family
ID=50182902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/004988 WO2014034064A1 (ja) | 2012-08-30 | 2013-08-23 | 画像処理装置、及び記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9967526B2 (ja) |
JP (1) | JP5820787B2 (ja) |
CN (1) | CN104641394B (ja) |
DE (1) | DE112013004266T5 (ja) |
WO (1) | WO2014034064A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020060550A (ja) * | 2018-10-10 | 2020-04-16 | 株式会社デンソーテン | 異常検出装置、異常検出方法、姿勢推定装置、および、移動体制御システム |
CN112184827A (zh) * | 2019-07-01 | 2021-01-05 | 威达斯高级驾驶辅助设备有限公司 | 校准多个摄像机的方法及装置 |
DE112015002764B4 (de) | 2014-06-11 | 2021-07-29 | Denso Corporation | Montagewinkeleinstellverfahren und Montagewinkelerfassungseinrichtung für bordeigene Kamera |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6151535B2 (ja) * | 2013-02-27 | 2017-06-21 | 富士通テン株式会社 | パラメータ取得装置、パラメータ取得方法及びプログラム |
EP2942951A1 (en) * | 2014-05-06 | 2015-11-11 | Application Solutions (Electronics and Vision) Limited | Image calibration |
KR101579100B1 (ko) * | 2014-06-10 | 2015-12-22 | 엘지전자 주식회사 | 차량용 어라운드뷰 제공 장치 및 이를 구비한 차량 |
US10091418B2 (en) * | 2014-10-24 | 2018-10-02 | Bounce Imaging, Inc. | Imaging systems and methods |
KR101947935B1 (ko) * | 2014-12-22 | 2019-02-13 | 사이버옵틱스 코포레이션 | 3차원 측정 시스템의 갱신 보정 방법 |
JP6471522B2 (ja) * | 2015-02-10 | 2019-02-20 | 株式会社デンソー | カメラパラメータ調整装置 |
JP6413974B2 (ja) * | 2015-08-05 | 2018-10-31 | 株式会社デンソー | キャリブレーション装置、キャリブレーション方法、及びプログラム |
WO2017057057A1 (ja) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | 画像処理装置、画像処理方法、およびプログラム |
MX367989B (es) * | 2015-10-08 | 2019-09-13 | Nissan Motor | Dispositivo de asistencia de despliegue y método de asistencia de despliegue. |
EP3176035A1 (en) * | 2015-12-03 | 2017-06-07 | Fico Mirrors S.A. | A rear vision system for a motor vehicle |
JP2017139612A (ja) * | 2016-02-03 | 2017-08-10 | パナソニックIpマネジメント株式会社 | 車載カメラ用校正システム |
JP6614042B2 (ja) * | 2016-06-15 | 2019-12-04 | 株式会社Jvcケンウッド | 姿勢変化判定装置、俯瞰映像生成装置、俯瞰映像生成システム、姿勢変化判定方法およびプログラム |
US10594934B2 (en) * | 2016-11-17 | 2020-03-17 | Bendix Commercial Vehicle Systems Llc | Vehicle display |
JPWO2018173907A1 (ja) * | 2017-03-23 | 2019-11-07 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JP6780093B2 (ja) * | 2017-03-30 | 2020-11-04 | 富士フイルム株式会社 | 画像処理装置及び画像処理方法 |
JP6779365B2 (ja) * | 2017-03-30 | 2020-11-04 | 三菱電機株式会社 | 物体検出装置及び車両 |
JP6815935B2 (ja) * | 2017-06-05 | 2021-01-20 | 日立オートモティブシステムズ株式会社 | 位置推定装置 |
US10466027B2 (en) | 2017-06-21 | 2019-11-05 | Fujitsu Ten Corp. Of America | System and method for marker placement |
US10670725B2 (en) * | 2017-07-25 | 2020-06-02 | Waymo Llc | Determining yaw error from map data, lasers, and cameras |
JP7019431B2 (ja) * | 2018-01-22 | 2022-02-15 | 株式会社デンソーアイティーラボラトリ | カメラキャリブレーション装置、カメラキャリブレーション方法、およびプログラム |
JP7219561B2 (ja) * | 2018-07-18 | 2023-02-08 | 日立Astemo株式会社 | 車載環境認識装置 |
JP7314486B2 (ja) * | 2018-09-06 | 2023-07-26 | 株式会社アイシン | カメラキャリブレーション装置 |
JP7148064B2 (ja) * | 2018-10-25 | 2022-10-05 | 株式会社アイシン | カメラパラメータ推定装置、カメラパラメータ推定方法、およびカメラパラメータ推定プログラム |
JP7123167B2 (ja) * | 2018-12-12 | 2022-08-22 | 日立Astemo株式会社 | 外界認識装置 |
CN109782755B (zh) * | 2018-12-27 | 2022-05-31 | 广东飞库科技有限公司 | 控制agv进行校准、agv校准位置的方法、计算机存储介质及agv |
KR102277828B1 (ko) * | 2019-08-13 | 2021-07-16 | (주)베이다스 | 복수의 카메라들을 캘리브레이션하는 방법 및 장치 |
WO2021145236A1 (ja) * | 2020-01-14 | 2021-07-22 | 京セラ株式会社 | 画像処理装置、撮像装置、情報処理装置、検出装置、路側機、画像処理方法、及び較正方法 |
EP3885698B1 (en) | 2020-01-31 | 2022-09-28 | NSK Ltd. | Calibration method for rotation angle calculation device, calibration device for rotation angle calculation device, rotation angle calculation device, motor control equipment, electric actuator product, and electric power steering apparatus |
CN113572946B (zh) * | 2020-04-29 | 2023-05-02 | 杭州海康威视数字技术股份有限公司 | 图像显示方法、装置、系统及存储介质 |
JP2022016908A (ja) * | 2020-07-13 | 2022-01-25 | フォルシアクラリオン・エレクトロニクス株式会社 | 俯瞰画像生成装置、俯瞰画像生成システム及び自動駐車装置 |
CN112132829A (zh) * | 2020-10-23 | 2020-12-25 | 北京百度网讯科技有限公司 | 车辆信息的检测方法、装置、电子设备和存储介质 |
CN112801874B (zh) * | 2021-02-03 | 2024-04-05 | 广州六环信息科技有限公司 | 一种车辆图像拼接方法、装置和车辆 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009017462A (ja) * | 2007-07-09 | 2009-01-22 | Sanyo Electric Co Ltd | 運転支援システム及び車両 |
JP2009129001A (ja) * | 2007-11-20 | 2009-06-11 | Sanyo Electric Co Ltd | 運転支援システム、車両、立体物領域推定方法 |
JP2010233079A (ja) * | 2009-03-27 | 2010-10-14 | Aisin Aw Co Ltd | 運転支援装置、運転支援方法、及び運転支援プログラム |
JP2010244326A (ja) * | 2009-04-07 | 2010-10-28 | Alpine Electronics Inc | 車載周辺画像表示装置 |
JP2011217233A (ja) * | 2010-04-01 | 2011-10-27 | Alpine Electronics Inc | 車載カメラ校正システム及びコンピュータプログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3663801B2 (ja) | 1997-01-30 | 2005-06-22 | いすゞ自動車株式会社 | 車両後方視界支援装置 |
JP3886376B2 (ja) | 2001-12-26 | 2007-02-28 | 株式会社デンソー | 車両周辺監視システム |
JP4596978B2 (ja) * | 2005-03-09 | 2010-12-15 | 三洋電機株式会社 | 運転支援システム |
JP5124147B2 (ja) * | 2007-02-01 | 2013-01-23 | 三洋電機株式会社 | カメラ校正装置及び方法並びに車両 |
JP4857143B2 (ja) | 2007-02-20 | 2012-01-18 | アルパイン株式会社 | カメラ姿勢算出用ターゲット装置およびこれを用いたカメラ姿勢算出方法ならびに画像表示方法 |
WO2009072264A1 (ja) * | 2007-12-03 | 2009-06-11 | Panasonic Corporation | 画像処理装置、撮影装置、再生装置、集積回路及び画像処理方法 |
EP2377094B1 (en) * | 2008-10-01 | 2014-06-11 | Connaught Electronics Limited | A method and a system for calibrating an image capture device |
JP5471038B2 (ja) * | 2009-05-27 | 2014-04-16 | アイシン精機株式会社 | 校正目標検出装置と、校正目標を検出する校正目標検出方法と、校正目標検出装置のためのプログラム |
JP5491235B2 (ja) | 2010-03-02 | 2014-05-14 | 東芝アルパイン・オートモティブテクノロジー株式会社 | カメラキャリブレーション装置 |
JP5444139B2 (ja) | 2010-06-29 | 2014-03-19 | クラリオン株式会社 | 画像のキャリブレーション方法および装置 |
-
2012
- 2012-08-30 JP JP2012190056A patent/JP5820787B2/ja not_active Expired - Fee Related
-
2013
- 2013-08-23 DE DE112013004266.5T patent/DE112013004266T5/de not_active Ceased
- 2013-08-23 WO PCT/JP2013/004988 patent/WO2014034064A1/ja active Application Filing
- 2013-08-23 US US14/421,204 patent/US9967526B2/en active Active
- 2013-08-23 CN CN201380044872.1A patent/CN104641394B/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009017462A (ja) * | 2007-07-09 | 2009-01-22 | Sanyo Electric Co Ltd | 運転支援システム及び車両 |
JP2009129001A (ja) * | 2007-11-20 | 2009-06-11 | Sanyo Electric Co Ltd | 運転支援システム、車両、立体物領域推定方法 |
JP2010233079A (ja) * | 2009-03-27 | 2010-10-14 | Aisin Aw Co Ltd | 運転支援装置、運転支援方法、及び運転支援プログラム |
JP2010244326A (ja) * | 2009-04-07 | 2010-10-28 | Alpine Electronics Inc | 車載周辺画像表示装置 |
JP2011217233A (ja) * | 2010-04-01 | 2011-10-27 | Alpine Electronics Inc | 車載カメラ校正システム及びコンピュータプログラム |
Non-Patent Citations (1)
Title |
---|
TOSHIHIRO ASAI ET AL.: "3D Lines Reconstruction of Road Environment Based on Structure Knowledge from In-Vehicle Camera Images", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS, vol. J93-D, no. 7, 1 July 2010 (2010-07-01), pages 1236 - 1247 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112015002764B4 (de) | 2014-06-11 | 2021-07-29 | Denso Corporation | Montagewinkeleinstellverfahren und Montagewinkelerfassungseinrichtung für bordeigene Kamera |
JP2020060550A (ja) * | 2018-10-10 | 2020-04-16 | 株式会社デンソーテン | 異常検出装置、異常検出方法、姿勢推定装置、および、移動体制御システム |
JP7270499B2 (ja) | 2018-10-10 | 2023-05-10 | 株式会社デンソーテン | 異常検出装置、異常検出方法、姿勢推定装置、および、移動体制御システム |
CN112184827A (zh) * | 2019-07-01 | 2021-01-05 | 威达斯高级驾驶辅助设备有限公司 | 校准多个摄像机的方法及装置 |
CN112184827B (zh) * | 2019-07-01 | 2024-06-04 | Nc&有限公司 | 校准多个摄像机的方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
DE112013004266T5 (de) | 2015-09-24 |
US9967526B2 (en) | 2018-05-08 |
US20150208041A1 (en) | 2015-07-23 |
CN104641394A (zh) | 2015-05-20 |
JP2014048803A (ja) | 2014-03-17 |
CN104641394B (zh) | 2017-06-27 |
JP5820787B2 (ja) | 2015-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014034064A1 (ja) | 画像処理装置、及び記憶媒体 | |
US10659677B2 (en) | Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium | |
KR101265012B1 (ko) | 차량 탑재 카메라의 교정 장치, 방법 및 프로그램 | |
JP5091902B2 (ja) | 車載カメラの校正に用いられる校正指標と、当該校正指標を用いた車載カメラの校正方法と、システムと、当該システムのためのプログラム | |
CN104718750B (zh) | 校准方法及校准装置 | |
JP4861034B2 (ja) | 車載カメラのキャリブレーションシステム | |
JP6277652B2 (ja) | 車両周辺画像表示装置及びカメラの調整方法 | |
WO2016056197A1 (ja) | 車載カメラ較正装置、画像生成装置、車載カメラ較正方法、画像生成方法 | |
US20200226926A1 (en) | Parking Assistance Method and Parking Assistance Device | |
WO2010137364A1 (ja) | 校正目標検出装置と、校正目標を検出する校正目標検出方法と、校正目標検出装置のためのプログラム | |
EP3032818B1 (en) | Image processing device | |
JP2009288152A (ja) | 車載カメラのキャリブレーション方法 | |
CN105472317B (zh) | 周边监视装置及周边监视系统 | |
JP2017188738A (ja) | 車載カメラの取付角度検出装置、取付角度較正装置、取付角度検出方法、取付角度較正方法、およびコンピュータープログラム | |
JP6471522B2 (ja) | カメラパラメータ調整装置 | |
JP2007256030A (ja) | 車載カメラのキャリブレーション装置およびキャリブレーション方法 | |
JP2015194397A (ja) | 車両位置検出装置、車両位置検出方法及び車両位置検出用コンピュータプログラムならびに車両位置検出システム | |
CN109345591A (zh) | 一种车辆自身姿态检测方法和装置 | |
JP6768554B2 (ja) | キャリブレーション装置 | |
US20220132049A1 (en) | Systems and method for image normalization | |
EP3709265A1 (en) | Road surface area detection device | |
JP2012118029A (ja) | 退出判定装置、退出判定プログラム及び退出判定方法 | |
JP6020736B2 (ja) | 予測進路提示装置及び予測進路提示方法 | |
JP2012023758A (ja) | 車載カメラの校正装置と、車載カメラの校正方法と、車載カメラの校正装置のためのプログラム | |
WO2019221151A1 (ja) | 撮像システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13834220 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14421204 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130042665 Country of ref document: DE Ref document number: 112013004266 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13834220 Country of ref document: EP Kind code of ref document: A1 |