WO2014119081A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2014119081A1 WO2014119081A1 PCT/JP2013/079934 JP2013079934W WO2014119081A1 WO 2014119081 A1 WO2014119081 A1 WO 2014119081A1 JP 2013079934 W JP2013079934 W JP 2013079934W WO 2014119081 A1 WO2014119081 A1 WO 2014119081A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- composite image
- unit
- display
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 6
- 239000002131 composite material Substances 0.000 claims abstract description 118
- 238000000034 method Methods 0.000 claims abstract description 11
- 239000003550 marker Substances 0.000 claims description 46
- 239000000203 mixture Substances 0.000 description 25
- 238000010586 diagram Methods 0.000 description 14
- 230000007704 transition Effects 0.000 description 7
- 240000004050 Pentaglottis sempervirens Species 0.000 description 3
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/101—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using cameras with adjustable capturing direction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/102—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- Embodiments described herein relate generally to an image processing apparatus and an image processing method.
- the composition position of the composite image may be displaced due to individual differences of the in-vehicle camera or a slight error in the attachment position.
- a calibration operation for correcting such a displacement is performed by a dealer operator or the like. Done.
- the present invention has been made in view of the above, and an object of the present invention is to provide an image processing apparatus and an image processing method capable of easily performing a calibration operation of an in-vehicle camera.
- an image processing apparatus for processing an image generates a composite image in which a vehicle is viewed from a virtual viewpoint based on a plurality of in-vehicle camera images.
- a change receiving unit that receives the change, and the generation unit generates the composite image based on the changed positional relationship each time the change in the positional relationship is received.
- the calibration work can be easily executed.
- FIG. 1 is an explanatory diagram of an image processing apparatus according to the embodiment.
- FIG. 2 is a block diagram illustrating a configuration of the image processing apparatus according to the embodiment.
- FIG. 3 is an explanatory diagram of a display change unit according to the embodiment.
- FIG. 4 is a diagram showing a three-dimensional orthogonal coordinate system.
- FIG. 5A is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5B is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5C is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5D is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5E is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5A is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5B is a transition diagram of the display operation unit according to the embodiment.
- FIG. 5C is a transition diagram of the display operation unit according to the embodiment
- FIG. 6 is an overhead view illustrating a guide locus when the vehicle is moved backward.
- FIG. 7 is a flowchart illustrating processing executed by the control unit of the image processing apparatus according to the embodiment.
- FIG. 8A is a plan view of the display operation unit according to the embodiment.
- FIG. 8B is a plan view of the display operation unit in which the display unit of FIG. 8A is enlarged.
- FIG. 9 is a plan view of the display unit according to the embodiment.
- FIG. 1 is an explanatory diagram of an image processing apparatus according to the embodiment.
- the image processing apparatus performs, for example, image processing and connecting camera images (car-mounted camera images) captured by cameras provided at four positions on the vehicle 2 in the front, rear, left, and right.
- camera images car-mounted camera images
- it is an apparatus that generates a bird's-eye view image (hereinafter referred to as “composite image”) when the vehicle 2 is viewed from an upper virtual viewpoint.
- composite image displays a composite image by a display operation unit 8 having a touch panel function provided in a vehicle interior.
- the image processing apparatus for example, when the attachment position and the attachment angle of the camera provided in the front portion of the vehicle 2 are deviated from the predetermined position and angle, as shown in FIG. There may be a shift in the joint between the image of the region 4a where the image is captured and another image.
- Such misalignment of the joints in the image may give a sense of discomfort to the user of the vehicle 2. For this reason, the shift of the seam between the images is corrected, for example, when a worker of a dealer of the vehicle 2 performs a calibration operation.
- the image processing apparatus is configured to easily perform such calibration work. Specifically, in the image processing apparatus, the display unit 4 that displays the generated composite image around the vehicle, the selection reception unit 5, and the change reception unit 6 are provided in the display area of the display operation unit 8.
- the selection receiving unit 5 is an image that imitates an operation button that receives an operation of selecting one camera from a plurality of cameras provided in the vehicle 2.
- the change receiving unit 6 receives a change in the relative positional relationship between an image region based on one camera image selected by an operation of the selection receiving unit 5 and an image region based on another camera image in the composite image. It is the image which imitated the operation button. More specifically, the change accepting unit 6 accepts an operation for changing an area to be used for a composite image in a camera image captured by a camera selected by an operation of the selection accepting unit 5.
- a control unit 15 selects a camera provided at the front of the vehicle.
- control unit 15 is an area to be used for the composite image in the camera images captured by the camera provided at the front of the vehicle. Is shifted to the right by a predetermined distance, and a composite image is generated and displayed on the display unit 4 each time.
- the operator performs the calibration work while confirming the composite image of the vehicle displayed on the display unit 4 every time the left arrow button of the change receiving unit 6 is touched, as shown in the lower part of FIG.
- the shift of the joint between the image of the area 4a obtained by imaging the front of the vehicle and the other image can be easily eliminated.
- a specific example of the composite image sequentially generated by the control unit 15 will be described later with reference to FIGS. 5A to 5E.
- such a calibration operation can be performed in units of all the cameras mounted on the vehicle 2, for example, calibration when any one camera is replaced. Work becomes easy.
- FIG. 2 is a block diagram illustrating a configuration of the image processing apparatus according to the embodiment.
- the image processing apparatus 1 is an apparatus applied to a vehicle 2 on which a camera 3 is mounted.
- the camera 3 includes a front camera 3a, a rear camera 3b, a left side camera 3c, and a right side camera 3d.
- These front camera 3a, rear camera 3b, left side camera 3c, and right side camera 3d each have an image pickup device for electronically acquiring an image, such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the front camera 3a is provided at the front end of the vehicle 2, for example, in the vicinity of the license plate mounting position, and photographs the front of the vehicle 2.
- the rear camera 3b is provided at the rear end of the vehicle 2 such as in the vicinity of the license plate mounting position, for example, and photographs the rear of the vehicle 2.
- the left side camera 3c and the right side camera 3d are provided, for example, in the vicinity of the left and right door mirrors, respectively, and photograph the left side and the right side of the vehicle 2.
- a wide-angle camera having a field angle of 180 degrees or more such as a fisheye lens is adopted. As a result, by using the four cameras 3, the entire periphery of the vehicle 2 can be photographed.
- the image processing apparatus 1 provided in the vehicle 2 equipped with the camera 3 includes a display operation unit 8, a control unit 15, and a storage unit 20.
- the display operation unit 8 is a display device having a touch panel function, and includes a display unit 4, a selection reception unit 5, a change reception unit 6, and a display enlargement unit 7.
- the display unit 4 displays and outputs the composite image generated by the generation unit 10 described later via the display control unit 11.
- the selection receiving unit 5 receives an operation for selecting one from the front camera 3a, the rear camera 3b, the left side camera 3c, and the right side camera 3d mounted on the vehicle 2 and outputs the operation to the image selection unit 12 described later.
- the change receiving unit 6 receives a change in the relative positional relationship between an image region based on one camera image selected by an operation of the selection receiving unit 5 and an image region based on another camera image in the composite image. More specifically, an area (hereinafter referred to as “image composition area”) to be used for a composite image in the captured images of the cameras 3 a, 3 b, 3 c, and 3 d selected by the operation of the selection reception unit 5. An operation to be changed is received and output to the display changing unit 13 described later.
- the display enlargement unit 7 enlarges a part of the composite image including the position and displays it on the display unit 4. The operation by selection of the display enlargement unit 7 will be described later with reference to FIGS. 8A and 8B.
- control unit 15 includes an image acquisition unit 9, a generation unit 10, a display control unit 11, an image selection unit 12, a display change unit 13, and a fixed frame superimposition unit 14.
- the image acquisition unit 9 acquires camera images from a plurality of cameras 3a, 3b, 3c, and 3d mounted on the vehicle 2, respectively.
- the camera image acquired by the image acquisition unit 9 is output to the generation unit 10.
- the generation unit 10 performs the image processing for connecting the images included in the regions adopted from the camera images of the cameras 3a, 3b, 3c, and 3d acquired by the image acquisition unit 9, thereby making the vehicle 2 a virtual viewpoint. Is generated, and the composite image is output to the display control unit 11.
- the display control unit 11 causes the display unit 4 to display the composite image synthesized by the generation unit 10.
- the image selection unit 12 selects a camera image acquired from one selected by the selection reception unit 5 among the plurality of cameras 3a, 3b, 3c, and 3d.
- the display changing unit 13 outputs the image included in the image composition area changed by the operation of the change receiving unit 6 to the generating unit 10. Then, each time the change receiving unit 6 receives a change in the relative positional relationship between adjacent image regions in the composite image, the generation unit 10 displays the composite image on the display unit 4 based on the changed positional relationship. Is generated. That is, the display changing unit 13 changes the composite image that the generating unit 10 displays on the display unit 4 via the display control unit 11 every time the change receiving unit 6 is operated.
- FIG. 3 is a diagram for explaining the change of the image composition area in the display change unit 13 according to the embodiment.
- the camera image captured by the camera 3 a mounted on the vehicle 2 is illustrated as an image processed in a planar shape.
- the image composition region r1 shown in FIG. 3 is a region where a shift occurs in the joint of the image when employed in the composite image, and the image composition region r2 is formed at the joint of the image when employed in the composite image. This is a region where no deviation occurs.
- the image processing apparatus 1 when the image composition area r1 is adopted for the composite image, the area used for the composite image is changed to the image composition area r2 in accordance with the operation of the change receiving unit 6. Eliminate image misalignment at the seam.
- FIG. 4 is a diagram illustrating a three-dimensional orthogonal coordinate system defined by the image processing apparatus 1 according to the embodiment.
- FIG. 4 a three-dimensional orthogonal coordinate system including a Z-axis having a positive upward direction as a positive direction, a Z-axis having a vertical downward direction as a negative direction, and a Y-axis having a straight direction of the vehicle 2 as a positive direction. It is shown.
- the camera image captured by the camera 3 is changed by changing the roll angle, tilt angle, pan angle, etc. of the camera 3.
- the image composition area in the camera image can be changed based on the roll angle, tilt angle, pan angle, etc. of the camera 3, but it is difficult to change it intuitively.
- an image parameter 17 to be described later is prepared, and the image composition area is changed based on the image parameter 17 so that the image composition area can be intuitively changed by the operation of the change receiving unit 6. it can.
- the image parameter 17 can be acquired by, for example, a combination of the three-dimensional orthogonal coordinates described in FIG. An example of such a combination is shown in Table 1.
- the combination of the rotating shafts shown in Table 1 is an example, and the combination can be changed depending on the mounting angle of each camera 3 mounted on the vehicle 2.
- the relative position in the combined image of the marker and the vehicle 2 provided at a predetermined position with respect to the vehicle 2 becomes the relative position of the marker and the vehicle 2.
- a display overlapping with the marker in the composite image (hereinafter referred to as “fixed frame”) is superimposed on a predetermined fixed position in the composite image.
- the fixed frame is a marker when each camera 3 is installed in the vehicle 2 without any deviation as designed and the marker is provided at a predetermined position with respect to the vehicle 2 without any deviation as designed. Is a design position displayed in the composite image.
- the fixed frame will be described later.
- the storage unit 20 includes a storage device such as a nonvolatile memory or a hard disk drive.
- the storage unit 20 may include a volatile memory that temporarily stores data.
- the storage unit 20 stores a fixed frame storage unit 16, an image parameter 17, and a program 18.
- the fixed frame storage unit 16 stores information related to the fixed frame superimposed on the composite screen by the fixed frame superimposing unit 14. Such information may be, for example, data regarding the relative position with respect to the image of the vehicle 2 displayed on the composite screen, or may be the shape of the fixed frame superimposed on the composite screen.
- the image parameter 17 is a parameter set to change the image composition area to be used for the composite image in the captured image in accordance with the operation of the image composition area change receiving unit 6 described above with reference to FIG. is there.
- the program 18 includes a program that is read from the storage unit 20 and executed when the control unit 15 generates a composite image.
- the program 18 includes a procedure in which the generation unit 10 generates a composite image in which the vehicle 2 is viewed from a virtual viewpoint based on a plurality of in-vehicle camera images acquired by the image acquisition unit 9, and the display control unit 11 generates the generation unit 10.
- the change receiving unit 6 receives a change in the relative positional relationship with the image area based on the other camera image that has been made, and the generation unit 10 is changed each time the change receiving unit 6 receives a change in the positional relationship.
- An image processing program for causing a computer to execute a procedure for generating a composite image based on the positional relationship.
- control unit 15 of the image processing apparatus 1 By causing the control unit 15 of the image processing apparatus 1 to execute such an image processing program, it is possible to easily execute a calibration operation for a composite image.
- FIGS. 5A to 5E are transition diagrams of the display operation unit 8 according to the embodiment.
- the display operation unit 8 displays the display unit 4 that displays the composite image, the selection reception unit 5 and the change reception unit 6 described above, the reset button R, the back button B, and the determination button 19.
- the selection reception unit 5 includes a front camera selection reception button 5a, a rear camera selection reception button 5b, a left side camera selection reception button 5c, and a right side camera selection reception button 5d.
- any one of the selection reception buttons 5a to 5d is touch-operated, so that any one of the cameras 3a, 3b, 3c, and 3d provided on the front, rear, left and right of the vehicle 2 is selected. Can be selected.
- the configuration of the selection receiving unit 5 can be changed according to the arrangement and number of cameras mounted on the vehicle.
- the change receiving unit 6 includes a direction change receiving unit 6a and a height change receiving unit 6b.
- the direction receiving unit 6a includes buttons 6a1 and 6a2 for changing the image composition area in the downward and upward directions, buttons 6a3 and 6a4 for changing in the left and right directions, and buttons 6a5 and 6a5 for rotating the images in the counterclockwise and clockwise directions, respectively. 6a6 is included.
- the height change receiving unit 6b includes buttons 6b1 and 6b2 for changing the image composition area in the height direction, that is, in the positive direction and the negative direction along the Z axis shown in FIG.
- the reset button R is a button for returning (resetting) the position of the image composition area in the captured image changed by the operation of the change receiving unit 6 to a default position.
- the back button B is a button for returning the composite image displayed on the display unit 4 to the previous state in which the selection receiving unit 5 and the change receiving unit 6 are operated.
- the decision button 19 is a button for confirming the composite image displayed on the display unit 4 by changing the image composition region by operating the selection reception unit 5 or the change reception unit 6.
- the composite image displayed on the display unit 4 includes the first image provided at a predetermined position with respect to the vehicle arranged on the arrangement surface of the work place provided in the vehicle factory or the vehicle maintenance shop. Images of markers M11 to M14 as one marker are reflected.
- markers M11 to M14 are joints of images joined by the generation unit 10 shown in FIG. 2, that is, portions indicated by broken lines in the display unit 4 shown in FIG. 5A. It is provided to confirm this.
- the markers M11 to M14 are cross lines in which two lines having a predetermined width and length are orthogonal to each other, and are imaged by both of two cameras provided at adjacent positions on the outer periphery of the vehicle 2. Provided at each position.
- the operator operates the change receiving unit 6 so that the markers M11 to M14 have a predetermined cross shape at the joints of the images in the composite image.
- Image calibration can be performed.
- the operator presses the front camera selection acceptance button.
- the front camera 3a is selected by operating 5a.
- the front camera selection acceptance button 5a is selected, and a display that clearly indicates that the image displayed in the area 4a is in a changeable state is displayed.
- the buttons 6a3 when the operator operates the button 6a3 to move the markers M11 and M12 to the left, as shown in FIG. 5C, the images of the markers M11 and M12 displayed in the area 4a are set to one unit. A composite image that is moved in the leftward direction is generated and displayed on the display unit 4.
- the numerical value indicating the operation amount displayed beside the button 6a3 is updated from “0” to “ ⁇ 1”. Thereby, the operator can confirm the operation amount of the button 6a3 by a numerical value.
- the markers M11 and M12 at the joints of the images do not have a predetermined cross shape.
- the operator again operates the button 6a3 for changing the image composition area so that the markers M11 and M12 move leftward.
- a composite image in which the images of the markers M11 and M12 displayed in the region 4a are further moved leftward by a preset unit is generated and displayed on the display unit 4.
- the numerical value displayed beside the button 6a3 is updated from "-1" to "-2".
- the images of the markers M11 and M12 displayed in the region 4a are further moved downward by a preset unit, and the markers M11 and M12 at the joint of the images are moved to a predetermined cross shape.
- a composite image having a shape is generated and displayed on the display unit 4.
- the numerical value displayed beside the button 6a1 is updated from “0” to “ ⁇ 1”.
- the image display device 1 displays the image of the markers M11 to M14 displayed in the composite image at a predetermined cross at the joint portion of the image indicated by the broken line on the display unit 4.
- the composite image is changed so as to have a shape. As a result, the uncomfortable feeling associated with the misalignment of the image joints in the composite image is eliminated.
- the composite image is generated and displayed so that the image of the markers M11 to M14 displayed in the composite image has a predetermined cross shape at the joint portion of the image with a broken line on the display unit 4.
- the relative positions of the markers M11 to M14 and the vehicle on the combined image are shifted from the relative positions of the markers M11 to M14 and the vehicle.
- Such a problem may occur, for example, when the driver assists the driving operation of the driver with a composite image when the driver moves the vehicle backward.
- this point will be described with reference to FIG.
- FIG. 6 is an overhead view illustrating a guide locus when the vehicle 2 is moved backward.
- a line 60 indicates that the vehicle 2 moves back to the position 2A when the relative position on the composite image of the markers M11 to M14 and the vehicle coincides with the relative position of the markers M11 to M14 and the vehicle. It is the bird's-eye view image which predicted and showed the guidance locus at the time.
- M21 to M26 as second markers are provided at predetermined positions with respect to the vehicle arranged on the arrangement surface of the work place provided in the vehicle factory or vehicle maintenance shop. M21 to M26 as the second markers are reflected in the composite image displayed on the display unit 4 as shown in FIG. 5A.
- the marker M21 is provided in an area photographed by the front camera 3a, and the marker M22 is provided in an area photographed by the rear camera 3b.
- the marker M21 and the marker M22 are lines having a predetermined width and length provided so as to be parallel to each other with the vehicle 2 interposed therebetween.
- both the markers M23 and M24 are provided in an area photographed by the left side camera 3c, and both the markers M25 and M26 are provided in an area photographed by the right side camera 3d.
- Marker M23 and marker M25 are lines having a predetermined width and length provided so as to be parallel to each other with vehicle 2 interposed therebetween.
- the marker M24 and the marker M26 are lines having a predetermined width and length provided so as to be parallel to each other with the vehicle 2 interposed therebetween.
- the marker M21 is arranged on a straight line connecting the intersection of the two lines constituting the marker M11 and the intersection of the two lines constituting the marker M12. Further, the marker M22 is arranged on a straight line connecting the intersection of two lines constituting the marker M13 and the intersection of two lines constituting the marker M14.
- Markers M23 and M24 are arranged on a straight line connecting the intersection of two lines constituting the marker M11 and the intersection of two lines constituting the marker M13. Further, the markers M25 and M26 are arranged on a straight line connecting the intersection of two lines constituting the marker M12 and the intersection of two lines constituting the marker M14.
- frame-shaped displays S21 to S28 (hereinafter referred to as “fixed frames”) are displayed superimposed on predetermined fixed positions in the composite image.
- the fixed frames S21 to S26 have relative positions in the combined image of the markers M21 to M26 and the vehicle 2 provided at predetermined positions with respect to the vehicle 2 as relative positions of the markers M21 to M26 and the vehicle 2. In this case, it overlaps with the markers M21 to M26 displayed in the composite image.
- the fixed frames S27 and S28 have relative positions in the regions 4c and 4d of the composite image of the markers M13 and M14 and the vehicle 2 provided at predetermined positions with respect to the vehicle 2, so that the markers M13 and M14 and the vehicle 2 overlaps with the images of the markers M13 and M14 displayed in the areas 4c and 4d in the composite image.
- the absolute position of the marker with respect to the vehicle is correctly displayed by changing the composite image based on the fixed frames S21 to S28. This further increases the accuracy of the calibration work.
- FIG. 7 is a flowchart illustrating processing executed by the control unit 15 of the image processing apparatus 1 according to the embodiment.
- the control unit 15 first acquires camera images captured by a plurality of cameras 3a, 3b, 3c, and 3d mounted on the vehicle 2 (step S101). Subsequently, the control unit 15 generates a composite image showing the state of the surroundings of the vehicle 2 by performing image processing for joining images included in the regions adopted from the acquired camera images, and displays the composite image on the display unit 4. It is displayed (step S102). At this time, when the relative position in the composite image of the marker provided at a predetermined position with respect to the vehicle 2 and the vehicle 2 becomes the relative position of the marker and the vehicle 2, the marker in the composite image A fixed frame that overlaps with may be superimposed on a predetermined fixed position in the composite image.
- control unit 15 determines whether or not the selection receiving unit 5 has been operated (step S103). When it is determined that the selection receiving unit 5 has been operated (step S103, Yes), the process proceeds to step S104. When it is determined that the selection receiving unit 5 has not been operated (step S103, No), the process proceeds to step S107.
- the control unit 15 further selects the camera 3 based on the operation of the selection receiving unit 5 in step S103 (step S104), and proceeds to step S105.
- control unit 15 determines whether or not the change receiving unit 6 has been operated (step S105). When it is determined that the change receiving unit 6 has been operated (step S105, Yes), the process proceeds to step S106. When it is determined that the change receiving unit 6 has not been operated (step S105, No), the process proceeds to step S107.
- control unit 15 changes the composite image generated by the generation unit 10 every time the change receiving unit 6 is operated and displayed on the display unit 4 via the display change unit 13 (step S106), and then proceeds to step S102. Return.
- control unit 15 determines whether or not the determination button 19 has been operated (step S107). If it is determined that the enter button 19 has not been operated (step S107, No), the process returns to step S103. If it is determined that the enter button 19 has been operated (step S107, Yes), the display unit 4 is finally displayed. Confirmed with the displayed composite image.
- the image processing apparatus for processing an image generates a composite image obtained by viewing a vehicle from a virtual viewpoint based on a plurality of in-vehicle camera images, and the generated composite image.
- the calibration work of the in-vehicle camera can be easily executed.
- FIG. 8A is a plan view of the display operation unit 8 according to the embodiment
- FIG. 8B is a plan view of the display operation unit 8 in which the display unit 4 of FIG. 8A is enlarged.
- FIG. 8A when a position in the vicinity of the marker M13 in the composite image displayed on the display unit 4 is selected, a part of the composite image including the position is enlarged and displayed on the display unit as shown in FIG. 8B. 4 is displayed.
- this display enlargement operation it becomes easy to confirm whether or not the change receiving unit 6 has been properly operated, so that the on-vehicle camera calibration operation can be easily and reliably executed.
- the display operation unit 8 has been described as a touch panel. However, only the display unit 4 is displayed on the display, and the buttons and the display enlargement unit 7 constituting the selection reception unit 5 and the change reception unit 6 are physically displayed. Alternatively, the button may be configured to be operated by pressing the button.
- FIG. 9 is a plan view of the display unit according to the embodiment.
- the composite image generated based on the camera image captured only by the rear camera 3 b is only the rear of the vehicle 2 in the composite image overlooking the periphery of the vehicle 2 in each of the above-described embodiments.
- the image is a bird's-eye view.
- the image processing apparatus 1 generates a composite image in which the vehicle 2 is viewed from a virtual viewpoint based on a single in-vehicle camera image, and causes the display unit 4 to display the generated composite image. Then, a change in the relative positional relationship between the fixed frame S22 displayed at a predetermined position of the composite image and the marker M22 reflected in the composite image is accepted, and the change is made each time a change in the positional relationship is accepted. A composite image is generated based on the positional relationship.
- the single camera 3 has been described as the rear camera 3b.
- any one of the front camera 3a, the left camera 3c, and the right camera 3d may be used.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
3 カメラ
4 表示部
5 選択受付部
6 変更受付部
7 表示拡大部
8 表示操作部
9 画像取得部
10 生成部
11 表示制御部
13 表示変更部
14 固定枠重畳部
15 制御部
20 記憶部
Claims (7)
- 画像を処理する画像処理装置において、
複数の車載カメラ画像に基づいて車両を仮想視点から見た合成画像を生成する生成部と、
生成された前記合成画像を表示部へ表示させる表示制御部と、
前記合成画像における、一のカメラ画像に基づく画像領域と他のカメラ画像に基づく画像領域との相対的な位置関係の変更を受け付ける変更受付部と
を備え、
前記生成部は、前記位置関係の変更を受け付ける度に、変更された位置関係に基づいて前記合成画像を生成することを特徴とする画像処理装置。 - 前記車両に対して予め定められた位置に設けられたマーカーと該車両との前記合成画像における相対位置が、前記マーカーと前記車両との相対位置となった場合に前記合成画像中で前記マーカーと重複する表示を前記合成画像内の予め定められた固定位置に重畳させる表示重畳部
をさらに備えることを特徴とする請求項1に記載の画像処理装置。 - 前記表示部に表示された前記合成画像における任意の位置が選択された場合に、当該位置を含む前記合成画像の一部を拡大して前記表示部へ表示させる表示拡大部
をさらに備えることを特徴とする請求項1または請求項2に記載の画像処理装置。 - 複数の車載カメラ画像に基づいて車両を仮想視点から見た合成画像を生成する工程と、
生成された前記合成画像を表示させる工程と、
前記合成画像における、一のカメラ画像に基づく画像領域と他のカメラ画像に基づく画像領域との相対的な位置関係の変更を受け付ける工程と、
前記位置関係の変更を受け付ける度に、変更された位置関係に基づいて前記合成画像を生成する工程と
を含むことを特徴とする画像処理方法。 - 前記車両が配置された配置面上には、前記車両の外周で隣り合う位置に設けられる2つの車載カメラの双方によって撮像される位置に所定形状の第1のマーカーが設けられ、
前記合成画像内の前記第1のマーカーが前記所定形状となるように前記相対的な位置関係を変更し、前記合成画像を生成させる工程
をさらに含むことを特徴とする請求項4に記載の画像処理方法。 - 前記車両が配置された配置面上には、前記車両に対して予め定められた位置に第2のマーカーが設けられ、
前記第2のマーカーと前記車両との前記合成画像における相対位置が、前記第2のマーカーと前記車両との相対位置となった場合に前記合成画像中で前記マーカーと重複する表示を、前記合成画像内の予め定められた固定位置に重畳させる工程と、
前記合成画像中で前記第2のマーカーおよび重畳される前記表示が重複するように前記合成画像を変更させる工程と
をさらに含むことを特徴とする請求項4または請求項5に記載の画像処理方法。 - 画像を処理する画像処理装置において、
単一の車載カメラ画像に基づいて車両を仮想視点から見た合成画像を生成する生成部と、
生成された前記合成画像を表示部へ表示させる表示制御部と、
前記合成画像の予め定められた位置に表示される枠形状の表示と前記合成画像に写り込むマーカーとの相対的な位置関係の変更を受け付ける変更受付部と
を備え、
前記生成部は、前記位置関係の変更を受け付ける度に、変更された位置関係に基づいて前記合成画像を生成することを特徴とする画像処理装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380071607.2A CN104956668B (zh) | 2013-01-30 | 2013-11-05 | 图像处理装置以及图像处理方法 |
US14/655,918 US10328866B2 (en) | 2013-01-30 | 2013-11-05 | Image processing apparatus and image processing method for generating synthetic image and changing synthetic image |
DE112013006544.4T DE112013006544T8 (de) | 2013-01-30 | 2013-11-05 | Bildverarbeitungsvorrichtung und Bildverarbeitungsverfahren |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013016193A JP6223685B2 (ja) | 2013-01-30 | 2013-01-30 | 画像処理装置および画像処理方法 |
JP2013-016193 | 2013-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014119081A1 true WO2014119081A1 (ja) | 2014-08-07 |
Family
ID=51261813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/079934 WO2014119081A1 (ja) | 2013-01-30 | 2013-11-05 | 画像処理装置および画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10328866B2 (ja) |
JP (1) | JP6223685B2 (ja) |
CN (1) | CN104956668B (ja) |
DE (1) | DE112013006544T8 (ja) |
WO (1) | WO2014119081A1 (ja) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9723272B2 (en) * | 2012-10-05 | 2017-08-01 | Magna Electronics Inc. | Multi-camera image stitching calibration system |
KR101610508B1 (ko) * | 2014-09-19 | 2016-04-20 | 현대자동차주식회사 | Avm 자동 보정 시스템 및 방법 |
JP6601101B2 (ja) * | 2015-09-29 | 2019-11-06 | アイシン精機株式会社 | 表示制御装置 |
US10176554B2 (en) * | 2015-10-05 | 2019-01-08 | Google Llc | Camera calibration using synthetic images |
JP6323729B2 (ja) * | 2016-04-25 | 2018-05-16 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびにキャリブレーション方法 |
JP6068710B1 (ja) * | 2016-05-30 | 2017-01-25 | 株式会社ネクスコ・エンジニアリング北海道 | 俯瞰画像調整装置および俯瞰画像調整プログラム |
US10331125B2 (en) * | 2017-06-06 | 2019-06-25 | Ford Global Technologies, Llc | Determination of vehicle view based on relative location |
US10466027B2 (en) | 2017-06-21 | 2019-11-05 | Fujitsu Ten Corp. Of America | System and method for marker placement |
JP7032950B2 (ja) | 2018-02-19 | 2022-03-09 | 株式会社デンソーテン | 車両遠隔操作装置、車両遠隔操作システム及び車両遠隔操作方法 |
JP7060418B2 (ja) | 2018-03-15 | 2022-04-26 | 株式会社デンソーテン | 車両遠隔操作装置及び車両遠隔操作方法 |
EP3644279A1 (en) * | 2018-10-25 | 2020-04-29 | Continental Automotive GmbH | Static camera calibration using motion of vehicle portion |
US10421401B1 (en) * | 2019-05-02 | 2019-09-24 | Richard C. Horian | Video system for vehicles |
JP7055158B2 (ja) * | 2020-02-13 | 2022-04-15 | 株式会社クボタ | 作業車 |
EP4195150A4 (en) * | 2020-09-21 | 2024-06-26 | Siemens Ltd., China | TARGET POSITIONING METHOD AND APPARATUS AND COMPUTER-READABLE MEDIUM |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010200054A (ja) * | 2009-02-26 | 2010-09-09 | Alpine Electronics Inc | 複数カメラ撮影画像合成システム用カメラ調整方法及び装置 |
JP2010239409A (ja) * | 2009-03-31 | 2010-10-21 | Aisin Seiki Co Ltd | 車載カメラの校正装置 |
JP2010244326A (ja) * | 2009-04-07 | 2010-10-28 | Alpine Electronics Inc | 車載周辺画像表示装置 |
JP2011151666A (ja) * | 2010-01-22 | 2011-08-04 | Fujitsu Ten Ltd | パラメータ取得装置、パラメータ取得システム、パラメータ取得方法、及び、プログラム |
JP2012124610A (ja) * | 2010-12-06 | 2012-06-28 | Fujitsu Ten Ltd | 画像表示システム、画像処理装置及び画像表示方法 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1227683B1 (en) * | 1999-10-12 | 2006-07-12 | Matsushita Electric Industrial Co., Ltd. | Monitor camera, method of adjusting camera, and vehicle monitor system |
JP2006287826A (ja) * | 2005-04-05 | 2006-10-19 | Nissan Motor Co Ltd | 車両用画像生成装置および方法 |
TW200829466A (en) * | 2007-01-03 | 2008-07-16 | Delta Electronics Inc | Advanced bird view visual system |
JP4548322B2 (ja) * | 2005-11-28 | 2010-09-22 | 株式会社デンソー | 駐車支援システム |
KR101143176B1 (ko) * | 2006-09-14 | 2012-05-08 | 주식회사 만도 | 조감도를 이용한 주차구획 인식 방법, 장치 및 그를 이용한주차 보조 시스템 |
CN201008471Y (zh) * | 2007-01-26 | 2008-01-23 | 明门实业股份有限公司 | 婴儿摇椅及其驱动装置 |
JP2008187564A (ja) * | 2007-01-31 | 2008-08-14 | Sanyo Electric Co Ltd | カメラ校正装置及び方法並びに車両 |
JP4286294B2 (ja) | 2007-02-21 | 2009-06-24 | 三洋電機株式会社 | 運転支援システム |
JP5112998B2 (ja) * | 2008-09-16 | 2013-01-09 | 本田技研工業株式会社 | 車両周囲監視装置 |
DE102009050368A1 (de) * | 2008-10-24 | 2010-05-27 | Magna Electronics Europe Gmbh & Co.Kg | Verfahren zum automatischen Kalibrieren einer virtuellen Kamera |
KR100966288B1 (ko) * | 2009-01-06 | 2010-06-28 | 주식회사 이미지넥스트 | 주변 영상 생성 방법 및 장치 |
JP2010215029A (ja) * | 2009-03-13 | 2010-09-30 | Toyota Industries Corp | 駐車支援装置 |
EP2485203B1 (en) * | 2009-09-30 | 2020-02-19 | Panasonic Intellectual Property Management Co., Ltd. | Vehicle-surroundings monitoring device |
DE102010034139A1 (de) | 2010-08-12 | 2012-02-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zur Unterstützung eines Parkvorgangs eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug |
US10793067B2 (en) * | 2011-07-26 | 2020-10-06 | Magna Electronics Inc. | Imaging system for vehicle |
TWI478833B (zh) * | 2011-08-31 | 2015-04-01 | Autoequips Tech Co Ltd | 調校車用影像裝置之方法及其系統 |
US9491451B2 (en) * | 2011-11-15 | 2016-11-08 | Magna Electronics Inc. | Calibration system and method for vehicular surround vision system |
-
2013
- 2013-01-30 JP JP2013016193A patent/JP6223685B2/ja active Active
- 2013-11-05 WO PCT/JP2013/079934 patent/WO2014119081A1/ja active Application Filing
- 2013-11-05 CN CN201380071607.2A patent/CN104956668B/zh active Active
- 2013-11-05 DE DE112013006544.4T patent/DE112013006544T8/de active Active
- 2013-11-05 US US14/655,918 patent/US10328866B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010200054A (ja) * | 2009-02-26 | 2010-09-09 | Alpine Electronics Inc | 複数カメラ撮影画像合成システム用カメラ調整方法及び装置 |
JP2010239409A (ja) * | 2009-03-31 | 2010-10-21 | Aisin Seiki Co Ltd | 車載カメラの校正装置 |
JP2010244326A (ja) * | 2009-04-07 | 2010-10-28 | Alpine Electronics Inc | 車載周辺画像表示装置 |
JP2011151666A (ja) * | 2010-01-22 | 2011-08-04 | Fujitsu Ten Ltd | パラメータ取得装置、パラメータ取得システム、パラメータ取得方法、及び、プログラム |
JP2012124610A (ja) * | 2010-12-06 | 2012-06-28 | Fujitsu Ten Ltd | 画像表示システム、画像処理装置及び画像表示方法 |
Also Published As
Publication number | Publication date |
---|---|
CN104956668A (zh) | 2015-09-30 |
JP6223685B2 (ja) | 2017-11-01 |
US10328866B2 (en) | 2019-06-25 |
DE112013006544T5 (de) | 2015-11-05 |
DE112013006544T8 (de) | 2015-11-19 |
CN104956668B (zh) | 2019-11-15 |
US20150356735A1 (en) | 2015-12-10 |
JP2014147057A (ja) | 2014-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6223685B2 (ja) | 画像処理装置および画像処理方法 | |
JP5316805B2 (ja) | 車載カメラ装置の画像調整装置及び車載カメラ装置 | |
JP4975592B2 (ja) | 撮像装置 | |
JP3632563B2 (ja) | 映像位置関係補正装置、該映像位置関係補正装置を備えた操舵支援装置、及び映像位置関係補正方法 | |
JP4196841B2 (ja) | 映像位置関係補正装置、該映像位置関係補正装置を備えた操舵支援装置、及び映像位置関係補正方法 | |
JP4863922B2 (ja) | 運転支援システム並びに車両 | |
EP2818363B1 (en) | Camera device, camera system, and camera calibration method | |
JP6099333B2 (ja) | 画像生成装置、画像表示システム、パラメータ取得装置、画像生成方法及びパラメータ取得方法 | |
JP4758481B2 (ja) | 車両用画像処理装置及び車両用画像処理プログラム | |
US20140114534A1 (en) | Dynamic rearview mirror display features | |
JP5044204B2 (ja) | 運転支援装置 | |
EP3002727B1 (en) | Periphery monitoring apparatus and periphery monitoring system | |
JP2011182236A (ja) | カメラキャリブレーション装置 | |
JP2008187564A (ja) | カメラ校正装置及び方法並びに車両 | |
WO2013038681A1 (ja) | カメラ較正装置、カメラ、及びカメラ較正方法 | |
JP2013054720A (ja) | 運転支援装置 | |
JP4286294B2 (ja) | 運転支援システム | |
JP5020621B2 (ja) | 運転支援装置 | |
JP2008307981A (ja) | 車両用運転支援装置 | |
JP2009083744A (ja) | 合成画像調整装置 | |
JP2013062692A (ja) | 車載カメラの較正装置及び方法 | |
JP5827095B2 (ja) | キャリブレーションシステム、パラメータ取得装置、標識体、及び、パラメータ取得方法 | |
JP2009123131A (ja) | 撮像装置 | |
WO2010007960A1 (ja) | 車載用カメラの視点変換映像システム及び視点変換映像取得方法 | |
JP2013207622A (ja) | キャリブレーション装置及びキャリブレーション方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13873314 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14655918 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130065444 Country of ref document: DE Ref document number: 112013006544 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13873314 Country of ref document: EP Kind code of ref document: A1 |