WO2015122389A1 - 撮影装置、車両および画像補正方法 - Google Patents

撮影装置、車両および画像補正方法 Download PDF

Info

Publication number
WO2015122389A1
WO2015122389A1 PCT/JP2015/053563 JP2015053563W WO2015122389A1 WO 2015122389 A1 WO2015122389 A1 WO 2015122389A1 JP 2015053563 W JP2015053563 W JP 2015053563W WO 2015122389 A1 WO2015122389 A1 WO 2015122389A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
marker
adjustment
unit
shift amount
Prior art date
Application number
PCT/JP2015/053563
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
直樹 難波
Original Assignee
ヤマハ発動機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ発動機株式会社 filed Critical ヤマハ発動機株式会社
Priority to KR1020157036534A priority Critical patent/KR101812530B1/ko
Priority to JP2015532236A priority patent/JP6161704B2/ja
Publication of WO2015122389A1 publication Critical patent/WO2015122389A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement

Definitions

  • the present invention relates to a photographing apparatus, a vehicle, and an image correction method.
  • Stereo camera has multiple cameras. Each camera is installed in a predetermined posture at a predetermined position. Each camera performs shooting by stereo vision. An image obtained from each camera is processed by an image processing unit. For example, the image processing unit calculates information in the depth direction of the image using the internal parameters and the external parameters of the camera. Camera external parameters are defined by the position and orientation of the camera.
  • the deviation of the position and orientation of the camera is referred to as “positional deviation” as appropriate.
  • Positional deviation occurs due to various causes. For example, when the camera receives an impact or vibration, a positional shift may occur.
  • the “positional deviation” occurs, the actual position and the actual posture of the camera deviate from the position and the posture defined by the external parameters. As a result, the error included in the information in the depth direction increases, and the accuracy of the information in the depth direction decreases.
  • the stereo camera operator may manually adjust the camera position and orientation. According to this method, the camera can be returned to the correct position and the correct posture. However, it takes time for the operator.
  • the camera captures a test pattern such as a grid chart.
  • the image processing unit adjusts correction parameters for correcting the image based on the obtained image. Specifically, the image processing unit adjusts correction parameters for translating the image in the vertical direction and correction parameters for rotating the image.
  • the image processing unit corrects the image using the adjusted correction parameter (see, for example, Patent Documents 1 and 2). In this method, it is not necessary to adjust the position and orientation of the camera, so that the burden on the operator can be reduced.
  • the conventional example having such a configuration has the following problems.
  • the conventional example does not adjust the correction parameter for translating the image in the horizontal direction. Therefore, even if the image is corrected, the error due to the positional deviation cannot be sufficiently reduced.
  • Patent Document 1 discloses only a method for adjusting a correction parameter, and does not disclose any timing for adjusting the correction parameter.
  • Patent Document 2 discloses that correction parameters are adjusted before an automobile or the like equipped with a stereo camera enters actual operation. Here, before entering actual operation is, for example, when the vehicle is stationary.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide an imaging device, a vehicle, and an image correction method capable of suitably adjusting correction parameters for translating an image in the horizontal direction. To do. It is another object of the present invention to provide a vehicle and an image correction method capable of adjusting correction parameters for correcting an image when a vehicle equipped with an image sensor is actually operating.
  • the present invention has the following configuration. That is, the present invention is a photographing apparatus, and the photographing apparatus translates a pair of image sensors and an image photographed by at least one of the image sensors by a lateral shift amount in the lateral direction of the image.
  • Each of the correction unit and the image sensor is located at a position where the distance away from the first marker in the optical axis direction is substantially equal to the distance away from the second marker in the optical axis direction.
  • a horizontal shift amount adjusting unit that adjusts the horizontal shift amount based on the adjustment image, and the horizontal shift amount adjusting unit is included in both of the adjustment images.
  • the image capturing apparatus includes a horizontal adjustment amount calculation unit that calculates a horizontal adjustment amount based on the horizontal shift amount change unit that changes the horizontal shift amount using the horizontal adjustment amount.
  • a pair of image sensors constitutes a stereo camera. Each image sensor performs imaging by stereo vision.
  • the image correction unit corrects the image using the horizontal shift amount.
  • the lateral shift amount is one of correction parameters.
  • the image correction unit may correct an image obtained from each image sensor, or may correct only an image obtained from any one of the image sensors.
  • the horizontal shift amount adjustment unit adjusts the horizontal shift amount based on the adjustment image obtained from each image sensor.
  • the adjustment image is a pair of images in which each image sensor images the first marker and the second marker.
  • the adjustment image is captured by each image sensor when the distance in the optical axis direction between the image sensor and the first marker is substantially equal to the distance in the optical axis direction between the image sensor and the second marker. It is an image.
  • substantially equal means that each distance is strictly equal and that each distance is approximated to such an extent that the lateral shift amount can be adjusted.
  • the first marker and the second marker are projected on the adjustment image.
  • the information in the depth direction of the first marker projected on the adjustment image is substantially equal to the information in the depth direction of the second marker projected on the adjustment image.
  • the horizontal shift amount adjustment unit includes a measurement unit, a theoretical value calculation unit, a horizontal direction adjustment amount calculation unit, and a horizontal shift amount change unit.
  • the measurement unit obtains a “parallax measurement value” based on the two adjustment images.
  • the theoretical value calculation unit obtains a “theoretical value of parallax” based on one adjustment image.
  • the horizontal direction adjustment amount calculation unit calculates the horizontal direction adjustment amount based on the “measurement value of parallax” and the “theoretical value of parallax”.
  • the horizontal shift amount changing unit changes the horizontal shift amount using the horizontal adjustment amount.
  • the lateral shift amount adjusting unit since the lateral shift amount adjusting unit is provided, the lateral shift amount can be suitably adjusted.
  • the image correction unit can appropriately correct the image. That is, the parallax between the two images obtained from each image sensor can be suitably adjusted. Therefore, even if the position and orientation of the image sensor are deviated, the accuracy of the imaging device and the reliability of the imaging device can be suitably maintained.
  • the theoretical value calculation unit includes the interval between the first marker and the second marker, the projection point of the first marker on the adjustment image, and the second marker. It is preferable to calculate the distance in the optical axis direction between the first marker and the second marker and the image sensor based on the projection point of the image sensor, and to calculate the theoretical value of the parallax based on the distance. . According to this, the theoretical value calculation unit can calculate the theoretical value of parallax with high accuracy. As a result, the horizontal adjustment amount calculation unit can appropriately calculate the horizontal adjustment amount.
  • the theoretical value calculation unit includes the interval between the first marker and the second marker, the projection point of the first marker on the adjustment image, and the second marker.
  • the distance in the optical axis direction between the first marker and the second marker and the image sensor is calculated on the basis of the distance between the projection point and the theoretical value of the parallax based on the distance. It is preferable to calculate.
  • the theoretical value calculation unit can calculate the theoretical value of parallax with high accuracy.
  • the horizontal adjustment amount calculation unit can appropriately calculate the horizontal adjustment amount.
  • the lateral adjustment amount calculation unit determines the lateral adjustment amount so that a difference between the parallax measurement value and the parallax theoretical value is small. According to this, the actual parallax between two images obtained from each image sensor can be brought close to the theoretical value of parallax. As a result, the accuracy of information in the depth direction of the image can be preferably improved.
  • At least one of the adjustment images is an image subjected to processing by the image correction unit
  • the horizontal shift amount changing unit is configured to set the horizontal shift amount used by the image correction unit to It is preferable to add or subtract the lateral adjustment amount. That is, the horizontal shift amount changing unit sets a value obtained by adding or subtracting the horizontal adjustment amount to the horizontal shift amount as a new horizontal shift amount.
  • the lateral shift amount can be appropriately adjusted according to the present invention.
  • both of the adjustment images are images that have not been processed by the image correction unit, and the horizontal shift amount changing unit sets the horizontal adjustment amount as the new horizontal shift amount. It is preferable to set.
  • the lateral shift amount can be appropriately adjusted according to the present invention.
  • the image obtained from each of the image sensors based on the projection point of the first marker and the projection point of the second marker on the image is an adjustment image. It is preferable that an adjustment image specifying unit for determining whether or not to be included is provided. According to this, the adjustment image can be suitably specified.
  • the first marker and the second marker are arranged at substantially the same height position. According to this, the process by the theoretical value calculation part can be simplified. Further, the processing by the adjustment image specifying unit can be simplified.
  • the image correction unit further translates an image captured by at least one of the image sensors in the vertical direction of the image by a vertical shift amount
  • the imaging device further includes the adjustment It is preferable to include a vertical shift amount adjustment unit that adjusts the vertical shift amount based on an image.
  • the vertical shift amount is one of the correction parameters. Since the photographing apparatus includes the vertical shift amount adjustment unit, the vertical shift amount can be suitably adjusted. Therefore, the image correction unit can correct the image more appropriately.
  • the image correction unit further rotates an image photographed by at least one of the image sensors by a rotation amount, and the photographing apparatus further calculates the rotation amount based on the adjustment image. It is preferable to provide a rotation amount adjustment unit for adjustment.
  • the amount of rotation is one of the correction parameters. Since the photographing apparatus includes the rotation amount adjustment unit, the rotation amount can be adjusted appropriately. Therefore, the image correction unit can correct the image more appropriately.
  • the present invention is a vehicle equipped with the above-described photographing device.
  • the image sensor captures the first marker and the second marker, respectively, when the vehicle travels on a road. According to this, the lateral shift amount can be adjusted when the vehicle is actually operating. Therefore, it is possible to maintain the accuracy of the photographing apparatus without reducing the operation rate of the vehicle.
  • the runway is determined in advance and the vehicle travels autonomously on the runway. Since the vehicle travels on a predetermined road, the image sensor can appropriately capture the first marker and the second marker. Thereby, the theoretical value calculation unit can calculate the “theoretical value of parallax” with higher accuracy.
  • the present invention provides a process in which a plurality of image sensors mounted on a vehicle capture a group of markers installed outside the vehicle when the vehicle is traveling, and at least one of the image sensors. And a correction used in the process of correcting the image based on the adjustment image, using the image obtained from the image sensor and the image on which the marker group is projected as the adjustment image. And a parameter adjusting process.
  • the correction parameter is a lateral shift amount.
  • the image is translated in the lateral direction of the image by the lateral shift amount, and the marker groups are spaced apart from each other.
  • a first marker and a second marker that are arranged in a position, and when performing the shooting process, the distance between the image sensor and the first marker in the optical axis direction, and the distance between the image sensor and the second marker The distance in the optical axis direction is substantially equal, and in the process of adjusting the correction parameter, the parallax relating to at least one of the first marker and the second marker is measured based on both of the adjustment images, and the parallax Based on the process of obtaining the measurement value of the first marker, the interval between the first marker and the second marker, and any one of the adjustment images.
  • each image sensor photographs the first marker and the second marker at a position separated from the first marker and the second marker by a substantially equal distance in the optical axis direction.
  • substantially equal means that each distance is exactly the same and that each distance is approximated to such an extent that the lateral shift amount can be adjusted.
  • the image is translated in the horizontal direction using the horizontal shift amount.
  • the horizontal shift amount is adjusted based on the adjustment image.
  • the process of adjusting the correction parameter further includes a process of acquiring a parallax measurement value, a process of calculating a parallax theoretical value, a process of calculating a lateral adjustment amount, and a process of changing a lateral shift amount. . For this reason, the process of adjusting the correction parameter can appropriately adjust the lateral shift amount.
  • the process of correcting the image can appropriately correct the image.
  • the parallax between the images obtained from the image sensors can be suitably adjusted. Therefore, even if the position and orientation of the image sensor are shifted, information in the depth direction of the image can be obtained with high accuracy.
  • the present specification also discloses the invention relating to the following detection device and vehicle.
  • the first marker and the second marker are arranged at a known interval.
  • the theoretical value calculation unit can suitably calculate the theoretical value of parallax.
  • each of the image sensors is arranged so that the optical axes of the image sensors are parallel to each other, and the horizontal axis of an image obtained from each image sensor is coaxial. Preferably it is.
  • a stereo camera can be suitably configured by each image sensor.
  • the theoretical value calculation unit may include the interval between the first marker and the second marker, the optical center associated with any one of the adjustment images, and the adjustment image.
  • the first marker, the second marker, and the image sensor based on the projection point of the first marker in the image and the interior angle of a triangle having the projection point of the second marker on the adjustment image as a vertex It is preferable to calculate a distance in the optical axis direction between and a theoretical value of the parallax based on the distance.
  • the theoretical value calculator can calculate the theoretical value of parallax with high accuracy.
  • the horizontal adjustment amount calculation unit can appropriately calculate the horizontal adjustment amount.
  • the theoretical value calculation unit may include the interval between the first marker and the second marker, the optical center associated with any one of the adjustment images, and the adjustment image. , The projection point of the first marker, the interior angle of a triangle whose apex is the projection point of the second marker on the adjustment image, and the positional relationship between the image sensor, the first marker, and the second marker It is preferable to calculate the distance in the optical axis direction between the first marker and the second marker and the image sensor based on the above and calculate the theoretical value of the parallax based on the distance.
  • the theoretical value calculator can calculate the theoretical value of parallax with high accuracy.
  • the horizontal adjustment amount calculation unit can appropriately calculate the horizontal adjustment amount.
  • the correction parameter for translating the image in the horizontal direction can be suitably adjusted. Therefore, the reliability of the photographing apparatus can be improved. Further, when a vehicle equipped with an image sensor is actually operating, a correction parameter for correcting an image can be adjusted. Therefore, it can prevent that the operation rate of a vehicle falls.
  • FIG. 1 is a front view of a vehicle according to an embodiment. It is a top view of a vehicle and a runway. It is a side view of a vehicle and a marker. It is a front view of a marker. It is a block diagram which shows the structure of the imaging device which concerns on an Example.
  • FIG. 6A and FIG. 6B are diagrams schematically illustrating images.
  • FIG. 7A and FIG. 7B are diagrams schematically illustrating projection points in an image. It is explanatory drawing of the method of calculating the distance in the optical axis direction between a camera and a marker. It is a flowchart which shows the operation example of an imaging device. It is explanatory drawing of the modification Example of the method of calculating the distance in the optical axis direction between a camera and a marker. It is a block diagram which shows the structure of the imaging device which concerns on a modified example.
  • FIG. 1 is a front view of a vehicle 1 according to an embodiment
  • FIG. 2 is a plan view of the vehicle 1 and a runway T.
  • the vehicle 1 is a golf cart that travels in a golf course.
  • a runway T is laid in advance in the golf course.
  • the vehicle 1 travels autonomously on the track T.
  • the vehicle 1 travels on the track T repeatedly.
  • the traveling of the vehicle 1 on the track T is that the vehicle 1 is actually operated.
  • front direction Zf the direction in which the vehicle 1 travels
  • rear direction Zb the direction opposite to the front direction
  • the user gets on the vehicle 1 in a posture facing the front direction Zf.
  • “right”, “left”, “upper”, and “lower” mean “right”, “left”, “upper”, and “lower” for the user who has boarded the vehicle 1, respectively.
  • front-rear direction Z the left-right direction
  • lateral direction X the left-right direction
  • the front-rear direction Z, the horizontal direction X, and the vertical direction Y are orthogonal to each other.
  • the vehicle 1 includes a vehicle body 3 and a pair of cameras 11R and 11L.
  • the cameras 11R and 11L constitute a stereo camera.
  • Each camera 11R, 11L is fixed to the front surface of the vehicle body 3.
  • the arrangement and posture of the cameras 11R and 11L are so-called parallel stereo. That is, the optical axis AR of the camera 11R and the optical axis AL of the left camera 11L are parallel to each other. In the present embodiment, the optical axes AR and AL are parallel to the front direction Zf.
  • the cameras 11R and 11L are arranged in the horizontal direction X with a predetermined distance (baseline length) therebetween.
  • the camera 11R is disposed on the right side of the camera 11L.
  • the cameras 11R and 11L are appropriately referred to as “right camera 11R” and “left camera 11L”.
  • the cameras 11R and 11L perform stereo shooting. Each camera 11R, 11L can simultaneously photograph the same subject.
  • the subject is, for example, the ground, a runway, a tree, or an obstacle.
  • Each camera 11R, 11L is, for example, a visible light camera.
  • Each camera 11R, 11L is realized by a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) sensor.
  • the cameras 11R and 11L are examples of image sensors in the present invention.
  • a marker group MG is installed near the runway T.
  • the marker group MG includes two markers M1 and M2.
  • the markers M1 and M2 are arranged so as to sandwich the runway T, respectively.
  • the distance LM between the marker M1 and the marker M2 is known.
  • distance LM is appropriately described as “interval LM”.
  • FIG. 2 and FIG. FIG. 3 is a side view of the vehicle 1 and the markers M1 and M2.
  • each of the cameras 11R and 11L can capture both the markers M1 and M2.
  • the section Tab is appropriately referred to as “marker group photographing section Tab”.
  • the optical axis AR / AL is substantially constant. As shown in FIG. 3, in a side view, the optical axis AR / AL is substantially parallel to the road surface in the marker group photographing section Tab.
  • one plane substantially perpendicular to the optical axis AR / AL when the vehicle 1 is in the section Tab is referred to as a “plane P”.
  • the markers M1 and M2 are arranged so as to be positioned on the plane P, respectively.
  • the distance D1R in the optical axis AR direction between the right camera 11R and the marker M1 and the distance D2R in the optical axis AR direction between the right camera 11R and the marker M2 are substantially equal.
  • the distance D1L in the optical axis AL direction between the left camera 11L and the marker M1 and the distance D2L in the optical axis AL direction between the left camera 11L and the marker M2 are substantially equal.
  • the distances D1R and D2R and the distances D1L and D2L are substantially equal.
  • distance D corresponds to the distance between the camera 11R / 11L and the plane P.
  • distance away from the optical axis direction AR / AL and the “distance in the optical axis direction AR / AL” mean the distance of the component in the optical axis direction AR / AL.
  • FIG. 4 is a front view of the markers M1 and M2.
  • the markers M1 and M2 are respectively displayed on the front surface of the marker support tool 15.
  • the marker M1 and the marker M2 are arranged at substantially the same height position.
  • the markers M1 and M2 are examples of the first marker and the second marker in the present invention.
  • FIG. 5 is a block diagram illustrating a configuration of the photographing apparatus.
  • the vehicle 1 includes a photographing device 5.
  • the imaging device 5 includes an image processing unit 13 in addition to the cameras 11R and 11L described above.
  • the image processing unit 13 processes the image IR photographed by the right camera 11R and the image IL photographed by the left camera 11L.
  • image IR or the like means image information that can be processed by the image processing unit 13.
  • the image processing unit 13 is realized by, for example, a central processing unit (CPU) and a storage unit.
  • FIG. 6A and 6B schematically illustrate a pair of images IR and IL.
  • a pair of images IR and IL means a combination of images IR and IL taken simultaneously.
  • the position on the image IR is represented by coordinates consisting of the horizontal axis uR and the vertical axis vR.
  • the horizontal axis uR is parallel to the horizontal direction of the image IR
  • the vertical axis vR is parallel to the vertical direction of the image IR.
  • the image center OR corresponds to the intersection of the image IR and the optical axis AR.
  • the position on the image IL is represented by coordinates composed of a horizontal axis uL and a vertical axis vL.
  • the horizontal axis uL is parallel to the horizontal direction of the image IL
  • the vertical axis vL is parallel to the vertical direction of the image IL.
  • the image center OL corresponds to the intersection of the image IL and the optical axis AL.
  • FIG. 6B illustrates a case where the left camera 11L is displaced. However, if the cameras 11R and 11L are not displaced, the horizontal axis uR and the horizontal axis uL are coaxial, and the vertical axis vR and the vertical axis vL are parallel. If the cameras 11R and 11L are not displaced, the horizontal axes uR and uL are parallel to the horizontal direction X, and the vertical axes vR and vL are parallel to the vertical direction Y. Furthermore, if the cameras 11R and 11L are not misaligned, the parallax regarding the markers M1 and M2 between the images IR and IL corresponds to the distance D at the time when the images IR and IL are captured.
  • the image processing unit 13 includes a correction parameter storage unit 21, an image correction unit 22, an image storage unit 23, an image re-correction unit 24, an adjustment image specifying unit 25, an imaging condition storage unit 26, and a lateral shift amount.
  • An adjustment unit 27, a vertical shift amount adjustment unit 28, and a rotation amount adjustment unit 29 are included.
  • or 29 is demonstrated.
  • the correction parameter storage unit 21 stores correction parameters.
  • the correction parameters are, for example, a horizontal shift amount Bh, a vertical shift amount Bv, and a rotation amount Br.
  • the horizontal shift amount Bh is a movement amount in the direction of the horizontal axis uL.
  • the vertical shift amount Bv is a movement amount in the direction of the vertical axis vL.
  • the rotation amount Br is an angle in the rotation direction. Note that the center of rotation is, for example, the image center OL.
  • the correction parameters Bh, Bv, Br are expressed in an appropriate format.
  • the correction parameters Bh, Bv, and Br may be represented by a correction map or determinant.
  • the image correction unit 22 corrects the image IL using the correction parameters Bh, Bv, Br stored in the correction parameter storage unit 21. Specifically, the image correction unit 22 translates the image IL by the horizontal shift amount Bh in the horizontal direction of the image IL, translates the image IL by the vertical shift amount Bv in the vertical direction of the image IL, and sets the image center OL. The image IL is rotated around the rotation amount Br.
  • the pixel at each position (uL, vL) on the image IL is moved to a new position (u + ⁇ , vL + ⁇ ), for example.
  • the pixel value at the original position (uL, vL) defines the pixel value at the new position (uL + ⁇ , vL + ⁇ ).
  • an image ILc is generated.
  • the image storage unit 23 stores the image IL.
  • the image re-correction unit 24 uses the latest correction parameters Bh, Bv, and Br to calculate the image IL. Correct again.
  • the correction itself is the same as the correction performed by the image correction unit 22.
  • the adjustment image specifying unit 25 determines whether the pair of images IR and ILc are the adjustment images ER and EL, respectively. Specific processing of the adjustment image specifying unit 25 is exemplified below.
  • FIG. 7A is a diagram schematically showing projection points mR1 and mR2, which are points where the markers M1 and M2 are projected on the image IR
  • FIG. 7B shows the markers M1 and M2 on the image ILc. It is a figure which shows typically projection point mL1, mL2 which is the point projected on the top.
  • the adjustment image specifying unit 25 detects projection points mR1 and mR2 of the markers M1 and M2 on the image IR based on the image IR.
  • the adjustment image specifying unit 25 detects the projection points mL1 and mL2 of the markers M1 and M2 on the image ILc based on the image ILc.
  • the adjustment image specifying unit 25 performs, for example, template matching using templates related to the markers M1 and M2.
  • the adjustment image specifying unit 25 may or may not identify whether the projection point mR1 is the projection point of the marker M1 or the marker M2.
  • projection points mR1 and mR2 are not particularly distinguished, they are described as “projection points mR”, and when the projection points mL1 and mL2 are not particularly distinguished, they are described as “projection points mL”. Further, when the projection points mR and mL are not particularly distinguished, “projection point m” is described.
  • the adjustment image specifying unit 25 determines whether the images IR and ILc are the adjustment images ER and EL, respectively, based on the projection point m. Specifically, the determination process is performed based on the number of projection points m and the position of each projection point m.
  • the adjustment image specifying unit 25 determines whether or not the projection point m satisfies all of the following determination conditions 1 to 4.
  • Determination condition 1 The number of projection points mR of the image IR is two.
  • Determination condition 2 The difference ⁇ vR between the vertical position vR1 of one projection point mR1 and the vertical position vR2 of the other projection point mR2 is equal to or less than a threshold value.
  • Judgment condition 3 The number of projection points mL of the image IL is two.
  • Determination condition 4 The difference ⁇ vL between the vertical position vL1 of one projection point mL1 and the vertical position vL2 of the other projection point mL2 is equal to or less than a threshold value.
  • the threshold value is, for example, 10 pixels.
  • the imaging condition storage unit 26 stores information related to the interval LM between the markers M1 and M2.
  • the imaging condition storage unit 26 further stores internal parameters and external parameters of the right camera 11R and the left camera 11L.
  • the horizontal shift amount adjustment unit 27 adjusts the horizontal shift amount Bh stored in the correction parameter storage unit 21 based on the adjustment images ER and EL.
  • the lateral shift amount adjustment unit 27 includes a measurement unit 31, a theoretical value calculation unit 32, a lateral direction adjustment amount calculation unit 33, and a lateral shift amount change unit 34. Hereinafter, each part 31 thru
  • the measuring unit 31 measures the parallax related to one of the markers M1 and M2 based on the adjustment images ER and EL, and obtains a parallax measurement value Fm. Specific processing of the measurement unit 31 is exemplified below.
  • the measuring unit 31 identifies the correspondence between the projection point mR of the adjustment image ER and the projection point mL of the adjustment image EL.
  • the measurement unit 31 performs, for example, stereo matching. Thereby, the measurement unit 31 specifies that the projection point mR1 and the projection point mL1 correspond and that the projection point mR2 and the projection point mL2 correspond.
  • the measurement unit 31 measures the difference between the position in the horizontal direction of any projection point mR and the position in the horizontal direction of the projection point mL corresponding to this projection point mR.
  • the measurement unit 31 sets the measured difference as the parallax measurement value Fm regarding either of the markers M1 and M2.
  • the theoretical value calculation unit 32 calculates the theoretical value Ft of parallax based on the adjustment image ER and the interval LM.
  • the theoretical value Ft of parallax is the parallax related to the markers M1 and M2 that should be obtained when it is assumed that the right camera 11R and the left camera 11L are not displaced.
  • the theoretical value calculation unit 32 estimates the distance between the camera 11R and the plane P based on the adjustment image ER and the interval LM. This estimation is synonymous with calculating the distances D1R and D2R in the optical axis direction AR between the camera 11R and the markers M1 and M2, and the distance in the optical axis direction AL between the camera 11L and the markers M1 and M2. It is synonymous with calculating D1L and D2L. In other words, the estimated distance corresponds to “distance D”. Then, the theoretical value calculator 32 calculates a parallax theoretical value Ft based on the distance D.
  • FIG. 8 is a diagram in which the adjustment image ER is modeled.
  • the optical axis AR is orthogonal to the plane P at the point QR.
  • the modeled adjustment image ER is located on a plane orthogonal to the optical axis AR.
  • the modeled adjustment image ER is parallel to the plane P.
  • the point where the optical axis direction AR is orthogonal to the adjustment image ER corresponds to the image center OR.
  • the optical center CR of the right camera 11R is located on the optical axis direction AR.
  • Projection points mR1 and mR2 are on the adjustment image ER, and markers M1 and M2 are on the plane P.
  • the marker M1 is located on the line J1 connecting the optical center CR and the projection point mR1
  • the marker M2 is located on the line J2 connecting the optical center CR and the projection point mR2.
  • a triangle whose vertex is the optical center CR, the projection point mR1, and the projection point mR2 is similar to a triangle whose vertex is the optical center CR, the marker M1, and the marker M2.
  • the optical center CR is regarded as the exact position of the right camera 11R, and the distance D is the distance between the optical center CR and the plane P (point QR). Further, the distance between the optical center CR and the adjustment image ER (image center OR) is defined as “focal length d”. Further, the distance between the projection point mR1 and the projection point mR2 is defined as an “interval Lm”.
  • the interval LM between the markers M1 and M2 is a distance on the real space, whereas the interval Lm is a distance on the adjustment image ER. Then, the ratio between the distance Lm and the distance LM is equal to the ratio between the focal distance d and the distance D. That is, the following relational expression holds.
  • interval LM is already known.
  • the interval Lm is given from the adjustment image ER (projection points mR1, mR2).
  • the focal length d is given from an internal parameter of the right camera 11R.
  • the distance D is uniquely determined by the distance LM, the distance Lm, and the focal length d.
  • the theoretical value calculation unit 32 reads the interval LM and the internal parameters of the right camera 11R from the imaging condition storage unit 26.
  • the theoretical value calculation unit 32 acquires the focal length d based on the internal parameters of the right camera 11R.
  • the theoretical value calculator 32 obtains an interval Lm between the projection point mR1 and the projection point mR2 based on the adjustment image ER.
  • the theoretical value calculator 32 calculates the distance D by substituting the distance Lm, the distance LM, and the focal distance d into the relational expression (1). Further, the theoretical value calculation unit 32 calculates a theoretical value Ft of parallax related to the markers M1 and M2 based on the distance D and the internal parameters and external parameters of the cameras 11R and 11L.
  • the horizontal adjustment amount calculation unit 33 calculates the horizontal adjustment amount Hh based on the parallax measurement value Fm and the parallax theoretical value Ft.
  • the horizontal direction adjustment amount calculation unit 33 preferably determines the horizontal direction adjustment amount Hh so that the parallax measurement value Fm is equal to the parallax theoretical value Ft.
  • the horizontal adjustment amount Hh is, for example, the difference between the parallax measurement value Fm and the parallax theoretical value Ft.
  • the horizontal shift amount changing unit 34 changes the horizontal shift amount Bh based on the horizontal adjustment amount Hh.
  • the lateral shift amount changing unit 34 adds or subtracts the lateral adjustment amount Hh to the lateral shift amount Bh stored in the correction parameter storage unit 21.
  • the calculated value is stored in the correction parameter storage unit 21 as a new lateral shift amount Bh.
  • the vertical shift amount adjustment unit 28 adjusts the vertical shift amount Bv based on the adjustment images ER and EL.
  • the vertical shift amount adjustment unit 28 includes a vertical direction adjustment amount calculation unit 36 and a vertical shift amount change unit 37.
  • the vertical adjustment amount calculation unit 36 calculates the vertical adjustment amount Hv based on one or more sets of projection points mR and mL that have a corresponding relationship.
  • the vertical adjustment amount calculation unit 36 may determine the vertical adjustment amount Hv so that the position of the projection point mR in the vertical direction is equal to the position of the projection point mL corresponding to the projection point mR in the vertical direction. preferable.
  • the vertical adjustment amount Hv is, for example, the difference between the position vR1 in the vertical direction of the projection point mR1 and the position vL1 in the vertical direction of the projection point mL1.
  • the vertical shift amount changing unit 37 changes the vertical shift amount Bv based on the vertical adjustment amount Hv.
  • the vertical shift amount changing unit 37 adds or subtracts the vertical adjustment amount Hv to the vertical shift amount Bv stored in the correction parameter storage unit 21.
  • the calculated value is stored in the correction parameter storage unit 21 as a new vertical shift amount Bv.
  • Rotation amount adjustment unit 29 adjusts the rotation amount based on adjustment images ER and EL.
  • the rotation amount adjustment unit 29 includes a rotation direction adjustment amount calculation unit 38 and a rotation amount change unit 39.
  • the rotation direction adjustment amount calculation unit 38 calculates the rotation direction adjustment amount Hr based on two or more sets of projection points mR and mL having a correspondence relationship. Referring to FIG. 7, rotation direction adjustment amount calculation unit 38 rotates so that the inclination of virtual line KR connecting projection points mR1 and mR2 is equal to the inclination of virtual line KL connecting projection points mL1 and mL2. It is preferable to determine the direction adjustment amount Hr.
  • the rotation direction adjustment amount Hr is, for example, an angle formed by the virtual line KR and the virtual line KL.
  • the rotation amount changing unit 39 changes the rotation amount Br based on the rotation direction adjustment amount Hr.
  • the rotation amount changing unit 39 adds or subtracts the rotation direction adjustment amount Hr to the rotation amount Br stored in the correction parameter storage unit 21.
  • the calculated value is stored in the correction parameter storage unit 21 as a new rotation amount Br.
  • the imaging device 5 may further include a stereo processing unit (not shown) that processes the images IR and ILc.
  • the stereo processing unit may calculate information in the depth direction of the images IR and ILc based on the images IR and ILc.
  • the stereo processing unit may calculate the distance to the subject in the optical axis direction AR / AL based on the images IR and ILc.
  • the stereo processing unit may detect an obstacle that prevents the vehicle 1 from traveling based on the images IR and ILc.
  • FIG. 9 is a flowchart showing an example of the operation of the photographing apparatus 5.
  • the vehicle 1 is traveling on the track T. That is, an operation example performed by the photographing device 5 when the vehicle 1 is traveling will be described below.
  • Step S1> Shooting The right camera 11R and the left camera 11L capture images IR and IL, respectively.
  • Step S2> Image Capture The image processing unit 13 captures the images IR and IL.
  • the image IL is stored in the image storage unit 23.
  • Step S3> Image Correction The image correction unit 22 reads the latest horizontal shift amount Bh, vertical shift amount Bv, and rotation amount Br from the correction parameter storage unit 21. The image correction unit 22 corrects the image IL using the read correction parameters Bh, Bv, and Br. Thereby, the image correction unit 22 generates the image ILc.
  • Step S4> Is the image for adjustment?
  • the adjustment image specifying unit 25 detects the projection point mR based on the image IR.
  • the adjustment image specifying unit 25 detects the projection point mL based on the image ILc.
  • the adjustment image specifying unit 25 determines whether the images IR and ILc are the adjustment images ER and EL based on the projection points mR and mL. If it is determined that the images IR and ILc are the adjustment images ER and EL, the process proceeds to step S5. Otherwise, the process returns to step S1.
  • the horizontal shift amount adjustment unit 27 calculates a horizontal adjustment amount Hh. Specifically, the measurement unit 31 acquires a parallax measurement value Fm based on the adjustment images ER and EL. The theoretical value calculation unit 32 acquires a parallax theoretical value Ft based on the adjustment image ER. The horizontal adjustment amount calculation unit 33 calculates the horizontal adjustment amount Hh based on the parallax measurement value Fm and the parallax theoretical value Ft.
  • the vertical adjustment amount calculation unit 36 calculates the vertical adjustment amount Hv based on the adjustment images ER and EL.
  • the rotation direction adjustment amount calculation unit 38 calculates a rotation direction adjustment amount Hr based on the adjustment images ER and EL.
  • the lateral shift amount changing unit 34 changes the lateral shift amount Bh using the lateral adjustment amount Hh.
  • the vertical shift amount changing unit 37 changes the vertical shift amount Bv using the vertical adjustment amount Hv.
  • the rotation amount changing unit 39 changes the rotation amount Br using the rotation direction adjustment amount Hr.
  • Step S ⁇ Step S ⁇ b>7> Image Recorrection
  • the image recorrection unit 24 reads the image IL from the image storage unit 23, and reads the changed correction parameters Bh, Bv, and Br from the correction parameter storage unit 21. Then, the image re-correction unit 24 corrects the image IL using the correction parameters Bh, Bv, and Br. Then, it returns to step S1.
  • the photographing apparatus 5 since the photographing apparatus 5 includes the lateral shift amount adjustment unit 27, the lateral shift amount Bh can be suitably adjusted. Therefore, the image correction unit 22 can translate the image IL in the horizontal direction using the appropriate horizontal shift amount Bh. Thereby, the parallax of the images IR and ILc is closer to the parallax theoretical value Ft than the parallax of the images IR and IL. Therefore, even when the cameras 11R and 11L are displaced, information about the subject in the depth direction can be obtained with high accuracy based on the images IR and ILc. Thus, even if the positions and postures of the cameras 11R and 11L are deviated, the accuracy and reliability of the imaging device 5 can be suitably maintained. As a result, it is possible to suitably prevent the operating rate of the vehicle 1 from decreasing.
  • the theoretical value calculation unit 32 calculates the theoretical value Ft of parallax using one of the adjustment images ER without using both the adjustment images ER and EL.
  • This calculation method is less susceptible to the displacement of the cameras 11R and 11L than when both the adjustment images ER and EL are used. Therefore, the theoretical value calculation unit 32 can acquire the theoretical value Ft of parallax with high accuracy.
  • the theoretical value calculation unit 32 utilizes the relationship between the distance LM between the markers M1 and M2 and the distance Lm between the projection points mR1 and mR2, and the optical axis direction AR / AL between the camera 11R / 11L and the markers M1 and M2.
  • the distance D at is calculated.
  • the theoretical value calculator 32 can suitably calculate the theoretical value Ft of parallax.
  • the measurement unit 31 acquires a “parallax measurement value Fm” corresponding to an actual parallax between the image IR and the image ILc. Therefore, the horizontal adjustment amount calculation unit 33 can grasp the deviation between the parallax measurement value Fm and the parallax theoretical value Ft. As a result, the horizontal adjustment amount calculation unit 33 can determine an appropriate horizontal adjustment amount Hh. In particular, when the horizontal adjustment amount Hh is the difference between the parallax measurement value Fm and the parallax theoretical value Ft, the actual parallax between the image IR and the image ILc may be matched with the parallax theoretical value Ft. it can.
  • the lateral shift amount adjusting unit 27 includes the lateral shift amount changing unit 34, the lateral shift amount Bh stored in the correction parameter storage unit 21 can be suitably updated.
  • the horizontal shift amount changing unit 34 uses the horizontal shift amount Bh used for generating the adjustment image EL.
  • the horizontal adjustment amount Hh is added or subtracted. Thereby, the amount of horizontal shift can be adjusted suitably.
  • the adjustment image specifying unit 25 since the adjustment image specifying unit 25 is provided, the images IR and ILc in which the markers M1 and M2 are appropriately photographed can be suitably specified.
  • the vertical shift amount adjusting unit 28 since the vertical shift amount adjusting unit 28 is provided, the vertical shift amount Bh can be adjusted suitably. Therefore, the image correction unit 22 can correct the image IL more appropriately.
  • the rotation amount adjustment unit 29 since the rotation amount adjustment unit 29 is provided, the rotation amount Br can be adjusted suitably. Therefore, the image correction unit 22 can correct the image IL more appropriately.
  • the marker group photographing section Tab can be suitably formed on the runway T by appropriately arranging the markers M1 and M2 in consideration of the layout of the runway T and the optical axes AR / AL of the cameras 11R and 11L. Thereby, the images for adjustment ER and EL can be suitably obtained.
  • markers M1 and M2 are installed so that the cameras 11R and 11L can photograph both the markers M1 and M2 in the linear marker group photographing section Tab. Further, markers M1 and M2 are arranged on a plane P substantially perpendicular to the optical axis AR / AL of the cameras 11R and 11L in the marker group photographing section Tab.
  • substantially vertical means to include both being strictly vertical and being nearly vertical to the extent that the lateral shift amount Bh can be appropriately adjusted.
  • the camera 11R can simultaneously photograph a plurality of markers M1 and M2 that are separated by substantially the same distance in the optical axis direction AR.
  • substantially the same means to include both being exactly equal and approximating to the extent that the lateral shift amount Bh can be appropriately corrected. Therefore, the theoretical value calculation unit 32 can accurately estimate the distance D based on the image IR and can calculate the parallax theoretical value Ft with high accuracy.
  • the camera 11L can photograph the markers M1 and M2 similarly to the camera 11R. Therefore, the measurement unit 31 can suitably acquire the parallax measurement value Fm based on the images IR and ILc.
  • the markers M1 and M2 are arranged at substantially the same height position. Thereby, the process in which the theoretical value calculation part 32 calculates the space
  • the cameras 11R and 11L can suitably photograph the markers M1 and M2, and the image processing unit 13
  • the correction parameters Bh, Bv, Br can be suitably adjusted. Therefore, the accuracy and reliability of the imaging device 5 can be suitably maintained without reducing the operating rate of the vehicle 1.
  • the markers M1 and M2 can be installed at appropriate positions in advance. Therefore, the cameras 11R and 11L can appropriately photograph the markers M1 and M2. Therefore, the image processing unit 13 can adjust the correction parameters Bh, Bv, Br more accurately.
  • the present invention is not limited to the above embodiment, and can be modified as follows.
  • the theoretical value calculation unit 32 calculates the distance D based on the interval LM and the interval Lm, but is not limited thereto.
  • the theoretical value calculator 32 may calculate the distance D by a triangulation technique or a technique using geometry.
  • FIG. 10 is a diagram modeling the adjustment image ER.
  • symbol is abbreviate
  • a straight line J3 is a line connecting the projection point mR1 and the projection point mR2.
  • the straight line J4 is a line connecting the marker M1 and the marker M2.
  • the angle ⁇ is an angle formed by the straight line J1 and the straight line J3.
  • the angle ⁇ is an angle formed by the straight line J2 and the straight line J3.
  • the angles ⁇ and ⁇ correspond to inner angles of a triangle having the optical center CR, the projection point mR1, and the projection point mR2 as vertices, respectively. This triangle is similar to the triangle having the optical center CR, the marker M1, and the marker M2 as vertices. Therefore, the angle ⁇ is equal to the angle formed by the straight line J1 and the straight line J4, and the angle ⁇ is equal to the angle formed by the straight line J2 and the straight line J4.
  • the point QR ′ is a point perpendicular to the straight line J4 from the optical center CR.
  • the distance between the optical center CR and the point QR ′ is defined as “distance Da”.
  • the triangle whose vertex is the optical center CR, the point QR, and the point QR ′ is a right triangle.
  • the distance between the point QR and the point QR ′ is defined as “distance Db”.
  • the distance Db is calculated based on the positional relationship between the camera 11R and the markers M1 and M2. As a relatively simple example, the distance Db is the difference between the height of the camera 11R and the heights of the markers M1 and M2. As a relatively complicated example, the distance Db is calculated using the position and angle of the camera 11R and the positions of the markers M1 and M2.
  • the distance D is expressed by the following equation (3).
  • the theoretical value calculation unit 32 calculates the distance Da based on the interval LM and the internal angles ⁇ and ⁇ of the triangle having the optical center CR and the projection points mR1 and mR2 as vertices. To do. Specifically, the theoretical value calculation unit 32 reads the interval LM and the internal parameters of the right camera 11R from the imaging condition storage unit 26. The theoretical value calculation unit 32 calculates the relative positional relationship between the optical center CR and the projection points mR1, mR2 based on the adjustment image ER and internal parameters, and the relative positions of the optical center CR and the projection points mR1, mR2. The angles ⁇ and ⁇ are calculated based on the relationship. The theoretical value calculation unit 32 calculates the distance Da by substituting the interval LM and the angles ⁇ and ⁇ into the relational expression (2).
  • the theoretical value calculation unit 32 reads the external parameters of the right camera 11R and the positions of the markers M1 and M2 from the imaging condition storage unit 26. Note that the positions of the markers M1 and M2 are also stored in the imaging condition storage unit 26 in advance.
  • the theoretical value calculator 32 calculates the distance Db based on the external parameters of the right camera 11R and the positions of the markers M1 and M2. Further, the theoretical value calculation unit 32 calculates the distance D using the relational expression (3).
  • the markers M1 and M2 are arranged at substantially the same height position. Further, it is particularly preferable that the optical axis AR / AL is substantially parallel to the road surface in the marker group photographing section Tab in a side view. This is because the theoretical value calculator 32 can easily calculate the distance Db.
  • the cameras 11R / 11L are arranged at substantially the same height as the markers M1 and M2. According to this, the point QR ′ coincides with the point QR, and the distance Da is equal to the distance D. Therefore, the theoretical value calculation unit 32 can directly acquire the distance D using the relational expression (2), and can omit the calculation of the distance Db.
  • the theoretical value calculation unit 32 calculates the theoretical value Ft of parallax based on the adjustment image ER, but is not limited thereto.
  • the theoretical value calculation unit 32 may calculate the theoretical value Ft of parallax based on the adjustment image EL. Even when the adjustment image EL is used, the theoretical parallax value Ft can be suitably calculated.
  • the theoretical value calculator 32 may calculate the theoretical value FtR of parallax based on the adjustment image ER, and may calculate the theoretical value FtL of parallax based on the adjustment image EL. In this case, the theoretical value calculation unit 32 may further calculate one parallax theoretical value Ft based on the parallax theoretical values FtR and FtL. According to this, the theoretical value Ft of parallax can be calculated with higher accuracy.
  • the measurement unit 31 acquires the parallax measurement value Fm related to one of the markers M1 and M2 based on one set of projection points mR and mL that are in a correspondence relationship.
  • the measurement unit 31 may acquire the parallax measurement value Fm for both the markers M1 and M2 based on the two sets of projection points mR and mL that are in a correspondence relationship.
  • the measurement unit 31 may further calculate one parallax measurement value Fm based on the parallax measurement value Fm related to the marker M1 and the parallax measurement value Fm related to the marker M2. According to this, the parallax measurement value Fm can be calculated with higher accuracy.
  • the horizontal adjustment amount calculation unit 33 determines the horizontal adjustment amount Hh so that the parallax measurement value Fm is equal to the parallax theoretical value Ft, but is not limited thereto. .
  • the horizontal adjustment amount calculation unit 33 may determine the horizontal adjustment amount Hh so that the difference between the parallax measurement value Fm and the parallax theoretical value Ft is small.
  • the lateral shift amount Bh can be adjusted suitably.
  • the vertical adjustment amount calculation unit 36 determines the vertical adjustment amount Hv so that the position of the projection point mR in the vertical direction is equal to the position of the projection point mL corresponding to the projection point mR in the vertical direction.
  • the vertical adjustment amount calculation unit 36 sets the vertical adjustment amount Hv so that the difference between the vertical position of the projection point mR and the vertical position of the projection point mL corresponding to the projection point mR is small. You may decide.
  • the vertical shift amount Bv can be suitably adjusted.
  • the rotation direction adjustment amount calculation unit 38 determines the rotation direction adjustment amount Hr so that the inclination of the virtual line KR and the inclination of the virtual line KL are equal, but the present invention is not limited to this.
  • the rotation direction adjustment amount calculation unit 38 may determine the rotation direction adjustment amount Hr so that the difference between the inclination of the virtual line KR and the inclination of the virtual line KL is small.
  • the rotation amount Br can be suitably adjusted.
  • the horizontal shift amount adjustment unit 27 changes the horizontal shift amount Bh.
  • the lateral shift amount adjustment unit 27 may selectively change the lateral shift amount Bh according to the value of the lateral direction adjustment amount Hh.
  • the vertical shift amount adjustment unit 28 changes the vertical shift amount Bv whenever the adjustment images ER and EL are specified.
  • the present invention is not limited to this.
  • the vertical shift amount adjustment unit 28 may selectively change the vertical shift amount Bv according to the value of the vertical direction adjustment amount Hv.
  • the rotation amount adjustment unit 29 changes the rotation amount Br whenever the adjustment images ER and EL are specified.
  • the present invention is not limited to this.
  • the rotation amount adjustment unit 29 may selectively change the rotation amount Br according to the value of the rotation direction adjustment amount Hr.
  • FIG. 11 is a block diagram illustrating a configuration of an imaging apparatus according to a modified embodiment.
  • the same components as those in the above-described embodiment are denoted by the same reference numerals and detailed description thereof is omitted.
  • the horizontal shift amount adjustment unit 27 further includes a horizontal direction adjustment amount determination unit 41.
  • the horizontal direction adjustment amount determination unit 41 determines whether or not the horizontal direction adjustment amount Hh is equal to or greater than a predetermined threshold value. When it is determined that the horizontal adjustment amount Hh is equal to or greater than the predetermined threshold, the horizontal shift amount changing unit 34 changes the horizontal shift amount Bh. Otherwise, the horizontal shift amount changing unit 34 does not change the horizontal shift amount Bh.
  • the vertical shift amount adjustment unit 28 further includes a vertical direction adjustment amount determination unit 42.
  • the vertical adjustment amount determination unit 42 determines whether or not the vertical adjustment amount Hv is greater than or equal to a predetermined threshold value. When it is determined that the vertical adjustment amount Hv is equal to or greater than the predetermined threshold, the vertical shift amount changing unit 37 changes the vertical shift amount Bv. Otherwise, the vertical shift amount changing unit 37 does not change the vertical shift amount Bv.
  • the rotation amount adjustment unit 29 further includes a rotation direction adjustment amount determination unit 43.
  • the rotation direction adjustment amount determination unit 43 determines whether or not the rotation direction adjustment amount Hr is greater than or equal to a predetermined threshold value. When it is determined that the rotation direction adjustment amount Hr is equal to or greater than the predetermined threshold value, the rotation amount changing unit 39 changes the rotation amount Br. Otherwise, the rotation amount changing unit 39 does not change the rotation amount Br.
  • the correction parameters Bh, Bv, Br are changed individually. That is, as a result, all of the correction parameters Bh, Bv, Br may be changed, some of the correction parameters Bh, Bv, Br may be changed, and any of the correction parameters Bh, Bv, Br May not be changed.
  • the image correction unit 22 can correct the image IL more appropriately.
  • the vertical shift amount Bv and the rotation amount Br can further appropriately correct the image IL.
  • the image correction unit 22 corrects one of the images IR and IL, but is not limited thereto.
  • both the images IR and IL may be corrected.
  • the correction parameter storage unit 21 may store correction parameters BhR, BvR, BrR for the image IR and correction parameters BhL, BvL, BrL for the image IL. According to this, the image correction part 22 can correct
  • the horizontal shift amount adjustment unit 27 may adjust the horizontal shift amount BhR for the image IR and the horizontal shift amount BhL for the image IL, respectively.
  • the vertical shift amount adjustment unit 28 may adjust the vertical shift amounts BvR and BvL, respectively.
  • the rotation amount adjustment unit 29 may adjust the rotation amounts BrR and BrL, respectively.
  • the image correction unit 22 performs the process of moving the image IL, but is not limited thereto.
  • the image correction unit 22 may correct image distortion in addition to the process of moving the image IL. According to this, the image IL can be corrected more suitably.
  • the adjustment image ER is the image IR that has not been corrected by the image correction unit 22, and the adjustment image EL is the image ILc that has been corrected by the image correction unit 22.
  • the adjustment image ER may be an image that has been corrected by the image correction unit 22, and the adjustment image EL may be an image that has not been corrected by the image correction unit 22.
  • one of the adjustment images ER and EL is the image ILc corrected by the image correction unit 22, but is not limited thereto.
  • both of the adjustment images ER and EL may be images corrected by the image correction unit 22.
  • the horizontal shift amount adjustment unit 27, the vertical shift amount adjustment unit 28, and the rotation amount adjustment unit 29 can suitably adjust the correction parameters Bh, Bv, and Br, respectively.
  • both of the adjustment images ER and EL may be images that have not been corrected by the image correction unit 22.
  • the adjustment image specifying unit 25 may determine whether the images IR and IL are the adjustment images ER and EL.
  • the lateral shift amount changing unit 34 sets the lateral adjustment amount Hh as a new lateral shift amount Bh.
  • the horizontal adjustment amount Hh is stored in the correction parameter storage unit 21 as a new horizontal shift amount Bh.
  • the vertical shift amount changing unit 37 sets the vertical adjustment amount Hv as a new vertical shift amount Bv.
  • the rotation amount changing unit 39 preferably sets the rotation direction adjustment amount Hr as a new rotation amount Br. According to this, when both the adjustment images ER and EL are uncorrected images, the correction parameters Bh, Bv, and Br can be suitably adjusted.
  • the adjustment units 27 to 29 perform correction parameters Bh, Bv
  • the images IR and IL determined to be the adjustment images ER and EL may be corrected by changing Br and using the corrected correction parameters Bh, Bv, and Br by the image correction unit 22. According to this, the image re-correction unit 24 described in the embodiment can be omitted.
  • the adjustment image specifying unit 25 may use the following determination condition 5 in addition to the determination conditions 1 to 4.
  • Determination condition 5 The difference between the difference ⁇ vR and the difference ⁇ vL is equal to or less than the threshold value.
  • the difference ⁇ vR is defined in the determination condition 2
  • the difference ⁇ vL is defined in the determination condition 4.
  • the adjustment image specifying unit 25 may use determination conditions 6 to 9 shown below.
  • Determination condition 6 The number of projection points mR of the image IR is two.
  • Determination condition 7 One projection point mR1 is in the region WR1 of the image IR, and the other projection point mR2 is in the region WR2 of the image IR.
  • Judgment condition 8 The number of projection points mL of the image IL is two.
  • Determination condition 9 One projection point mL1 is in the region WL1 of the image IL, and the other projection point mL2 is in the region WL2 of the image IL.
  • each of the regions WR1 and WR2 is defined by the coordinate value of the horizontal axis uR and the coordinate value of the vertical axis vR.
  • each of WL1 and WL2 is defined by a coordinate value on the horizontal axis uL and a coordinate value on the vertical axis vL.
  • the adjustment image specifying unit 25 can preferably determine whether the images IR and ILc are the adjustment images ER and EL.
  • the horizontal shift amount adjusting unit 27, the vertical shift amount adjusting unit 28, and the rotation amount adjusting unit 29 are provided, but the present invention is not limited thereto.
  • any one or two of the adjustment units 27 to 29 may be omitted.
  • the image processing unit 13 can suitably adjust at least one of the correction parameters Bh, Bv, and Br.
  • the photographing apparatus 5 includes the two cameras 11R and 11L, but is not limited thereto.
  • the imaging device 5 may include three or more cameras 11. That is, a stereo camera may be configured by three or more cameras 11.
  • the optical axes AR and AL are parallel to the front direction Zf, but the present invention is not limited to this. That is, the optical axes AR and AL may not be parallel to the front direction Zf.
  • the marker group MG includes the two markers M1 and M2.
  • the marker group MG may include three or more markers M.
  • the marker M may be arranged at an appropriate position.
  • three or more markers M may be arranged on the plane P described in the embodiment.
  • two or more markers M may be arranged on the plane P, and another marker M may be arranged at a position off the plane P. According to this, the horizontal adjustment amount Hh, the vertical adjustment amount Hv, and the rotation direction adjustment amount Hr can be calculated more appropriately.
  • the vehicle 1 is a golf cart, but is not limited thereto.
  • the vehicle 1 may be applied to various uses.
  • the vehicle 1 may be a vehicle for traveling in a farm.
  • the vehicle 1 may be an unmanned traveling vehicle.
  • the marker M1 and the marker M2 are arranged at substantially the same height, but the present invention is not limited to this.
  • the height position of at least one of the markers M1 and M2 may be arbitrarily changed.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
  • Manufacturing & Machinery (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
PCT/JP2015/053563 2014-02-12 2015-02-09 撮影装置、車両および画像補正方法 WO2015122389A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020157036534A KR101812530B1 (ko) 2014-02-12 2015-02-09 촬영 장치, 차량 및 화상 보정 방법
JP2015532236A JP6161704B2 (ja) 2014-02-12 2015-02-09 撮影装置、車両および画像補正方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-024466 2014-02-12
JP2014024466 2014-02-12

Publications (1)

Publication Number Publication Date
WO2015122389A1 true WO2015122389A1 (ja) 2015-08-20

Family

ID=53800132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/053563 WO2015122389A1 (ja) 2014-02-12 2015-02-09 撮影装置、車両および画像補正方法

Country Status (4)

Country Link
JP (1) JP6161704B2 (ko)
KR (1) KR101812530B1 (ko)
TW (1) TWI551490B (ko)
WO (1) WO2015122389A1 (ko)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018125844A (ja) * 2017-11-15 2018-08-09 パナソニックIpマネジメント株式会社 車載カメラ
WO2019234936A1 (ja) * 2018-06-08 2019-12-12 マクセル株式会社 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板
JP2020112959A (ja) * 2019-01-10 2020-07-27 株式会社ダイフク 物品搬送装置
JP2021056180A (ja) * 2019-10-02 2021-04-08 株式会社Subaru 画像処理装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10341458A (ja) * 1997-06-10 1998-12-22 Toyota Motor Corp 車載ステレオカメラの校正方法、および、その方法を適用した車載ステレオカメラ
JP2001272210A (ja) * 2000-03-27 2001-10-05 Toyoda Mach Works Ltd 距離認識装置
JP2007256030A (ja) * 2006-03-23 2007-10-04 Clarion Co Ltd 車載カメラのキャリブレーション装置およびキャリブレーション方法
JP2008304248A (ja) * 2007-06-06 2008-12-18 Konica Minolta Holdings Inc 車載用ステレオカメラの校正方法、車載用距離画像生成装置及びプログラム
JP2013190281A (ja) * 2012-03-13 2013-09-26 Honda Elesys Co Ltd 設置状態検出システム、設置状態検出装置、及び設置状態検出方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007024590A (ja) * 2005-07-13 2007-02-01 Toyota Motor Corp 物体検出装置
WO2010062863A2 (en) * 2008-11-26 2010-06-03 Satiogen Pharmaceuticals, Inc. Compositions containing satiogens and methods of use
JP5915268B2 (ja) * 2012-03-05 2016-05-11 富士通株式会社 パラメータ算出方法、情報処理装置及びプログラム
TWM466839U (zh) * 2013-07-26 2013-12-01 Univ Chaoyang Technology 車輛警示系統

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10341458A (ja) * 1997-06-10 1998-12-22 Toyota Motor Corp 車載ステレオカメラの校正方法、および、その方法を適用した車載ステレオカメラ
JP2001272210A (ja) * 2000-03-27 2001-10-05 Toyoda Mach Works Ltd 距離認識装置
JP2007256030A (ja) * 2006-03-23 2007-10-04 Clarion Co Ltd 車載カメラのキャリブレーション装置およびキャリブレーション方法
JP2008304248A (ja) * 2007-06-06 2008-12-18 Konica Minolta Holdings Inc 車載用ステレオカメラの校正方法、車載用距離画像生成装置及びプログラム
JP2013190281A (ja) * 2012-03-13 2013-09-26 Honda Elesys Co Ltd 設置状態検出システム、設置状態検出装置、及び設置状態検出方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018125844A (ja) * 2017-11-15 2018-08-09 パナソニックIpマネジメント株式会社 車載カメラ
WO2019234936A1 (ja) * 2018-06-08 2019-12-12 マクセル株式会社 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板
JPWO2019234936A1 (ja) * 2018-06-08 2021-02-25 マクセル株式会社 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板
JP7041262B2 (ja) 2018-06-08 2022-03-23 マクセル株式会社 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板
JP2020112959A (ja) * 2019-01-10 2020-07-27 株式会社ダイフク 物品搬送装置
JP2021056180A (ja) * 2019-10-02 2021-04-08 株式会社Subaru 画像処理装置
JP7291594B2 (ja) 2019-10-02 2023-06-15 株式会社Subaru 画像処理装置

Also Published As

Publication number Publication date
TW201534514A (zh) 2015-09-16
JP6161704B2 (ja) 2017-07-12
TWI551490B (zh) 2016-10-01
KR101812530B1 (ko) 2017-12-27
KR20160013994A (ko) 2016-02-05
JPWO2015122389A1 (ja) 2017-03-30

Similar Documents

Publication Publication Date Title
KR101787304B1 (ko) 교정 방법, 교정 장치 및 컴퓨터 프로그램 제품
KR101892595B1 (ko) 서라운드뷰 시스템 카메라 자동 보정 전용 외부 변수
JP5027747B2 (ja) 位置測定方法、位置測定装置、およびプログラム
JP6767998B2 (ja) 画像の線からのカメラの外部パラメータ推定
WO2010001940A1 (ja) 位置測定方法、位置測定装置、およびプログラム
WO2014199929A1 (ja) 単眼モーションステレオ距離推定方法および単眼モーションステレオ距離推定装置
JP5027746B2 (ja) 位置測定方法、位置測定装置、およびプログラム
US9862417B2 (en) Turning angle correction method, turning angle correction device, image-capturing device, and turning angle correction system
JP6161704B2 (ja) 撮影装置、車両および画像補正方法
JP6602982B2 (ja) 車載カメラ、車載カメラの調整方法、車載カメラシステム
JP2014074632A (ja) 車載ステレオカメラの校正装置及び校正方法
KR101379787B1 (ko) 구멍을 가진 구조물을 이용한 카메라와 레이저 거리 센서의 보정 장치 및 보정 방법
US8941732B2 (en) Three-dimensional measuring method
JP7256734B2 (ja) 姿勢推定装置、異常検出装置、補正装置、および、姿勢推定方法
JP6755709B2 (ja) 溶接装置及び溶接方法
TW201203173A (en) Three dimensional distance measuring device and method
JP3842988B2 (ja) 両眼立体視によって物体の3次元情報を計測する画像処理装置およびその方法又は計測のプログラムを記録した記録媒体
WO2019087253A1 (ja) ステレオカメラのキャリブレーション方法
JP2009212734A (ja) 自動校正単眼ステレオ視装置
JP7303064B2 (ja) 画像処理装置、および、画像処理方法
WO2015182771A1 (ja) 撮像装置、画像処理装置、画像処理方法およびコンピュータプログラム
CN113504385B (zh) 复数相机测速方法及测速装置
WO2022118513A1 (ja) 位置姿勢算出装置、位置姿勢算出方法及び測量装置
JP2019212203A (ja) 3dモデル作成システム
KR102479253B1 (ko) 차량용 카메라 영상 기반 공차 보정 방법

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2015532236

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15749631

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20157036534

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15749631

Country of ref document: EP

Kind code of ref document: A1