WO2016047808A1 - 撮像装置の校正システム、作業機械及び撮像装置の校正方法 - Google Patents
撮像装置の校正システム、作業機械及び撮像装置の校正方法 Download PDFInfo
- Publication number
- WO2016047808A1 WO2016047808A1 PCT/JP2015/077873 JP2015077873W WO2016047808A1 WO 2016047808 A1 WO2016047808 A1 WO 2016047808A1 JP 2015077873 W JP2015077873 W JP 2015077873W WO 2016047808 A1 WO2016047808 A1 WO 2016047808A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging device
- imaging
- pair
- parameter
- devices
- Prior art date
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/08—Superstructures; Supports for superstructures
- E02F9/0858—Arrangement of component parts installed on superstructures not otherwise provided for, e.g. electric components, fenders, air-conditioning units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
- G01C11/08—Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
- G01C11/10—Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken using computers to control the position of the pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/30—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom
- E02F3/32—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom working downwardly and towards the machine, e.g. with backhoes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to an imaging apparatus calibration system, a working machine, and an imaging apparatus calibration method for calibrating an imaging apparatus included in a work machine.
- Such a working machine captures an image of an object using an imaging device, controls its own operation based on the imaging result, and sends information about the imaged object to a management device.
- Patent Document 1 describes a technique for calibrating a work machine using an imaging device. However, Patent Document 1 neither describes nor suggests calibrating the imaging device included in the work machine.
- An object of the present invention is to calibrate an imaging device included in a work machine.
- the present invention defines the attitude of the second imaging device with a constant distance between the first imaging device and the second imaging device among at least two imaging devices and the at least two imaging devices.
- a processing device that changes a parameter searches for a corresponding portion between a pair of images obtained by the first imaging device and the second imaging device, and obtains the parameter based on the search result; Is a calibration system for an imaging apparatus.
- the processing device changes a parameter that defines a posture of the second imaging device while keeping a distance between the first imaging device and the second imaging device constant among at least two of the imaging devices, and A search unit that searches for a corresponding portion between a pair of images obtained by the first imaging device and the second imaging device, and a posture of the imaging device is defined based on a search result of the search unit. It is preferable to have a determination unit for obtaining posture parameters.
- the parameter defines rotation of the second imaging device.
- the parameter includes a first parameter for rotating the second imaging device around the first imaging device and a second parameter for rotating the second imaging device around the center of the second imaging device.
- the parameters are preferably included.
- the processing device needs to obtain the parameter based on a result of searching for a corresponding portion between a pair of images obtained by a pair of the imaging devices out of at least two imaging devices. It is preferable to determine the imaging device and the second imaging device.
- the pair of imaging devices whose search success rate is less than a threshold value obtain the parameters.
- the present invention is a work machine including the above-described imaging device calibration system and a plurality of the imaging devices.
- the present invention obtains a plurality of images by imaging a target with at least two imaging devices, and searches for a corresponding portion between a pair of images obtained by the pair of imaging devices among the plurality of imaging devices.
- determining whether to determine a parameter that defines the posture of one of the pair of imaging devices based on the parameter, and determining the parameter the first imaging device and the second imaging device that are the pair of imaging devices And a parameter that defines the attitude of the second imaging device is changed, and a corresponding portion between a pair of images obtained by the first imaging device and the second imaging device is changed.
- This is a method for calibrating an imaging apparatus, which searches and obtains an attitude parameter that defines the attitude of the imaging apparatus based on the searched result.
- This invention can suppress the fall of the work efficiency at the time of working using the working machine provided with the working machine which has a working tool.
- FIG. 1 is a perspective view of a hydraulic excavator including an imaging apparatus calibration system according to an embodiment.
- FIG. 2 is a perspective view showing the vicinity of the driver's seat of the hydraulic excavator according to the embodiment.
- FIG. 3 is a diagram illustrating dimensions of a working machine included in the hydraulic excavator according to the embodiment and a coordinate system of the hydraulic excavator.
- FIG. 4 is a diagram illustrating an example of an image obtained by imaging a target by a plurality of imaging devices.
- FIG. 5 is a diagram illustrating an example of an object captured by a plurality of imaging devices.
- FIG. 6 is a diagram illustrating a calibration system of the imaging apparatus according to the embodiment.
- FIG. 1 is a perspective view of a hydraulic excavator including an imaging apparatus calibration system according to an embodiment.
- FIG. 2 is a perspective view showing the vicinity of the driver's seat of the hydraulic excavator according to the embodiment.
- FIG. 3 is a diagram illustrating dimensions of
- FIG. 7 is a diagram illustrating an example of three-dimensional measurement of the blade edge of the bucket using a pair of imaging devices.
- FIG. 8 is a diagram illustrating a pair of images obtained by the pair of imaging devices.
- FIG. 9 is a diagram illustrating a pair of images obtained by the pair of imaging devices.
- FIG. 10 is a perspective view showing the positional relationship between a pair of imaging devices.
- FIG. 11 is an explanatory diagram of a shift of the imaging device with respect to the imaging device.
- FIG. 12 is a diagram illustrating a pair of images obtained by the pair of imaging devices.
- FIG. 13 is a diagram illustrating a pair of images obtained by a pair of imaging devices.
- FIG. 14 is a flowchart illustrating processing when the calibration system according to the embodiment executes the calibration method according to the embodiment.
- FIG. 15 is a diagram for explaining a method of determining an imaging device for obtaining a posture parameter.
- FIG. 16 is a diagram illustrating an example of a table for determining an imaging device for which posture parameters are obtained.
- FIG. 17 is a diagram for explaining posture parameters.
- FIG. 18 is a diagram for explaining posture parameters.
- FIG. 19 is a diagram for explaining posture parameters.
- FIG. 20 is a diagram for explaining posture parameters.
- FIG. 21 is a diagram for explaining posture parameters.
- FIG. 1 is a perspective view of a hydraulic excavator 100 including a calibration system for an imaging apparatus according to an embodiment.
- FIG. 2 is a perspective view showing the vicinity of the driver's seat of the excavator 100 according to the embodiment.
- FIG. 3 is a diagram illustrating dimensions of the working machine 2 included in the hydraulic excavator according to the embodiment and a coordinate system of the hydraulic excavator 100.
- a hydraulic excavator 100 that is a work machine has a vehicle body 1 and a work machine 2.
- the vehicle body 1 includes a revolving body 3, a cab 4, and a traveling body 5.
- the turning body 3 is attached to the traveling body 5 so as to be turnable.
- the swivel body 3 houses devices such as a hydraulic pump and an engine (not shown).
- the cab 4 is disposed in the front part of the revolving structure 3.
- An operation device 25 shown in FIG. 2 is arranged in the cab 4.
- the traveling body 5 has crawler belts 5a and 5b, and the excavator 100 travels as the crawler belts 5a and 5b rotate.
- the work machine 2 is attached to the front portion of the vehicle body 1 and has a boom 6, an arm 7, a bucket 8 as a work tool, a boom cylinder 10, an arm cylinder 11, and a bucket cylinder 12.
- the front side of the vehicle body 1 is the direction side from the backrest 4SS of the driver's seat 4S shown in FIG.
- the rear side of the vehicle body 1 is the direction side from the operation device 25 toward the backrest 4SS of the driver's seat 4S.
- the front portion of the vehicle body 1 is a portion on the front side of the vehicle body 1 and is a portion on the opposite side of the counterweight WT of the vehicle body 1.
- the operating device 25 is a device for operating the work implement 2 and the swing body 3 and includes a right lever 25R and a left lever 25L.
- the base end portion of the boom 6 is rotatably attached to the front portion of the vehicle body 1 via a boom pin 13.
- the boom pin 13 corresponds to the rotation center of the boom 6 with respect to the swing body 3.
- a base end portion of the arm 7 is rotatably attached to a tip end portion of the boom 6 via an arm pin 14.
- the arm pin 14 corresponds to the rotation center of the arm 7 with respect to the boom 6.
- a bucket 8 is rotatably attached to the tip of the arm 7 via a bucket pin 15.
- the bucket pin 15 corresponds to the rotation center of the bucket 8 with respect to the arm 7.
- the length of the boom 6, that is, the length between the boom pin 13 and the arm pin 14 is L1.
- the length of the arm 7, that is, the length between the arm pin 14 and the bucket pin 15 is L2.
- the length of the bucket 8, that is, the length between the bucket pin 15 and the blade tip P3 that is the tip of the blade 9 of the bucket 8 is L3.
- the base end portion of the boom cylinder 10 is rotatably attached to the swing body 3 via a boom cylinder foot pin 10a.
- the tip of the boom cylinder 10 is rotatably attached to the boom 6 via a boom cylinder top pin 10b.
- the boom cylinder 10 drives the boom 6 by expanding and contracting by hydraulic pressure.
- the base end of the arm cylinder 11 is rotatably attached to the boom 6 via an arm cylinder foot pin 11a.
- the tip of the arm cylinder 11 is rotatably attached to the arm 7 via an arm cylinder top pin 11b.
- the arm cylinder 11 drives the arm 7 by expanding and contracting by hydraulic pressure.
- the base end portion of the bucket cylinder 12 is rotatably attached to the arm 7 via a bucket cylinder foot pin 12a.
- the tip of the bucket cylinder 12 is rotatably attached to one end of the first link member 47 and one end of the second link member 48 via the bucket cylinder top pin 12b.
- the other end of the first link member 47 is rotatably attached to the distal end portion of the arm 7 via the first link pin 47a.
- the other end of the second link member 48 is rotatably attached to the bucket 8 via a second link pin 48a.
- the bucket cylinder 12 drives the bucket 8 by expanding and contracting by hydraulic pressure.
- the boom 6, the arm 7 and the bucket 8 are provided with a first angle detector 18A, a second angle detector 18B and a third angle detector 18C, respectively.
- the first angle detector 18A, the second angle detector 18B, and the third angle detector 18C are, for example, stroke sensors. Each of them detects the stroke length of the boom cylinder 10, the arm cylinder 11, and the bucket cylinder 12, so that the rotation angle of the boom 6 with respect to the vehicle body 1, the rotation angle of the arm 7 with respect to the boom 6, 7 is indirectly detected.
- the first angle detection unit 18A detects the stroke length of the boom cylinder 10.
- the processing device 20 to be described later determines the boom 6 with respect to the Zm axis of the coordinate system (Xm, Ym, Zm) of the excavator 100 shown in FIG. 3 based on the stroke length of the boom cylinder 10 detected by the first angle detector 18A.
- the rotation angle ⁇ 1 is calculated.
- the coordinate system of the excavator 100 is appropriately referred to as a vehicle body coordinate system. As shown in FIG. 2, for example, the origin of the vehicle body coordinate system is the center of the boom pin 13.
- the center of the boom pin 13 is the center of the cross section when the boom pin 13 is cut on a plane orthogonal to the direction in which the boom pin 13 extends, and the center in the direction in which the boom pin 13 extends.
- the vehicle body coordinate system is not limited to the example of the embodiment.
- the turning center of the revolving structure 3 is the Zm axis
- the axis parallel to the direction in which the boom pin 13 extends is the Ym axis
- the axis may be the Xm axis.
- the second angle detector 18B detects the stroke length of the arm cylinder 11.
- the processing device 20 calculates the rotation angle ⁇ 2 of the arm 7 with respect to the boom 6 from the stroke length of the arm cylinder 11 detected by the second angle detection unit 18B.
- the third angle detection unit 18C detects the stroke length of the bucket cylinder 12.
- the processing device 20 calculates the rotation angle ⁇ 3 of the bucket 8 relative to the arm 7 from the stroke length of the bucket cylinder 12 detected by the third angle detection unit 18C.
- the excavator 100 includes a plurality of imaging devices 30 a, 30 b, 30 c, and 30 d in the cab 4, for example.
- the imaging device 30 when the plurality of imaging devices 30a, 30b, 30c, and 30d are not distinguished, they are appropriately referred to as the imaging device 30.
- the imaging device 30a and the imaging device 30c are disposed on the work machine 2 side.
- the type of the imaging device 30 is not limited, but in the embodiment, for example, an imaging device including a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor is used.
- CCD Couple Charged Device
- CMOS Complementary Metal Oxide Semiconductor
- the imaging device 30a and the imaging device 30b are arranged in the cab 4 facing the same direction or different directions at a predetermined interval.
- the imaging device 30c and the imaging device 30d are arranged in the cab 4, for example, facing the same direction or different directions at a predetermined interval.
- a plurality of imaging devices 30a, 30b, 30c, and 30d are combined to form a stereo camera.
- a stereo camera is configured by a combination of the imaging devices 30a and 30b and a combination of the imaging devices 30c and 30d.
- the imaging device 30a and the imaging device 30b face upward, and the imaging device 30c and the imaging device 30d face downward.
- At least the imaging device 30a and the imaging device 30c face the front surface of the excavator 100, in the embodiment, the swing body 3.
- the imaging device 30b and the imaging device 30d may be arranged slightly toward the work machine 2, that is, slightly toward the imaging device 30a and the imaging device 30c.
- the excavator 100 includes the four image pickup devices 30, but the number of the image pickup devices 30 included in the excavator 100 may be at least two, and is not limited to four. This is because the excavator 100 configures a stereo camera with at least a pair of the imaging devices 30 to capture the subject in stereo.
- the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d are arranged in front of and above the cab 4.
- the upward direction is a direction perpendicular to the ground contact surfaces of the crawler belts 5a, 5b of the excavator 100 and away from the ground contact surfaces.
- the ground contact surfaces of the crawler belts 5a and 5b are planes defined by at least three points that do not exist on the same straight line at a portion where at least one of the crawler belts 5a and 5b is grounded.
- the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d shoots a subject existing in front of the vehicle body 1 of the excavator 100 in stereo.
- the target is, for example, a target excavated by the work machine 2.
- the processing device 20 illustrated in FIGS. 1 and 2 measures a target three-dimensionally using a result of stereo shooting performed by at least a pair of imaging devices 30.
- the place where the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d are arranged is not limited to the front and upper side in the cab 4.
- FIG. 4 is a diagram illustrating an example of an image obtained by imaging a target by a plurality of imaging devices 30a, 30b, 30c, and 30d.
- FIG. 5 is a diagram illustrating an example of the target OJ imaged by the plurality of imaging devices 30a, 30b, 30c, and 30d.
- the images PIa, PIb, PIc, and PId shown in FIG. 4 are obtained by, for example, imaging the target OJ by the plurality of imaging devices 30a, 30b, 30c, and 30d shown in FIG.
- the target OJ has a first part OJa, a second part OJb, and a third part OJc.
- the image PIa is captured by the imaging device 30a
- the image PIb is captured by the imaging device 30b
- the image PIc is captured by the imaging device 30c
- the image PId is captured by the imaging device 30d. It has been done. Since the pair of imaging devices 30a and 30b are arranged facing the upper side of the hydraulic excavator 100, the upper side of the target OJ is shown in the images PIa and PIb. Since the pair of imaging devices 30c and 30d are arranged facing the lower side of the excavator 100, the lower side of the target OJ is shown in the images PIc and PId.
- the images PIa and PIb captured by the pair of imaging devices 30a and 30b and the images PIc and PId captured by the pair of imaging devices 30c and 30d are part of the region of the target OJ.
- the second portion OJb is duplicated. That is, the imaging region of the pair of imaging devices 30a and 30b facing upward and the imaging region of the pair of imaging devices 30c and 30d facing downward have overlapping portions.
- the processing device 20 uses a pair of imaging devices 30a and 30b when performing image processing by the stereo method on the images PIa, PIb, PIc, and PId of the same target OJ captured by the plurality of imaging devices 30a, 30b, 30c, and 30d.
- a first parallax image is obtained from the captured images PIa and PIb.
- the processing device 20 obtains a second parallax image from the images PIc and PId captured by the pair of imaging devices 30c and 30d. Thereafter, the processing device 20 combines the first parallax image and the second parallax image to obtain one parallax image.
- the processing device 20 measures the target three-dimensionally using the obtained parallax image. As described above, the processing device 20 and the plurality of imaging devices 30a, 30b, 30c, and 30d measure the entire predetermined area of the target OJ in a three-dimensional manner with a single imaging.
- the imaging device 30c is used as the reference.
- the four imaging devices 30a, 30b, 30c, and 30d each have a coordinate system. These coordinate systems are appropriately referred to as imaging device coordinate systems.
- FIG. 2 shows only the coordinate system (Xs, Ys, Zs) of the imaging device 30c serving as a reference. The origin of the imaging device coordinate system is the center of each imaging device 30a, 30b, 30c, 30d.
- FIG. 6 is a diagram illustrating the calibration system 50 of the imaging apparatus according to the embodiment.
- the imaging apparatus calibration system 50 (hereinafter, appropriately referred to as the calibration system 50) includes a plurality of imaging apparatuses 30 a, 30 b, 30 c, 30 d and a processing apparatus 20. These are provided in the vehicle body 1 of the excavator 100 as shown in FIGS. 1 and 2.
- the processing device 20 includes a processing unit 21, a storage unit 22, and an input / output unit 23.
- the processing unit 21 is realized by, for example, a processor such as a CPU (Central Processing Unit) and a memory.
- the processing unit 21 includes a search unit 21A and a determination unit 21B.
- the processing device 20 realizes a calibration method for the imaging apparatus according to the embodiment (hereinafter referred to as a calibration method as appropriate).
- the processing unit 21 reads and executes the computer program stored in the storage unit 22.
- This computer program is for causing the processing unit 21 to execute the calibration method according to the embodiment.
- the calibration method according to the embodiment allows the position of the imaging device 30 so that at least three-dimensional measurement using the result of stereo shooting by the pair of imaging devices 30 can be realized when the imaging device 30 has moved due to some factor. This is to correct the deviation.
- the processing unit 21 of the processing device 20 executes the calibration method according to the embodiment.
- the imaging device 30c and the imaging device 30d on which the calibration method according to the embodiment is executed are referred to as a first imaging device 30c and a second imaging device 30d, respectively.
- the processing unit 21 includes the first imaging device 30c and the second imaging device among at least two imaging devices 30a, 30b, 30c, and 30d in the embodiment.
- the parameter that defines the attitude of the second imaging device 30d is changed with the distance to 30d constant.
- the processing unit 21 performs image processing on a pair of images obtained by the first imaging device 30c and the second imaging device 30d, and in the embodiment, a portion corresponding to the pair of images in stereo processing. Based on the search result, the above-mentioned parameters are obtained.
- the search unit 21A of the processing unit 21 executes the parameter change and search described above.
- the determination unit 21A of the processing unit 21 obtains the above-described parameters based on the searched result.
- Image processing in the stereo system is a method of obtaining a distance to an object from two images obtained by observing the same object from two different imaging devices 30.
- the distance to the object is expressed as, for example, a distance image obtained by visualizing the distance information to the object by shading.
- the processing device 20 executes the calibration method according to the embodiment, the processing device 20 performs stereo image processing on the pair of images captured by the pair of imaging devices 30, so that the target position, specifically, the tertiary. Find the coordinates of the object in the original coordinate system.
- the processing device 20 can measure the target three-dimensionally using a pair of images obtained by capturing the same target with at least the pair of imaging devices 30. That is, at least a pair of the imaging device 30 and the processing device 20 measures a target three-dimensionally by a stereo method.
- the storage unit 22 is a nonvolatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Random Access Memory), flash memory, EPROM (Erasable Programmable Random Access Memory), EEPROM (Electrically Erasable Programmable Random Access Memory), etc. At least one of a magnetic disk, a flexible disk, and a magneto-optical disk is used.
- the storage unit 22 stores a computer program for causing the processing unit 21 to execute the calibration method according to the embodiment.
- the storage unit 22 stores information used when the processing unit 21 executes the calibration method according to the embodiment. This information is necessary for obtaining the position of a part of the work implement 2 from the internal calibration data of each image capture device 30, the posture of each image capture device 30, the positional relationship between the image capture devices 30, and the posture of the work implement 2, for example. Information.
- the input / output unit 23 is an interface circuit for connecting the processing device 20 and devices.
- the hub 51, the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C are connected to the input / output unit 23.
- the hub 51 is connected to a plurality of imaging devices 30a, 30b, 30c, and 30d. Results obtained by the imaging devices 30 a, 30 b, 30 c, and 30 d are input to the input / output unit 23 via the hub 51.
- the processing unit 21 acquires the results captured by the imaging devices 30a, 30b, 30c, and 30d via the hub 51 and the input / output unit 23.
- the processing device 20 may be realized by dedicated hardware, or a plurality of processing circuits may cooperate to realize the function of the processing device 20.
- FIG. 7 is a diagram illustrating an example in which the blade tip P3 of the blade 9 of the bucket 8 is three-dimensionally measured using the pair of imaging devices 30L and 30R.
- 8 and 9 are diagrams illustrating a pair of images 32L and 32R obtained by the pair of imaging devices 30L and 30R.
- the processing device 20 illustrated in FIG. 6 obtains a target position by performing image processing by a stereo method on a pair of images captured by the pair of imaging devices 30.
- the pair of imaging devices 30 that capture the blade edge P3 are referred to as an imaging device 30L and an imaging device 30R.
- the pair of imaging devices 30L and 30R is the imaging device 30 included in the excavator 100 shown in FIG.
- FIG. 7 shows a state where the position of the imaging device 30L is moved due to some external factor as an imaging device 30L ′ of a two-dot chain line.
- the imaging device 30L includes an imaging element 31L.
- the origin of the imaging device coordinate system (Xs, Ys, Zs) of the imaging device 30L, that is, the center of the imaging device 30L is defined as the optical center OCL.
- the Zs axis of the imaging device 30L is the optical axis of the imaging device 30L and passes through the optical center OCL.
- an image 32L including the target is obtained.
- the imaging device 30R includes an imaging element 31R.
- the origin of the imaging device coordinate system (Xs, Ys, Zs) of the imaging device 30R, that is, the center of the imaging device 30R is defined as the optical center OCR.
- the Zs axis of the imaging device 30R is the optical axis of the imaging device 30R and passes through the optical center OCR.
- the object whose position is obtained by the stereo method is the blade tip P3 of the bucket 8 shown in FIG.
- the imaging device 30L and the imaging device 30R capture the bucket 8, a pair of images 32L and 32R as shown in FIG. 8 are obtained.
- the imaging device 30L is disposed on the left side toward the bucket 8, and the imaging device 30R is disposed on the right side toward the bucket 8 and separated from the imaging device 30L by a predetermined distance B.
- the device 30L and the imaging device 30R are different in the direction in which they are arranged. As described above, since the imaging device 30L and the imaging device 30R are arranged apart from each other by a predetermined distance, the direction in which the object is seen differs depending on the position of the target observation point.
- the processing device 20 performs stereo image processing on the image 32L of the blade tip P3 of the bucket 8 imaged by the imaging device 30L and the image 32R of the blade tip P3 of the bucket 8 imaged by the imaging device 30R.
- the position of the blade tip P3 of the bucket 8 that is the same target is three-dimensionally measured by image processing using the stereo method.
- the stereo image processing is a three-dimensional measurement of the imaging range of the imaging devices 30L and 30R based on the process of generating the parallax image 33 from the pair of images 32L and 32R and the parallax information included in the parallax image 33. Including the step of.
- the processing device 20 searches for a corresponding portion between the pair of images 32L and 32R, in the embodiment, the pixels PX1 and PXr corresponding to the cutting edge P3, as shown in FIG.
- the parallax is obtained from the search result of the corresponding pixels PXl and PXr.
- the parallax is information indicating how far the pixels PXl and PXr corresponding to the cutting edge P3 are physically separated, for example, how many pixels are separated.
- the parallax image 33 is an image expressing parallax in a two-dimensional array.
- the parallax is generally defined as a change amount of an angle formed by the line of sight of the pair of imaging devices 30 at a reference point to be measured.
- the projection point of the same measurement point in the image of the other imaging device 30 is captured with respect to the projection point of the measurement point in the image of one imaging device serving as a reference.
- the degree of pixel shift in the image is parallax.
- the parallax image 33 when search of the corresponding pixel fails, 0 is stored in the pixel PXs that has failed to search, and when search is successful, a numerical value greater than 0 is stored in the pixel PXs that has been successfully searched. .
- the pixel PXs storing 0 is black, and the pixel PXs storing a numerical value larger than 0 is grayscale. Therefore, in order to confirm whether or not the image processing by the stereo method has succeeded, the ratio of the pixels PXs storing numerical values other than 0 in the parallax image 33 may be used.
- the threshold value can be, for example, 80% to 90%, but is not limited to a value within this range.
- the processing apparatus 20 obtains the distance to the object using triangulation in the three-dimensional measurement process.
- a three-dimensional coordinate system (X, Y, Z) with the optical center OCL of the imaging device 30L as the origin is set.
- the imaging device 30L and the imaging device 30R are assumed to be arranged in parallel. That is, it is assumed that the image planes of the images 32L and 32R are the same and the positions in the X-axis direction are the same between the imaging device 30L and the imaging device 30R.
- the distance between the optical center OCL of the imaging device 30L and the optical center OCR of the imaging device 30R is B, the cutting edge P3 in the image 32L captured by the imaging device 30L, that is, the Y coordinate of the pixel PXl is YL, and the imaging device 30R takes an image.
- the Y edge of the blade tip P3 in the image 32R, that is, the pixel PXr is YR, and the Z coordinate of the blade tip P3 is ZP.
- YL, YR, and ZP are all coordinates in the three-dimensional coordinate system (X, Y, Z).
- the distance between the Y axis and the image planes of the images 32L and 32R is the focal length f of the imaging devices 30L and 30R.
- each pixel PXs of the parallax image 33 shown in FIG. 9 information indicating whether or not the search is successful, and parallax d when the search is successful are stored.
- the processing device 20 uses the parallax d between the pixels that have been successfully searched in the images 32L and 32R, the coordinates of each pixel that has been successfully searched in the images 32L and 32R, and the focal lengths f of the imaging devices 30L and 30R. The distance to the object can be obtained.
- the processing device 20 searches for a corresponding pixel between the pair of images 32 ⁇ / b> L and 32 ⁇ / b> R and generates a parallax image 33.
- the processing device 20 searches for the pixels PXl and PXr corresponding to the cutting edge P3 whose distance is to be obtained.
- the processing device 20 obtains YL and YR which are Y coordinates of the searched pixels PXl and PXr.
- the processing device 20 uses the obtained parallax d, distance B, and focal length f to obtain the distance ZP from the imaging devices 30L, 30R to the blade tip P3 by the above-described equation.
- FIG. 10 is a perspective view showing the positional relationship between the pair of imaging devices 30L and 30R.
- the pair of imaging devices 30L and 30R constitutes a stereo camera.
- one imaging device 30R is used as a reference machine and the other imaging device 30L is used as a reference machine.
- a straight line connecting the optical center OCR of the imaging device 30R and the optical center OCL of the imaging device 30L is a baseline BL.
- the length of the baseline BL is B.
- the imaging device 30L When the imaging device 30L is not arranged in parallel to the imaging device 30R, there may be a case where a corresponding pixel cannot be searched between the pair of images 32L and 32R. For this reason, the relative positional relationship between the imaging device 30L and the imaging device 30R is obtained in advance. Then, by correcting at least one of the images 32L and 32R based on the deviation between the imaging device 30L and the imaging device 30R obtained from the obtained relative positional relationship, stereo image processing and three-dimensional measurement are performed. enable.
- the deviation between the imaging device 30L and the imaging device 30R can be represented by the deviation of the reference device with respect to the reference device, that is, the deviation of the imaging device 30L with respect to the imaging device 30R.
- This deviation includes the rotation RTx around the Xs axis of the imaging device 30L, the rotation RTy around the Ys axis of the imaging device 30L, the rotation RTz around the Zs axis of the imaging device 30L, the deviation of the imaging device 30L in the Xs axis direction, and the imaging device 30L.
- FIG. 11 is an explanatory diagram of a shift of the imaging device 30R with respect to the imaging device 30L.
- FIG. 11 for example, when the rotation RTz around the Zs axis of the imaging device 30L occurs in the imaging device 30L, an image 32Lr obtained from the posture of the imaging device 30L when there is a deviation is rotated Rty. By rotating around the Zs axis by the amount of deviation due to, it is possible to correct the image 32L of the imaging device 30L when there is no deviation.
- the deviation due to the rotation RTz can be expressed by an angle ⁇ around the Zs axis. Therefore, by rotating the position (xs, ys) of the image 32Lr of the image pickup device 30L on the xs-ys plane around the Zs axis using the equation (1), the image 32L of the image pickup device 30L when there is no deviation is obtained. Can be converted to a position (Xs, Ys) on the Xs-Ys plane.
- the deviation due to the rotation RTx around the Xs axis is corrected by the expression (2)
- the deviation due to the rotation RTy around the Ys axis is corrected by the expression (3).
- the angle ⁇ in the equation (2) represents a deviation due to the rotation RTx
- the angle ⁇ in the equation (3) represents a deviation due to the rotation RTy.
- the angles ⁇ , ⁇ , and ⁇ are amounts for correcting a shift in the rotational direction around the axis in the imaging device coordinate system of the imaging device 30L.
- the angles ⁇ , ⁇ , ⁇ are appropriately referred to as rotational direction correction amounts ⁇ , ⁇ , ⁇ or simply rotational direction correction amounts.
- the displacement of the imaging device 30L that occurs in the Xs-axis direction of the imaging device 30R moves the position of the image 32Lr captured by the imaging device 30L by an amount ⁇ X that cancels the displacement in parallel with the Xs-axis direction of the imaging device 30R. It is corrected by.
- the deviation of the imaging device 30L that occurs in the Ys-axis direction and the Zs-axis direction of the imaging device 30R is also corrected in the same manner as the amount ⁇ X that cancels the deviation of the imaging device 30L that occurs in the Xs-axis direction.
- the positions of the image 32Lr captured by the imaging device 30L are moved in parallel with the Ys-axis direction and the Zs-axis direction of the imaging device 30R by amounts ⁇ Y and ⁇ Z that cancel these deviations, respectively.
- the amounts ⁇ X, ⁇ Y, ⁇ Z for canceling the shift are amounts for correcting the shift in the translation direction of the pair of imaging devices 30.
- the amounts ⁇ X, ⁇ Y, ⁇ Z for canceling the deviation are appropriately referred to as translation direction correction amounts ⁇ X, ⁇ Y, ⁇ Z or simply as translation direction correction amounts.
- the external calibration is performed, for example, when the excavator 100 is shipped from the factory.
- the rotational direction correction amounts ⁇ , ⁇ , ⁇ and translational direction correction amounts ⁇ X, ⁇ Y, ⁇ Z that are obtained in the external calibration are parameters that define the attitude of the imaging device 30.
- these parameters are referred to as posture parameters as appropriate.
- the posture parameter is a six-dimensional parameter.
- the posture parameters obtained by the external calibration are stored in the storage unit 22 of the processing device 20 shown in FIG.
- the processing device 20 performs stereo image processing on the images captured by at least the pair of imaging devices 30 using the orientation parameters stored in the storage unit 22, and three-dimensionally measures the captured object.
- the relative positional shift is corrected by the above-described method.
- the image pickup device 30 corrected after being attached to the excavator 100 physically moves due to some external factor the posture parameter before the image pickup device 30 moves and the actual posture of the image pickup device 30 are determined. It may not be supported.
- FIGS. 12 and 13 are diagrams illustrating a pair of images 32L and 32R obtained by the pair of imaging devices 30L and 30R.
- FIGS. 12 and 13 show a pair of images 32L ′ and 32R captured by the imaging device 30R shown in FIG. 7 and the imaging device 30L ′ moved by an external factor.
- the imaging device 30L ′ shown in FIG. 7 the imaging device 30L arranged in parallel with the imaging device 30R has, for example, an imaging surface of the imaging element 31L ′ around the Xs axis of the imaging device coordinate system. It has been rotated in the direction facing it.
- the image 32L ′ captured by the imaging device 30L ′ in this state is compared to the image 32L captured by the imaging device 30L before being moved due to an external factor.
- the position of the blade edge P3 is moved in the direction indicated by the arrow Lt, that is, the left side of the image 32L.
- the processing device 20 searches for the pixel PXl ′ and the pixel PXr corresponding to the cutting edge P3 between the pair of images 32L ′ and 32R, the search cannot be performed. Therefore, as shown in FIG.
- the parallax image 33 ′ obtained by the search between the pair of images 32 ⁇ / b> L ′ and 32 ⁇ / b> R has a higher proportion of 0 indicating that the search for the corresponding pixel has failed.
- the parallax image 33 ′ the proportion of gray scale pixels is low and the proportion of black pixels PXs is high in the entire image, and stereo three-dimensional measurement cannot be realized.
- the posture parameter may be obtained again by external calibration, but it takes time and labor to construct the equipment for external calibration and to perform external calibration.
- the calibration system 50 shown in FIG. 6 automatically corrects the deviation between the plurality of imaging devices 30 by executing the calibration method according to the embodiment and recalculating the orientation parameters when the orientation of the imaging device 30 changes. And revitalize stereo 3D measurement.
- this process is appropriately referred to as automatic calibration.
- FIG. 14 is a flowchart showing processing when the calibration system 50 according to the embodiment executes the calibration method according to the embodiment.
- FIG. 15 is a diagram for explaining a method of determining an imaging device for obtaining a posture parameter.
- FIG. 16 is a diagram illustrating an example of a table for determining an imaging device for which posture parameters are obtained.
- the processing device 20 causes all of the plurality of imaging devices 30 illustrated in FIG. 2 to image the target.
- the target can be the bucket 8, but is not limited to this.
- step S102 the processing apparatus 20 performs stereo image processing on the image captured in step S101. Specifically, the image processing by the stereo method is performed on the images captured by the pair of imaging devices 30 constituting the stereo camera. This image processing is processing for generating a parallax image from a pair of images.
- step S ⁇ b> 102 the processing device 20 generates a parallax image from all the pairs of images obtained by all the combinations that can form the stereo camera among the plurality of imaging devices 30 included in the excavator 100.
- the excavator 100 includes four imaging devices 30a, 30b, 30c, and 30d.
- the processing device 20 generates a parallax image from six pairs of images obtained from the following six combinations R1, R2, R3, R4, R5, and R6.
- R5 Imaging device 30b and imaging device 30d
- R6 Imaging device 30c and imaging device 30d
- the parallax images are generated by the six combinations described above, the parallax images are generated three times by each of the imaging devices 30a, 30b, 30c, and 30d.
- the proportion of grayscale pixels in the parallax image is equal to or greater than the threshold, it is determined that the parallax image is normal.
- the magnitude of the threshold is as described above.
- the processing device 20 uses a determination table TB illustrated in FIG. 16 in order to determine the imaging device 30 for obtaining the posture parameter from the six parallax images obtained by the six combinations R1 to R6.
- the determination table TB is stored in the storage unit 22 of the processing device 20.
- the determination table TB 1 is written to the imaging device 30 corresponding to the combination for which the normal parallax image is generated, and 0 is written to the imaging device 30 corresponding to the combination for which the normal parallax image is not generated. It has become.
- the total number of times 1 of each imaging device 30a, 30b, 30c, 30d is written is written in the total column.
- the determination table TB is configured to know the number of times that the imaging devices 30a, 30b, 30c, and 30d have generated normal parallax images.
- the processing unit 21 writes a value in the determination table TB.
- 1 or 0 is written according to the following rules. (1) When the parallax image generated by the combination R1 is normal, 1 is written in the imaging devices 30a and 30b. (2) When the parallax image generated by the combination R2 is normal, the imaging devices 30a and 30c 1 is written (3) When the parallax image generated by the combination R3 is normal, 1 is written to the imaging devices 30a and 30d.
- the imaging device 1 is written in 30b and 30c (5)
- the parallax image generated by the combination R5 is normal, 1 is written in the imaging devices 30b and 30d (6)
- the parallax image generated by the combination R6 is normal In this case, 1 is written in the imaging devices 30c and 30d.
- the determination table TB shown in FIG. 16 shows a case where the parallax images generated by the combinations R2, R3, and R6 are normal and the parallax images generated by the combinations R1, R4, and R5 are not normal.
- the number of times 1 is written in the imaging devices 30a, 30c, and 30d is twice as described in the total column of the determination table TB, and the number of times 1 is written in the imaging device 30b is 0. Times. Since the image capturing device 30b has an unacceptable shift with respect to the image capturing devices 30a, 30c, and 30d, it is determined that a combination in which a normal parallax image has been generated has not been made. For this reason, the imaging device 30b is a target for obtaining the posture parameter.
- the determination table TB determines the imaging device 30 for obtaining the posture parameter using the number of times 1 is written, that is, the number of times that a normal parallax image is generated from the imaging result of the imaging device 30. That is, the processing device 20 obtains a posture parameter based on a parallax image that is a result of searching for a corresponding portion between a pair of images obtained by the pair of imaging devices 30 out of at least two imaging devices 30. The pair of imaging devices 30 is determined.
- the above-described method for determining the pair of imaging devices 30 for obtaining the posture parameters described in the embodiment is an example, and the present invention is not limited to this.
- step S103 the processing device 20 counts the number of times that a normal parallax image is generated using the determination table TB for each of the imaging devices 30a, 30b, 30c, and 30d.
- step S ⁇ b> 104 the processing device 20 determines the imaging device 30 that obtains the posture parameter again because a deviation has occurred based on the number of times that a normal parallax image has been generated.
- the processing device 20 has a success rate of search less than a threshold, that is, at least one of the pair of imaging devices 30 in which a normal parallax image is generated has a posture parameter. We will ask again.
- the processing device 20 executes processing for obtaining the posture parameter.
- the processing device 20 changes the posture parameter.
- the search unit 21A of the processing device 20 uses the changed posture parameter to stereo the pair of images captured by the imaging device 30 that obtains the posture parameter again and the paired imaging device 30.
- the image processing by the method is performed.
- the pair of images that are subjected to image processing by the stereo method are images captured in step S101.
- the stereo image processing is processing for generating a parallax image from a pair of images.
- step S106 ends, the processing device 20, in this embodiment, the determination unit 21B of the processing unit 21 stores, in step S107, grayscale pixels occupying the parallax image generated in step S106, that is, a numerical value other than 0.
- the gray scale ratio SR which is the ratio of pixels, is compared with the threshold SRc.
- Step S107 is a process of determining the success rate of the stereo image processing.
- the magnitude of the threshold SRc can be set to, for example, 80% to 90%, but is not limited to a value within this range.
- step S107 when the gray scale ratio SR is smaller than the threshold value SRc (step S107, No), the determination unit 21B of the processing device 20 returns to step S105 and continues to step S105 until the gray scale ratio SR becomes equal to or greater than the threshold value SRc. To repeat step S107.
- step S107 when the grayscale ratio SR of the parallax image is equal to or greater than the threshold value SRc (step S107, Yes), the determination unit 21B of the processing device 20 uses the posture parameter at that time as a new posture parameter in step S108. decide. Thereafter, image processing based on the stereo method is executed based on the posture parameters determined in step S108.
- the processing device 20 changes one posture parameter of a pair of imaging devices 30 that are targets of changing posture parameters, and does not change the other posture parameter, and the pair of images captured by these.
- stereo image processing is performed.
- the pair of imaging devices 30 whose posture parameters are to be changed the one in which the posture parameter is not changed is referred to as a first imaging device, and the one in which the posture parameter is changed is referred to as a second imaging device.
- the posture parameter is changed by the imaging device 30c and the imaging device 30d illustrated in FIG. 2, and the posture parameter of the imaging device 30d is changed. Therefore, the imaging device 30c is a first imaging device, and the imaging device 30d is a second imaging device.
- the imaging device 30c is appropriately referred to as a first imaging device 30c
- the imaging device 30d is appropriately referred to as a second imaging device 30d.
- FIG. 17 to FIG. 21 are diagrams for explaining the posture parameters.
- the posture parameters are the rotation direction correction amounts ⁇ , ⁇ , ⁇ and the translation direction correction amounts ⁇ X, ⁇ Y, ⁇ Z.
- the processing device 20 defines the first parameter that defines the positional relationship between the first imaging device 30c and the second imaging device 30d in the translation direction, and the imaging of the second imaging device 30d.
- the second parameter defining the posture in the apparatus coordinate system is changed.
- the first parameter and the second parameter that is, the parameter that defines the attitude of the second imaging device 30d represents the rotation of the second imaging device 30d.
- the processing device 20 changes the rotation direction correction amounts ⁇ , ⁇ , ⁇ and the translation direction correction amounts ⁇ X, ⁇ Y, ⁇ Z, which are posture parameters, by changing the first parameter and the second parameter.
- the second parameters are angles ⁇ ′, ⁇ ′, and ⁇ ′, as shown in FIG.
- Angles ⁇ ′, ⁇ ′, and ⁇ ′ are rotation angles of the second imaging device 30d around each axis of the imaging device coordinate system (Xs, Ys, Zs) of the second imaging device 30d.
- the first parameter is the angle ⁇ shown in FIGS. 18 and 19 and the angle ⁇ shown in FIGS. 20 and 21.
- the angle ⁇ is an angle formed between the base line BL and the Zs axis of the imaging device coordinate system (Xs, Ys, Zs) of the second imaging device 30d.
- the angle ⁇ is an angle formed between the base line BL and the Xs axis of the imaging device coordinate system (Xs, Ys, Zs) of the second imaging device 30d.
- the second imaging device 30d is centered on the first imaging device 30c, more specifically, the origin of the imaging device coordinate system of the first imaging device 30c. (In this example, it coincides with the optical center OCc). That is, the first parameter rotates the second imaging device 30d around the first imaging device 30c.
- the second imaging device 30d When the angles ⁇ ′, ⁇ ′, and ⁇ ′, which are the second parameters, are changed, the second imaging device 30d has its own center, more specifically, the origin of the imaging device coordinate system of the second imaging device 30d ( In this example, it rotates around the optical center OCd). That is, the second parameter rotates the second imaging device 30d around the center of the second imaging device 30d.
- both the first parameter and the second parameter are parameters that define the attitude of the second imaging device 30d.
- the attitude of the second imaging device 30d By defining the attitude of the second imaging device 30d, the relative positional relationship between the first imaging device 30c and the second imaging device 30d is defined.
- the processing device 20 has a constant distance between the first imaging device 30c and the second imaging device 30d, that is, the length of the baseline BL between the first imaging device 30c and the second imaging device 30d.
- a parameter that defines the attitude of the second imaging device 30d is changed with B being constant.
- the base line BL between the first imaging device 30c and the second imaging device 30d is a straight line connecting the optical center OCc of the first imaging device 30c and the optical center OCd of the second imaging device 30d.
- the second imaging device 30d rotates around the first imaging device 30c.
- the translational component of the second imaging device 30d also changes. Therefore, by changing the first parameter and the second parameter, the rotation direction correction amounts ⁇ , ⁇ , ⁇ and the translation direction correction amounts ⁇ X, ⁇ Y, ⁇ Z, which are posture parameters, are changed.
- the angle ⁇ and the angle ⁇ , which are the first parameters with the length of the base line BL being constant, the number of parameters to be changed when obtaining the posture parameters can be reduced. As a result, the calculation load of the processing device 20 is reduced, which is preferable.
- the processing device 20 When the angles ⁇ , ⁇ , which are the first parameters, and the angles ⁇ ′, ⁇ ′, ⁇ ′, which are the second parameters, are obtained, the relative positions of the first imaging device 30c and the second imaging device 30d are obtained. A relationship is obtained.
- the processing device 20 generates a parallax image while changing the first parameter and the second parameter until the grayscale ratio SR of the parallax image is equal to or greater than the threshold SRc.
- the processing device 20 uses a pre-change value as a reference until both the positive direction and the negative direction have a predetermined change amount with a predetermined change amount.
- the angles ⁇ and ⁇ and the angles ⁇ ′, ⁇ ′, and ⁇ ′ are changed. 17 to 21 exemplarily show that the angles ⁇ and ⁇ and the angles ⁇ ′, ⁇ ′, and ⁇ ′ are changed in the positive direction and the negative direction.
- the processing device 20 uses the changed angles ⁇ , ⁇ and the angles ⁇ ′, ⁇ ′, ⁇ ′ for the first imaging.
- a parallax image is generated from a pair of images captured by the device 30c and the second imaging device 30d.
- the processing device 20 uses the changed angles ⁇ and ⁇ and the angles ⁇ ′, ⁇ ′, and ⁇ ′, and rotational direction correction amounts ⁇ , ⁇ , ⁇ and translation direction correction amounts ⁇ X that are posture parameters.
- ⁇ Y, ⁇ Z are obtained, and a parallax image is generated using the obtained posture parameters.
- the processing device 20 compares the gray scale ratio SR of the generated parallax image with the threshold value SRc.
- the processing device 20 uses the first parameter and the second parameter when the gray scale ratio SR of the parallax image is equal to or greater than the threshold SRc, and uses the rotation direction correction amounts ⁇ , ⁇ , ⁇ and the translation direction as posture parameters. Correction amounts ⁇ X, ⁇ Y, ⁇ Z are obtained.
- the image picked up by the image pickup device 30 is subjected to stereo image processing using the newly obtained rotation direction correction amounts ⁇ , ⁇ , ⁇ and translation direction correction amounts ⁇ X, ⁇ Y, ⁇ Z, Three-dimensional measurement is performed.
- the three imaging devices 30b, 30c, and 30d shown in FIG. 15 are targets for changing the attitude parameter, there are three combinations, that is, a combination of the imaging device 30c and the imaging device 30b, and an imaging device 30c and an imaging device 30d. And a combination of the imaging device 30d and the imaging device 30b.
- one of the three imaging devices 30b, 30c, and 30d is a first imaging device, and the remaining two are second imaging devices. Then, since the combination of two pairs of image pickup devices that share the first image pickup device is established, the processing device 20 obtains a new posture parameter for each combination.
- the imaging device 30c is a first imaging device, and the imaging devices 30b and 30d are second imaging devices. Then, a combination of the imaging device 30c and the imaging device 30b and a combination of the imaging device 30c and the imaging device 30d are established.
- the processing device 20 changes the orientation parameter of the imaging device 30b for the former combination, and changes the orientation parameter of the imaging device 30d for the latter combination.
- the method for obtaining the posture parameter when the three imaging devices 30 are targets for changing the posture parameter is not limited to the above-described method.
- the processing device 20 first determines the orientation parameter of the imaging device 30b in the combination of the imaging device 30c and the imaging device 30b, and then sets the imaging device 30b as the first imaging device and the imaging device 30d as the second imaging device. As described above, the orientation parameter of the imaging device 30d may be determined.
- the four imaging devices 30a, 30b, 30c, and 30d shown in FIG. 15 are objects whose posture parameters are to be changed, a combination of the imaging device 30a and the imaging device 30b and a combination of the imaging device 30c and the imaging device 30d.
- first combination that is a combination of the imaging device 30a and the imaging device 30b and a second combination that is a combination of the imaging device 30c and the imaging device 30d are established.
- first combination either one is the first imaging device and the other is the second imaging device.
- second combination either one is the first imaging device and the other is the second imaging device.
- the processing device 20 obtains a new posture parameter by changing the posture parameter of the second imaging device in each of the first combination and the second combination.
- the calibration system 50 and the calibration method according to the embodiment are as follows when a position shift occurs due to an external factor in at least one of the at least two imaging devices 30 included in the excavator 100 that is a work machine. To process. That is, the calibration system 50 and the calibration method according to the embodiment change the posture parameter by changing the posture parameter while keeping the distance between the first imaging device and the second imaging device out of at least two imaging devices 30. Based on a parallax image that is a result of searching for a corresponding portion between a pair of images obtained by the imaging device and the second imaging device, a new posture parameter is obtained.
- at least one of the first imaging device and the second imaging device is an imaging device in which a positional shift has occurred due to an external factor.
- the calibration system 50 and the calibration method according to the embodiment can calibrate the imaging device 30 included in the hydraulic excavator 100 that is a work machine.
- the calibration system 50 and the calibration method according to the embodiment do not require construction of equipment for calibration, the displacement of the position of the imaging device 30 that has occurred at the user's destination of the excavator 100 can be easily and quickly performed. You can fix it.
- the calibration system 50 and the calibration method according to the embodiment can correct the displacement of the imaging device 30 even in a place where there is no facility for calibrating the imaging device 30, and thus can suppress interruption of work.
- the calibration system 50 and the calibration method according to the embodiment also have an advantage that the positional deviation of the imaging device 30 can be corrected easily and in a short time by software processing without moving the imaging device 30 in which the positional deviation has occurred.
- the calibration system 50 and the calibration method according to the embodiment are obtained by searching for a corresponding portion between a pair of images obtained by the pair of imaging devices 30 out of at least two imaging devices 30, that is, a gray scale occupied in a parallax image. Based on the ratio, the imaging device 30 that needs to obtain the posture parameter is determined. Specifically, the imaging device 30 that has never been able to generate a normal parallax image is the imaging device 30 that needs to obtain a posture parameter, that is, the imaging device 30 in which an unacceptable positional deviation has occurred. . For this reason, the calibration system 50 and the calibration method according to the embodiment can easily and reliably determine the imaging device 30 that needs to obtain the attitude parameter.
- the work machine is not limited to the hydraulic excavator 100 as long as the work machine includes at least a pair of imaging devices and three-dimensionally measures an object using the pair of imaging devices, and a work machine such as a wheel loader or a bulldozer. It may be.
- the processing for obtaining the attitude parameter may be executed by a processing device outside the excavator 100. In this case, the image imaged by the imaging device 30 is sent to a processing device outside the excavator 100 by communication, for example.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Electromagnetism (AREA)
- Human Computer Interaction (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
図1は、実施形態に係る撮像装置の校正システムを備えた油圧ショベル100の斜視図である。図2は、実施形態に係る油圧ショベル100の運転席付近を示す斜視図である。図3は、実施形態に係る油圧ショベルが有する作業機2の寸法及び油圧ショベル100の座標系を示す図である。
図2に示されるように、油圧ショベル100は、例えば運転室4内に、複数の撮像装置30a,30b,30c,30dを有する。以下において、複数の撮像装置30a,30b,30c,30dを区別しない場合は適宜、撮像装置30と称する。撮像装置30a及び撮像装置30cは、作業機2側に配置される。撮像装置30の種類は限定されないが、実施形態では、例えば、CCD(Couple Charged Device)イメージセンサ又はCMOS(Complementary Metal Oxide Semiconductor)イメージセンサを備えた撮像装置が用いられる。
図6は、実施形態に係る撮像装置の校正システム50を示す図である。撮像装置の校正システム50(以下、適宜、校正システム50と称する)は、複数の撮像装置30a,30b,30c,30dと、処理装置20とを含む。これらは、図1及び図2に示されるように、油圧ショベル100の車体1に備えられている。処理装置20は、処理部21と、記憶部22と、入出力部23とを有する。処理部21は、例えば、CPU(Central Processing Unit)のようなプロセッサ及びメモリによって実現される。処理部21は、探索部21A及び決定部21Bを有する。処理装置20は、実施形態に係る撮像装置の校正方法(以下適宜、校正方法と称する)を実現する。この場合、処理部21は、記憶部22に記憶されたコンピュータプログラムを読み込んで実行する。このコンピュータプログラムは、実施形態に係る校正方法を処理部21に実行させるためのものである。
図7は、一対の撮像装置30L,30Rを用いてバケット8の刃9の刃先P3を三次元計測する例を説明する図である。図8及び図9は、一対の撮像装置30L,30Rによって得られた一対の画像32L,32Rを示す図である。実施形態において、図6に示される処理装置20は、一対の撮像装置30によって撮像された一対の画像にステレオ方式による画像処理を施すことにより、対象の位置を求める。図7において、刃先P3を撮像する一対の撮像装置30を、撮像装置30L及び撮像装置30Rと称する。一対の撮像装置30L,30Rは、図2に示される油圧ショベル100が有する撮像装置30である。図7は、何らかの外的要因により撮像装置30Lの位置が動いた状態を、二点鎖線の撮像装置30L’として示している。
R1:撮像装置30a及び撮像装置30b
R2:撮像装置30a及び撮像装置30c
R3:撮像装置30a及び撮像装置30d
R4:撮像装置30b及び撮像装置30c
R5:撮像装置30b及び撮像装置30d
R6:撮像装置30c及び撮像装置30d
(1)組合せR1によって生成された視差画像が正常である場合、撮像装置30a,30bに1が書き込まれる
(2)組合せR2によって生成された視差画像が正常である場合、撮像装置30a,30cに1が書き込まれる
(3)組合せR3によって生成された視差画像が正常である場合、撮像装置30a,30dに1が書き込まれる
(4)組合せR4によって生成された視差画像が正常である場合、撮像装置30b,30cに1が書き込まれる
(5)組合せR5によって生成された視差画像が正常である場合、撮像装置30b,30dに1が書き込まれる
(6)組合せR6によって生成された視差画像が正常である場合、撮像装置30c,30dに1が書き込まれる
2 作業機
3 旋回体
4 運転室
5 走行体
5a,5b 履帯
6 ブーム
7 アーム
8 バケット
9 刃
10 ブームシリンダ
11 アームシリンダ
12 バケットシリンダ
13 ブームピン
14 アームピン
15 バケットピン
20 処理装置
21 処理部
22 記憶部
23 入出力部
30,30a,30b,30c,30d,30L,30R 撮像装置
31L,31R 撮像素子
32L,32R,32Lr 画像
33,33’ 視差画像
50 撮像装置の校正システム
100 油圧ショベル
BL ベースライン
d 視差
f 焦点距離
OCL,OCR,OCc,OCd 光学中心
P3 刃先
SR グレースケール比率
SRc 閾値
TB 判定テーブル
α,β,γ,θ,φ 角度
Claims (7)
- 少なくとも2個の撮像装置と、
少なくとも2個の前記撮像装置のうち、第1の撮像装置と第2の撮像装置との距離を一定として前記第2の撮像装置の姿勢を規定するパラメータを変化させて、前記第1の撮像装置及び前記第2の撮像装置によって得られた一対の画像間で対応する部分を探索し、探索した結果に基づいて、前記パラメータを求める処理装置と、
を含む、撮像装置の校正システム。 - 前記パラメータは、
前記第2の撮像装置の回転を規定するものである、請求項1に記載の撮像装置の校正システム。 - 前記パラメータは、
前記第1の撮像装置を中心として前記第2の撮像装置を回転させる第1のパラメータと、前記第2の撮像装置の中心の周りに前記第2の撮像装置を回転させる第2のパラメータとを含む、請求項1又は請求項2に記載の撮像装置の校正システム。 - 前記処理装置は、
少なくとも2個の前記撮像装置のうち一対の前記撮像装置によって得られた一対の画像間で対応する部分を探索した結果に基づいて、前記パラメータを求める必要がある前記第1の撮像装置及び前記第2の撮像装置を決定する、請求項1から請求項3のいずれか1項に記載の撮像装置の校正システム。 - 前記処理装置は、
複数対の前記撮像装置がある場合、探索の成功率が閾値未満である一対の前記撮像装置は前記パラメータを求める、請求項4に記載の撮像装置の校正システム。 - 請求項1から請求項5のいずれか1項に記載の撮像装置の校正システムと、
複数の前記撮像装置と、
を含む、作業機械。 - 複数の撮像装置のうち一対の撮像装置によって得られた一対の画像間で対応する部分を探索した結果に基づいて一対の前記撮像装置のいずれか一方の姿勢を規定するパラメータを求めるか否かを決定し、
前記パラメータを求める場合、一対の前記撮像装置である第1の撮像装置と第2の撮像装置との距離を一定として、前記第2の撮像装置の姿勢を規定するパラメータを変化させて、前記第1の撮像装置及び前記第2の撮像装置によって得られた一対の画像間の対応する部分を探索し、
探索した結果に基づいて前記撮像装置の姿勢を規定する姿勢パラメータを求める、
撮像装置の校正方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/908,392 US20170094154A1 (en) | 2015-09-30 | 2015-09-30 | Correction system of image pickup apparatus, work machine, and correction method of image pickup apparatus |
JP2015551893A JPWO2016047808A1 (ja) | 2015-09-30 | 2015-09-30 | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 |
CN201580001082.4A CN105518221A (zh) | 2015-09-30 | 2015-09-30 | 拍摄装置的校正系统、作业机械和拍摄装置的校正方法 |
PCT/JP2015/077873 WO2016047808A1 (ja) | 2015-09-30 | 2015-09-30 | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 |
DE112015000108.5T DE112015000108T5 (de) | 2015-09-30 | 2015-09-30 | Korrektursystem einer Bildaufnahmevorrichtung, Arbeitsmaschine und Korrekturverfahren einer Bildaufnahmevorrichtung |
KR1020167001504A KR20170048231A (ko) | 2015-09-30 | 2015-09-30 | 촬상 장치의 교정 시스템, 작업 기계 및 촬상 장치의 교정 방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/077873 WO2016047808A1 (ja) | 2015-09-30 | 2015-09-30 | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016047808A1 true WO2016047808A1 (ja) | 2016-03-31 |
Family
ID=55581323
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/077873 WO2016047808A1 (ja) | 2015-09-30 | 2015-09-30 | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20170094154A1 (ja) |
JP (1) | JPWO2016047808A1 (ja) |
KR (1) | KR20170048231A (ja) |
CN (1) | CN105518221A (ja) |
DE (1) | DE112015000108T5 (ja) |
WO (1) | WO2016047808A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018128397A (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
CN112334733A (zh) * | 2018-06-29 | 2021-02-05 | 株式会社小松制作所 | 拍摄装置的校正装置、监视装置、作业机械及校正方法 |
WO2023063219A1 (ja) * | 2021-10-15 | 2023-04-20 | 住友重機械工業株式会社 | 作業機械の周辺監視システム、情報処理装置、及び周辺監視方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6684682B2 (ja) * | 2016-08-18 | 2020-04-22 | 株式会社神戸製鋼所 | 建設機械 |
JP6840645B2 (ja) * | 2017-09-08 | 2021-03-10 | 株式会社小松製作所 | 施工管理装置および施工管理方法 |
JP7114907B2 (ja) * | 2018-01-19 | 2022-08-09 | コベルコ建機株式会社 | 先端アタッチメント判別装置 |
WO2019144289A1 (en) * | 2018-01-23 | 2019-08-01 | SZ DJI Technology Co., Ltd. | Systems and methods for calibrating an optical system of a movable object |
WO2020110248A1 (ja) * | 2018-11-29 | 2020-06-04 | 本田技研工業株式会社 | 作業機、作業機の制御方法及びプログラム |
US11447931B2 (en) | 2019-05-15 | 2022-09-20 | Caterpillar Inc. | Ground engaging tool monitoring system |
US11466984B2 (en) | 2019-05-15 | 2022-10-11 | Caterpillar Inc. | Bucket get monitoring system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010217001A (ja) * | 2009-03-17 | 2010-09-30 | Tokyo Electric Power Co Inc:The | 飛翔体位置測定装置 |
JP2012177676A (ja) * | 2011-01-31 | 2012-09-13 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100826243B1 (ko) * | 2002-11-20 | 2008-04-29 | 삼성전자주식회사 | 화각 조정이 가능한 촬영장치 및 그의 제어방법 |
JP5714232B2 (ja) * | 2009-03-12 | 2015-05-07 | オムロン株式会社 | キャリブレーション装置および3次元計測のためのパラメータの精度の確認支援方法 |
JP2011253376A (ja) * | 2010-06-02 | 2011-12-15 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
JP5588812B2 (ja) * | 2010-09-30 | 2014-09-10 | 日立オートモティブシステムズ株式会社 | 画像処理装置及びそれを用いた撮像装置 |
JP5625976B2 (ja) * | 2011-02-09 | 2014-11-19 | ソニー株式会社 | 撮像装置、および撮像装置制御方法、並びにプログラム |
JP2013048334A (ja) * | 2011-08-29 | 2013-03-07 | Sony Corp | 画像処理装置および方法、画像処理システム、並びにプログラム |
CN104581136B (zh) * | 2013-10-14 | 2017-05-31 | 钰立微电子股份有限公司 | 图像校准系统和立体摄像机的校准方法 |
TWI520098B (zh) * | 2014-01-28 | 2016-02-01 | 聚晶半導體股份有限公司 | 影像擷取裝置及其影像形變偵測方法 |
-
2015
- 2015-09-30 CN CN201580001082.4A patent/CN105518221A/zh active Pending
- 2015-09-30 JP JP2015551893A patent/JPWO2016047808A1/ja active Pending
- 2015-09-30 KR KR1020167001504A patent/KR20170048231A/ko active Search and Examination
- 2015-09-30 WO PCT/JP2015/077873 patent/WO2016047808A1/ja active Application Filing
- 2015-09-30 US US14/908,392 patent/US20170094154A1/en not_active Abandoned
- 2015-09-30 DE DE112015000108.5T patent/DE112015000108T5/de not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010217001A (ja) * | 2009-03-17 | 2010-09-30 | Tokyo Electric Power Co Inc:The | 飛翔体位置測定装置 |
JP2012177676A (ja) * | 2011-01-31 | 2012-09-13 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018128397A (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
WO2018147340A1 (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
US11120577B2 (en) | 2017-02-09 | 2021-09-14 | Komatsu Ltd. | Position measurement system, work machine, and position measurement method |
CN112334733A (zh) * | 2018-06-29 | 2021-02-05 | 株式会社小松制作所 | 拍摄装置的校正装置、监视装置、作业机械及校正方法 |
CN112334733B (zh) * | 2018-06-29 | 2022-09-27 | 株式会社小松制作所 | 拍摄装置的校正装置、监视装置、作业机械及校正方法 |
WO2023063219A1 (ja) * | 2021-10-15 | 2023-04-20 | 住友重機械工業株式会社 | 作業機械の周辺監視システム、情報処理装置、及び周辺監視方法 |
Also Published As
Publication number | Publication date |
---|---|
DE112015000108T5 (de) | 2016-06-16 |
KR20170048231A (ko) | 2017-05-08 |
US20170094154A1 (en) | 2017-03-30 |
CN105518221A (zh) | 2016-04-20 |
JPWO2016047808A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016047808A1 (ja) | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 | |
JP6050525B2 (ja) | 位置計測システム、作業機械及び位置計測方法 | |
WO2016047807A1 (ja) | 校正システム、作業機械及び校正方法 | |
US11441294B2 (en) | Measurement system, work machine, and measurement method | |
CN108700402B (zh) | 位置测量系统、作业机械及位置测量方法 | |
WO2018043301A1 (ja) | 作業機械の画像表示システム | |
CN106029994B (zh) | 校正系统、作业机械和校正方法 | |
US10527413B2 (en) | Outside recognition device | |
KR20170107076A (ko) | 작업 기계의 화상 표시 시스템, 작업 기계의 원격 조작 시스템 및 작업 기계 | |
WO2018062523A1 (ja) | 作業機械の検出処理装置及び作業機械の検出処理方法 | |
KR102231510B1 (ko) | 작업기의 외형 형상 측정 시스템, 작업기의 외형 형상 표시 시스템, 작업기의 제어 시스템 및 작업 기계 | |
JP7138538B2 (ja) | レーザスキャナのキャリブレーション方法、運搬機械 | |
CN113874862A (zh) | 工程机械 | |
JP7128497B2 (ja) | 作業機械の画像表示システム | |
JP2010146357A (ja) | 3次元画像処理方法および3次元画像処理装置 | |
JP7333551B2 (ja) | 作業機械の画像表示システム | |
JP6923144B2 (ja) | 作業機械の画像表示システム | |
KR20180076458A (ko) | 우주 환경에서 스테레오 카메라를 이용한 마커기반의 거리 추정 알고리즘 | |
JP2023027554A (ja) | ロボットビジョンシステム | |
JP5629874B2 (ja) | 三次元座標計測装置及び三次元座標計測方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015551893 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20167001504 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14908392 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015000108 Country of ref document: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15844776 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15844776 Country of ref document: EP Kind code of ref document: A1 |