WO2016047807A1 - 校正システム、作業機械及び校正方法 - Google Patents
校正システム、作業機械及び校正方法 Download PDFInfo
- Publication number
- WO2016047807A1 WO2016047807A1 PCT/JP2015/077872 JP2015077872W WO2016047807A1 WO 2016047807 A1 WO2016047807 A1 WO 2016047807A1 JP 2015077872 W JP2015077872 W JP 2015077872W WO 2016047807 A1 WO2016047807 A1 WO 2016047807A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection unit
- information
- coordinate system
- position detection
- work machine
- Prior art date
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/22—Hydraulic or pneumatic drives
- E02F9/2203—Arrangements for controlling the attitude of actuators, e.g. speed, floating function
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
- E02F9/262—Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
- E02F9/265—Sensors and their calibration for indicating the position of the work tool with follow-up actions (e.g. control signals sent to actuate the work tool)
-
- G—PHYSICS
- G12—INSTRUMENT DETAILS
- G12B—CONSTRUCTIONAL DETAILS OF INSTRUMENTS, OR COMPARABLE DETAILS OF OTHER APPARATUS, NOT OTHERWISE PROVIDED FOR
- G12B13/00—Calibrating of instruments and apparatus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2200/00—Type of vehicle
- B60Y2200/40—Special vehicles
- B60Y2200/41—Construction vehicles, e.g. graders, excavators
- B60Y2200/412—Excavators
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/30—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom
- E02F3/32—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom working downwardly and towards the machine, e.g. with backhoes
Definitions
- the present invention relates to a calibration system, a work machine, and a calibration method for calibrating a position detection unit that is provided in a work machine and detects a target position.
- a work machine including an imaging device (for example, Patent Document 1).
- Patent Document 1 describes a technique for calibrating a work machine using an imaging device. However, Patent Document 1 also describes that the target position detected by the means for detecting the target position provided in the work machine is converted to a coordinate system other than the means for detecting the target position. There is no suggestion.
- the present invention seeks to obtain conversion information for converting the position information of the object detected by the means for detecting the position of the object provided in the work machine into a coordinate system other than the means for detecting the position of the object. Objective.
- the present invention is provided in a work machine having a work machine, detects a target position and outputs a first position detection unit, and information on a predetermined position of the work machine detected by the first position detection unit
- the first position information is information on the predetermined position detected by the second position detection unit in the posture of the work machine when the first position detection unit detects the predetermined position.
- Conversion information used for converting the position detected by the first position detection unit from the coordinate system of the first position detection unit into a coordinate system different from the coordinate system of the first position detection unit using the position information.
- Find and output conversion information A processing unit, a calibration system including.
- the first position information is a plurality of pieces of information obtained by the first position detection unit detecting the predetermined position from the work machines having different postures
- the second position information is the second position information. It is preferable that the position detection unit is a plurality of pieces of information obtained by detecting the predetermined position from the work machine machines having different postures.
- the first position detection unit is a stereo camera including at least a pair of imaging devices
- the second position detection unit is provided in the work machine and detects an operation amount of an actuator that operates the work machine. It is preferable that it is a sensor which performs.
- the predetermined position is preferably a plurality of positions of the work machine in a direction in which a pair of the imaging devices constituting the stereo camera are arranged.
- the present invention is a work machine including a work machine and the calibration system.
- the predetermined position of the work machine is detected by the first method and the second method in a state where the posture of the work machine is different, and the information on the predetermined position detected by the first method is used.
- the first position information, and the second position that is information on the predetermined position detected by the second method in the attitude of the work machine when the predetermined position is detected by the first method.
- conversion information used to convert the position detected by the first method from the coordinate system in the first method to a coordinate system different from the coordinate system of the first position detection unit is obtained.
- Direction It is.
- a plurality of pieces of information obtained by detecting the predetermined position from the work machine having a different posture by the first position detection unit is the first position information, and the work having a different posture by the second position detection unit.
- the plurality of pieces of information obtained by detecting the predetermined position from the machine is the second position information, and when the predetermined position is detected, the first position detection unit and the second position detection unit are: It is preferable that the predetermined position is detected from the work machine having a different posture.
- the predetermined position is three-dimensionally measured by a stereo method, and the predetermined position is a direction in which a pair of imaging devices used for the three-dimensional measurement by the stereo method are arranged.
- a pair of imaging devices used for the three-dimensional measurement by the stereo method are arranged.
- the present invention obtains conversion information for converting the position information of the object detected by the means for detecting the position of the object provided in the work machine into a coordinate system other than the means for detecting the position of the object. it can.
- FIG. 1 is a perspective view of a hydraulic excavator provided with a calibration system according to an embodiment.
- FIG. 2 is a perspective view of the vicinity of the driver's seat of the hydraulic excavator according to the embodiment.
- FIG. 3 is a diagram illustrating dimensions of a working machine included in the hydraulic excavator according to the embodiment and a coordinate system of the hydraulic excavator.
- FIG. 4 is a diagram illustrating an example of an image obtained by imaging a target by a plurality of imaging devices.
- FIG. 5 is a diagram illustrating an example of an object captured by a plurality of imaging devices.
- FIG. 6 is a diagram illustrating a calibration system according to the embodiment.
- FIG. 7 is a diagram for explaining a calibration method according to the embodiment.
- FIG. 1 is a perspective view of a hydraulic excavator provided with a calibration system according to an embodiment.
- FIG. 2 is a perspective view of the vicinity of the driver's seat of the hydraulic excavator according to
- FIG. 8 is a flowchart illustrating a processing example when the processing apparatus according to the embodiment executes the calibration method according to the embodiment.
- FIG. 9 is a diagram illustrating an object to be imaged by the imaging device 30 when the processing device according to the embodiment executes the calibration method according to the embodiment.
- FIG. 10 is a diagram illustrating an object to be imaged by the imaging apparatus when the processing apparatus according to the embodiment executes the calibration method according to the embodiment.
- FIG. 11 is a diagram illustrating a posture of an object captured by the imaging device when the processing device according to the embodiment executes the calibration method according to the embodiment.
- FIG. 12 is a diagram illustrating a posture of an object captured by the imaging apparatus when the processing apparatus according to the embodiment executes the calibration method according to the embodiment.
- FIG. 13 is a diagram illustrating a posture of an object captured by the imaging device when the processing device according to the embodiment executes the calibration method according to the embodiment.
- FIG. 1 is a perspective view of a hydraulic excavator 100 including a calibration system according to an embodiment.
- FIG. 2 is a perspective view of the vicinity of the driver's seat of the excavator 100 according to the embodiment.
- FIG. 3 is a diagram illustrating dimensions of the working machine 2 included in the hydraulic excavator according to the embodiment and a coordinate system of the hydraulic excavator 100.
- a hydraulic excavator 100 that is a work machine has a vehicle body 1 and a work machine 2.
- the vehicle body 1 includes a revolving body 3, a cab 4, and a traveling body 5.
- the turning body 3 is attached to the traveling body 5 so as to be turnable.
- the swivel body 3 houses devices such as a hydraulic pump and an engine (not shown).
- the cab 4 is disposed in the front part of the revolving structure 3.
- An operation device 25 shown in FIG. 2 is arranged in the cab 4.
- the traveling body 5 has crawler belts 5a and 5b, and the excavator 100 travels as the crawler belts 5a and 5b rotate.
- the work machine 2 is attached to the front portion of the vehicle body 1 and has a boom 6, an arm 7, a bucket 8 as a work tool, a boom cylinder 10, an arm cylinder 11, and a bucket cylinder 12.
- the front side of the vehicle body 1 is the direction side from the backrest 4SS of the driver's seat 4S shown in FIG.
- the rear side of the vehicle body 1 is the direction side from the operation device 25 toward the backrest 4SS of the driver's seat 4S.
- the front portion of the vehicle body 1 is a portion on the front side of the vehicle body 1 and is a portion on the opposite side of the counterweight WT of the vehicle body 1.
- the operating device 25 is a device for operating the work implement 2 and the swing body 3 and includes a right lever 25R and a left lever 25L.
- a monitor panel 26 is provided in the cab 4 in front of the driver's seat 4S.
- the base end portion of the boom 6 is rotatably attached to the front portion of the vehicle body 1 via a boom pin 13.
- the boom pin 13 corresponds to the rotation center of the boom 6 with respect to the swing body 3.
- a base end portion of the arm 7 is rotatably attached to a tip end portion of the boom 6 via an arm pin 14.
- the arm pin 14 corresponds to the rotation center of the arm 7 with respect to the boom 6.
- a bucket 8 is rotatably attached to the tip of the arm 7 via a bucket pin 15.
- the bucket pin 15 corresponds to the rotation center of the bucket 8 with respect to the arm 7.
- the length of the boom 6, that is, the length between the boom pin 13 and the arm pin 14 is L1.
- the length of the arm 7, that is, the length between the arm pin 14 and the bucket pin 15 is L2.
- the length of the bucket 8, that is, the length between the bucket pin 15 and the blade tip P3 that is the tip of the blade 9 of the bucket 8 is L3.
- the boom cylinder 10, the arm cylinder 11 and the bucket cylinder 12 shown in FIG. These are actuators that are provided in the vehicle body 1 of the excavator 100 and operate the work implement 2.
- the base end portion of the boom cylinder 10 is rotatably attached to the swing body 3 via a boom cylinder foot pin 10a.
- the tip of the boom cylinder 10 is rotatably attached to the boom 6 via a boom cylinder top pin 10b.
- the boom cylinder 10 drives the boom 6 by expanding and contracting by hydraulic pressure.
- the base end of the arm cylinder 11 is rotatably attached to the boom 6 via an arm cylinder foot pin 11a.
- the tip of the arm cylinder 11 is rotatably attached to the arm 7 via an arm cylinder top pin 11b.
- the arm cylinder 11 drives the arm 7 by expanding and contracting by hydraulic pressure.
- the base end portion of the bucket cylinder 12 is rotatably attached to the arm 7 via a bucket cylinder foot pin 12a.
- the tip of the bucket cylinder 12 is rotatably attached to one end of the first link member 47 and one end of the second link member 48 via the bucket cylinder top pin 12b.
- the other end of the first link member 47 is rotatably attached to the distal end portion of the arm 7 via the first link pin 47a.
- the other end of the second link member 48 is rotatably attached to the bucket 8 via a second link pin 48a.
- the bucket cylinder 12 drives the bucket 8 by expanding and contracting by hydraulic pressure.
- the boom 6, the arm 7 and the bucket 8 are provided with a first angle detector 18A, a second angle detector 18B and a third angle detector 18C, respectively.
- the first angle detector 18A, the second angle detector 18B, and the third angle detector 18C are, for example, stroke sensors. Each of them detects the stroke length of the boom cylinder 10, the arm cylinder 11, and the bucket cylinder 12, so that the rotation angle of the boom 6 with respect to the vehicle body 1, the rotation angle of the arm 7 with respect to the boom 6, 7 is indirectly detected.
- the first angle detection unit 18A detects the operation amount of the boom cylinder 10, that is, the stroke length.
- the processing device 20 to be described later determines the boom 6 with respect to the Zm axis of the coordinate system (Xm, Ym, Zm) of the excavator 100 shown in FIG. 3 based on the stroke length of the boom cylinder 10 detected by the first angle detector 18A.
- the rotation angle ⁇ 1 is calculated.
- the coordinate system of the excavator 100 is appropriately referred to as a vehicle body coordinate system. As shown in FIG. 2, the origin of the vehicle body coordinate system is the center of the boom pin 13.
- the center of the boom pin 13 is the center of the cross section when the boom pin 13 is cut on a plane orthogonal to the direction in which the boom pin 13 extends, and the center in the direction in which the boom pin 13 extends.
- the vehicle body coordinate system is not limited to the example of the embodiment.
- the turning center of the revolving structure 3 is the Zm axis
- the axis parallel to the direction in which the boom pin 13 extends is the Ym axis
- the axis may be the Xm axis.
- the second angle detector 18B detects the amount of movement of the arm cylinder 11, that is, the stroke length.
- the processing device 20 calculates the rotation angle ⁇ 2 of the arm 7 with respect to the boom 6 from the stroke length of the arm cylinder 11 detected by the second angle detection unit 18B.
- the third angle detector 18C detects the amount of movement of the bucket cylinder 12, that is, the stroke length.
- the processing device 20 calculates the rotation angle ⁇ 3 of the bucket 8 relative to the arm 7 from the stroke length of the bucket cylinder 12 detected by the third angle detection unit 18C.
- the excavator 100 includes, for example, a plurality of imaging devices 30 a, 30 b, 30 c, and 30 d in the cab 4.
- the imaging device 30 when the plurality of imaging devices 30a, 30b, 30c, and 30d are not distinguished, they are appropriately referred to as the imaging device 30.
- the kind of the imaging device 30 is not limited, in the embodiment, for example, an imaging device including a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor is used.
- a plurality (four) of the imaging devices 30a, 30b, 30c, and 30d are attached to the excavator 100. More specifically, as illustrated in FIG. 2, the imaging device 30 a and the imaging device 30 b are disposed in the cab 4, for example, facing the same direction at a predetermined interval. The imaging device 30c and the imaging device 30d are arranged in the cab 4 with a predetermined interval and facing the same direction. The imaging device 30b and the imaging device 30d may be arranged slightly toward the work machine 2, that is, slightly toward the imaging device 30a and the imaging device 30c. A plurality of imaging devices 30a, 30b, 30c, and 30d are combined to form a stereo camera. In the embodiment, a stereo camera is configured by a combination of the imaging devices 30a and 30b and a combination of the imaging devices 30c and 30d.
- the excavator 100 includes the four image pickup devices 30, but the number of the image pickup devices 30 included in the excavator 100 may be at least two, and is not limited to four. This is because the excavator 100 configures a stereo camera with at least a pair of the imaging devices 30 to capture the subject in stereo.
- the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d are arranged in front of and above the cab 4.
- the upward direction is a direction perpendicular to the ground contact surfaces of the crawler belts 5a, 5b of the excavator 100 and away from the ground contact surfaces.
- the ground contact surfaces of the crawler belts 5a and 5b are planes defined by at least three points that do not exist on the same straight line at a portion where at least one of the crawler belts 5a and 5b is grounded.
- the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d shoots a subject existing in front of the vehicle body 1 of the excavator 100 in stereo.
- the target is, for example, a target excavated by the work machine 2.
- the processing device 20 illustrated in FIGS. 1 and 2 measures a target three-dimensionally using a result of stereo shooting performed by at least a pair of imaging devices 30. In other words, the processing device 20 performs stereo image processing on the same target image captured by at least the pair of imaging devices 30 to measure the above-described target three-dimensionally.
- the place where the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d are arranged is not limited to the front and upper side in the cab 4.
- FIG. 4 is a diagram illustrating an example of an image obtained by imaging a target by a plurality of imaging devices 30a, 30b, 30c, and 30d.
- FIG. 5 is a diagram illustrating an example of the target OJ imaged by the plurality of imaging devices 30a, 30b, 30c, and 30d.
- the images PIa, PIb, PIc, and PId shown in FIG. 4 are obtained by, for example, imaging the target OJ by the plurality of imaging devices 30a, 30b, 30c, and 30d shown in FIG.
- the target OJ has a first part OJa, a second part OJb, and a third part OJc.
- the image PIa is captured by the imaging device 30a
- the image PIb is captured by the imaging device 30b
- the image PIc is captured by the imaging device 30c
- the image PId is captured by the imaging device 30d. It has been done. Since the pair of imaging devices 30a and 30b are arranged facing the upper side of the hydraulic excavator 100, the upper side of the target OJ is shown in the images PIa and PIb. Since the pair of imaging devices 30c and 30d are arranged facing the lower side of the excavator 100, the lower side of the target OJ is shown in the images PIc and PId.
- the images PIa and PIb captured by the pair of imaging devices 30a and 30b and the images PIc and PId captured by the pair of imaging devices 30c and 30d are part of the region of the target OJ.
- the second portion OJb is duplicated. That is, the imaging region of the pair of imaging devices 30a and 30b facing upward and the imaging region of the pair of imaging devices 30c and 30d facing downward have overlapping portions.
- the processing device 20 uses a pair of imaging devices 30a and 30b when performing image processing by the stereo method on the images PIa, PIb, PIc, and PId of the same target OJ captured by the plurality of imaging devices 30a, 30b, 30c, and 30d.
- a first parallax image is obtained from the captured images PIa and PIb.
- the processing device 20 obtains a second parallax image from the images PIc and PId captured by the pair of imaging devices 30c and 30d. Thereafter, the processing device 20 combines the first parallax image and the second parallax image to obtain one parallax image.
- the processing device 20 measures the target three-dimensionally using the obtained parallax image. As described above, the processing device 20 and the plurality of imaging devices 30a, 30b, 30c, and 30d measure the entire predetermined area of the target OJ in a three-dimensional manner with a single imaging.
- the imaging device 30c is used as a reference for the imaging devices 30a, 30b, 30c, and 30d.
- the coordinate system (Xs, Ys, Zs) of the imaging device 30c is appropriately referred to as an imaging device coordinate system.
- the origin of the imaging device coordinate system is the center of the imaging device 30c.
- the origin of each coordinate system of the imaging device 30a, the imaging device 30b, and the imaging device 30d is the center of each imaging device.
- FIG. 6 is a diagram illustrating a calibration system 50 according to the embodiment.
- the calibration system 50 includes a plurality of imaging devices 30 a, 30 b, 30 c, 30 d and the processing device 20. These are provided in the vehicle body 1 of the excavator 100 as shown in FIGS. 1 and 2.
- the plurality of imaging devices 30 a, 30 b, 30 c, and 30 d are attached to a hydraulic excavator 100 that is a work machine, images a target, and outputs an image of the target obtained by the imaging to the processing device 20.
- the processing device 20 includes a processing unit 21, a storage unit 22, and an input / output unit 23.
- the processing unit 21 is realized by a processor such as a CPU (Central Processing Unit) and a memory, for example.
- the processing device 20 implements the calibration method according to the embodiment.
- the processing unit 21 reads and executes the computer program stored in the storage unit 22. This computer program is for causing the processing unit 21 to execute the calibration method according to the embodiment.
- the processing device 20 executes the calibration method according to the embodiment, performs image processing in a stereo system on at least a pair of images captured by the pair of imaging devices 30, and specifically the target position, specifically. Find the coordinates of the object in the 3D coordinate system.
- the processing device 20 can measure the target three-dimensionally using a pair of images obtained by capturing the same target with at least the pair of imaging devices 30. That is, at least a pair of the imaging device 30 and the processing device 20 measures a target three-dimensionally by a stereo method.
- at least a pair of the imaging device 30 and the processing device 20 corresponds to a first position detection unit that is provided in the excavator 100 and detects and outputs a target position.
- the imaging device 30 When the imaging device 30 has a function of performing three-dimensional measurement of an object by executing stereo image processing, at least a pair of the imaging devices 30 corresponds to the first position detection unit.
- the first position detection unit detects and outputs the target position by the first method.
- the first method is a method of three-dimensionally measuring a predetermined position of a hydraulic excavator 100 that is a target, for example, the working machine of the embodiment, in a stereo method.
- the predetermined position of the hydraulic excavator 100 is measured by a laser length measuring device. May be used, and is not limited to stereo three-dimensional measurement.
- the predetermined position of the excavator 100 used in the first method is the predetermined position of the work machine 2, but if the predetermined position of the elements constituting the excavator 100 is the predetermined position, It is not limited to a predetermined position.
- the storage unit 22 is a nonvolatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Random Access Memory), flash memory, EPROM (Erasable Programmable Random Access Memory), EEPROM (Electrically Erasable Programmable Random Access Memory), etc. At least one of a magnetic disk, a flexible disk, and a magneto-optical disk is used.
- the storage unit 22 stores a computer program for causing the processing unit 21 to execute the calibration method according to the embodiment.
- the storage unit 22 stores information used when the processing unit 21 executes the calibration method according to the embodiment.
- the internal calibration data of each imaging device 30, the posture of each imaging device 30, the positional relationship between the imaging devices 30, known dimensions of the work implement 2, etc., and this information are mounted on the imaging device 30 and the hydraulic excavator 100.
- the input / output unit 23 is an interface circuit for connecting the processing device 20 and devices.
- the input / output unit 23 is connected to the hub 51, the input device 52, the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C.
- the hub 51 is connected to a plurality of imaging devices 30a, 30b, 30c, and 30d.
- the imaging device 30 and the processing device 20 may be connected without using the hub 51.
- Results obtained by the imaging devices 30 a, 30 b, 30 c, and 30 d are input to the input / output unit 23 via the hub 51.
- the processing unit 21 acquires the results of imaging of the imaging devices 30a, 30b, 30c, and 30d via the hub 51 and the input / output unit 23.
- the input device 52 is used for inputting information necessary when the processing unit 21 executes the calibration method according to the embodiment.
- the input device 52 is exemplified by a switch and a touch panel, but is not limited thereto.
- the input device 52 is provided in the cab 4 shown in FIG. 2, more specifically in the vicinity of the driver's seat 4 ⁇ / b> S.
- the input device 52 may be attached to at least one of the right lever 25R and the left lever 25L of the operation device 25, or may be provided on the monitor panel 26 in the cab 4.
- the input device 52 may be removable from the input / output unit 23, or information may be input to the input / output unit 23 by radio communication using radio waves or infrared rays.
- a predetermined position of the work machine 2 in the coordinate system (Xm, Ym, Zm) is obtained.
- the predetermined position of the work machine 2 obtained from the dimensions and the rotation angles ⁇ 1, ⁇ 2, and ⁇ 3 of the work machine 2 is, for example, the position of the tip of the blade 9 of the bucket 8 included in the work machine 2, the position of the bucket pin 15, There is a position of one link pin 47a.
- the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C serve as a second position detection unit that detects the position of the excavator 100 that is the working machine of the embodiment, for example, the position of the work implement 2. Equivalent to.
- the second position detection unit detects the position of the target by the second method.
- the second method is to obtain a predetermined position of the excavator 100 from the size and posture of the excavator 100 that is the working machine of the embodiment, but the second method is different from the first method. If it is a thing, it is not limited to the method of embodiment.
- the predetermined position of the excavator 100 used in the second method is the same as the predetermined position of the excavator 100 that is the measurement target of the first method.
- the predetermined position of the excavator 100 used in the second method is the predetermined position of the work machine 2, but if the predetermined position of the elements constituting the excavator 100 is used, the work machine 2. It is not limited to the predetermined position.
- FIG. 7 is a diagram for explaining a calibration method according to the embodiment.
- the target position information Ps (xs, ys, zs) is obtained by performing stereo image processing on the target image captured by at least the pair of imaging devices 30.
- the obtained position information Ps (xs, ys, zs) is obtained from the imaging device coordinate system (Xs, Ys, Zs), which is the coordinate system of the first position detection unit.
- Xs, Ys, Zs) is converted into position information Pm (xm, ym, zm) in a different coordinate system.
- the coordinate system different from the imaging device coordinate system (Xs, Ys, Zs) is the vehicle body coordinate system (Xm, Ym, Zm), but is not limited thereto.
- the position information Ps (xs, ys, zs) obtained from at least the pair of imaging devices 30 is three-dimensional information, and is represented by coordinates in the embodiment. Using the position information Ps (xs, ys, zs), the distance from the imaging device 30 to the target is obtained.
- the position information Ps (xs, ys, zs) obtained from at least the pair of imaging devices 30 is changed from the imaging device coordinate system (Xs, Ys, Zs) to the vehicle body coordinate system (Xm, Ym, Zm) is used to obtain conversion information used when the position information Pm (xm, ym, zm) is converted. That is, the conversion information is information used to convert the positions detected by at least the pair of imaging devices 30 serving as the first position detection unit from the coordinate system of the first position detection unit to the coordinate system of the vehicle body 1.
- the position information Ps of the imaging apparatus coordinate system is converted into the position information Pm of the vehicle body coordinate system by the expression (1).
- R in equation (1) is a rotation matrix represented by equation (2)
- T in equation (1) is a translation vector represented by equation (3).
- ⁇ is a rotation angle around the Xs axis of the imaging device coordinate system
- ⁇ is a rotation angle around the Ys axis of the imaging device coordinate system
- ⁇ is a rotation angle around the Zs axis of the imaging device coordinate system.
- the rotation matrix R and the translation vector T are conversion information.
- Pm R ⁇ Ps + T (1)
- the processing unit 21 obtains the conversion information described above when executing the calibration method according to the embodiment. Specifically, the processing unit 21 is detected by at least the first position information detected by the pair of imaging devices 30, and the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C. Conversion information is obtained and output using the second position information.
- the imaging devices 30 are the imaging devices 30c and 30d. However, it is only necessary to include the reference imaging device 30c.
- the detection values of the IMU (Inertial Measurement Unit) 24 shown in FIGS. 1 and 2 mounted on the excavator 100 are also used. You may ask for it.
- IMU Inertial Measurement Unit
- the first position information is information on a predetermined position of the work machine 2, for example, the position of the blade 9 of the bucket 8, detected by at least a pair of the imaging device 30 and the processing device 20 serving as the first position detection unit.
- the second position information is information on a predetermined position of the work implement 2 detected by the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C.
- the second position information is information detected by the first angle detection unit 18A, which is an example of the second position detection unit, in the posture of the work machine 2 when the first position detection unit detects a predetermined position. .
- Both the first position information and the second position information are information on the work machine 2 in the same posture and the same position of the work machine 2.
- the first position information and the second position information are obtained by different methods for the same position of the work machine 2 in a state where the work machine 2 is in the same posture.
- the first position information and the second position information are a plurality of pieces of information obtained when the work implement 2 is in a different posture by moving the work implement 2, and are obtained in a plurality of states. Multiple information.
- the first position information and the second position information may be information that can specify a predetermined position of the work machine 2.
- the first position information and the second position information may be information on a predetermined position of the work machine 2 itself, or positions of members that are attached to the work machine and whose positional relationship with the work machine 2 is known. It may be the information. That is, the first position information and the second position information are not limited to information on a predetermined position of the work machine 2 itself.
- the processing device 20 may be realized by dedicated hardware, or a plurality of processing circuits may cooperate to realize the function of the processing device 20. Next, a processing example when the processing device 20 executes the calibration method according to the embodiment will be described.
- FIG. 8 is a flowchart illustrating a processing example when the processing device 20 according to the embodiment executes the calibration method according to the embodiment.
- 9 and 10 are diagrams illustrating objects to be imaged by the imaging device 30 when the processing device 20 according to the embodiment executes the calibration method according to the embodiment.
- FIG. 11 to FIG. 13 are diagrams illustrating postures of objects to be imaged by the imaging device 30 when the processing device 20 according to the embodiment executes the calibration method according to the embodiment.
- the working machine obtained by using at least a pair of imaging devices 30 for the angles ⁇ , ⁇ , ⁇ and translation vector components x 0 , y 0 , z 0 included in the rotation matrix R that is an unknown number. 2 and the second position information detected by the first angle detector 18A, the second angle detector 18B, and the third angle detector 18C.
- the processing unit 21 sets the count numbers N and M to 0 in step S101.
- step S102 the processing unit 21 causes the pair of imaging devices 30c and 30d to image the target. Further, the processing unit 21 acquires detection values of the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C.
- the target to be imaged by the pair of imaging devices 30c and 30d is a predetermined position of the work machine 2, and in the embodiment, the bucket 8 of the excavator 100, more specifically, the blade 9.
- the bucket 8 is provided with marks MKl, MKc, and MKr on the blade 9.
- the mark MKl is provided on the leftmost blade 9, the mark MKc is provided on the central blade 9, and the mark MKr is provided on the rightmost blade 9.
- the marks MKl, MKc, and MKr are not distinguished, they are appropriately referred to as marks MK.
- step S ⁇ b> 102 the processing unit 21 determines the first angle detection unit 18 ⁇ / b> A, the second angle detection unit 18 ⁇ / b> B, and the third angle detection unit in the posture of the work machine 2 when the pair of imaging devices 30 c and 30 d image the bucket 8.
- the detection value of 18C is acquired.
- the processing unit 21 performs imaging by the pair of imaging devices 30c and 30d, the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection with the same posture of the work machine 2.
- the detection value of the unit 18C is acquired.
- the processing unit 21 causes the storage unit 22 to store the image obtained as a result of imaging by the imaging device 30 and the detection values of the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C. .
- the marks MKl, MKc, and MKr are arranged in the width direction W of the bucket 8, that is, the direction parallel to the direction in which the bucket pin 15 extends.
- the width direction W of the bucket 8 is a direction in which the pair of imaging devices 30c and 30d are arranged.
- the central blade 9 in the width direction W of the bucket 8 moves only on one plane, that is, the Xm-Zm plane in the vehicle body coordinate system. For this reason, when only the position of the center blade 9 is obtained, the constraint condition is weakened. Therefore, in the position measurement by the stereo method using the pair of imaging devices 30c and 30d, the accuracy in the Ym-axis direction in the vehicle body coordinate system is lowered. To do.
- a plurality of positions in the width direction W of the bucket 8, specifically, the positions of the three blades 9 are measured and used as the first position information. For this reason, when obtaining the rotation matrix R and the translation vector T as conversion information, the positional information of a plurality of planes with respect to the width direction W of the bucket 8 can be used. Is suppressed.
- the rotation matrix R and the translation vector T obtained by the calibration method according to the embodiment are used for the position measurement by the stereo method using the pair of imaging devices 30c and 30d, so that the measurement in the Ym axis direction in the vehicle body coordinate system is performed. Reduction in accuracy is suppressed.
- the bucket 8 is provided with the marks MKl, MKc, and MKr on the three blades 9, but the number of marks MK, that is, the number of blades 9 to be measured is not limited to three.
- the mark MK may be provided on at least one blade 9.
- two or more marks MK are arranged in the width direction W of the bucket 8. In order to obtain high measurement accuracy, it is preferable that two or more blades 9 are measured at positions apart from each other.
- FIG. 10 shows an example in which a measurement target 60 attached to the work machine 2 is used instead of the position of the blade 9.
- at least a pair of the imaging device 30 and the processing unit 21 measure the position of the measurement target 60 attached to the work machine 2 and use it as the first position information in the calibration method according to the embodiment.
- the measurement target 60 includes target members 63a and 63b provided with marks MKa and MKb, a shaft member 62 connecting the two target members 63a and 63b, and a fixing member attached to one end of the shaft member 62. 61.
- the target members 63a and 63b are arranged side by side in the direction in which the shaft member 62 extends.
- the fixing member 61 has a magnet.
- the fixing member 61 is attached to the work implement 2 by, for example, attaching the target members 63 a and 63 b and the shaft member 62 to the work implement 2 by being attracted to the work implement 2.
- the fixing member 61 can be attached to the work machine 2 and can be detached from the work machine 2.
- the fixing member 61 is attracted to the bucket pin 15, and the target members 63 a and 63 b and the shaft member 62 are fixed to the work machine 2.
- the target members 63 a and 63 b are arranged side by side in the width direction W of the bucket 8.
- the positions of the marks MKa and MKb of the measurement target 60 are obtained in advance from the dimensions of the measurement target 60.
- the part of the work machine 2 to which the fixing member 61 of the measurement target 60 is attached and the position of the blade 9 are obtained in advance from the dimensions of the bucket 8. Therefore, if the positions of the marks MKa and MKb of the measurement target 60 are known, the position of the blade 9 of the bucket 8 can be known.
- the positional relationship between the marks MKa and MKb included in the measurement target 60 and the blade 9 of the bucket 8 is stored in the storage unit 22 of the processing device 20.
- the processing unit 21 reads the positional relationship between the marks MKa and MKb and the blade 9 of the bucket 8 from the storage unit 22 and generates first position information or second position information. Used when.
- step S102 When the imaging of the pair of imaging devices 30c and 30d in step S102 and the measurement of the predetermined position using the detection values of the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C are completed, the processing is completed. Proceed to step S103.
- step S103 the processing unit 21 operates the work machine 2 to move the bucket 8 away from the ground, that is, upward.
- step S104 the processing unit 21 sets a value obtained by adding 1 to the count number N as a new count number N.
- step S105 when the current count number M is equal to or less than Mc-1, the processing unit 21 compares the current count number N with the count number threshold value Nc1.
- the processing unit 21 compares the current count number N with the count number threshold value Nc2.
- the count number threshold Nc1 is 2.
- the count number threshold Nc2 is smaller than the count number threshold Nc1, for example, 1.
- step S105 when the count number N is not the count number threshold Nc1 (No in step S105), the processing unit 21 repeats the processing from step S102 to step S105.
- step S105 when the count number N is the count number threshold Nc1 (step S105, Yes), the process proceeds to step S106.
- step S106 the processing unit 21 operates the work machine 2 to move the bucket 8 in the depth direction, that is, in the direction away from the swing body 3 shown in FIG.
- step S107 the processing unit 21 sets a value obtained by adding 1 to the count number M as a new count number M.
- step S108 the processing unit 21 compares the current count number M with the count number threshold Mc.
- the count number threshold Mc is 2.
- step S108 when the count number M is not the count threshold value Mc (step S108, No), the processing unit 21 sets the count number N to 0 in step S109. Thereafter, the processing unit 21 executes the processing from step S102 to step S105.
- the pair of imaging devices 30c and 30d images the bucket 8 Nc + 1 times in the vertical direction of the excavator 100 under the condition that the horizontal distance L between the plurality of imaging devices 30 and the bucket 8 is equal. That is, the pair of imaging devices 30c and 30d captures Nc + 1 times by changing the vertical position of the bucket 8.
- the horizontal distance L is in the direction parallel to the ground contact surface of the excavator 100, that is, the ground contact surfaces of the crawler belts 5a and 5b shown in FIG. 1, and in the direction perpendicular to the direction in which the boom pin 13 shown in FIG. It is a distance of 8.
- the plurality of imaging devices 30 repeat Steps S ⁇ b> 106 to S ⁇ b> 108 to change the horizontal distance L, which is the distance between the swing body 3 parallel to the ground contact surface of the excavator 100 and the bucket 8, Mc + 1 times. That is, the pair of imaging devices 30c and 30d captures Nc + 1 times by changing the horizontal distance L of the bucket 8.
- the bucket 8 is imaged at three locations. For this reason, at the horizontal distance L1, information on the positions of the marks MKl, MKc, and MKr can be obtained at three different heights.
- the positions A, B, and C are higher in the direction indicated by the arrow h in FIG.
- the horizontal distance L2 is larger than the horizontal distance L1.
- the horizontal distance L2 being larger than the horizontal distance L1 indicates that the bucket 8 is at a position far from the imaging device 30c and the imaging device 30d.
- the positions D, E, and F are higher in the direction indicated by the arrow h in FIG.
- the horizontal distance L3 is larger than the horizontal distance L2. That the horizontal distance L3 is larger than the horizontal distance L2 indicates that the bucket 8 is located farther from the imaging device 30c and the imaging device 30d.
- the positions G and H are higher in the direction indicated by the arrow h in FIG.
- the pair of imaging devices 30c and 30d images the bucket 8 at two locations in the vertical direction, but the imaging locations in the vertical direction are not limited to two locations. Further, when the bucket 8 is imaged by moving the bucket in the vertical direction with the horizontal distance L being constant, the imaging location in the vertical direction is not limited to that of the embodiment.
- the bucket 8 is imaged by the pair of imaging devices 30c and 30d for a total of 8 times, 3 times at the horizontal distance L1, 3 times at the horizontal distance L2, and 2 times at the horizontal distance L3.
- the constraint condition is strengthened when the portion to be measured, in the embodiment, the marks MKl, MKc, and MKr are at the ends of the images captured by the pair of imaging devices 30c and 30d. Measurement accuracy is improved.
- the processing unit 21 changes the height and causes the bucket 8, more specifically, the marks MKl, MKc, and MKr to be imaged by the pair of imaging devices 30c and 30d at the same horizontal distance L.
- the marks MKl, MKc, and MKr are arranged at both ends of the images picked up by the plurality of image pickup devices 30, specifically, both ends in the vertical direction, so that the measurement accuracy is improved.
- the horizontal distance L is changed in three stages, and the number of times of imaging in the height direction is three or two, but is not limited to this.
- the number of times the horizontal distance L is changed is changed by changing the count number threshold Mc.
- the number of times of imaging in the height direction is changed by changing at least one of the count number threshold Nc1 and the count number threshold Nc2.
- the processing unit 21 changes the horizontal distance L of the bucket 8 and causes the pair of imaging devices 30 to image the bucket 8, more specifically, the marks MKl, MKc, and MKr. By doing in this way, the precision of three-dimensional measurement improves in a wide range.
- step S110 the processing unit 21 obtains first position information and second position information. Specifically, the processing unit 21 has a plurality of pairs of images (eight pairs in the embodiment) obtained by the pair of imaging devices 30c and 30d capturing the bucket 8 a plurality of times (in the embodiment, eight times). Is acquired from the storage unit 22. And the process part 21 performs the image process by a stereo system to each image which becomes a pair among the acquired several pairs of images, and measures the position of mark MKl, MKc, MKr three-dimensionally.
- the processing unit 21 extracts the marks MKl, MKc, and MKr by image processing.
- the processing unit 21 can extract these from the image based on the shape characteristics of the marks MKl, MKc, and MKr.
- the marks MKl, MKc, and MKr may be selected by the operator operating the input device 52 shown in FIG.
- the processing unit 21 obtains the positions of the marks MKl, MKc, and MKr existing in the pair of images obtained from the pair of imaging devices 30c and 30d by triangulation. Information on the positions of the marks MKl, MKc, and MKr thus determined is the first position information.
- the processing unit 21 obtains first position information for each of the imaging results at eight locations obtained in steps S101 to S109, and outputs the first position information to, for example, the storage unit 21 for temporary storage.
- step S110 the processing unit 21 acquires the detection values of the first angle detection unit 18A, the second angle detection unit 18B, and the third angle detection unit 18C and the dimensions of the work implement 2.
- the detection values of the first angle detection unit 18A and the like are detected by the first angle detection unit 18A and the like when the posture of the work machine 2 is the posture when the bucket 8 is imaged by the pair of imaging devices 30c and 30d. Value.
- the processing unit 21 obtains the position of the blade 9 of the bucket 8, more specifically the positions of the marks MKl, MKc, and MKr from the acquired detection value and the dimensions of the work machine 2.
- the processing unit 21 obtains the second position information for each of the imaging results at the eight locations obtained in steps S101 to S109, and outputs the second position information to, for example, the storage unit 21 for temporary storage.
- 3 pieces of second position information are obtained for imaging at one location. As described above, since the bucket 8 is imaged at eight locations, a total of 24 pieces of second position information can be obtained.
- the processing unit 21 associates the first position information and the second position information obtained with the same posture of the work machine 2 and temporarily stores them in the storage unit 22. In the embodiment, there are a total of 24 combinations of the first position information and the second position information.
- step S111 the processing unit 21 obtains the rotation matrix R and the translation vector T using the first position information and the second position information. More specifically, the processing unit 21 uses the first position information and the second position information to convert the angles ⁇ , ⁇ , ⁇ and the translation vector T components x 0 , y 0 , z included in the rotation matrix R. Find 0 . In obtaining the angles ⁇ , ⁇ , ⁇ and the components x 0 , y 0 , z 0 , 24 combinations of the first position information and the second position information are used, but those having large errors may be excluded. . By doing so, the angle alpha, beta, gamma and components x 0, y 0, reduction of accuracy z 0 is suppressed.
- the first position information is represented by (xm, ym, zm) because it is a coordinate in the vehicle body coordinate system.
- the second position information is represented by (xs, ys, zs) since it is the imaging apparatus coordinate system.
- the processing unit 21 reads out the first position information and the second position information obtained from the posture of the same work implement 2 from the storage unit 22, converts the first position information into the position information Pm of Expression (4), and the second position information Is given to the position information Ps of the equation (4). Then, three expressions including any of the angles ⁇ , ⁇ , ⁇ included in the rotation matrix R and the components x 0 , y 0 , z 0 of the translation vector T are obtained. In the embodiment, since there are 24 combinations of the first position information and the second position information, the processing unit 21 gives the combination of 24 pieces of the first position information and the second position information to Expression (4). A total of 72 Js including any of the angles ⁇ , ⁇ , ⁇ included in the rotation matrix R and the components x 0 , y 0 , z 0 of the translation vector T are obtained.
- the total sum JS of 72 Js in total is obtained by equation (5).
- the processing unit 21 obtains the sum JS from Expression (5).
- the processing unit 21 minimizes JS. Therefore, the processing unit 21 partially differentiates ⁇ ⁇ Pmi ⁇ (R ⁇ Psi + T) ⁇ 2 with respect to each of angle ⁇ , angle ⁇ , angle ⁇ , component x 0 , component y 0 , and component z 0 . Make things zero.
- the processing unit 21 obtains the angles ⁇ , ⁇ , ⁇ and the components x 0 , y 0 , z 0 of the translation vector T by solving the six equations obtained in this way, for example, by the Newton-Raphson method.
- the processing unit 21 obtains the rotation matrix R and the translation vector T from the obtained angles ⁇ , ⁇ , ⁇ and the components x 0 , y 0 , z 0 of the translation vector T.
- the rotation matrix R and translation vector T obtained in this way convert the target position information detected by the first position detection unit into a coordinate system other than the first position detection unit, in the embodiment, a vehicle body coordinate system. Conversion information.
- the processing unit 21 converts the target position detected by the second position detection unit into a coordinate system different from the coordinate system of the second position detection unit, for example, the coordinate system of the first position detection unit. Conversion information may be obtained.
- the position of the object in the coordinate system of the second position detection unit detected by the second position detection unit can be converted into the coordinate system of the first position detection unit by Expression (6).
- the coordinate system of the second position detector is a vehicle body coordinate system
- the coordinate system of the first position detector is an imaging device coordinate system.
- R ⁇ 1 in the equation (6) is an inverse matrix of the rotation matrix represented by the above-described equation (2)
- T in the equation (6) is a translation vector represented by the above-described equation (3).
- the position information Pm is the position of the target in the vehicle body coordinate system
- the position information Ps is the position of the target in the imaging device coordinate system.
- the product of the inverse matrix R ⁇ 1 and the translation vector T and R ⁇ 1 is the conversion information.
- the processing of the processing unit 21 and the calibration method of the embodiment change the position detected by the second position detection unit from the coordinate system of the second position detection unit to a coordinate system different from the coordinate system of the second position detection unit. It is also possible to obtain and output conversion information used for conversion.
- the second position detector is the first angle detector 18A, the second angle detector 18B, and the third angle detector 18C, but is not limited thereto.
- the excavator 100 is equipped with an antenna for RTK-GNSS (Real Time Kinematic-Global Navigation Satellite Systems, GNSS is a global navigation satellite system), and the position of the vehicle is measured by measuring the position of the antenna by GNSS. It is assumed that a position detection system for detection is provided. In this case, the position detection system described above is the second position detection unit, and the position of the GNSS antenna is the predetermined position of the work machine.
- RTK-GNSS Real Time Kinematic-Global Navigation Satellite Systems
- the first position information and the second position information are obtained by detecting the position of the GNSS antenna by the first position detection unit and the second position detection unit while changing the position of the GNSS antenna.
- the processing unit 21 uses the obtained first position information and second position information to convert the target position information detected by the first position detection unit into a coordinate system other than the first position detection unit, in the embodiment, vehicle body coordinates. Find conversion information to convert to a system. Further, the processing unit 21 converts the target position information detected by the second position detection unit into a coordinate system other than the second position detection unit, using the obtained first position information and second position information. It is also possible to obtain conversion information for this purpose.
- a removable GNSS receiver to a predetermined position of the excavator 1, for example, a predetermined position of the traveling body 5 or the work machine 2, the GNSS receiver is used as a second position detection unit, Conversion information can be obtained in the same manner as in the case where the above-described position detection system for detecting the position of the vehicle is the second position detection unit.
- the first position detection unit and the first position detection unit which are means for detecting the position of the target in a state where the working machine 2 of the excavator 100 has the same posture
- a predetermined position of the work implement 2 is obtained using a different second position detection unit.
- the calibration system 50 and the calibration method according to the embodiment use the first position information obtained by the first position detection unit and the second position information obtained by the second position detection unit, and use the rotation matrix R. And a translation vector T.
- the calibration system 50 and the calibration method according to the embodiment use conversion information for converting the target position information detected by the first position detection unit into a coordinate system other than the first position detection unit. Can be sought.
- the target image picked up by at least a pair of the image pickup devices 30 among the plurality of image pickup devices 30 is subjected to stereo image processing to obtain position information of the target in the image pickup device coordinate system. If the conversion information is obtained by the calibration system 50 and the calibration method according to the embodiment, the position information of the object in the imaging apparatus coordinate system can be converted into the position information in the vehicle body coordinate system, so the excavator 100 can convert the position of the object after conversion.
- the work machine 2 can be controlled using the information, or the guidance screen of the work machine 2 can be displayed on the monitor.
- the calibration system 50 and the calibration method according to the embodiment use the processing device 20 and the pair of imaging devices 30c and 30d provided in the excavator 100, external devices for obtaining the rotation matrix R and the translation vector T are unnecessary. It is. For this reason, the calibration system 50 and the calibration method according to the embodiment can obtain the rotation matrix R and the translation vector T, for example, at the usage destination of the user of the excavator 100. As described above, the calibration system 50 and the calibration method according to the embodiment can obtain the rotation matrix R and the translation vector T even in a place where there is no external device for obtaining the rotation matrix R and the translation vector T. There is an advantage.
- the first position information and the second position information are information on predetermined positions detected from the work equipment 2 having different postures, so that the rotation matrix R and the conversion information are converted.
- the amount of information for obtaining the translation vector T can be increased.
- the calibration system 50 and the calibration method according to the embodiment can accurately obtain the rotation matrix R and the translation vector T.
- the first position detection unit is a stereo camera including at least a pair of imaging devices 30, but is not limited thereto.
- the first position detection unit may be, for example, a laser scanner or a 3D scanner.
- the work machine is not limited to the hydraulic excavator 100 as long as it includes at least a pair of imaging devices and uses the pair of imaging devices to three-dimensionally measure an object in a stereo manner.
- the work machine may have a work machine, and may be a work machine such as a wheel loader or a bulldozer.
- marks MKl, MKc, and MKr are provided on the blade 9, but these are not necessarily required.
- the portion for which the processing unit 21 obtains the position for example, the portion of the blade 9 of the bucket 8 may be specified in the target image imaged by the imaging device 30 by the input device 52 shown in FIG. In this case, the processing unit 21 performs three-dimensional measurement for the designated portion.
Landscapes
- Engineering & Computer Science (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Mechanical Engineering (AREA)
- Component Parts Of Construction Machinery (AREA)
- Operation Control Of Excavators (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
図1は、実施形態に係る校正システムを備えた油圧ショベル100の斜視図である。図2は、実施形態に係る油圧ショベル100の運転席付近を斜視図である。図3は、実施形態に係る油圧ショベルが有する作業機2の寸法及び油圧ショベル100の座標系を示す図である。
図2に示されるように、油圧ショベル100は、例えば、運転室4内に、複数の撮像装置30a,30b,30c,30dを有する。以下において、複数の撮像装置30a,30b,30c,30dを区別しない場合は適宜、撮像装置30と称する。撮像装置30の種類は限定されないが、実施形態では、例えば、CCD(Couple Charged Device)イメージセンサ又はCMOS(Complementary Metal Oxide Semiconductor)イメージセンサを備えた撮像装置が用いられる。
図6は、実施形態に係る校正システム50を示す図である。校正システム50は、複数の撮像装置30a,30b,30c,30dと、処理装置20とを含む。これらは、図1及び図2に示されるように、油圧ショベル100の車体1に備えられている。複数の撮像装置30a,30b,30c,30dは、作業機械である油圧ショベル100に取り付けられて、対象を撮像し、撮像によって得られた対象の画像を処理装置20に出力する。
Pm=R・Ps+T・・・(1)
図8は、実施形態に係る処理装置20が実施形態に係る校正方法を実行する際の処理例を示すフローチャートである。図9及び図10は、実施形態に係る処理装置20が実施形態に係る校正方法を実行する際に撮像装置30が撮像する対象を示す図である。図11から図13は、実施形態に係る処理装置20が実施形態に係る校正方法を実行する際に撮像装置30が撮像する対象の姿勢を示す図である。
J={Pmi-(R・Psi+T)}2・・・(4)
JS=ΣJi=Σ{Pmi-(R・Psi+T)}2,{i:1から72}・・・(5)
Ps=R-1・Pm-R-1・T・・・(6)
2 作業機
3 旋回体
4 運転室
5 走行体
6 ブーム
7 アーム
8 バケット
9 刃
10 ブームシリンダ
11 アームシリンダ
12 バケットシリンダ
13 ブームピン
14 アームピン
15 バケットピン
18A 第1角度検出部
18B 第2角度検出部
18C 第3角度検出部
20 処理装置
21 処理部
22 記憶部
23 入出力部
25 操作装置
26 モニタパネル
30a,30b,30c,30d 撮像装置
50 校正システム
52 入力装置
60 計測用ターゲット
100 油圧ショベル
P3 刃先
R 回転行列
T 並進ベクトル
W 幅方向
x0,y0,z0 成分
α,β,γ 角度
Claims (8)
- 作業機を有する作業機械に備えられて、対象の位置を検出して出力する第1位置検出部と、
前記第1位置検出部によって検出された前記作業機械の所定の位置に関する情報である第1位置情報、及び前記第1位置検出部が前記所定の位置を検出したときの前記作業機械の姿勢で、第2位置検出部によって検出された前記所定の位置に関する情報である第2位置情報を用いて、前記第1位置検出部が検出した位置を前記第1位置検出部の座標系から前記第1位置検出部の座標系とは異なる座標系に変換するために用いられる変換情報を求めて出力するか、又は前記第2位置検出部が検出した位置を前記第2位置検出部の座標系から前記第2位置検出部の座標系とは異なる座標系に変換するために用いられる変換情報を求めて出力する処理部と、
を含む、校正システム。 - 前記第1位置情報は、前記第1位置検出部が、異なる姿勢の前記作業機械から前記所定の位置を検出することにより得られた複数の情報であり、
前記第2位置情報は、前記第2位置検出部が、異なる姿勢の前記作業機機械から前記所定の位置を検出することにより得られた複数の情報である、請求項1に記載の校正システム。 - 前記第1位置検出部は、少なくとも一対の撮像装置で構成されたステレオカメラであり、
前記第2位置検出部は、前記作業機械に備えられて、前記作業機を動作させるアクチュエータの動作量を検出するセンサである、請求項1又は請求項2に記載の校正システム。 - 前記所定の位置は、前記ステレオカメラを構成する一対の前記撮像装置が配列されている方向における、前記作業機械の複数の位置である、請求項3に記載の校正システム。
- 作業機と、
請求項1から請求項4のいずれか1項に記載の校正システムと、
を含む、作業機械。 - 第1の方法及び第2の方法によって作業機械の所定の位置を、前記作業機械の姿勢が異なる状態で検出し、
前記第1の方法によって検出された前記所定の位置に関する情報である第1位置情報、及び前記第1の方法によって前記所定の位置が検出されたときの前記作業機の姿勢で、前記第2の方法によって検出された前記所定の位置に関する情報である第2位置情報を用いて、前記第1の方法によって検出された位置を前記第1の方法における座標系から前記第1位置検出部の座標系とは異なる座標系に変換するために用いられる変換情報を求めるか、又は前記第2位置検出部が検出した位置を前記第2位置検出部の座標系から前記第2位置検出部の座標系とは異なる座標系に変換するために用いられる変換情報を求める、
校正方法。 - 前記第1位置検出部が異なる姿勢の前記作業機械から前記所定の位置を検出することにより得られた複数の情報が前記第1位置情報であり、前記第2位置検出部が異なる姿勢の前記作業機械から前記所定の位置を検出することにより得られた複数の情報が前記第2位置情報であり、
前記所定の位置を検出する場合、前記第1位置検出部及び前記第2位置検出部は、異なる姿勢の前記作業機械から前記所定の位置を検出する、請求項6に記載の校正方法。 - 前記第1の方法は、ステレオ方式により前記所定の位置を三次元計測するものであり、
前記所定の位置は、前記ステレオ方式による前記三次元計測に用いられる一対の撮像装置が配列されている方向における、前記作業機械の複数の位置である、請求項6又は請求項7に記載の校正方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112015000132.8T DE112015000132T5 (de) | 2015-09-30 | 2015-09-30 | Kalibrierungssystem, Arbeitsmaschine und Kalibrierungsverfahren |
PCT/JP2015/077872 WO2016047807A1 (ja) | 2015-09-30 | 2015-09-30 | 校正システム、作業機械及び校正方法 |
KR1020167004863A KR20170039612A (ko) | 2015-09-30 | 2015-09-30 | 교정 시스템, 작업 기계 및 교정 방법 |
JP2015551654A JPWO2016047807A1 (ja) | 2015-09-30 | 2015-09-30 | 校正システム、作業機械及び校正方法 |
CN201580001358.9A CN106817905A (zh) | 2015-09-30 | 2015-09-30 | 校正系统、作业机械和校正方法 |
US14/915,743 US9790666B2 (en) | 2015-09-30 | 2015-09-30 | Calibration system, work machine, and calibration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/077872 WO2016047807A1 (ja) | 2015-09-30 | 2015-09-30 | 校正システム、作業機械及び校正方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016047807A1 true WO2016047807A1 (ja) | 2016-03-31 |
Family
ID=55581322
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/077872 WO2016047807A1 (ja) | 2015-09-30 | 2015-09-30 | 校正システム、作業機械及び校正方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9790666B2 (ja) |
JP (1) | JPWO2016047807A1 (ja) |
KR (1) | KR20170039612A (ja) |
CN (1) | CN106817905A (ja) |
DE (1) | DE112015000132T5 (ja) |
WO (1) | WO2016047807A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180103868A (ko) * | 2017-02-09 | 2018-09-19 | 가부시키가이샤 고마쓰 세이사쿠쇼 | 위치 계측 시스템, 작업 기계, 및 위치 계측 방법 |
JP2018152022A (ja) * | 2017-03-15 | 2018-09-27 | セイコーエプソン株式会社 | プロジェクターシステム |
JP2018184815A (ja) * | 2017-04-27 | 2018-11-22 | 株式会社小松製作所 | 撮像装置の校正装置、作業機械および校正方法 |
CN111195897A (zh) * | 2018-11-20 | 2020-05-26 | 财团法人工业技术研究院 | 用于机械手臂系统的校正方法及装置 |
CN112334733A (zh) * | 2018-06-29 | 2021-02-05 | 株式会社小松制作所 | 拍摄装置的校正装置、监视装置、作业机械及校正方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10132060B2 (en) * | 2017-02-27 | 2018-11-20 | Caterpillar Inc. | Implement orientation by image processing |
JP6714534B2 (ja) * | 2017-03-29 | 2020-06-24 | 日立建機株式会社 | 建設機械 |
US10781575B2 (en) | 2018-10-31 | 2020-09-22 | Deere & Company | Attachment calibration control system |
CN109816778B (zh) | 2019-01-25 | 2023-04-07 | 北京百度网讯科技有限公司 | 物料堆三维重建方法、装置、电子设备和计算机可读介质 |
CN109903337B (zh) * | 2019-02-28 | 2022-06-14 | 北京百度网讯科技有限公司 | 用于确定挖掘机的铲斗的位姿的方法和装置 |
EP4097552A4 (en) * | 2020-01-28 | 2023-11-22 | Topcon Positioning Systems, Inc. | SYSTEM AND METHOD FOR CONTROLLING A WORK DEVICE ON A WORK MACHINE USING MACHINE VISION |
GB2593759B (en) * | 2020-04-02 | 2023-04-26 | Caterpillar Inc | Method and control unit for generating a control command to at least one actuator of an electro-hydraulic machine |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001055762A (ja) * | 1999-08-13 | 2001-02-27 | Hitachi Constr Mach Co Ltd | 自動運転建設機械およびその位置計測手段の校正方法 |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
JP2014181092A (ja) * | 2013-03-18 | 2014-09-29 | Tadano Ltd | ブームの先端位置検出装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8718880B2 (en) * | 2008-03-10 | 2014-05-06 | Deere & Company | Hydraulic system calibration method and apparatus |
JP2010014535A (ja) | 2008-07-03 | 2010-01-21 | Yanmar Co Ltd | 回転検出装置及び不整地用走行車両 |
JP5390813B2 (ja) * | 2008-09-02 | 2014-01-15 | 東急建設株式会社 | 空間情報表示装置及び支援装置 |
JP5052463B2 (ja) * | 2008-09-10 | 2012-10-17 | 日立建機株式会社 | 作業機械のステレオ画像処理装置 |
JP5237409B2 (ja) * | 2011-03-24 | 2013-07-17 | 株式会社小松製作所 | 油圧ショベルの較正装置及び油圧ショベルの較正方法 |
JP2012225111A (ja) * | 2011-04-22 | 2012-11-15 | Kajima Corp | 建設車両周辺の作業者検出装置 |
US8543298B2 (en) * | 2011-06-03 | 2013-09-24 | Caterpillar Inc. | Operator interface with tactile feedback |
JP5783184B2 (ja) * | 2013-01-10 | 2015-09-24 | コベルコ建機株式会社 | 建設機械 |
-
2015
- 2015-09-30 WO PCT/JP2015/077872 patent/WO2016047807A1/ja active Application Filing
- 2015-09-30 CN CN201580001358.9A patent/CN106817905A/zh active Pending
- 2015-09-30 KR KR1020167004863A patent/KR20170039612A/ko active Search and Examination
- 2015-09-30 JP JP2015551654A patent/JPWO2016047807A1/ja active Pending
- 2015-09-30 DE DE112015000132.8T patent/DE112015000132T5/de not_active Withdrawn
- 2015-09-30 US US14/915,743 patent/US9790666B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001055762A (ja) * | 1999-08-13 | 2001-02-27 | Hitachi Constr Mach Co Ltd | 自動運転建設機械およびその位置計測手段の校正方法 |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
JP2014181092A (ja) * | 2013-03-18 | 2014-09-29 | Tadano Ltd | ブームの先端位置検出装置 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180103868A (ko) * | 2017-02-09 | 2018-09-19 | 가부시키가이샤 고마쓰 세이사쿠쇼 | 위치 계측 시스템, 작업 기계, 및 위치 계측 방법 |
KR102076631B1 (ko) | 2017-02-09 | 2020-02-12 | 가부시키가이샤 고마쓰 세이사쿠쇼 | 위치 계측 시스템, 작업 기계, 및 위치 계측 방법 |
JP2018152022A (ja) * | 2017-03-15 | 2018-09-27 | セイコーエプソン株式会社 | プロジェクターシステム |
JP2018184815A (ja) * | 2017-04-27 | 2018-11-22 | 株式会社小松製作所 | 撮像装置の校正装置、作業機械および校正方法 |
CN112334733A (zh) * | 2018-06-29 | 2021-02-05 | 株式会社小松制作所 | 拍摄装置的校正装置、监视装置、作业机械及校正方法 |
US11508091B2 (en) | 2018-06-29 | 2022-11-22 | Komatsu Ltd. | Calibration device for imaging device, monitoring device, work machine and calibration method |
CN111195897A (zh) * | 2018-11-20 | 2020-05-26 | 财团法人工业技术研究院 | 用于机械手臂系统的校正方法及装置 |
CN111195897B (zh) * | 2018-11-20 | 2021-09-14 | 财团法人工业技术研究院 | 用于机械手臂系统的校正方法及装置 |
US11524406B2 (en) | 2018-11-20 | 2022-12-13 | Industrial Technology Research Institute | Calibration method and device for robotic arm system |
Also Published As
Publication number | Publication date |
---|---|
US9790666B2 (en) | 2017-10-17 |
JPWO2016047807A1 (ja) | 2017-04-27 |
DE112015000132T5 (de) | 2016-11-10 |
US20170089041A1 (en) | 2017-03-30 |
CN106817905A (zh) | 2017-06-09 |
KR20170039612A (ko) | 2017-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016047807A1 (ja) | 校正システム、作業機械及び校正方法 | |
JP6229097B2 (ja) | 校正システム、作業機械及び校正方法 | |
JP6050525B2 (ja) | 位置計測システム、作業機械及び位置計測方法 | |
JP6995767B2 (ja) | 計測システム、作業機械及び計測方法 | |
WO2017061518A1 (ja) | 施工管理システム、施工管理方法、及び管理装置 | |
JP6322612B2 (ja) | 施工管理システム及び形状計測方法 | |
KR20170107076A (ko) | 작업 기계의 화상 표시 시스템, 작업 기계의 원격 조작 시스템 및 작업 기계 | |
JP6674846B2 (ja) | 形状計測システム、作業機械及び形状計測方法 | |
JP6867132B2 (ja) | 作業機械の検出処理装置及び作業機械の検出処理方法 | |
JP6585697B2 (ja) | 施工管理システム | |
WO2016047808A1 (ja) | 撮像装置の校正システム、作業機械及び撮像装置の校正方法 | |
KR102231510B1 (ko) | 작업기의 외형 형상 측정 시스템, 작업기의 외형 형상 표시 시스템, 작업기의 제어 시스템 및 작업 기계 | |
JP2019190193A (ja) | 作業機械 | |
JP2017193958A (ja) | 校正システム、作業機械及び校正方法 | |
JP6606230B2 (ja) | 形状計測システム | |
JP2019203291A (ja) | 油圧ショベル、およびシステム | |
JP6616149B2 (ja) | 施工方法、作業機械の制御システム及び作業機械 | |
US20230291989A1 (en) | Display control device and display method | |
KR20230002979A (ko) | 정보 취득 시스템 및 정보 취득 방법 | |
Borthwick | Mining haul truck pose estimation and load profiling using stereo vision | |
Borthwick et al. | Mining haul truck localization using stereo vision | |
JP2022145942A (ja) | 作業機械の画像表示システム | |
JP2021156000A (ja) | 作業機械 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015551654 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20167004863 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14915743 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015000132 Country of ref document: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15845287 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15845287 Country of ref document: EP Kind code of ref document: A1 |