WO2019049515A1 - Work machine measurement system, work machine, and work machine measurement method - Google Patents

Work machine measurement system, work machine, and work machine measurement method Download PDF

Info

Publication number
WO2019049515A1
WO2019049515A1 PCT/JP2018/026709 JP2018026709W WO2019049515A1 WO 2019049515 A1 WO2019049515 A1 WO 2019049515A1 JP 2018026709 W JP2018026709 W JP 2018026709W WO 2019049515 A1 WO2019049515 A1 WO 2019049515A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
work machine
common part
imaging device
Prior art date
Application number
PCT/JP2018/026709
Other languages
French (fr)
Japanese (ja)
Inventor
駿 川本
洋平 関
Original Assignee
株式会社小松製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小松製作所 filed Critical 株式会社小松製作所
Publication of WO2019049515A1 publication Critical patent/WO2019049515A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles
    • G01C7/02Tracing profiles of land surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Definitions

  • the present invention relates to a measurement system of a work machine, a work machine, and a measurement method of the work machine.
  • the position of the imaging device may be acquired based on the photographed image. For example, when acquiring the position of an imaging device based on a feature point common to a plurality of images, it is difficult to accurately acquire the position of the imaging device depending on the extraction state of the feature point. As a result, the measurement accuracy of the three-dimensional shape of the object may be reduced.
  • An aspect of the present invention aims to suppress a decrease in measurement accuracy of a three-dimensional shape of an object.
  • an image acquisition unit that acquires a plurality of images of a construction target captured by an imaging device mounted on a revolving structure during operation of a work machine, and extracting a common part in the plurality of images
  • Position data of the work machine detected by a common part extraction unit that divides the extraction target area into divided areas and extracts the common part from each of the divided areas
  • a position detection unit that detects the position of the work machine
  • An operation comprising: a calculation unit; and a three-dimensional position calculation unit that calculates a three-dimensional position of the construction target based on the position data, the posture data, and the image. ⁇ measurement system is provided.
  • FIG. 1 is a perspective view showing an example of a working machine according to the present embodiment.
  • FIG. 2 is a perspective view showing a part of the work machine according to the present embodiment.
  • FIG. 3 is a functional block diagram showing an example of a measurement system according to the present embodiment.
  • FIG. 4: is a figure which shows typically an example of operation
  • FIG. 5 is a figure which shows typically an example of operation
  • FIG. 6 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment.
  • FIG. 7 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment.
  • FIG. 4 is a figure which shows typically an example of operation
  • FIG. 5: is a figure which shows typically an example of operation
  • FIG. 6 is a schematic view for explaining an example of processing of
  • FIG. 8 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment.
  • FIG. 9 schematically shows a state in which one extraction target area is set in the image.
  • FIG. 10 is a diagram showing an example of an extraction target area of an image according to the present embodiment.
  • FIG. 11 is a diagram in which a common portion is extracted in an image in which the contrast distribution of each divided region is corrected by the histogram flattening algorithm.
  • FIG. 12 is a flowchart showing an example of the measurement method according to the present embodiment.
  • FIG. 13 is a timing chart of the measurement method according to the present embodiment.
  • FIG. 14 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment.
  • a three-dimensional on-site coordinate system (Xg, Yg, Zg), a three-dimensional vehicle coordinate system (Xm, Ym, Zm), and a three-dimensional imager coordinate system (Xs, Ys, Zs) are used. It prescribes and explains the physical relationship of each part.
  • the on-site coordinate system is a coordinate system based on an origin fixed to the earth.
  • the on-site coordinate system is a coordinate system defined by the Global Navigation Satellite System (GNSS).
  • GNSS refers to the Global Navigation Satellite System.
  • GPS Global Positioning System
  • GPS Global Positioning System
  • the on-site coordinate system is defined by the Xg axis in the horizontal plane, the Yg axis in the horizontal plane orthogonal to the Xg axis, and the Zg axis orthogonal to the Xg axis and the Yg axis.
  • the rotation or inclination direction about the Xg axis is the ⁇ Xg direction
  • the rotation or inclination direction about the Yg axis is the ⁇ Yg direction
  • the rotation or inclination direction about the Zg axis is the ⁇ Zg direction.
  • the Zg axis direction is the vertical direction.
  • the vehicle body coordinate system includes an Xm axis of a first predetermined surface with respect to an origin defined on a vehicle body of the work machine, a Ym axis of the first predetermined surface orthogonal to the Xm axis, and a Zm orthogonal to the Xm axis and the Ym axis It is defined by the axis.
  • the rotation or inclination direction about the Xm axis is the ⁇ Xm direction
  • the rotation or inclination direction about the Ym axis is the ⁇ Ym direction
  • the rotation or inclination direction about the Zm axis is the ⁇ Zm direction.
  • the Xm-axis direction is the longitudinal direction of the working machine
  • the Ym-axis direction is the vehicle width direction of the working machine
  • the Zm-axis direction is the vertical direction of the working machine.
  • the imaging device coordinate system includes an Xs axis of the second predetermined surface with respect to the origin defined in the imaging device, a Ys axis of the second predetermined surface orthogonal to the Xs axis, and a Zs axis orthogonal to the Xs axis and the Ys axis And defined by
  • the rotation or tilt direction about the Xs axis is the ⁇ Xs direction
  • the rotation or tilt direction about the Ys axis is the ⁇ Ys direction
  • the rotation or tilt direction about the Zs axis is the ⁇ Zs direction.
  • the Xs axis direction is the vertical direction of the imaging device
  • the Ys axis direction is the width direction of the imaging device
  • the Zs axis direction is the front and back direction of the imaging device.
  • the Zs axis direction is parallel to the optical axis of the optical system of the imaging device.
  • the position in the work site coordinate system, the position in the vehicle body coordinate system, and the position in the imaging device coordinate system can be mutually converted.
  • FIG. 1 is a perspective view showing an example of a working machine 1 according to the present embodiment.
  • the work machine 1 is a hydraulic shovel
  • the work machine 1 is appropriately referred to as a hydraulic shovel 1.
  • the hydraulic shovel 1 has a vehicle body 1 ⁇ / b> B and a work implement 2.
  • the vehicle body 1 ⁇ / b> B includes a revolving unit 3 and a traveling unit 5 that supports the revolving unit 3 so as to be rotatable.
  • the swing body 3 has a cab 4.
  • a hydraulic pump and an internal combustion engine are arranged in the revolving unit 3.
  • the pivoting body 3 is pivotable about a pivot axis Zr.
  • the pivot axis Zr is parallel to the Zm axis of the vehicle coordinate system.
  • the origin of the vehicle body coordinate system is defined, for example, at the center of the swing circle of the swing body 3.
  • the center of the swing circle is located on the pivot axis Zr of the swing body 3.
  • the traveling body 5 has crawler belts 5A and 5B.
  • the hydraulic shovel 1 travels by rotation of the crawler belts 5A and 5B.
  • the Zm axis of the vehicle body coordinate system is orthogonal to the ground contact surface of the crawler belts 5A and 5B.
  • the upper side (+ Zm direction) of the vehicle body coordinate system is a direction away from the ground contact surface of the crawler belts 5A and 5B, and the lower side (-Zm direction) of the vehicle body coordinate system is a direction opposite to the upper side of the vehicle body coordinate system.
  • the work implement 2 is connected to the swing body 3.
  • the swing body coordinate system In the vehicle body coordinate system, at least a part of the work implement 2 is disposed in front of the swing body 3.
  • the front (+ Xm direction) of the body coordinate system is the direction in which the working machine 2 exists with reference to the revolving unit 3 and the back (-Xm direction) of the body coordinate system is the direction opposite to the front of the body coordinate system is there.
  • the work machine 2 drives a boom 6 coupled to the revolving unit 3, an arm 7 coupled to the boom 6, a bucket 8 coupled to the arm 7, a boom cylinder 10 for driving the boom 6, and the arm 7 And a bucket cylinder 12 for driving the bucket 8.
  • the boom cylinder 10, the arm cylinder 11, and the bucket cylinder 12 are hydraulic cylinders driven by hydraulic pressure.
  • the hydraulic shovel 1 has a position detection device 23 that detects the position of the swing body 3, a posture detection device 24 that detects the posture of the swing body 3, and a control device 40.
  • the position detection device 23 detects the position of the swing body 3 in the on-site coordinate system.
  • the position detection device 23 functions as a position detection unit that detects the position of the hydraulic shovel 1.
  • the position of the swing body 3 includes coordinates in the Xg axis direction, coordinates in the Yg axis direction, and coordinates in the Zg axis direction.
  • the position detection device 23 includes a GPS receiver.
  • the position detection device 23 is provided on the revolving unit 3.
  • a GPS antenna 21 is provided on the revolving unit 3.
  • two GPS antennas 21 are arranged in the Ym axis direction of the vehicle body coordinate system.
  • the GPS antenna 21 receives a radio wave from a GPS satellite, and outputs a signal generated based on the received radio wave to the position detection device 23.
  • the position detection device 23 detects the position of the GPS antenna 21 in the on-site coordinate system based on the signal from the GPS antenna 21.
  • the position detection device 23 performs arithmetic processing based on at least one of the positions of the two GPS antennas 21 to calculate the position of the revolving unit 3.
  • the position of the revolving unit 3 may be the position of one GPS antenna 21 or the position between the position of one GPS antenna 21 and the position of the other GPS antenna 21.
  • the posture detection device 24 detects the posture of the rotating body 3 in the on-site coordinate system.
  • the posture detection device 24 functions as a posture detection unit that detects the posture of the hydraulic shovel 1.
  • the posture of the revolving unit 3 is a roll angle indicating the inclination angle of the revolving unit 3 in the rotational direction about the Xm axis, a pitch angle indicating the inclination angle of the revolving unit 3 in the rotational direction about the Ym axis, and Zm And an azimuth angle indicating an inclination angle of the swing body 3 in the rotation direction about the axis and the axis.
  • the attitude detection device 24 includes an inertial measurement unit (IMU).
  • the posture detection device 24 is provided on the revolving unit 3.
  • a gyro sensor may be mounted on the revolving unit 3 as the posture detection device 24.
  • the posture detection device 24 detects an acceleration and an angular velocity that act on the posture detection device 24. By detecting the acceleration and angular velocity acting on the posture detection device 24, the acceleration and angular velocity acting on the revolving unit 3 are detected. The posture detection device 24 performs arithmetic processing based on the acceleration and angular velocity acting on the swing body 3 to calculate the posture of the swing body 3 including the roll angle, the pitch angle, and the azimuth angle.
  • the azimuth may be calculated based on the detection data of the position detection device 23.
  • the position detection device 23 can calculate the azimuth angle of the rotating body 3 with respect to the reference azimuth in the on-site coordinate system based on the position of one GPS antenna 21 and the position of the other GPS antenna 21.
  • the reference orientation is, for example, north.
  • the position detection device 23 calculates a straight line connecting the position of one GPS antenna 21 and the position of the other GPS antenna 21, and based on the angle formed by the calculated straight line and the reference direction, The azimuth angle can be calculated.
  • FIG. 2 is a perspective view showing a part of the hydraulic shovel 1 according to the present embodiment.
  • the hydraulic shovel 1 has a stereo camera 300.
  • the stereo camera 300 refers to a camera capable of measuring the distance to the construction object SB by simultaneously photographing the construction object SB from a plurality of directions to generate parallax data.
  • the stereo camera 300 is mounted on the revolving unit 3.
  • the stereo camera 300 is provided in the cab 4.
  • the stereo camera 300 is disposed in front (+ Xm direction) and above (+ Zm direction) of the cab 4.
  • the stereo camera 300 photographs the construction target SB in front of the hydraulic shovel 1.
  • the stereo camera 300 has a plurality of imaging devices 30.
  • the imaging device 30 is mounted on the revolving unit 3.
  • the imaging device 30 has an optical system and an image sensor.
  • the image sensor includes a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the imaging device 30 includes four imaging devices 30A, 30B, 30C, and 30D.
  • Stereo camera 300 is configured by the pair of imaging devices 30.
  • Stereo camera 300 includes a first stereo camera 301 configured by a pair of imaging devices 30A and 30B, and a second stereo camera 302 configured by a pair of imaging devices 30C and 30D.
  • the imaging devices 30A and 30C are disposed on the + Ym side (the work machine 2 side) of the imaging devices 30B and 30D.
  • the imaging device 30A and the imaging device 30B are arranged at intervals in the Ym axis direction.
  • the imaging device 30C and the imaging device 30D are arranged at intervals in the Ym axis direction.
  • the imaging devices 30A and 30B are disposed on the + Zm side of the imaging devices 30C and 30D. In the Zm-axis direction, the imaging device 30A and the imaging device 30B are disposed at substantially the same position. In the Zm-axis direction, the imaging device 30C and the imaging device 30D are disposed at substantially the same position.
  • the imaging devices 30A and 30B face upward (in the + Zm direction).
  • the imaging devices 30C and 30D face downward (-Zm direction). Further, the imaging devices 30A and 30C face forward (in the + Xm direction).
  • the imaging devices 30B and 30D face the + Ym side (working machine 2 side) slightly more than the front. That is, the imaging devices 30A and 30C face the front of the swing structure 3, and the imaging devices 30B and 30D face the imaging devices 30A and 30C.
  • the imaging devices 30B and 30D may face the front of the swing body 3, and the imaging devices 30A and 30C may face the imaging devices 30B and 30D.
  • the imaging device 30 photographs the construction target SB existing in front of the revolving unit 3.
  • the stereo image processing of the pair of images taken by the pair of imaging devices 30 is performed by the control device 40, whereby three-dimensional data indicating the three-dimensional shape of the construction target SB is calculated.
  • the control device 40 converts three-dimensional data of the construction object SB in the imaging device coordinate system into three-dimensional data of the construction object SB in the on-site coordinate system.
  • the three-dimensional data indicates the three-dimensional position of the construction object SB.
  • the three-dimensional position of the construction object SB includes three-dimensional coordinates of each of a plurality of portions of the surface of the construction object SB.
  • An imaging device coordinate system is defined for each of the plurality of imaging devices 30.
  • the imaging device coordinate system is a coordinate system based on the origin fixed to the imaging device 30.
  • the Zs axis of the imaging device coordinate system coincides with the optical axis of the optical system of the imaging device 30.
  • first stereo camera 301 and second stereo camera 302 are mounted on the revolving unit 3, but one set of stereo cameras may be mounted. And three or more sets of stereo cameras may be mounted.
  • the hydraulic shovel 1 has a driver's seat 4 ⁇ / b> S, an input device 32, and an operation device 35.
  • the driver's seat 4S, the input device 32, and the operating device 35 are disposed in the driver's cab 4.
  • the driver of the hydraulic shovel 1 sits on the driver's seat 4S.
  • the input device 32 is operated by the driver for start or end of imaging by the imaging device 30.
  • the input device 32 is provided near the driver's seat 4S. When the input device 32 is operated, imaging by the imaging device 30 starts or ends.
  • the operating device 35 is operated by the driver to drive or stop the work machine 2, stop the swing or swing of the swing body 3, and stop the traveling or running of the traveling body 5.
  • the operating device 35 includes a right operating lever 35R and a left operating lever 35L for operating the work machine 2 and the swing body 3.
  • the operating device 35 includes a right travel lever and a left travel lever (not shown) for operating the traveling body 5.
  • FIG. 3 is a functional block diagram showing an example of a measurement system 50 according to the present embodiment.
  • the measurement system 50 is provided to the hydraulic shovel 1.
  • the measurement system 50 is an operation for detecting an operation amount of the control device 40, the stereo camera 300 including the first stereo camera 301 and the second stereo camera 302, the position detection device 23, the posture detection device 24, and the operation device 35.
  • An amount sensor 36 and an input device 32 are provided.
  • the control device 40 is provided on the swing body 3 of the hydraulic shovel 1.
  • Control device 40 includes a computer system.
  • the control device 40 includes an arithmetic processing unit 41 including a processor such as a CPU (Central Processing Unit), a volatile memory such as a RAM (Random Access Memory), and a non-volatile memory such as a ROM (Read Only Memory).
  • a storage device 42 and an input / output interface 43 are provided.
  • the arithmetic processing unit 41 includes an image acquisition unit 410, a signal acquisition unit 411, an image processing unit 412, a common part extraction unit 413, an imaging position calculation unit 416, a three-dimensional position calculation unit 417, and a determination unit 418. Have.
  • the image acquisition unit 410 acquires a plurality of images PC of the construction target SB captured by the imaging device 30 mounted on the swing structure 3 during the operation of the hydraulic shovel 1. Further, the image acquisition unit 410 acquires the image PC of the construction object SB captured by the imaging device 30 in the operation stop state of the hydraulic shovel 1, that is, in the state in which traveling and turning are also stopped.
  • the operation of the hydraulic shovel 1 includes one or both of the turning of the revolving unit 3 and the traveling of the traveling unit 5.
  • the fact that the hydraulic shovel 1 is in motion stop includes that the swing body 3 is in the swing stop state and that the traveling body 5 is in the travel stop state.
  • the image acquisition unit 410 acquires a plurality of images PC of the target SB captured by the imaging device 30 while the traveling body 5 is stopped and the swing body 3 is turning. Further, the image acquisition unit 410 acquires the image PC of the object SB captured by the imaging device 30 while the traveling body 5 is stopped and the swing body 3 is stopped.
  • the signal acquisition unit 411 acquires a command signal generated by operating the input device 32.
  • the input device 32 is operated to start or end imaging by the imaging device 30.
  • the command signal includes a shooting start command signal and a shooting end command signal.
  • the arithmetic processing unit 41 outputs a control signal for causing the imaging device 30 to start imaging based on the imaging start instruction signal acquired by the signal acquisition unit 411. Further, the arithmetic processing unit 41 outputs a control signal for causing the imaging device 30 to end shooting based on the shooting end command signal acquired by the signal acquisition unit 411.
  • image PC captured in the period between the time when the signal acquisition unit 411 acquires the imaging start instruction signal and the time when the imaging end instruction signal is acquired is stored in the storage device 42 and stored in the storage device 42.
  • An image PC may be used for stereo processing.
  • the image processing unit 412 corrects the contrast distribution of the image PC acquired by the image acquisition unit 410.
  • the image processing unit 412 corrects the contrast distribution of the image PC based on a histogram equalization algorithm which is a type of histogram correction method.
  • Histogram flattening refers to a process of converting the image PC so that a histogram indicating the relationship between the tone and the frequency of each pixel of the image PC is uniformly distributed over the entire range of the tone.
  • the image processing unit 412 performs image processing based on a histogram flattening algorithm to generate an image PC with improved contrast.
  • the image processing unit 412 corrects the luminance (Luminance) of each pixel of the image PC to improve the contrast of the image PC.
  • the common part extraction unit 413 extracts a common part KS of the plurality of images PC captured by the imaging device 30 during turning of the swing structure 3 of the hydraulic shovel 1.
  • the common part KS will be described later.
  • the imaging position calculation unit 416 calculates the position P and orientation of the imaging device 30 at the time of imaging.
  • the position P of the imaging device 30 at the time of shooting includes the position of the imaging device 30 in the turning direction RD.
  • the position P of the imaging device 30 at the time of shooting includes the position of the imaging device 30 in the traveling direction MD.
  • the imaging position calculation unit 416 calculates the turning angle ⁇ of the turning body 3.
  • the imaging position calculation unit 416 includes position data of the hydraulic shovel 1 detected by the position detection device 23, posture data of the hydraulic shovel 1 detected by the posture detection device 24, and the common part KS extracted by the common part extraction unit 413. Based on the above, the position P and the posture of the imaging device 30 at the time when the image PC is captured are calculated.
  • the three-dimensional position calculation unit 417 performs stereo processing on a pair of images PC captured by the pair of imaging devices 30, to calculate the three-dimensional position of the construction target SB in the imaging device coordinate system.
  • the three-dimensional position calculation unit 417 determines the three-dimensional position of the construction object SB in the imaging device coordinate system based on the position P and the posture of the imaging device 30 calculated by the imaging position calculation unit 416 Convert to 3D position.
  • the storage device 42 includes an image storage unit 423.
  • the image storage unit 423 sequentially stores a plurality of images PC captured by the imaging device 30.
  • the input / output interface 43 includes an interface circuit that connects the arithmetic processing unit 41 and the storage unit 42 to an external device.
  • the hub 31, the position detection device 23, the posture detection device 24, the operation amount sensor 36, and the input device 32 are connected to the input / output interface 43.
  • the plurality of imaging devices 30 (30A, 30B, 30C, 30D) are connected to the arithmetic processing unit 41 via the hub 31.
  • the imaging device 30 captures an image PC of the construction target SB based on the imaging start command signal from the signal acquisition unit 411.
  • the image PC of the construction object SB captured by the imaging device 30 is input to the arithmetic processing unit 41 and the storage device 42 through the hub 31 and the input / output interface 43.
  • Each of the image acquisition unit 410 and the image storage unit 423 acquires the image PC of the construction target SB captured by the imaging device 30 via the hub 31 and the input / output interface 43.
  • the hub 31 may be omitted.
  • the input device 32 is operated to start or end imaging by the imaging device 30. By operating the input device 32, a shooting start command signal or a shooting end command signal is generated. As the input device 32, at least one of an operation switch, an operation button, a touch panel, a voice input, and a keyboard is exemplified.
  • FIG. 4 is a figure which shows typically an example of operation
  • the measurement system 50 continuously captures the image PC of the construction target SB around the hydraulic shovel 1 by the imaging device 30 while the swing body 3 is swinging.
  • the imaging device 30 sequentially captures images PC of the construction target SB at a predetermined cycle while the revolving unit 3 is turning.
  • the imaging device 30 is mounted on the revolving unit 3. When the swing body 3 swings, the imaging region FM of the imaging device 30 moves in the swing direction RD.
  • the imaging device 30 can obtain the images PC of the plurality of regions of the construction object SB by the imaging device 30 continuously photographing the images PC of the construction object SB while the revolving structure 3 is turning.
  • the three-dimensional position calculation unit 417 can calculate the three-dimensional position of the construction target SB around the hydraulic shovel 1 by performing stereo processing on the pair of images PC captured by the pair of imaging devices 30.
  • the three-dimensional position of the construction target SB calculated by stereo processing is defined in the imaging device coordinate system.
  • the three-dimensional position calculation unit 417 converts a three-dimensional position in the imaging device coordinate system into a three-dimensional position in the work site coordinate system.
  • the position and posture of the revolving unit 3 on the on-site coordinate system are required.
  • the position and posture of the swing body 3 in the site coordinate system can be detected by the position detection device 23 and the posture detection device 24.
  • each of the position detection device 23 and the posture detection device 24 mounted on the hydraulic shovel 1 is displaced.
  • the detection data output from each of the position detection device 23 and the posture detection device 24 in a moving state may be unstable or the detection accuracy may decrease.
  • the position detection device 23 outputs detection data at a predetermined cycle. Therefore, when the detection of the position by the position detection device 23 and the photographing by the imaging device 30 are performed in parallel while the hydraulic shovel 1 is turning, if the position detection device 23 is moving, the imaging device 30 photographs an image There is a possibility that the timing to be performed may not be synchronized with the timing at which the position detection device 23 detects the position. If the three-dimensional position of the construction object SB is subjected to coordinate conversion based on detection data of the position detection device 23 detected at a timing different from the timing of imaging, the measurement accuracy of the three-dimensional position may be reduced.
  • the measurement system 50 calculates with high accuracy the position and orientation of the swing body 3 at the time when the image PC is photographed by the imaging device 30 while the swing body 3 is swinging, based on a method described later.
  • the measurement system 50 can calculate the three-dimensional position of the construction object SB in the on-site coordinate system with high accuracy.
  • the imaging position calculation unit 416 acquires detection data of each of the position detection device 23 and the posture detection device 24 detected when the hydraulic shovel 1 is in the operation stop state. There is a high possibility that the detection data detected by the position detection device 23 and the posture detection device 24 when the hydraulic shovel 1 is in the operation stop state is stable.
  • the imaging position calculation unit 416 acquires detection data detected in the operation stop state before and after the turning of the turning body 3.
  • the imaging device 30 captures an image PC of the construction object SB in the operation stop state before and after the turning of the swing body 3.
  • the imaging position calculation unit 416 Based on detection data of the position detection device 23 and the posture detection device 24 acquired when the hydraulic shovel 1 is in the operation stop state, the imaging position calculation unit 416 generates an image PC taken from the image PC while the hydraulic shovel 1 is in the operation stop state.
  • the three-dimensional position of the construction object SB in the calculated imaging device coordinate system is converted to a three-dimensional position in the on-site coordinate system.
  • the imaging position calculation unit 416 calculates the image by the imaging device 30 while the swing body 3 is swinging.
  • the position and posture of the swing body 3 at the time of shooting the image PC are calculated.
  • the imaging position calculation unit 416 converts the three-dimensional position in the imaging device coordinate system calculated from the captured image PC into the three-dimensional position in the site coordinate system based on the calculated position and posture of the swing body 3.
  • FIG. 5 is a figure which shows typically an example of operation
  • FIG. 5 is a schematic diagram for explaining that the imaging device 30 captures an image of the construction target SB while the revolving structure 3 is revolving.
  • the traveling body 5 is in the traveling stop state.
  • the imaging device 30 mounted on the swing body 3 and the imaging region FM of the imaging device 30 move in the swing direction RD.
  • the imaging region FM of the imaging device 30 is defined based on the field of view of the optical system of the imaging device 30.
  • the imaging device 30 acquires an image PC of the construction target SB disposed in the imaging region FM. Due to the swing of the swing body 3, the imaging region FM of the imaging device 30 moves in the swing direction RD of the swing body 3.
  • the imaging device 30 captures an image PC of the construction target SB sequentially disposed in the moving imaging region FM.
  • FIG. 5 illustrates an example in which the imaging region FM moves in the rotation direction RD in the order of the imaging region FM1, the imaging region FM2, and the imaging region FM3 by the rotation of the revolving structure 3.
  • the imaging area FM1 is defined at the first position PJ1 in the turning direction RD.
  • the imaging area FM2 is defined at the second position PJ2 in the turning direction RD.
  • the imaging area FM3 is defined at the third position PJ3 in the turning direction RD.
  • the second position PJ2 is a position turned from the first position PJ1 by a turning angle ⁇ 1.
  • the third position PJ3 is a position turned from the second position PJ2 by a turning angle ⁇ 2.
  • the imaging device 30 displays each of the image PC1 of the construction object SB disposed in the imaging area FM1, the image PC2 of the construction object SB disposed in the imaging area FM2, and the image PC3 of the construction object SB disposed in the imaging area FM3. Take a picture.
  • the image PC1, the image PC2, and the image PC3 are images captured by the same imaging device 30 (the imaging device 30C in the example shown in FIG. 5).
  • the imaging device 30 performs imaging at a predetermined timing so that the overlapping region OB is provided in the adjacent imaging regions FM.
  • FIG. 5 shows an example in which the overlapping area OB1 is provided in the imaging area FM1 and the imaging area FM2, and the overlapping area OB2 is provided between the imaging area FM2 and the imaging area FM3.
  • the overlapping area OB1 is an overlapping area OB in which the image PC1 and a part of the image PC2 overlap.
  • the overlapping area OB2 is an overlapping area OB in which the image PC2 and a part of the image PC3 overlap.
  • the common part KS of the image PC is present in the overlapping area OB.
  • the common part KS1 present in the overlapping area OB1 is a common part KS between the image PC1 and the image PC2.
  • the common part KS2 present in the overlapping area OB2 is a common part KS between the image PC2 and the image PC3.
  • the common part extraction unit 413 extracts a common part KS of the plurality of two-dimensional images PC captured by the imaging device 30.
  • FIG. 6 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment.
  • FIG. 6 is a view showing an example of an image PC (PC1, PC2) captured when the revolving unit 3 is pivoting.
  • the common part extraction unit 413 extracts the common part KS from the image PC.
  • the common part extraction unit 413 uses the image PC1 and the image PC2 from the image PC1 of the construction object SB disposed in the imaging region FM1 and the image PC2 of the construction object SB disposed in the imaging region FM2. Extract common part KS1 with.
  • the imaging position calculation unit 416 calculates an estimated angle ⁇ s ( ⁇ 1 in FIG. 5) of the rotating body 3 based on the common part KS1 extracted by the common part extraction unit 413. Further, the imaging position calculation unit 416 calculates an estimated position Ps (PJ2 in FIG. 5) at the time of imaging based on the estimated angle ⁇ s.
  • the common part KS is a feature point in the image PC.
  • the common part extraction unit 413 extracts the common part KS based on a known feature point detection algorithm such as, for example, ORB (Oriented FAST and Rotated Brief) or Harris corner detection.
  • the common part extraction unit 413 extracts a plurality of feature points from each of the plurality of images PC, and extracts a common part KS by searching for similar feature points from the plurality of extracted feature points.
  • the common part extraction unit 413 may extract a plurality of common parts KS.
  • the common part extraction unit 413 extracts, for example, a corner of the construction target SB in the image PC as a feature point.
  • the imaging area FM of the imaging device 30 moves in the revolving direction RD by the rotation of the rotating body 3, and the common part KS in the image PC is displaced by the movement of the imaging area FM.
  • the common part KS exists at the pixel position PX1 in the image PC1 and exists at the pixel position PX2 in the image PC2.
  • the position where the common part KS exists differs between the image PC1 and the image PC2. That is, the common part KS1 is displaced between the image PC1 and the image PC2.
  • the imaging position calculation unit 416 can calculate the turning angle ⁇ 1 from the position PJ1 to the position PJ2 based on the position of the common part KS in the plurality of images PC.
  • FIG. 7 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment.
  • FIG. 7 is a diagram for explaining the movement of the imaging device coordinate system (Xs, Ys, Zs) as the revolving unit 3 revolves.
  • the swing body 3 swings around the swing axis Zr in the Xm-Ym plane of the vehicle body coordinate system.
  • the imaging device coordinate system (Xs, Ys, Zs) moves around the swing axis Zr by the swing angle ⁇ .
  • the imaging position calculation unit 416 estimates the estimated angle ⁇ s of the swing body 3 when the swing body 3 swings from the pre-swing angle ⁇ ra by the swing angle ⁇ under the constraint that the swing body 3 turns about the swing axis Zr.
  • the estimated position Ps can be calculated.
  • FIG. 8 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment.
  • FIG. 8 is a schematic view showing an example of calculating the turning angle ⁇ of the turning body 3 and the position P of the imaging device 30. As shown in FIG. FIG. 8 shows the turning angle ⁇ of the turning body 3 and the position P of the imaging device 30 in the vehicle body coordinate system.
  • the swing body 3 turns in the turning direction RD in the order of the pre-turning angle ⁇ ra, the first estimated angle ⁇ s1, the second estimated angle ⁇ s2, the third estimated angle ⁇ s3, and the after-turning angle ⁇ rb.
  • the imaging device 30 moves in order of the pre-rotation position Pra, the first estimated position Ps1, the second estimated position Ps2, the third estimated position Ps3, and the post-rotation position Prb by the rotation of the rotating body 3.
  • the imaging area FM of the imaging device 30 moves in the order of the imaging area FMra, the imaging area FMs1, the imaging area FMs2, the imaging area FMs3, and the imaging area FMrb by the movement of the imaging apparatus 30 in the turning direction RD.
  • the swing body 3 When the pre-rotation angle ⁇ ra and the post-rotation angle ⁇ rb, the swing body 3 is in the non-rotation state.
  • the swing body 3 in the swinging stop state at the pre-swing angle ⁇ ra reaches the post-swing angle ⁇ rb via the first estimated angle ⁇ s1, the second estimated angle ⁇ s2, and the third estimated angle ⁇ s3, It turns from ⁇ ra to a post-turn angle ⁇ rb.
  • the imaging device 30 acquires the image PCra of the construction target SB disposed in the imaging region FMra in a state where the imaging device 30 is disposed at the pre-turning position Pra.
  • the imaging device 30 acquires the image PCs1 of the construction target SB disposed in the imaging region FMs1 in the state of being disposed at the first estimated position Ps1.
  • the imaging device 30 acquires the image PCs2 of the construction target SB disposed in the imaging region FMs2 in the state of being disposed at the second estimated position Ps2.
  • the imaging device 30 acquires the image PCs3 of the construction target SB disposed in the imaging region FMs3 in the state of being disposed at the third estimated position Ps3.
  • the imaging device 30 acquires the image PCrb of the construction target SB disposed in the imaging region FMrb in a state where the imaging device 30 is disposed at the post-turning position Prb.
  • the imaging position calculation unit 416 determines the after-swing angle ⁇ rb based on the position and posture of the swing body 3 detected by the position detection device 23 and the posture detection device 24 in the swing stop state after the swing body 3 finishes the swing. calculate.
  • the imaging position calculation unit 416 also calculates the pre-turn position Pra based on the pre-turn angle ⁇ ra.
  • the imaging position calculation unit 416 also calculates the after-turn position Prb based on the after-turn angle ⁇ rb.
  • the pre-turning angle ⁇ ra, the post-turning angle ⁇ rb, the pre-turning position Pra, and the post-turning position Prb are calculated with high accuracy based on detection data of the position detection device 23 and detection data of the posture detection device 24.
  • the imaging position calculation unit 416 estimates the estimated angle ⁇ s (the first estimated angle ⁇ s1, the second estimated angle ⁇ s2, and the like) based on the common part KS of the plurality of images PC captured while the rotating body 3 is turning. And a third estimated angle ⁇ s3). Further, the imaging position calculation unit 416 calculates an estimated position Ps (a first estimated position Ps1, a second estimated position Ps2, and a third estimated position Ps3) based on the estimated angle ⁇ s.
  • the common part extraction unit 413 extracts a common part KS1 between the image PCra and the image PCs1.
  • the imaging position calculation unit 416 performs the turning based on the pre-turning angle ⁇ ra of the turning body 3 and the common portion KS1 of the image PCra and the image PCs1 under the constraint that the turning body 3 turns about the turning axis Zr.
  • the angle ⁇ 1 is calculated.
  • the imaging position calculation unit 416 can calculate the estimated angles ⁇ s ( ⁇ s1, ⁇ s2, ⁇ s3, ⁇ s4) of the swing body 3 when the imaging device 30 captures an image while the swing body 3 is swinging. . Further, by calculating the estimated angle ⁇ s of the swing body 3, the imaging position calculation unit 416 estimates the estimated position Ps (Ps1, Ps2, Ps3) of the imaging device 30 while the swing body 3 is swinging based on the estimated angle ⁇ s. , Ps 4) can be calculated.
  • the imaging position calculation unit 416 acquires, from the imaging device 30, point-in-time data indicating a point in time when each of the plurality of images PC (PCra, PCs1, PCs2, PCs3, PCs4, and PCrb) is captured.
  • the imaging device 416 can calculate the turning speed V based on the time when each of the plurality of images PC is taken and the turning angle ⁇ ( ⁇ 1, ⁇ 2, ⁇ 3, ⁇ 4).
  • the turning angle V when turning from the estimated angle ⁇ s1 to the estimated angle ⁇ s2 can be calculated. Further, the imaging device 416 can calculate the turning direction R based on the time when each of the plurality of images PC is captured and the turning angle ⁇ .
  • the imaging position calculation unit 416 can calculate the turning angle ⁇ based on the common part KS of the plurality of images PC.
  • the post-turning angle ⁇ rb is calculated with high accuracy based on the detection data of the position detection device 23 and the detection data of the posture detection device 24.
  • the estimated angle ⁇ s4 is calculated based on the displacement of the common portion KS. If the estimated angle ⁇ s4 and the estimated position Ps4 calculated using the common part KS are accurate, the difference between the estimated angle ⁇ s4 and the post-turning angle ⁇ rb is small, and the difference between the estimated position Ps4 and the post-turning position Prb is small.
  • the error of the estimated angle ⁇ s4 and the estimated position Ps4 calculated using the common portion KS may be large.
  • the difference between the estimated angle ⁇ s4 and the post-turning angle ⁇ rb is large, and the difference between the estimated position Ps4 and the post-turning position Prb is large.
  • the imaging position calculation unit 416 is turning based on the angle ⁇ r (the pre-turning angle ⁇ ra and the after-turning angle ⁇ rb) which is the turning angle ⁇ in the turning stop state.
  • the estimated angle ⁇ s (a first estimated angle ⁇ s1, a second estimated angle ⁇ s2, a third estimated angle ⁇ s3, and a fourth estimated angle ⁇ s4), which is the turning angle ⁇ , is corrected. Further, the imaging position calculation unit 416 corrects the estimated position Ps based on the corrected estimated angle ⁇ s.
  • the imaging position calculation unit 416 can accurately calculate the estimated position Ps1 of the imaging device 30 at the time of shooting based on the corrected estimated angle ⁇ s1 '. Similarly, the imaging position calculation unit 416 can accurately calculate the estimated position Ps2 of the imaging device 30 at the time of photographing based on the corrected estimated angle ⁇ s2 ′, and the corrected estimated angle ⁇ s3 ′. The estimated position Ps3 of the imaging device 30 at the time of shooting the image PCs3 can be calculated with high accuracy based on the above. Therefore, the three-dimensional position calculation unit 417 determines the construction target SB of the on-site coordinate system based on the estimated position Ps of the imaging device 30 at the time of imaging calculated with high accuracy and the image PCs captured by the imaging device 30. The three-dimensional position can be calculated with high accuracy.
  • the common part extraction part 413 demonstrated the example which extracts the common part KS from the whole image PC.
  • the common part extraction unit 413 may set an extraction target area WD for extracting the common part KS to a part of the image PC, and extract the common part KS from the extraction target area WD.
  • the following description will be made on the assumption that the extraction target area WD is set in the central portion of the image PC.
  • the common part extraction unit 413 can divide the extraction target area WD set in the central part of the image PC into a plurality of divided areas, and extract the common part KS from each of the plurality of divided areas.
  • FIG. 9 schematically shows a state in which one extraction target area WD is set in the image PC.
  • FIG. 9 shows, for example, an image PC captured by the imaging device 30C facing downward.
  • a feature point (common part KS) is extracted by a feature point detection algorithm such as the above-described ORB, depending on the shape of the construction object SB or the imaging condition of the image PC, as shown in FIG.
  • the common part KS may concentrate on a specific area without being distributed.
  • the common part KS is extracted in order from a point having a high degree of feature (feature amount) depending on the feature point detection algorithm.
  • the common portion KS is concentrated in that portion I will.
  • the inside of the hole HL in the image PC is a dark portion and the periphery of the hole HL is bright It becomes a part.
  • the edge of the hole HL which is the boundary between the dark portion and the light portion, is a portion with high contrast in the image PC, so there is a high possibility that the common portion KS is extracted to concentrate on the edge of the hole HL.
  • the dark part inside the hole HL has low contrast, it is difficult to extract the common part KS.
  • the common part KS which is a feature point
  • the common part KS which is a feature point
  • the common part KS is dispersed in the image PC or the extraction target area WD, so that the accuracy of the estimated angle ⁇ s and the estimated position Ps calculated based on the common part KS is improved.
  • the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas, and extracts the common part KS from each of the plurality of divided areas. Thereby, the common part KS extracted in the image PC is dispersed. For example, when the common part KS of 100 points is extracted sequentially from the point with the highest feature amount based on the feature point detection algorithm, if the extraction target region WD is not divided, 100 points are formed along the edge of the hole HL as shown in FIG. Common part KS is concentrated. On the other hand, when the extraction target area WD is divided into four, for example, 25 common parts KS are extracted sequentially from the point with the highest feature amount in each of the four divided areas. Therefore, the common part KS extracted in the image PC is dispersed.
  • the imaging position calculation unit 416 can calculate the estimated angle ⁇ s and the estimated position Ps with high accuracy based on the dispersed common part KS.
  • FIG. 10 is a diagram showing an example of the extraction target area WD of the image PC according to the present embodiment.
  • the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas WD1, WD2, WD3, and WD4 to form a plurality of divided areas WD1, WD2, WD3, and WD4.
  • the common part KS of a plurality of images PC is extracted from each of them.
  • the imaging position calculation unit 416 acquires the estimated angle ⁇ s and the estimated position Ps at the time of imaging based on the common parts KS1, KS2, KS3, and KS4 extracted from each of the plurality of divided areas WD1, WD2, WD3, and WD4. .
  • the common part extraction unit 413 extracts at least one common part KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4.
  • the common part extraction unit 413 extracts a predetermined number or more of common parts KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4.
  • FIG. 10 shows an example in which at least six common parts KS (KS1, KS2, KS3, KS4) are extracted from each of the plurality of divided areas WD1, WD2, WD3, WD4. Thereby, the common part KS extracted in the image PC is dispersed.
  • the extraction target area WD has a rectangular shape.
  • the center of the extraction target area WD in the turning direction RD is divided by the first dividing line, and the center of the extraction target area WD in the vertical direction orthogonal to the turning direction RD is divided by the second dividing line.
  • the outer shapes and areas of divided regions WD1, WD2, WD3, and WD4 are the same.
  • the shapes of the extraction target area WD and the divided areas WD1, WD2, WD3, and WD4 can be arbitrarily determined.
  • the extraction target area WD may not be rectangular, may be square, or may be circular.
  • the outer shapes and areas of divided regions WD1, WD2, WD3 and WD4 may be the same or different. Further, the number of divided regions is not limited to four, and can be set to any plural number.
  • the dark part such as the inside of the hole HL has low contrast
  • the image processing unit 412 corrects the contrast distribution of the image PC.
  • the image processing unit 412 corrects the contrast distribution of the image PC based on, for example, a histogram flattening algorithm.
  • the common part extraction unit 413 extracts a common part KS of the plurality of images PC from the extraction target area WD of the image PC for which the contrast distribution has been corrected. As a result, even if a portion with low contrast is present in the divided area, the contrast distribution is corrected so that the contrast of the divided area becomes high. Since the contrast of the divided area becomes high, the common part extraction unit 413 can extract the common part KS from the divided area.
  • FIG. 11 is a diagram in which the common portion KS is extracted in the image PC in which the contrast distribution of each divided region is corrected by the histogram flattening algorithm.
  • Histogram flattening refers to a process of converting the image PC so that a histogram indicating the relationship between the tone and the frequency of each pixel of the image PC is uniformly distributed over the entire range of the tone.
  • the common part extraction unit 412 corrects the holes in the divided areas WD1, WD2, WD3, and WD4 by the image processing unit 412 correcting the brightness of each pixel of the divided area to improve the contrast of the image PC.
  • the common part KS can be extracted at HL.
  • the extraction target area WD is defined in the center of the image PC.
  • a frame-like non-target area WE is defined around the extraction target area WD.
  • the common part KS extracted in the extraction target area WD is an extraction target It moves to the non-target area WE on the left side or the right side of the area WD. That is, since the common part KS extracted in the extraction target area WD moves to the non-target area WE without immediately disappearing from the image PC when the rotary body 3 turns, the common part KS is continuously extracted. It is possible to accurately estimate the turning angle ⁇ .
  • the extraction target area WD is in the vertical direction so that the traveling body 5 is located outside the extraction target area WD in the image PC, in other words, the image of the traveling body 5 does not enter the extraction target area WD. It is defined in the central part of the image PC. Thereby, extraction of the feature points of the traveling body 5 is suppressed, and the common portion extraction unit 413 can extract more feature points of the construction object SB.
  • the image processing unit 412 may correct the contrast distribution for each of the plurality of divided areas WD1, WD2, WD3, and WD4, or may correct the contrast distribution over the entire extraction target area WD that is not divided.
  • the contrast distribution may be corrected in the image PC.
  • the extraction target area WD may be set at a position shifted from the center of the image PC.
  • the entire area of the image PC may be set as the extraction target area WD without providing the non-target area WE.
  • the input device 32 has a swing continuous shooting switch 32A capable of generating a shooting start command signal for commanding the start of the swing continuous shooting mode and a shooting end command signal for instructing the end of the swing continuous shooting mode.
  • FIG. 12 is a flowchart showing an example of the measurement method according to the present embodiment.
  • FIG. 13 is a timing chart of the measurement method according to the present embodiment.
  • the driver of the hydraulic shovel 1 operates the operation device 35 to turn the swing body 3 so that the imaging device 30 faces the measurement start position of the construction target SB.
  • the determination unit 418 determines that the position detection device 23 and the posture detection device 24 respectively Is determined to be in a stationary state where detection data can be stably output.
  • the imaging position calculation unit 416 determines the position of the swing body 3 from the position detection device 23
  • the detection data indicating the posture of the swing body 3 is acquired from the posture detection device 24 (step S10).
  • the imaging position calculation unit 416 acquires the pre-turning angle ⁇ ra and the pre-turning position Pra.
  • the detection data of the position detection device 23 and the detection data of the attitude detection device 24 are temporarily stored in the storage device 42.
  • the driver of the hydraulic shovel 1 When starting the turning continuous shooting mode, the driver of the hydraulic shovel 1 operates (depresses) the turning continuous shooting switch 32A. In the example shown in FIG. 13, the turning continuous shooting switch 32A is operated at time t2. A photographing start instruction signal generated by operating the turning continuous photographing switch 32A is output to the arithmetic processing unit 41.
  • the signal acquisition unit 411 acquires a photographing start instruction signal (step S20).
  • the arithmetic processing unit 41 starts shooting of the imaging device 30 (step S30).
  • the image acquisition unit 410 acquires the image PCra of the target SB captured by the imaging device 30 in the operation stop state before the hydraulic shovel 1 starts the operation. Further, the image storage unit 423 stores the image PCra.
  • the driver operates the operation device 35 to start the turning of the swing body 3 in a state in which the traveling of the traveling body 5 is stopped (step S40).
  • the driver operates the operation device 35 so that the imaging device 30 turns from the turning start position where the imaging device 30 faces the measurement start position of the target SB to the turning end position where the imaging device 30 steps on the measurement end position of the target SB. And the turning of the turning body 3 is started.
  • the operating device 35 is operated at time t3 to start turning of the swing body 3.
  • Each of the plurality of imaging devices 30 (30A, 30B, 30C, 30D) captures the image PCs of the target SB a plurality of times at intervals of time while the rotating body 3 is in a turning state.
  • the image acquisition unit 410 sequentially acquires a plurality of images PCs of the construction target SB captured by the imaging device 30 while the rotating body 3 is turning. Further, the image storage unit 423 sequentially stores a plurality of image PCs.
  • step S60 When the swing body 3 reaches the swing end position, the driver cancels the operation of the operating device 35 and ends the swing of the swing body 3 (step S60). In the example shown in FIG. 13, the operation of the operating device 35 is released at time t4, and the turning of the swing body 3 is ended.
  • the determination unit 418 causes each of the position detection device 23 and the posture detection device 24 to stabilize detection data. It is determined that it is in a stationary state that can be output.
  • the imaging position calculation unit 416 acquires detection data indicating the position of the swing body 3 from the position detection device 23, Detection data indicating the attitude of the swing body 3 is acquired from the attitude detection device 24 (step S70).
  • the imaging position calculation unit 416 acquires the after-turn angle ⁇ rb and the after-turn position Prb.
  • the detection data of the position detection device 23 and the detection data of the attitude detection device 24 are temporarily stored in the storage device 42.
  • the image acquisition unit 410 acquires the image PCrb of the construction target SB captured by the imaging device 30 in the operation stop state after the hydraulic shovel 1 ends the operation. Further, the image storage unit 423 stores the image PCrb.
  • the driver of the hydraulic shovel 1 operates (depresses) the turning continuous shooting switch 32A.
  • the turning continuous shooting switch 32A is operated at time t6.
  • a photographing end command signal generated by operating the turning continuous photographing switch 32A is output to the arithmetic processing unit 41.
  • the signal acquisition unit 411 acquires a photographing end instruction signal (step S80).
  • the imaging of the imaging device 30 ends (step S90).
  • the imaging device 30 transitions to the imaging disabled state.
  • the common part extraction unit 413 extracts the common part KS from each of the plurality of images PC stored in the image storage unit 423 (step S120).
  • the common part extraction unit 413 extracts a common part KS of the two images PC from at least two images PC captured by the imaging device 30 while the rotating body 3 is stopped and turned. Extraction of the common part KS may be performed on all the images PC acquired from the start to the end of the turning of the turning body 3 in the turning direction R of the turning body 3 or based on a predetermined rule May be implemented for the selected image PC.
  • the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas WD1, WD2, WD3, and WD4 to form a plurality of divided areas WD1, WD2, Extract common parts KS more than a specified number from each of WD3 and WD4.
  • the common part extraction unit 413 can correct the contrast distribution of the image PC using a histogram flattening algorithm.
  • the common part extraction unit 413 can extract the common part KS from the extraction target area WD of the image PC for which the contrast distribution has been corrected.
  • the imaging position calculation unit 416 estimates the estimated angle ⁇ s of the rotating body 3 based on the common part KS of the plurality of images PC (step S140).
  • the imaging position calculation unit 416 can estimate the estimated angle ⁇ s by calculating the turning angle ⁇ based on the position of the common part KS in the plurality of images PC.
  • the imaging position calculation unit 416 determines during turning based on the angle ⁇ r (pre-turning angle ⁇ ra, after-turning angle ⁇ rb). The estimated angle ⁇ s is corrected. Further, the imaging position calculation unit 416 corrects the estimated position Ps of the imaging device 30 during turning based on the corrected estimated angle ⁇ s. The imaging position calculation unit 416 can correct the estimated angle ⁇ s based on the above-described procedure.
  • the three-dimensional position calculation unit 417 performs stereo processing on the image PC at the time of shooting to calculate a three-dimensional position in the imaging device coordinate system of the plurality of construction targets SB (step S160). Also, the three-dimensional position calculation unit 417 converts the three-dimensional position in the imaging device coordinate system into a three-dimensional position in the on-site coordinate system.
  • the extraction target area WD when acquiring the position of the imaging device 30 based on the common part KS that is a feature point common to a plurality of images PC, the extraction target area WD extracts the common part KS Is divided into a plurality of divided areas WD1, WD2, WD3 and WD4.
  • the common part extraction unit 413 extracts the common part KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4.
  • the imaging position calculation unit 416 can calculate the estimated position Ps of the imaging device 30 with high accuracy based on the dispersed common part KS. Therefore, the decrease in the measurement accuracy of the three-dimensional shape of the target SB is suppressed.
  • the image processor 412 corrects the contrast distribution of the image PC. As a result, even if a region with low contrast is present in the extraction target region WD, the contrast distribution is corrected so that the contrast of the extraction target region WD becomes high. Since the contrast of the extraction target area WD is high, the common part extraction unit 413 can extract the common part KS from the extraction target area WD. Therefore, the imaging position calculation unit 416 can calculate the estimated position Ps of the imaging device 30 with high accuracy based on the extracted common part KS.
  • FIG. 14 is a schematic view for explaining an example of processing of the measurement system 30 according to the present embodiment.
  • the fact that the hydraulic shovel 1 is in operation means that the traveling body 5 is stopped for traveling and the swing body 3 is turning. That the hydraulic shovel 1 is in operation may be that the swing body 3 is in a swing stop state and the traveling body 5 is in travel, or that the swing body 3 is in rotation and the travel body 5 is in travel May be.
  • the imaging device 30 and the imaging region FM of the imaging device 30 move in the traveling direction MD.
  • the imaging device 30 captures an image of the target SB such that an overlapping area is provided in the imaging area FM1 and the imaging area FM2, and an overlapping area is provided in the imaging area FM1 and the imaging area FM2.
  • the imaging device 30 captures images PC1, PC2, and PC3 of the target SB disposed in the imaging regions FM1, FM2, and FM3, respectively.
  • the common part extraction unit 413 can extract a common part KS1 between the image PC1 and the image PC2 and a common part KS2 between the image PC2 and the image PC3.
  • the imaging position calculation unit 416 calculates the movement amount ⁇ D1 of the imaging device 30 based on the displacement amount of the common part KS1. Further, the imaging position calculation unit 416 calculates the movement amount ⁇ D2 of the imaging device 30 based on the displacement amount of the common part KS2. The imaging position calculation unit 416 acquires the position of the imaging device 30 at the time of shooting while the traveling body 5 is traveling, based on the movement amount ⁇ ( ⁇ D1, ⁇ D2) of the imaging device 30. In the above-described embodiment, the imaging position calculation unit 416 calculates the turning angle ⁇ . However, in the present embodiment, the position of the imaging device 30 includes the position in the X axis direction, the position in the Y axis direction, and the Z axis direction.
  • the three-dimensional position calculation unit 417 calculates the three-dimensional position of the target SB based on the position of the imaging device 30 at the time of imaging and the image PC captured by the imaging device 30.
  • the position of the imaging device 30 is calculated based on the common portion KS while the hydraulic excavator 1 is turning, but the present invention is not limited to this embodiment.
  • the position of the imaging device 30 while the hydraulic shovel 1 is turning may be calculated based on detection data of the position detection device 23 and detection data of the posture detection device 24.
  • the imaging position calculation unit 416 calculates the imaging position with the turning axis Zr as the constraint condition and the variable as the turning angle ⁇ only, but the imaging device does not use the turning axis Zr as the constraint condition.
  • the 30 positions and orientations may be calculated based on six variables of the position in the X axis direction, the position in the Y axis direction, the position in the Z axis direction, the roll angle, the pitch angle, and the yaw angle.
  • the input device 32 may be attached to at least one of the right control lever 35R and the left control lever 35L of the operation device 35, for example. It may be provided, or may be provided in a portable terminal device. In addition, the input device 32 may be provided outside the hydraulic shovel 1 and the start or end of imaging of the imaging device 30 may be remotely controlled.
  • the imaging position calculation unit 416 may calculate the turning angle based on the detection result of the detection device.
  • the imaging position calculation unit 416 may calculate the swing angle ⁇ of the swing body 3 based on the detection data of the position detection device 23 or the detection data of the posture detection device 24 or the detection data of the operation amount sensor 36
  • the turning angle ⁇ may be calculated based on the above, or the turning angle ⁇ may be calculated based on detection data of an angle sensor capable of detecting the turning angle of the turning body 3, for example, a rotary encoder.
  • the arithmetic processing unit 41 synchronizes the timing at which the turning angle ⁇ , which is the detection value of the angle detection sensor, is acquired from the angle detection sensor with the timing at which at least a pair of imaging devices 30 shoot the construction object SB. In this manner, the timing at which the image PC is captured by at least a pair of imaging devices 30 is associated with the pivot angle ⁇ of the pivoting body 3 at that timing.
  • arithmetic processing unit 41 carries out stereo processing of image PC picturized by at least one pair of imaging devices 30, and realizes three-dimensional measurement, it is not limited to such a thing.
  • the image PC of the excavating object SB around the hydraulic shovel 1 captured by at least a pair of imaging devices 30, the position and posture of the hydraulic shovel 1 at rest when determined by the position detection device 23 and the posture detection device 24 Is transmitted to, for example, an external management device (portable terminal, server device, etc.) of the hydraulic shovel 1.
  • the external management device performs stereo processing on the image PC of the excavating object SB around the hydraulic shovel 1, and obtains the turning angle ⁇ at the time of turning of the turning body 3 and the position and attitude of the hydraulic shovel 1
  • the three-dimensional position of the excavating target SB around the hydraulic shovel 1 at the time of turning may be determined using
  • an external management device of the hydraulic shovel 1 corresponds to the arithmetic processing unit 41. That is, in the other embodiment, the measurement system 50 does not have to realize all the functions only in the hydraulic shovel 1, and the external management device may have some functions, for example, the function of the arithmetic processing unit 41. Good.
  • the imaging device is a stereo camera that includes at least one pair of imaging devices 30.
  • the imaging device may be capable of stereo imaging with one camera. That is, it may be an imaging device capable of stereo processing based on two images different in photographing timing by one camera.
  • the imaging device is not limited to a stereo camera.
  • the imaging device may be, for example, a sensor that can obtain both an image and three-dimensional data, such as a TOF (Time Of Flight) camera.
  • the imaging device may be an imaging device in which three-dimensional data can be obtained by one camera.
  • the imaging device may be an imaging device that can perform stereo measurement with one camera.
  • the imaging device may be a laser scanner.
  • the working machine 1 is the hydraulic shovel 1 having the swing body 3.
  • the work machine 1 may be a work machine having no rotating body.
  • the work machine may be at least one of a bulldozer, a wheel loader, a dump truck, and a motor grader.
  • SYMBOLS 1 Hydraulic shovel (work machine), 1B ... Car body, 2 ... Work machine, 3 ... Swirling body, 4 ... Cab, 4S ... Driver seat, 5 ... Traveling body, 5A, 5B ... Track, 6 ... Boom, 7 ... Arm ... 8 ... bucket, 9 ... counter weight, 10 ... boom cylinder, 11 ... arm cylinder, 12 ... bucket cylinder, 21 ... GPS antenna, 23 ... position detection device, 24 ... posture detection device, 30 ...
  • imaging device 30A, 30B, 30C, 30D: imaging device, 31: hub, 32: input device, 32A: rotation continuous shooting switch, 35: operating device, 35L: left operating lever, 35R: right operating lever, 36: operating amount sensor, 37: ...
  • Reference numeral 40 control device 41: arithmetic processing device 42: storage device 43: input / output interface 50: measurement system 300: stereo camera 301: first Tereo camera 302: second stereo camera 410: image acquisition unit 411: signal acquisition unit 412: image processing unit 413: common part extraction unit 416: imaging position calculation unit 417: three-dimensional position calculation unit 418 ... determination unit, 423 ... image storage unit, FM ... shooting area, KS ... common part, MD ... traveling direction, PC ...

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Component Parts Of Construction Machinery (AREA)

Abstract

The present invention comprises: an image acquisition unit that acquires a plurality of images of a construction target, said images having been captured during operation of a work machine by an imaging device that is mounted to a turning body; a common section extraction unit that divides an extraction target region into division regions, said extraction target region extracting a common section of the plurality of images, and extracts the common section from each of the division regions; an imaging location calculation unit that calculates the location and position of the imaging device at the time at which the images were captured, on the basis of work machine location data detected by a location detection unit that detects the location of the work machine, work machine position data detected by a position detection unit that detects the position of the work machine, and the common section; and a three-dimensional location calculation unit that calculates the three-dimensional location of the construction target on the basis of the location data, the position data, and the images.

Description

作業機械の計測システム、作業機械、及び作業機械の計測方法Measuring system for working machine, working machine, and measuring method for working machine
 本発明は、作業機械の計測システム、作業機械、及び作業機械の計測方法に関する。 The present invention relates to a measurement system of a work machine, a work machine, and a measurement method of the work machine.
 作業機械に係る技術分野において、特許文献1に開示されているような、撮像装置が搭載された作業機械が知られている。一対の撮像装置で撮影された一対の画像がステレオ処理されることにより、作業機械の周辺の施工対象の三次元形状が計測される。 In the technical field relating to work machines, work machines equipped with an imaging device as disclosed in Patent Document 1 are known. By stereo processing a pair of images captured by the pair of imaging devices, the three-dimensional shape of the construction target around the work machine is measured.
国際公開第2017/033991号International Publication No. 2017/033991
 旋回体を旋回させながら撮像装置によって作業機械の周辺の施工対象を連続撮影するとき、撮影された画像に基づいて撮像装置の位置を取得する場合がある。例えば複数の画像で共通する特徴点に基づいて撮像装置の位置を取得する場合、特徴点の抽出状態によっては、撮像装置の位置を精度良く取得することが困難となる。その結果、対象の三次元形状の計測精度が低下する可能性がある。 When continuously shooting a construction target in the vicinity of the work machine by the imaging device while turning the revolving body, the position of the imaging device may be acquired based on the photographed image. For example, when acquiring the position of an imaging device based on a feature point common to a plurality of images, it is difficult to accurately acquire the position of the imaging device depending on the extraction state of the feature point. As a result, the measurement accuracy of the three-dimensional shape of the object may be reduced.
 本発明の態様は、対象の三次元形状の計測精度の低下を抑制することを目的とする。 An aspect of the present invention aims to suppress a decrease in measurement accuracy of a three-dimensional shape of an object.
 本発明の態様に従えば、作業機械の動作中に旋回体に搭載された撮像装置によって撮影された施工対象の複数の画像を取得する画像取得部と、前記複数の画像における共通部分を抽出する抽出対象領域を分割領域に分割し、前記分割領域のそれぞれから前記共通部分を抽出する共通部分抽出部と、前記作業機械の位置を検出する位置検出部により検出された前記作業機械の位置データ、前記作業機械の姿勢を検出する姿勢検出部により検出された前記作業機械の姿勢データ、及び前記共通部分に基づいて、前記画像が撮影された時点における前記撮像装置の位置及び姿勢を算出する撮像位置算出部と、前記位置データ、前記姿勢データ、および前記画像に基づいて、前記施工対象の三次元位置を算出する三次元位置算出部と、を備える作業機械の計測システムが提供される。 According to an aspect of the present invention, an image acquisition unit that acquires a plurality of images of a construction target captured by an imaging device mounted on a revolving structure during operation of a work machine, and extracting a common part in the plurality of images Position data of the work machine detected by a common part extraction unit that divides the extraction target area into divided areas and extracts the common part from each of the divided areas; and a position detection unit that detects the position of the work machine; An imaging position for calculating the position and orientation of the imaging device at the time the image is captured based on the posture data of the work machine detected by the posture detection unit that detects the posture of the work machine and the common part An operation comprising: a calculation unit; and a three-dimensional position calculation unit that calculates a three-dimensional position of the construction target based on the position data, the posture data, and the image.械 measurement system is provided.
 本発明の態様によれば、対象の三次元形状の計測精度の低下を抑制することができる。 According to the aspect of the present invention, it is possible to suppress the decrease in the measurement accuracy of the three-dimensional shape of the object.
図1は、本実施形態に係る作業機械の一例を示す斜視図である。FIG. 1 is a perspective view showing an example of a working machine according to the present embodiment. 図2は、本実施形態に係る作業機械の一部を示す斜視図である。FIG. 2 is a perspective view showing a part of the work machine according to the present embodiment. 図3は、本実施形態に係る計測システムの一例を示す機能ブロック図である。FIG. 3 is a functional block diagram showing an example of a measurement system according to the present embodiment. 図4は、本実施形態に係る作業機械の動作の一例を模式的に示す図である。FIG. 4: is a figure which shows typically an example of operation | movement of the working machine which concerns on this embodiment. 図5は、本実施形態に係る計測システムの動作の一例を模式的に示す図である。FIG. 5: is a figure which shows typically an example of operation | movement of the measurement system which concerns on this embodiment. 図6は、本実施形態に係る計測システムの処理の一例を説明するための模式図である。FIG. 6 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment. 図7は、本実施形態に係る計測システムの処理の一例を説明するための模式図である。FIG. 7 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment. 図8は、本実施形態に係る計測システムの処理の一例を説明するための模式図である。FIG. 8 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment. 図9は、画像に抽出対象領域が1つ設定されている状態を模式的に示す図である。FIG. 9 schematically shows a state in which one extraction target area is set in the image. 図10は、本実施形態に係る画像の抽出対象領域の一例を示す図である。FIG. 10 is a diagram showing an example of an extraction target area of an image according to the present embodiment. 図11は、ヒストグラム平坦化アルゴリズムにより各分割領域のコントラスト分布を補正した画像において共通部分を抽出した図である。FIG. 11 is a diagram in which a common portion is extracted in an image in which the contrast distribution of each divided region is corrected by the histogram flattening algorithm. 図12は、本実施形態に係る計測方法の一例を示すフローチャートである。FIG. 12 is a flowchart showing an example of the measurement method according to the present embodiment. 図13は、本実施形態に係る計測方法のタイミングチャートである。FIG. 13 is a timing chart of the measurement method according to the present embodiment. 図14は、本実施形態に係る計測システムの処理の一例を説明するための模式図である。FIG. 14 is a schematic view for explaining an example of processing of the measurement system according to the present embodiment.
 以下、本発明に係る実施形態について図面を参照しながら説明するが、本発明はこれに限定されない。以下で説明する実施形態の構成要素は適宜組み合わせることができる。また、一部の構成要素を用いない場合もある。 Hereinafter, embodiments of the present invention will be described with reference to the drawings, but the present invention is not limited thereto. The components of the embodiments described below can be combined as appropriate. In addition, some components may not be used.
 以下の説明においては、三次元の現場座標系(Xg,Yg,Zg)、三次元の車体座標系(Xm,Ym,Zm)、及び三次元の撮像装置座標系(Xs,Ys,Zs)を規定して、各部の位置関係について説明する。 In the following description, a three-dimensional on-site coordinate system (Xg, Yg, Zg), a three-dimensional vehicle coordinate system (Xm, Ym, Zm), and a three-dimensional imager coordinate system (Xs, Ys, Zs) are used. It prescribes and explains the physical relationship of each part.
 現場座標系は、地球に固定された原点を基準とする座標系である。現場座標系は、GNSS(Global Navigation Satellite System)によって規定される座標系である。GNSSとは、全地球航法衛星システムをいう。全地球航法衛星システムの一例として、GPS(Global Positioning System)が挙げられる。 The on-site coordinate system is a coordinate system based on an origin fixed to the earth. The on-site coordinate system is a coordinate system defined by the Global Navigation Satellite System (GNSS). GNSS refers to the Global Navigation Satellite System. One example of a global navigation satellite system is the GPS (Global Positioning System).
 現場座標系は、水平面のXg軸と、Xg軸と直交する水平面のYg軸と、Xg軸及びYg軸と直交するZg軸とによって規定される。Xg軸を中心とする回転又は傾斜方向をθXg方向とし、Yg軸を中心とする回転又は傾斜方向をθYg方向とし、Zg軸を中心とする回転又は傾斜方向をθZg方向とする。Zg軸方向は鉛直方向である。 The on-site coordinate system is defined by the Xg axis in the horizontal plane, the Yg axis in the horizontal plane orthogonal to the Xg axis, and the Zg axis orthogonal to the Xg axis and the Yg axis. The rotation or inclination direction about the Xg axis is the θXg direction, the rotation or inclination direction about the Yg axis is the θYg direction, and the rotation or inclination direction about the Zg axis is the θZg direction. The Zg axis direction is the vertical direction.
 車体座標系は、作業機械の車体に規定された原点を基準とする第1所定面のXm軸と、Xm軸と直交する第1所定面のYm軸と、Xm軸及びYm軸と直交するZm軸とによって規定される。Xm軸を中心とする回転又は傾斜方向をθXm方向とし、Ym軸を中心とする回転又は傾斜方向をθYm方向とし、Zm軸を中心とする回転又は傾斜方向をθZm方向とする。Xm軸方向は作業機械の前後方向であり、Ym軸方向は作業機械の車幅方向であり、Zm軸方向は作業機械の上下方向である。 The vehicle body coordinate system includes an Xm axis of a first predetermined surface with respect to an origin defined on a vehicle body of the work machine, a Ym axis of the first predetermined surface orthogonal to the Xm axis, and a Zm orthogonal to the Xm axis and the Ym axis It is defined by the axis. The rotation or inclination direction about the Xm axis is the θXm direction, the rotation or inclination direction about the Ym axis is the θYm direction, and the rotation or inclination direction about the Zm axis is the θZm direction. The Xm-axis direction is the longitudinal direction of the working machine, the Ym-axis direction is the vehicle width direction of the working machine, and the Zm-axis direction is the vertical direction of the working machine.
 撮像装置座標系は、撮像装置に規定された原点を基準とする第2所定面のXs軸と、Xs軸と直交する第2所定面のYs軸と、Xs軸及びYs軸と直交するZs軸とによって規定される。Xs軸を中心とする回転又は傾斜方向をθXs方向とし、Ys軸を中心とする回転又は傾斜方向をθYs方向とし、Zs軸を中心とする回転又は傾斜方向をθZs方向とする。Xs軸方向は撮像装置の上下方向であり、Ys軸方向は撮像装置の幅方向であり、Zs軸方向は撮像装置の前後方向である。Zs軸方向は撮像装置の光学系の光軸と平行である。 The imaging device coordinate system includes an Xs axis of the second predetermined surface with respect to the origin defined in the imaging device, a Ys axis of the second predetermined surface orthogonal to the Xs axis, and a Zs axis orthogonal to the Xs axis and the Ys axis And defined by The rotation or tilt direction about the Xs axis is the θXs direction, the rotation or tilt direction about the Ys axis is the θYs direction, and the rotation or tilt direction about the Zs axis is the θZs direction. The Xs axis direction is the vertical direction of the imaging device, the Ys axis direction is the width direction of the imaging device, and the Zs axis direction is the front and back direction of the imaging device. The Zs axis direction is parallel to the optical axis of the optical system of the imaging device.
 現場座標系における位置と、車体座標系における位置と、撮像装置座標系における位置とは、相互に変換可能である。 The position in the work site coordinate system, the position in the vehicle body coordinate system, and the position in the imaging device coordinate system can be mutually converted.
第1実施形態.
[作業機械]
 図1は、本実施形態に係る作業機械1の一例を示す斜視図である。本実施形態においては、作業機械1が油圧ショベルである例について説明する。以下の説明においては、作業機械1を適宜、油圧ショベル1、と称する。
First embodiment.
[Working machine]
FIG. 1 is a perspective view showing an example of a working machine 1 according to the present embodiment. In the present embodiment, an example in which the work machine 1 is a hydraulic shovel will be described. In the following description, the work machine 1 is appropriately referred to as a hydraulic shovel 1.
 図1に示すように、油圧ショベル1は、車体1Bと、作業機2とを有する。車体1Bは、旋回体3と、旋回体3を旋回可能に支持する走行体5とを有する。 As shown in FIG. 1, the hydraulic shovel 1 has a vehicle body 1 </ b> B and a work implement 2. The vehicle body 1 </ b> B includes a revolving unit 3 and a traveling unit 5 that supports the revolving unit 3 so as to be rotatable.
 旋回体3は、運転室4を有する。油圧ポンプ及び内燃機関が旋回体3に配置される。旋回体3は、旋回軸Zrを中心に旋回可能である。旋回軸Zrは、車体座標系のZm軸と平行である。車体座標系の原点は、例えば旋回体3のスイングサークルの中心に規定される。スイングサークルの中心は、旋回体3の旋回軸Zrに位置する。 The swing body 3 has a cab 4. A hydraulic pump and an internal combustion engine are arranged in the revolving unit 3. The pivoting body 3 is pivotable about a pivot axis Zr. The pivot axis Zr is parallel to the Zm axis of the vehicle coordinate system. The origin of the vehicle body coordinate system is defined, for example, at the center of the swing circle of the swing body 3. The center of the swing circle is located on the pivot axis Zr of the swing body 3.
 走行体5は、履帯5A,5Bを有する。履帯5A,5Bが回転することにより、油圧ショベル1が走行する。車体座標系のZm軸は、履帯5A,5Bの接地面と直交する。車体座標系の上方(+Zm方向)は、履帯5A,5Bの接地面から離れる方向であり、車体座標系の下方(-Zm方向)は、車体座標系の上方とは反対の方向である。 The traveling body 5 has crawler belts 5A and 5B. The hydraulic shovel 1 travels by rotation of the crawler belts 5A and 5B. The Zm axis of the vehicle body coordinate system is orthogonal to the ground contact surface of the crawler belts 5A and 5B. The upper side (+ Zm direction) of the vehicle body coordinate system is a direction away from the ground contact surface of the crawler belts 5A and 5B, and the lower side (-Zm direction) of the vehicle body coordinate system is a direction opposite to the upper side of the vehicle body coordinate system.
 作業機2は、旋回体3に連結される。車体座標系において、作業機2の少なくとも一部は、旋回体3よりも前方に配置される。車体座標系の前方(+Xm方向)は、旋回体3を基準として作業機2が存在する方向であり、車体座標系の後方(-Xm方向)は、車体座標系の前方とは反対の方向である。 The work implement 2 is connected to the swing body 3. In the vehicle body coordinate system, at least a part of the work implement 2 is disposed in front of the swing body 3. The front (+ Xm direction) of the body coordinate system is the direction in which the working machine 2 exists with reference to the revolving unit 3 and the back (-Xm direction) of the body coordinate system is the direction opposite to the front of the body coordinate system is there.
 作業機2は、旋回体3に連結されるブーム6と、ブーム6に連結されるアーム7と、アーム7に連結されるバケット8と、ブーム6を駆動するブームシリンダ10と、アーム7を駆動するアームシリンダ11と、バケット8を駆動するバケットシリンダ12とを有する。ブームシリンダ10、アームシリンダ11、及びバケットシリンダ12はそれぞれ、油圧によって駆動される油圧シリンダである。 The work machine 2 drives a boom 6 coupled to the revolving unit 3, an arm 7 coupled to the boom 6, a bucket 8 coupled to the arm 7, a boom cylinder 10 for driving the boom 6, and the arm 7 And a bucket cylinder 12 for driving the bucket 8. The boom cylinder 10, the arm cylinder 11, and the bucket cylinder 12 are hydraulic cylinders driven by hydraulic pressure.
 また、油圧ショベル1は、旋回体3の位置を検出する位置検出装置23と、旋回体3の姿勢を検出する姿勢検出装置24と、制御装置40とを有する。 Further, the hydraulic shovel 1 has a position detection device 23 that detects the position of the swing body 3, a posture detection device 24 that detects the posture of the swing body 3, and a control device 40.
 位置検出装置23は、現場座標系における旋回体3の位置を検出する。位置検出装置23は、油圧ショベル1の位置を検出する位置検出部として機能する。旋回体3の位置は、Xg軸方向の座標、Yg軸方向の座標、及びZg軸方向の座標を含む。位置検出装置23は、GPS受信機を含む。位置検出装置23は、旋回体3に設けられる。 The position detection device 23 detects the position of the swing body 3 in the on-site coordinate system. The position detection device 23 functions as a position detection unit that detects the position of the hydraulic shovel 1. The position of the swing body 3 includes coordinates in the Xg axis direction, coordinates in the Yg axis direction, and coordinates in the Zg axis direction. The position detection device 23 includes a GPS receiver. The position detection device 23 is provided on the revolving unit 3.
 GPSアンテナ21が旋回体3に設けられる。GPSアンテナ21は、例えば車体座標系のYm軸方向に2つ配置される。GPSアンテナ21は、GPS衛星から電波を受信して、受信した電波に基づいて生成した信号を位置検出装置23に出力する。位置検出装置23は、GPSアンテナ21からの信号に基づいて、現場座標系におけるGPSアンテナ21の位置を検出する。 A GPS antenna 21 is provided on the revolving unit 3. For example, two GPS antennas 21 are arranged in the Ym axis direction of the vehicle body coordinate system. The GPS antenna 21 receives a radio wave from a GPS satellite, and outputs a signal generated based on the received radio wave to the position detection device 23. The position detection device 23 detects the position of the GPS antenna 21 in the on-site coordinate system based on the signal from the GPS antenna 21.
 位置検出装置23は、2つのGPSアンテナ21の位置の少なくとも一方に基づいて演算処理を実施して、旋回体3の位置を算出する。旋回体3の位置は、一方のGPSアンテナ21の位置でもよいし、一方のGPSアンテナ21の位置と他方のGPSアンテナ21の位置との間の位置でもよい。 The position detection device 23 performs arithmetic processing based on at least one of the positions of the two GPS antennas 21 to calculate the position of the revolving unit 3. The position of the revolving unit 3 may be the position of one GPS antenna 21 or the position between the position of one GPS antenna 21 and the position of the other GPS antenna 21.
 姿勢検出装置24は、現場座標系における旋回体3の姿勢を検出する。姿勢検出装置24は、油圧ショベル1の姿勢を検出する姿勢検出部として機能する。旋回体3の姿勢は、Xm軸を中心とする回転方向における旋回体3の傾斜角度を示すロール角と、Ym軸を中心とする回転方向における旋回体3の傾斜角度を示すピッチ角と、Zm軸と中心とする回転方向における旋回体3の傾斜角度を示す方位角とを含む。姿勢検出装置24は、慣性計測装置(IMU:Inertial Measurement Unit)を含む。姿勢検出装置24は、旋回体3に設けられる。なお、姿勢検出装置24としてジャイロセンサが旋回体3に搭載されてもよい。 The posture detection device 24 detects the posture of the rotating body 3 in the on-site coordinate system. The posture detection device 24 functions as a posture detection unit that detects the posture of the hydraulic shovel 1. The posture of the revolving unit 3 is a roll angle indicating the inclination angle of the revolving unit 3 in the rotational direction about the Xm axis, a pitch angle indicating the inclination angle of the revolving unit 3 in the rotational direction about the Ym axis, and Zm And an azimuth angle indicating an inclination angle of the swing body 3 in the rotation direction about the axis and the axis. The attitude detection device 24 includes an inertial measurement unit (IMU). The posture detection device 24 is provided on the revolving unit 3. A gyro sensor may be mounted on the revolving unit 3 as the posture detection device 24.
 姿勢検出装置24は、姿勢検出装置24に作用する加速度及び角速度を検出する。姿勢検出装置24に作用する加速度及び角速度が検出されることにより、旋回体3に作用する加速度及び角速度が検出される。姿勢検出装置24は、旋回体3に作用する加速度及び角速度に基づいて演算処理を実施して、ロール角、ピッチ角、及び方位角を含む旋回体3の姿勢を算出する。 The posture detection device 24 detects an acceleration and an angular velocity that act on the posture detection device 24. By detecting the acceleration and angular velocity acting on the posture detection device 24, the acceleration and angular velocity acting on the revolving unit 3 are detected. The posture detection device 24 performs arithmetic processing based on the acceleration and angular velocity acting on the swing body 3 to calculate the posture of the swing body 3 including the roll angle, the pitch angle, and the azimuth angle.
 なお、位置検出装置23の検出データに基づいて方位角が算出されてもよい。位置検出装置23は、一方のGPSアンテナ21の位置と他方のGPSアンテナ21の位置とに基づいて、現場座標系における基準方位に対する旋回体3の方位角を算出することができる。基準方位は、例えば北である。位置検出装置23は、一方のGPSアンテナ21の位置と他方のGPSアンテナ21の位置とを結ぶ直線を算出し、算出した直線と基準方位とがなす角度に基づいて、基準方位に対する旋回体3の方位角を算出することができる。 The azimuth may be calculated based on the detection data of the position detection device 23. The position detection device 23 can calculate the azimuth angle of the rotating body 3 with respect to the reference azimuth in the on-site coordinate system based on the position of one GPS antenna 21 and the position of the other GPS antenna 21. The reference orientation is, for example, north. The position detection device 23 calculates a straight line connecting the position of one GPS antenna 21 and the position of the other GPS antenna 21, and based on the angle formed by the calculated straight line and the reference direction, The azimuth angle can be calculated.
 次に、本実施形態に係るステレオカメラ300について説明する。図2は、本実施形態に係る油圧ショベル1の一部を示す斜視図である。図2に示すように、油圧ショベル1は、ステレオカメラ300を有する。ステレオカメラ300とは、施工対象SBを複数の方向から同時に撮影して視差データを生成することにより、施工対象SBまでの距離を計測可能なカメラをいう。 Next, the stereo camera 300 according to the present embodiment will be described. FIG. 2 is a perspective view showing a part of the hydraulic shovel 1 according to the present embodiment. As shown in FIG. 2, the hydraulic shovel 1 has a stereo camera 300. The stereo camera 300 refers to a camera capable of measuring the distance to the construction object SB by simultaneously photographing the construction object SB from a plurality of directions to generate parallax data.
 ステレオカメラ300は、油圧ショベル1の周辺の施工対象SBを撮影する。施工対象SBは、油圧ショベル1の作業機2で掘削される掘削対象を含む。なお、施工対象SBは、油圧ショベル1とは別の作業機械によって施工される施工対象でもよいし、作業者によって施工される施工対象でもよい。また、施工対象SBは、施工前の施工対象、施工中の施工対象、及び施工後の施工対象を含む概念である。 The stereo camera 300 photographs the construction target SB around the hydraulic shovel 1. The construction target SB includes a digging target to be excavated by the work machine 2 of the hydraulic shovel 1. The construction target SB may be a construction target to be constructed by a working machine different from the hydraulic shovel 1, or may be a construction target to be constructed by a worker. The construction target SB is a concept including a construction target before construction, a construction target during construction, and a construction target after construction.
 ステレオカメラ300は、旋回体3に搭載される。ステレオカメラ300は、運転室4に設けられる。ステレオカメラ300は、運転室4の前方(+Xm方向)かつ上方(+Zm方向)に配置される。ステレオカメラ300は、油圧ショベル1の前方の施工対象SBを撮影する。 The stereo camera 300 is mounted on the revolving unit 3. The stereo camera 300 is provided in the cab 4. The stereo camera 300 is disposed in front (+ Xm direction) and above (+ Zm direction) of the cab 4. The stereo camera 300 photographs the construction target SB in front of the hydraulic shovel 1.
 ステレオカメラ300は、複数の撮像装置30を有する。撮像装置30は、旋回体3に搭載される。撮像装置30は、光学系と、イメージセンサとを有する。イメージセンサは、CCD(Couple Charged Device)イメージセンサ又はCMOS(Complementary Metal Oxide Semiconductor)イメージセンサを含む。本実施形態において、撮像装置30は、4つの撮像装置30A,30B,30C,30Dを含む。 The stereo camera 300 has a plurality of imaging devices 30. The imaging device 30 is mounted on the revolving unit 3. The imaging device 30 has an optical system and an image sensor. The image sensor includes a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. In the present embodiment, the imaging device 30 includes four imaging devices 30A, 30B, 30C, and 30D.
 一対の撮像装置30によってステレオカメラ300が構成される。ステレオカメラ300は、一対の撮像装置30A,30Bによって構成される第1ステレオカメラ301と、一対の撮像装置30C,30Dによって構成される第2ステレオカメラ302とを含む。 The stereo camera 300 is configured by the pair of imaging devices 30. Stereo camera 300 includes a first stereo camera 301 configured by a pair of imaging devices 30A and 30B, and a second stereo camera 302 configured by a pair of imaging devices 30C and 30D.
 撮像装置30A,30Cは、撮像装置30B,30Dよりも+Ym側(作業機2側)に配置される。撮像装置30Aと撮像装置30Bとは、Ym軸方向に間隔をあけて配置される。撮像装置30Cと撮像装置30Dとは、Ym軸方向に間隔をあけて配置される。撮像装置30A,30Bは、撮像装置30C,30Dよりも+Zm側に配置される。Zm軸方向において、撮像装置30Aと撮像装置30Bとは、実質的に同一の位置に配置される。Zm軸方向において、撮像装置30Cと撮像装置30Dとは、実質的に同一の位置に配置される。 The imaging devices 30A and 30C are disposed on the + Ym side (the work machine 2 side) of the imaging devices 30B and 30D. The imaging device 30A and the imaging device 30B are arranged at intervals in the Ym axis direction. The imaging device 30C and the imaging device 30D are arranged at intervals in the Ym axis direction. The imaging devices 30A and 30B are disposed on the + Zm side of the imaging devices 30C and 30D. In the Zm-axis direction, the imaging device 30A and the imaging device 30B are disposed at substantially the same position. In the Zm-axis direction, the imaging device 30C and the imaging device 30D are disposed at substantially the same position.
 撮像装置30A,30Bは、上方(+Zm方向)を向く。撮像装置30C,30Dは、下方(-Zm方向)を向く。また、撮像装置30A,30Cは、前方(+Xm方向)を向く。撮像装置30B,30Dは、前方よりも僅かに+Ym側(作業機2側)を向く。すなわち、撮像装置30A,30Cは、旋回体3の正面を向き、撮像装置30B,30Dは、撮像装置30A,30C側を向く。なお、撮像装置30B,30Dが旋回体3の正面を向き、撮像装置30A,30Cが撮像装置30B,30D側を向いてもよい。 The imaging devices 30A and 30B face upward (in the + Zm direction). The imaging devices 30C and 30D face downward (-Zm direction). Further, the imaging devices 30A and 30C face forward (in the + Xm direction). The imaging devices 30B and 30D face the + Ym side (working machine 2 side) slightly more than the front. That is, the imaging devices 30A and 30C face the front of the swing structure 3, and the imaging devices 30B and 30D face the imaging devices 30A and 30C. The imaging devices 30B and 30D may face the front of the swing body 3, and the imaging devices 30A and 30C may face the imaging devices 30B and 30D.
 撮像装置30は、旋回体3の前方に存在する施工対象SBを撮影する。一対の撮像装置30が撮影した一対の画像が制御装置40においてステレオ処理されることにより、施工対象SBの三次元形状を示す三次元データが算出される。制御装置40は、撮像装置座標系における施工対象SBの三次元データを現場座標系における施工対象SBの三次元データに変換する。三次元データは、施工対象SBの三次元位置を示す。施工対象SBの三次元位置は、施工対象SBの表面の複数の部位のそれぞれの三次元座標を含む。 The imaging device 30 photographs the construction target SB existing in front of the revolving unit 3. The stereo image processing of the pair of images taken by the pair of imaging devices 30 is performed by the control device 40, whereby three-dimensional data indicating the three-dimensional shape of the construction target SB is calculated. The control device 40 converts three-dimensional data of the construction object SB in the imaging device coordinate system into three-dimensional data of the construction object SB in the on-site coordinate system. The three-dimensional data indicates the three-dimensional position of the construction object SB. The three-dimensional position of the construction object SB includes three-dimensional coordinates of each of a plurality of portions of the surface of the construction object SB.
 複数の撮像装置30のそれぞれに撮像装置座標系が規定される。撮像装置座標系は、撮像装置30に固定された原点を基準とする座標系である。撮像装置座標系のZs軸は、撮像装置30の光学系の光軸と一致する。 An imaging device coordinate system is defined for each of the plurality of imaging devices 30. The imaging device coordinate system is a coordinate system based on the origin fixed to the imaging device 30. The Zs axis of the imaging device coordinate system coincides with the optical axis of the optical system of the imaging device 30.
 なお、本実施形態においては、旋回体3に2組のステレオカメラ(第1ステレオカメラ301及び第2ステレオカメラ302)が搭載されることとするが、1組のステレオカメラが搭載されてもよいし、3組以上のステレオカメラが搭載されてもよい。 In the present embodiment, two sets of stereo cameras (first stereo camera 301 and second stereo camera 302) are mounted on the revolving unit 3, but one set of stereo cameras may be mounted. And three or more sets of stereo cameras may be mounted.
 また、図2に示すように、油圧ショベル1は、運転席4Sと、入力装置32と、操作装置35とを有する。運転席4S、入力装置32、及び操作装置35は、運転室4に配置される。油圧ショベル1の運転者は、運転席4Sに着座する。 Further, as shown in FIG. 2, the hydraulic shovel 1 has a driver's seat 4 </ b> S, an input device 32, and an operation device 35. The driver's seat 4S, the input device 32, and the operating device 35 are disposed in the driver's cab 4. The driver of the hydraulic shovel 1 sits on the driver's seat 4S.
 入力装置32は、撮像装置30による撮影の開始又は終了のために運転者に操作される。入力装置32は、運転席4Sの近傍に設けられる。入力装置32が操作されることにより、撮像装置30による撮影が開始又は終了する。 The input device 32 is operated by the driver for start or end of imaging by the imaging device 30. The input device 32 is provided near the driver's seat 4S. When the input device 32 is operated, imaging by the imaging device 30 starts or ends.
 操作装置35は、作業機2の駆動又は駆動停止、旋回体3の旋回又は旋回停止、及び走行体5の走行又は走行停止のために運転者に操作される。操作装置35は、作業機2及び旋回体3を操作するための右操作レバー35R及び左操作レバー35Lを含む。また、操作装置35は、走行体5を操作するための右走行レバー及び左走行レバー(不図示)を含む。操作装置35が操作されることにより、作業機2の駆動又は駆動停止、旋回体3の旋回又は旋回停止、及び走行体5の走行又は走行停止が実施される。 The operating device 35 is operated by the driver to drive or stop the work machine 2, stop the swing or swing of the swing body 3, and stop the traveling or running of the traveling body 5. The operating device 35 includes a right operating lever 35R and a left operating lever 35L for operating the work machine 2 and the swing body 3. In addition, the operating device 35 includes a right travel lever and a left travel lever (not shown) for operating the traveling body 5. By operating the operation device 35, the drive or the drive stop of the work implement 2, the turning or the swing stop of the swing body 3 and the running or the stop of the running body 5 are implemented.
[計測システム]
 次に、本実施形態に係る計測システム50について説明する。図3は、本実施形態に係る計測システム50の一例を示す機能ブロック図である。計測システム50は、油圧ショベル1に設けられる。
[Measurement system]
Next, a measurement system 50 according to the present embodiment will be described. FIG. 3 is a functional block diagram showing an example of a measurement system 50 according to the present embodiment. The measurement system 50 is provided to the hydraulic shovel 1.
 計測システム50は、制御装置40と、第1ステレオカメラ301及び第2ステレオカメラ302を含むステレオカメラ300と、位置検出装置23と、姿勢検出装置24と、操作装置35の操作量を検出する操作量センサ36と、入力装置32とを備える。 The measurement system 50 is an operation for detecting an operation amount of the control device 40, the stereo camera 300 including the first stereo camera 301 and the second stereo camera 302, the position detection device 23, the posture detection device 24, and the operation device 35. An amount sensor 36 and an input device 32 are provided.
 制御装置40は、油圧ショベル1の旋回体3に設けられる。制御装置40は、コンピュータシステムを含む。制御装置40は、CPU(Central Processing Unit)のようなプロセッサを含む演算処理装置41と、RAM(Random Access Memory)のような揮発性メモリ及びROM(Read Only Memory)のような不揮発性メモリを含む記憶装置42と、入出力インターフェース43とを有する。 The control device 40 is provided on the swing body 3 of the hydraulic shovel 1. Control device 40 includes a computer system. The control device 40 includes an arithmetic processing unit 41 including a processor such as a CPU (Central Processing Unit), a volatile memory such as a RAM (Random Access Memory), and a non-volatile memory such as a ROM (Read Only Memory). A storage device 42 and an input / output interface 43 are provided.
 演算処理装置41は、画像取得部410と、信号取得部411と、画像処理部412と、共通部分抽出部413と、撮像位置算出部416と、三次元位置算出部417と、判定部418とを有する。 The arithmetic processing unit 41 includes an image acquisition unit 410, a signal acquisition unit 411, an image processing unit 412, a common part extraction unit 413, an imaging position calculation unit 416, a three-dimensional position calculation unit 417, and a determination unit 418. Have.
 画像取得部410は、油圧ショベル1の動作中に旋回体3に搭載された撮像装置30によって撮影された施工対象SBの複数の画像PCを取得する。また、画像取得部410は、油圧ショベル1が動作停止状態、すなわち走行も旋回も停止した状態において撮像装置30によって撮影された施工対象SBの画像PCを取得する。 The image acquisition unit 410 acquires a plurality of images PC of the construction target SB captured by the imaging device 30 mounted on the swing structure 3 during the operation of the hydraulic shovel 1. Further, the image acquisition unit 410 acquires the image PC of the construction object SB captured by the imaging device 30 in the operation stop state of the hydraulic shovel 1, that is, in the state in which traveling and turning are also stopped.
 油圧ショベル1が動作中とは、旋回体3が旋回中であること及び走行体5が走行中であることの一方又は両方を含む。油圧ショベル1が動作停止中とは、旋回体3が旋回停止中であること及び走行体5が走行停止中であることを含む。 The operation of the hydraulic shovel 1 includes one or both of the turning of the revolving unit 3 and the traveling of the traveling unit 5. The fact that the hydraulic shovel 1 is in motion stop includes that the swing body 3 is in the swing stop state and that the traveling body 5 is in the travel stop state.
 画像取得部410は、走行体5が走行停止中であり旋回体3が旋回中に撮像装置30によって撮影された対象SBの複数の画像PCを取得する。また、画像取得部410は、走行体5が走行停止中であり旋回体3が旋回停止中において撮像装置30によって撮影された対象SBの画像PCを取得する。 The image acquisition unit 410 acquires a plurality of images PC of the target SB captured by the imaging device 30 while the traveling body 5 is stopped and the swing body 3 is turning. Further, the image acquisition unit 410 acquires the image PC of the object SB captured by the imaging device 30 while the traveling body 5 is stopped and the swing body 3 is stopped.
 信号取得部411は、入力装置32が操作されることにより生成された指令信号を取得する。入力装置32は、撮像装置30による撮影の開始又は終了のために操作される。指令信号は、撮影開始指令信号及び撮影終了指令信号を含む。演算処理装置41は、信号取得部411に取得された撮影開始指令信号に基づいて、撮像装置30に撮影を開始させる制御信号を出力する。また、演算処理装置41は、信号取得部411に取得された撮影終了指令信号に基づいて、撮像装置30に撮影を終了させる制御信号を出力する。 The signal acquisition unit 411 acquires a command signal generated by operating the input device 32. The input device 32 is operated to start or end imaging by the imaging device 30. The command signal includes a shooting start command signal and a shooting end command signal. The arithmetic processing unit 41 outputs a control signal for causing the imaging device 30 to start imaging based on the imaging start instruction signal acquired by the signal acquisition unit 411. Further, the arithmetic processing unit 41 outputs a control signal for causing the imaging device 30 to end shooting based on the shooting end command signal acquired by the signal acquisition unit 411.
 なお、信号取得部411が撮影開始指令信号を取得した時点と撮影終了指令信号を取得した時点との期間に撮影された画像PCが記憶装置42に記憶され、その記憶装置42に記憶されている画像PCがステレオ処理に使用されてもよい。 Note that the image PC captured in the period between the time when the signal acquisition unit 411 acquires the imaging start instruction signal and the time when the imaging end instruction signal is acquired is stored in the storage device 42 and stored in the storage device 42. An image PC may be used for stereo processing.
 画像処理部412は、画像取得部410に取得された画像PCのコントラスト分布を補正する。画像処理部412は、ヒストグラム補正方法の一種であるヒストグラム平坦化アルゴリズム(Histogram Equalization Algorithm)に基づいて画像PCのコントラスト分布を補正する。ヒストグラム平坦化とは、画像PCのそれぞれの画素の階調と頻度との関係を示すヒストグラムが階調の全域に亘って均等に分布するように画像PCを変換する処理をいう。画像処理部412は、ヒストグラム平坦化アルゴリズムに基づいて画像処理を実施して、コントラストが改善された画像PCを生成する。画像処理部412は、画像PCのそれぞれの画素の輝度(Luminance)を補正して、画像PCのコントラストを向上させる。 The image processing unit 412 corrects the contrast distribution of the image PC acquired by the image acquisition unit 410. The image processing unit 412 corrects the contrast distribution of the image PC based on a histogram equalization algorithm which is a type of histogram correction method. Histogram flattening refers to a process of converting the image PC so that a histogram indicating the relationship between the tone and the frequency of each pixel of the image PC is uniformly distributed over the entire range of the tone. The image processing unit 412 performs image processing based on a histogram flattening algorithm to generate an image PC with improved contrast. The image processing unit 412 corrects the luminance (Luminance) of each pixel of the image PC to improve the contrast of the image PC.
 共通部分抽出部413は、油圧ショベル1の旋回体3の旋回中において撮像装置30によって撮影された複数の画像PCの共通部分KSを抽出する。共通部分KSについては後述する。 The common part extraction unit 413 extracts a common part KS of the plurality of images PC captured by the imaging device 30 during turning of the swing structure 3 of the hydraulic shovel 1. The common part KS will be described later.
 撮像位置算出部416は、撮影時の撮像装置30の位置P及び姿勢を算出する。旋回体3が旋回中である場合、撮影時の撮像装置30の位置Pは、旋回方向RDの撮像装置30の位置を含む。走行体5が走行状態である場合、撮影時の撮像装置30の位置Pは、走行方向MDの撮像装置30の位置を含む。また、撮像位置算出部416は、旋回体3の旋回角度θを算出する。撮像位置算出部416は、位置検出装置23により検出された油圧ショベル1の位置データ、姿勢検出装置24により検出された油圧ショベル1の姿勢データ、及び共通部分抽出部413に抽出された共通部分KSに基づいて、画像PCが撮影された時点における撮像装置30の位置P及び姿勢を算出する。 The imaging position calculation unit 416 calculates the position P and orientation of the imaging device 30 at the time of imaging. When the rotating body 3 is turning, the position P of the imaging device 30 at the time of shooting includes the position of the imaging device 30 in the turning direction RD. When the traveling body 5 is in the traveling state, the position P of the imaging device 30 at the time of shooting includes the position of the imaging device 30 in the traveling direction MD. Further, the imaging position calculation unit 416 calculates the turning angle θ of the turning body 3. The imaging position calculation unit 416 includes position data of the hydraulic shovel 1 detected by the position detection device 23, posture data of the hydraulic shovel 1 detected by the posture detection device 24, and the common part KS extracted by the common part extraction unit 413. Based on the above, the position P and the posture of the imaging device 30 at the time when the image PC is captured are calculated.
 三次元位置算出部417は、一対の撮像装置30によって撮影された一対の画像PCをステレオ処理することにより、撮像装置座標系における施工対象SBの三次元位置を算出する。三次元位置算出部417は、撮像位置算出部416に算出された撮像装置30の位置P及び姿勢に基づいて、撮像装置座標系における施工対象SBの三次元位置を現場座標系における施工対象SBの三次元位置に変換する。 The three-dimensional position calculation unit 417 performs stereo processing on a pair of images PC captured by the pair of imaging devices 30, to calculate the three-dimensional position of the construction target SB in the imaging device coordinate system. The three-dimensional position calculation unit 417 determines the three-dimensional position of the construction object SB in the imaging device coordinate system based on the position P and the posture of the imaging device 30 calculated by the imaging position calculation unit 416 Convert to 3D position.
 記憶装置42は、画像記憶部423を有する。画像記憶部423は、撮像装置30によって撮影された複数の画像PCを順次記憶する。 The storage device 42 includes an image storage unit 423. The image storage unit 423 sequentially stores a plurality of images PC captured by the imaging device 30.
 入出力インターフェース43は、演算処理装置41及び記憶装置42と外部機器とを接続するインターフェース回路を含む。入出力インターフェース43には、ハブ31、位置検出装置23、姿勢検出装置24、操作量センサ36、入力装置32が接続される。 The input / output interface 43 includes an interface circuit that connects the arithmetic processing unit 41 and the storage unit 42 to an external device. The hub 31, the position detection device 23, the posture detection device 24, the operation amount sensor 36, and the input device 32 are connected to the input / output interface 43.
 複数の撮像装置30(30A,30B,30C,30D)は、ハブ31を介して演算処理装置41と接続される。撮像装置30は、信号取得部411からの撮影開始指令信号に基づいて、施工対象SBの画像PCを撮影する。撮像装置30が撮影した施工対象SBの画像PCは、ハブ31及び入出力インターフェース43を介して演算処理装置41及び記憶装置42のそれぞれに入力される。画像取得部410及び画像記憶部423のそれぞれは、撮像装置30が撮影した施工対象SBの画像PCを、ハブ31及び入出力インターフェース43を介して取得する。なお、ハブ31は省略されてもよい。 The plurality of imaging devices 30 (30A, 30B, 30C, 30D) are connected to the arithmetic processing unit 41 via the hub 31. The imaging device 30 captures an image PC of the construction target SB based on the imaging start command signal from the signal acquisition unit 411. The image PC of the construction object SB captured by the imaging device 30 is input to the arithmetic processing unit 41 and the storage device 42 through the hub 31 and the input / output interface 43. Each of the image acquisition unit 410 and the image storage unit 423 acquires the image PC of the construction target SB captured by the imaging device 30 via the hub 31 and the input / output interface 43. The hub 31 may be omitted.
 入力装置32は、撮像装置30による撮影の開始又は終了のために操作される。入力装置32が操作されることにより、撮影開始指令信号又は撮影終了指令信号が生成される。入力装置32として、操作スイッチ、操作ボタン、タッチパネル、音声入力、及びキーボードの少なくとも一つが例示される。 The input device 32 is operated to start or end imaging by the imaging device 30. By operating the input device 32, a shooting start command signal or a shooting end command signal is generated. As the input device 32, at least one of an operation switch, an operation button, a touch panel, a voice input, and a keyboard is exemplified.
[旋回地形計測]
 次に、本実施形態に係る油圧ショベル1の動作の一例について説明する。図4は、本実施形態に係る油圧ショベル1の動作の一例を模式的に示す図である。計測システム50は、旋回体3が旋回中において撮像装置30によって油圧ショベル1の周辺の施工対象SBの画像PCを連続撮影する。撮像装置30は、旋回体3が旋回中において、所定の周期で施工対象SBの画像PCを順次撮影する。
[Turning terrain measurement]
Next, an example of the operation of the hydraulic shovel 1 according to the present embodiment will be described. FIG. 4: is a figure which shows typically an example of operation | movement of the hydraulic shovel 1 which concerns on this embodiment. The measurement system 50 continuously captures the image PC of the construction target SB around the hydraulic shovel 1 by the imaging device 30 while the swing body 3 is swinging. The imaging device 30 sequentially captures images PC of the construction target SB at a predetermined cycle while the revolving unit 3 is turning.
 撮像装置30は、旋回体3に搭載されている。旋回体3が旋回することにより、撮像装置30の撮影領域FMは、旋回方向RDに移動する。旋回体3が旋回中において撮像装置30が施工対象SBの画像PCを連続撮影することによって、撮像装置30は、施工対象SBの複数の領域のそれぞれの画像PCを取得することができる。三次元位置算出部417は、一対の撮像装置30で撮影された一対の画像PCをステレオ処理することにより、油圧ショベル1の周辺の施工対象SBの三次元位置を算出することができる。 The imaging device 30 is mounted on the revolving unit 3. When the swing body 3 swings, the imaging region FM of the imaging device 30 moves in the swing direction RD. The imaging device 30 can obtain the images PC of the plurality of regions of the construction object SB by the imaging device 30 continuously photographing the images PC of the construction object SB while the revolving structure 3 is turning. The three-dimensional position calculation unit 417 can calculate the three-dimensional position of the construction target SB around the hydraulic shovel 1 by performing stereo processing on the pair of images PC captured by the pair of imaging devices 30.
 ステレオ処理により算出された施工対象SBの三次元位置は、撮像装置座標系において規定される。三次元位置算出部417は、撮像装置座標系における三次元位置を現場座標系における三次元位置に変換する。撮像装置座標系における三次元位置を現場座標系における三次元位置に変換するため、現場座標系における旋回体3の位置及び姿勢が必要となる。現場座標系における旋回体3の位置及び姿勢は、位置検出装置23及び姿勢検出装置24によって検出可能である。 The three-dimensional position of the construction target SB calculated by stereo processing is defined in the imaging device coordinate system. The three-dimensional position calculation unit 417 converts a three-dimensional position in the imaging device coordinate system into a three-dimensional position in the work site coordinate system. In order to convert the three-dimensional position in the imaging device coordinate system to the three-dimensional position in the on-site coordinate system, the position and posture of the revolving unit 3 on the on-site coordinate system are required. The position and posture of the swing body 3 in the site coordinate system can be detected by the position detection device 23 and the posture detection device 24.
 油圧ショベル1が旋回中において、油圧ショベル1に搭載されている位置検出装置23及び姿勢検出装置24のそれぞれは変位する。動いている状態の位置検出装置23及び姿勢検出装置24のそれぞれから出力される検出データは、不安定であったり検出精度が低下したりする可能性がある。 While the hydraulic shovel 1 is turning, each of the position detection device 23 and the posture detection device 24 mounted on the hydraulic shovel 1 is displaced. The detection data output from each of the position detection device 23 and the posture detection device 24 in a moving state may be unstable or the detection accuracy may decrease.
 位置検出装置23は、所定の周期で検出データを出力する。そのため、油圧ショベル1が旋回中において位置検出装置23による位置の検出と撮像装置30による撮影とが並行して実施される場合、位置検出装置23が動いていると、撮像装置30が画像を撮影するタイミングと位置検出装置23が位置を検出するタイミングとが同期しない可能性がある。撮影のタイミングとは異なるタイミングで検出された位置検出装置23の検出データに基づいて施工対象SBの三次元位置が座標変換されると、三次元位置の計測精度が低下する可能性がある。 The position detection device 23 outputs detection data at a predetermined cycle. Therefore, when the detection of the position by the position detection device 23 and the photographing by the imaging device 30 are performed in parallel while the hydraulic shovel 1 is turning, if the position detection device 23 is moving, the imaging device 30 photographs an image There is a possibility that the timing to be performed may not be synchronized with the timing at which the position detection device 23 detects the position. If the three-dimensional position of the construction object SB is subjected to coordinate conversion based on detection data of the position detection device 23 detected at a timing different from the timing of imaging, the measurement accuracy of the three-dimensional position may be reduced.
 本実施形態において、計測システム50は、後述する方法に基づいて、旋回体3が旋回中において撮像装置30によって画像PCが撮影された時点における旋回体3の位置及び姿勢を高精度に算出する。これにより、計測システム50は、現場座標系における施工対象SBの三次元位置を高精度に算出することができる。 In the present embodiment, the measurement system 50 calculates with high accuracy the position and orientation of the swing body 3 at the time when the image PC is photographed by the imaging device 30 while the swing body 3 is swinging, based on a method described later. Thus, the measurement system 50 can calculate the three-dimensional position of the construction object SB in the on-site coordinate system with high accuracy.
 撮像位置算出部416は、油圧ショベル1が動作停止状態において検出された位置検出装置23及び姿勢検出装置24のそれぞれの検出データを取得する。油圧ショベル1が動作停止状態において位置検出装置23及び姿勢検出装置24によって検出された検出データは安定している可能性が高い。撮像位置算出部416は、旋回体3の旋回開始前及び旋回終了後のそれぞれの動作停止状態において検出された検出データを取得する。撮像装置30は、旋回体3の旋回開始前及び旋回終了後のそれぞれの動作停止状態において施工対象SBの画像PCを撮影する。撮像位置算出部416は、油圧ショベル1が動作停止状態において取得された位置検出装置23及び姿勢検出装置24のそれぞれの検出データに基づいて、油圧ショベル1が動作停止状態において撮影された画像PCから算出された撮像装置座標系における施工対象SBの三次元位置を現場座標系における三次元位置に変換する。 The imaging position calculation unit 416 acquires detection data of each of the position detection device 23 and the posture detection device 24 detected when the hydraulic shovel 1 is in the operation stop state. There is a high possibility that the detection data detected by the position detection device 23 and the posture detection device 24 when the hydraulic shovel 1 is in the operation stop state is stable. The imaging position calculation unit 416 acquires detection data detected in the operation stop state before and after the turning of the turning body 3. The imaging device 30 captures an image PC of the construction object SB in the operation stop state before and after the turning of the swing body 3. Based on detection data of the position detection device 23 and the posture detection device 24 acquired when the hydraulic shovel 1 is in the operation stop state, the imaging position calculation unit 416 generates an image PC taken from the image PC while the hydraulic shovel 1 is in the operation stop state. The three-dimensional position of the construction object SB in the calculated imaging device coordinate system is converted to a three-dimensional position in the on-site coordinate system.
 一方、撮像位置算出部416は、旋回体3が旋回中において撮像装置30によって施工対象SBの画像PCが撮影された場合、後述する方法に基づいて、旋回体3が旋回中において撮像装置30によって画像PCが撮影した時点における旋回体3の位置及び姿勢を算出する。撮像位置算出部416は、算出された旋回体3の位置及び姿勢に基づいて、撮影された画像PCから算出された撮像装置座標系における三次元位置を現場座標系における三次元位置に変換する。 On the other hand, when the imaging device 30 captures an image PC of the construction object SB while the swing body 3 is swinging, the imaging position calculation unit 416 calculates the image by the imaging device 30 while the swing body 3 is swinging. The position and posture of the swing body 3 at the time of shooting the image PC are calculated. The imaging position calculation unit 416 converts the three-dimensional position in the imaging device coordinate system calculated from the captured image PC into the three-dimensional position in the site coordinate system based on the calculated position and posture of the swing body 3.
 図5は、本実施形態に係る計測システム50の動作の一例を模式的に示す図である。図5は、旋回体3が旋回中において撮像装置30が施工対象SBを撮影することを説明するための模式図である。 FIG. 5: is a figure which shows typically an example of operation | movement of the measurement system 50 which concerns on this embodiment. FIG. 5 is a schematic diagram for explaining that the imaging device 30 captures an image of the construction target SB while the revolving structure 3 is revolving.
 以下の説明においては、走行体5が走行停止状態であることとする。図5に示すように、旋回体3が旋回することにより、旋回体3に搭載されている撮像装置30及び撮像装置30の撮影領域FMは、旋回方向RDに移動する。撮像装置30の撮影領域FMは、撮像装置30の光学系の視野に基づいて規定される。撮像装置30は、撮影領域FMに配置される施工対象SBの画像PCを取得する。旋回体3の旋回により、撮像装置30の撮影領域FMは、旋回体3の旋回方向RDに移動する。撮像装置30は、移動する撮影領域FMに順次配置される施工対象SBの画像PCを撮影する。 In the following description, it is assumed that the traveling body 5 is in the traveling stop state. As shown in FIG. 5, when the swing body 3 swings, the imaging device 30 mounted on the swing body 3 and the imaging region FM of the imaging device 30 move in the swing direction RD. The imaging region FM of the imaging device 30 is defined based on the field of view of the optical system of the imaging device 30. The imaging device 30 acquires an image PC of the construction target SB disposed in the imaging region FM. Due to the swing of the swing body 3, the imaging region FM of the imaging device 30 moves in the swing direction RD of the swing body 3. The imaging device 30 captures an image PC of the construction target SB sequentially disposed in the moving imaging region FM.
 図5は、旋回体3の旋回により、撮影領域FMが旋回方向RDに、撮影領域FM1、撮影領域FM2、及び撮影領域FM3の順に移動する例を示す。撮影領域FM1は、旋回方向RDにおける第1位置PJ1に規定される。撮影領域FM2は、旋回方向RDにおける第2位置PJ2に規定される。撮影領域FM3は、旋回方向RDにおける第3位置PJ3に規定される。第2位置PJ2は、第1位置PJ1から旋回角度Δθ1だけ旋回した位置である。第3位置PJ3は、第2位置PJ2から旋回角度Δθ2だけ旋回した位置である。撮像装置30は、撮影領域FM1に配置された施工対象SBの画像PC1、撮影領域FM2に配置された施工対象SBの画像PC2、及び撮影領域FM3に配置された施工対象SBの画像PC3のそれぞれを撮影する。画像PC1、画像PC2、画像PC3は、同一の撮像装置30(図5に示す例では撮像装置30C)により撮像された画像である。 FIG. 5 illustrates an example in which the imaging region FM moves in the rotation direction RD in the order of the imaging region FM1, the imaging region FM2, and the imaging region FM3 by the rotation of the revolving structure 3. The imaging area FM1 is defined at the first position PJ1 in the turning direction RD. The imaging area FM2 is defined at the second position PJ2 in the turning direction RD. The imaging area FM3 is defined at the third position PJ3 in the turning direction RD. The second position PJ2 is a position turned from the first position PJ1 by a turning angle Δθ1. The third position PJ3 is a position turned from the second position PJ2 by a turning angle Δθ2. The imaging device 30 displays each of the image PC1 of the construction object SB disposed in the imaging area FM1, the image PC2 of the construction object SB disposed in the imaging area FM2, and the image PC3 of the construction object SB disposed in the imaging area FM3. Take a picture. The image PC1, the image PC2, and the image PC3 are images captured by the same imaging device 30 (the imaging device 30C in the example shown in FIG. 5).
 撮像装置30は、隣り合う撮影領域FMに重複領域OBが設けられるように、所定のタイミングで撮影する。図5は、撮影領域FM1と撮影領域FM2とに重複領域OB1が設けられ、撮影領域FM2と撮影領域FM3との間に重複領域OB2が設けられる例を示す。重複領域OB1は、画像PC1と画像PC2の一部とが重複する重複領域OBである。重複領域OB2は、画像PC2と画像PC3の一部とが重複する重複領域OBである。 The imaging device 30 performs imaging at a predetermined timing so that the overlapping region OB is provided in the adjacent imaging regions FM. FIG. 5 shows an example in which the overlapping area OB1 is provided in the imaging area FM1 and the imaging area FM2, and the overlapping area OB2 is provided between the imaging area FM2 and the imaging area FM3. The overlapping area OB1 is an overlapping area OB in which the image PC1 and a part of the image PC2 overlap. The overlapping area OB2 is an overlapping area OB in which the image PC2 and a part of the image PC3 overlap.
 重複領域OBに画像PCの共通部分KSが存在する。重複領域OB1に存在する共通部分KS1は、画像PC1と画像PC2との共通部分KSである。重複領域OB2に存在する共通部分KS2は、画像PC2と画像PC3との共通部分KSである。共通部分抽出部413は、撮像装置30によって撮影された複数の二次元の画像PCの共通部分KSを抽出する。 The common part KS of the image PC is present in the overlapping area OB. The common part KS1 present in the overlapping area OB1 is a common part KS between the image PC1 and the image PC2. The common part KS2 present in the overlapping area OB2 is a common part KS between the image PC2 and the image PC3. The common part extraction unit 413 extracts a common part KS of the plurality of two-dimensional images PC captured by the imaging device 30.
 図6は、本実施形態に係る計測システム50の処理の一例を説明するための模式図である。図6は、旋回体3が旋回中において撮影された画像PC(PC1,PC2)の一例を示す図である。共通部分抽出部413は、画像PCから共通部分KSを抽出する。 FIG. 6 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment. FIG. 6 is a view showing an example of an image PC (PC1, PC2) captured when the revolving unit 3 is pivoting. The common part extraction unit 413 extracts the common part KS from the image PC.
 図6に示すように、共通部分抽出部413は、撮影領域FM1に配置された施工対象SBの画像PC1と、撮影領域FM2に配置された施工対象SBの画像PC2とから、画像PC1と画像PC2との共通部分KS1を抽出する。撮像位置算出部416は、共通部分抽出部413で抽出された共通部分KS1に基づいて、旋回体3の推定角度θs(図5においてはΔθ1)を算出する。また、撮像位置算出部416は、推定角度θsに基づいて、撮影時の撮像装置30の推定位置Ps(図5においてはPJ2)を算出する。 As shown in FIG. 6, the common part extraction unit 413 uses the image PC1 and the image PC2 from the image PC1 of the construction object SB disposed in the imaging region FM1 and the image PC2 of the construction object SB disposed in the imaging region FM2. Extract common part KS1 with. The imaging position calculation unit 416 calculates an estimated angle θs (Δθ1 in FIG. 5) of the rotating body 3 based on the common part KS1 extracted by the common part extraction unit 413. Further, the imaging position calculation unit 416 calculates an estimated position Ps (PJ2 in FIG. 5) at the time of imaging based on the estimated angle θs.
 共通部分KSは、画像PCにおける特徴点である。共通部分抽出部413は、例えばORB(Oriented FAST and Rotated BRIEF)又はHarrisコーナー検出のような既知の特徴点検出アルゴリズムに基づいて、共通部分KSを抽出する。共通部分抽出部413は、複数の画像PCのそれぞれから複数の特徴点を抽出し、抽出した複数の特徴点から類似する特徴点を探索することによって、共通部分KSを抽出する。共通部分抽出部413は、共通部分KSを複数点抽出してもよい。共通部分抽出部413は、例えば画像PCにおける施工対象SBの角部を特徴点として抽出する。 The common part KS is a feature point in the image PC. The common part extraction unit 413 extracts the common part KS based on a known feature point detection algorithm such as, for example, ORB (Oriented FAST and Rotated Brief) or Harris corner detection. The common part extraction unit 413 extracts a plurality of feature points from each of the plurality of images PC, and extracts a common part KS by searching for similar feature points from the plurality of extracted feature points. The common part extraction unit 413 may extract a plurality of common parts KS. The common part extraction unit 413 extracts, for example, a corner of the construction target SB in the image PC as a feature point.
 旋回体3の旋回により撮像装置30の撮影領域FMが旋回方向RDに移動し、撮影領域FMの移動により、画像PCにおける共通部分KSが変位する。図6に示す例では、共通部分KSは、画像PC1においては画素位置PX1に存在し、画像PC2においては画素位置PX2に存在する。画像PC1と画像PC2とにおいて、共通部分KSが存在する位置が異なる。すなわち、共通部分KS1は、画像PC1と画像PC2との間において変位したことになる。撮像位置算出部416は、複数の画像PCにおける共通部分KSの位置に基づいて、位置PJ1から位置PJ2までの旋回角度Δθ1を算出することができる。 The imaging area FM of the imaging device 30 moves in the revolving direction RD by the rotation of the rotating body 3, and the common part KS in the image PC is displaced by the movement of the imaging area FM. In the example shown in FIG. 6, the common part KS exists at the pixel position PX1 in the image PC1 and exists at the pixel position PX2 in the image PC2. The position where the common part KS exists differs between the image PC1 and the image PC2. That is, the common part KS1 is displaced between the image PC1 and the image PC2. The imaging position calculation unit 416 can calculate the turning angle Δθ1 from the position PJ1 to the position PJ2 based on the position of the common part KS in the plurality of images PC.
 図7は、本実施形態に係る計測システム50の処理の一例を説明するための模式図である。図7は、旋回体3が旋回することにより撮像装置座標系(Xs,Ys,Zs)が移動することを説明する図である。図7に示すように、旋回体3は、車体座標系のXm-Ym平面内において旋回軸Zrを中心に旋回する。旋回体3が旋回軸Zrを中心に旋回角度Δθだけ旋回すると、撮像装置座標系(Xs,Ys,Zs)が旋回軸Zrを中心に旋回角度Δθだけ移動する。撮像位置算出部416は、旋回体3が旋回軸Zrを中心に旋回することを拘束条件として、旋回前角度θraから旋回角度Δθだけ旋回したときの旋回体3の推定角度θs及び撮像装置30の推定位置Psを算出することができる。 FIG. 7 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment. FIG. 7 is a diagram for explaining the movement of the imaging device coordinate system (Xs, Ys, Zs) as the revolving unit 3 revolves. As shown in FIG. 7, the swing body 3 swings around the swing axis Zr in the Xm-Ym plane of the vehicle body coordinate system. When the swing body 3 swings around the swing axis Zr by the swing angle Δθ, the imaging device coordinate system (Xs, Ys, Zs) moves around the swing axis Zr by the swing angle Δθ. The imaging position calculation unit 416 estimates the estimated angle θs of the swing body 3 when the swing body 3 swings from the pre-swing angle θra by the swing angle Δθ under the constraint that the swing body 3 turns about the swing axis Zr. The estimated position Ps can be calculated.
[推定角度の算出]
 図8は、本実施形態に係る計測システム50の処理の一例を説明するための模式図である。図8は、旋回体3の旋回角度θ及び撮像装置30の位置Pを算出する例を示す模式図である。図8は、車体座標系における旋回体3の旋回角度θ及び撮像装置30の位置Pを示す。
[Calculation of estimated angle]
FIG. 8 is a schematic view for explaining an example of processing of the measurement system 50 according to the present embodiment. FIG. 8 is a schematic view showing an example of calculating the turning angle θ of the turning body 3 and the position P of the imaging device 30. As shown in FIG. FIG. 8 shows the turning angle θ of the turning body 3 and the position P of the imaging device 30 in the vehicle body coordinate system.
 図8に示す例において、旋回体3は、旋回前角度θra、第1推定角度θs1、第2推定角度θs2、第3推定角度θs3、及び旋回後角度θrbの順に、旋回方向RDに旋回する。撮像装置30は、旋回体3の旋回により、旋回前位置Pra、第1推定位置Ps1、第2推定位置Ps2、第3推定位置Ps3、及び旋回後位置Prbの順に移動する。撮像装置30の撮影領域FMは、旋回方向RDの撮像装置30の移動により、撮影領域FMra、撮影領域FMs1、撮影領域FMs2、撮影領域FMs3、及び撮影領域FMrbの順に移動する。 In the example shown in FIG. 8, the swing body 3 turns in the turning direction RD in the order of the pre-turning angle θra, the first estimated angle θs1, the second estimated angle θs2, the third estimated angle θs3, and the after-turning angle θrb. The imaging device 30 moves in order of the pre-rotation position Pra, the first estimated position Ps1, the second estimated position Ps2, the third estimated position Ps3, and the post-rotation position Prb by the rotation of the rotating body 3. The imaging area FM of the imaging device 30 moves in the order of the imaging area FMra, the imaging area FMs1, the imaging area FMs2, the imaging area FMs3, and the imaging area FMrb by the movement of the imaging apparatus 30 in the turning direction RD.
 旋回前角度θra及び旋回後角度θrbのとき、旋回体3は旋回停止状態である。旋回前角度θraにおいて旋回停止状態の旋回体3は、第1推定角度θs1、第2推定角度θs2、及び第3推定角度θs3を経由して、旋回後角度θrbに到達するように、旋回前角度θraから旋回後角度θrbまで旋回する。 When the pre-rotation angle θra and the post-rotation angle θrb, the swing body 3 is in the non-rotation state. The swing body 3 in the swinging stop state at the pre-swing angle θra reaches the post-swing angle θrb via the first estimated angle θs1, the second estimated angle θs2, and the third estimated angle θs3, It turns from θra to a post-turn angle θrb.
 撮像装置30は、旋回前位置Praに配置されている状態で、撮影領域FMraに配置される施工対象SBの画像PCraを取得する。撮像装置30は、第1推定位置Ps1に配置されている状態で、撮影領域FMs1に配置される施工対象SBの画像PCs1を取得する。撮像装置30は、第2推定位置Ps2に配置されている状態で、撮影領域FMs2に配置される施工対象SBの画像PCs2を取得する。撮像装置30は、第3推定位置Ps3に配置されている状態で、撮影領域FMs3に配置される施工対象SBの画像PCs3を取得する。撮像装置30は、旋回後位置Prbに配置されている状態で、撮影領域FMrbに配置される施工対象SBの画像PCrbを取得する。 The imaging device 30 acquires the image PCra of the construction target SB disposed in the imaging region FMra in a state where the imaging device 30 is disposed at the pre-turning position Pra. The imaging device 30 acquires the image PCs1 of the construction target SB disposed in the imaging region FMs1 in the state of being disposed at the first estimated position Ps1. The imaging device 30 acquires the image PCs2 of the construction target SB disposed in the imaging region FMs2 in the state of being disposed at the second estimated position Ps2. The imaging device 30 acquires the image PCs3 of the construction target SB disposed in the imaging region FMs3 in the state of being disposed at the third estimated position Ps3. The imaging device 30 acquires the image PCrb of the construction target SB disposed in the imaging region FMrb in a state where the imaging device 30 is disposed at the post-turning position Prb.
 上述のように、撮像位置算出部416は、旋回体3が旋回開始前の旋回停止状態において位置検出装置23及び姿勢検出装置24によって検出された旋回体3の位置及び姿勢に基づいて、旋回前角度θraを算出する。また、撮像位置算出部416は、旋回体3が旋回終了後の旋回停止状態において位置検出装置23及び姿勢検出装置24によって検出された旋回体3の位置及び姿勢に基づいて、旋回後角度θrbを算出する。また、撮像位置算出部416は、旋回前角度θraに基づいて、旋回前位置Praを算出する。また、撮像位置算出部416は、旋回後角度θrbに基づいて、旋回後位置Prbを算出する。 As described above, based on the position and orientation of the swing body 3 detected by the position detection device 23 and the posture detection device 24 in the swing stop state before the swing body 3 starts swinging, the imaging position calculation unit 416 The angle θra is calculated. Further, the imaging position calculation unit 416 determines the after-swing angle θrb based on the position and posture of the swing body 3 detected by the position detection device 23 and the posture detection device 24 in the swing stop state after the swing body 3 finishes the swing. calculate. The imaging position calculation unit 416 also calculates the pre-turn position Pra based on the pre-turn angle θra. The imaging position calculation unit 416 also calculates the after-turn position Prb based on the after-turn angle θrb.
 旋回前角度θra、旋回後角度θrb、旋回前位置Pra、及び旋回後位置Prbは、位置検出装置23の検出データ及び姿勢検出装置24の検出データに基づいて、高精度に算出される。 The pre-turning angle θra, the post-turning angle θrb, the pre-turning position Pra, and the post-turning position Prb are calculated with high accuracy based on detection data of the position detection device 23 and detection data of the posture detection device 24.
 上述のように、撮像位置算出部416は、旋回体3が旋回中において撮影された複数の画像PCの共通部分KSに基づいて、推定角度θs(第1推定角度θs1、第2推定角度θs2、及び第3推定角度θs3)を算出する。また、撮像位置算出部416は、推定角度θsに基づいて、推定位置Ps(第1推定位置Ps1、第2推定位置Ps2、及び第3推定位置Ps3)を算出する。 As described above, the imaging position calculation unit 416 estimates the estimated angle θs (the first estimated angle θs1, the second estimated angle θs2, and the like) based on the common part KS of the plurality of images PC captured while the rotating body 3 is turning. And a third estimated angle θs3). Further, the imaging position calculation unit 416 calculates an estimated position Ps (a first estimated position Ps1, a second estimated position Ps2, and a third estimated position Ps3) based on the estimated angle θs.
 共通部分抽出部413は、画像PCraと画像PCs1との共通部分KS1を抽出する。撮像位置算出部416は、旋回体3が旋回軸Zrを中心に旋回することを拘束条件として、旋回体3の旋回前角度θraと、画像PCraと画像PCs1との共通部分KS1に基づいて、旋回角度Δθ1を算出する。第1推定角度θs1は、旋回前正解角度θraと旋回角度Δθ1との和である(θs1=θra+Δθ1)。 The common part extraction unit 413 extracts a common part KS1 between the image PCra and the image PCs1. The imaging position calculation unit 416 performs the turning based on the pre-turning angle θra of the turning body 3 and the common portion KS1 of the image PCra and the image PCs1 under the constraint that the turning body 3 turns about the turning axis Zr. The angle Δθ1 is calculated. The first estimated angle θs1 is the sum of the pre-turn correct angle θra and the turning angle Δθ1 (θs1 = θra + Δθ1).
 同様に、共通部分抽出部413は、旋回角度Δθ2、旋回角度Δθ3、及び旋回角度Δθ4を算出することができ、撮像位置算出部416は、第2推定角度θs2(θs2=θra+Δθ1+Δθ2)、第3推定角度θs3(θs3=θra+Δθ1+Δθ2+Δθ3)、第4推定角度θs4(θs4=θra+Δθ1+Δθ2+Δθ3+Δθ4)を算出することができる。 Similarly, the common part extraction unit 413 can calculate the turning angle Δθ2, the turning angle Δθ3, and the turning angle Δθ4, and the imaging position calculating unit 416 calculates the second estimated angle θs2 (θs2 = θra + Δθ1 + Δθ2), the third estimation. The angle θs3 (θs3 = θra + Δθ1 + Δθ2 + Δθ3) and the fourth estimated angle θs4 (θs4 = θra + Δθ1 + Δθ2 + Δθ2 + Δθ3 + Δθ4) can be calculated.
 このように、撮像位置算出部416は、旋回体3が旋回中において撮像装置30が撮影時の旋回体3の推定角度θs(θs1、θs2、θs3、θs4)を段階的に算出することができる。また、旋回体3の推定角度θsが算出されることにより、撮像位置算出部416は、推定角度θsに基づいて、旋回体3が旋回中における撮像装置30の推定位置Ps(Ps1,Ps2,Ps3,Ps4)を算出することができる。 As described above, the imaging position calculation unit 416 can calculate the estimated angles θs (θs1, θs2, θs3, θs4) of the swing body 3 when the imaging device 30 captures an image while the swing body 3 is swinging. . Further, by calculating the estimated angle θs of the swing body 3, the imaging position calculation unit 416 estimates the estimated position Ps (Ps1, Ps2, Ps3) of the imaging device 30 while the swing body 3 is swinging based on the estimated angle θs. , Ps 4) can be calculated.
 また、撮像位置算出部416は、複数の画像PC(PCra,PCs1,PCs2,PCs3,PCs4,PCrb)のそれぞれが撮影された時点を示す時点データを撮像装置30から取得する。撮像装置416は、複数の画像PCのそれぞれが撮影された時点と、旋回角度Δθ(Δθ1,Δθ2,Δθ3,Δθ4)とに基づいて、旋回速度Vを算出することができる。例えば、画像PCs1が撮影された時点から画像PCs2が撮影された時点までの時間と、画像PCs1が撮影されたときの推定角度θs1から推定角度θs2までの移動量とに基づいて、旋回体3が推定角度θs1から推定角度θs2まで旋回するときの旋回角度Vを算出することができる。また、撮像装置416は、複数の画像PCのそれぞれが撮影された時点と旋回角度Δθとに基づいて、旋回方向Rを算出することができる。 In addition, the imaging position calculation unit 416 acquires, from the imaging device 30, point-in-time data indicating a point in time when each of the plurality of images PC (PCra, PCs1, PCs2, PCs3, PCs4, and PCrb) is captured. The imaging device 416 can calculate the turning speed V based on the time when each of the plurality of images PC is taken and the turning angle Δθ (Δθ1, Δθ2, Δθ3, Δθ4). For example, based on the time from the time when image PCs1 is captured to the time when image PCs2 is captured, and based on the amount of movement from estimated angle θs1 to estimated angle θs2 when image PCs1 is captured, The turning angle V when turning from the estimated angle θs1 to the estimated angle θs2 can be calculated. Further, the imaging device 416 can calculate the turning direction R based on the time when each of the plurality of images PC is captured and the turning angle Δθ.
 このように、本実施形態においては、撮像位置算出部416は、複数の画像PCの共通部分KSに基づいて、旋回角度θを算出することができる。 As described above, in the present embodiment, the imaging position calculation unit 416 can calculate the turning angle θ based on the common part KS of the plurality of images PC.
[推定角度の補正]
 旋回後角度θrbは、位置検出装置23の検出データ及び姿勢検出装置24の検出データに基づいて高精度に算出される。一方、推定角度θs4は、共通部分KSの変位量に基づいて算出される。共通部分KSを使って算出された推定角度θs4及び推定位置Ps4が正確であれば、推定角度θs4と旋回後角度θrbとの差は小さく、推定位置Ps4と旋回後位置Prbとの差は小さい。
[Correction of estimated angle]
The post-turning angle θrb is calculated with high accuracy based on the detection data of the position detection device 23 and the detection data of the posture detection device 24. On the other hand, the estimated angle θs4 is calculated based on the displacement of the common portion KS. If the estimated angle θs4 and the estimated position Ps4 calculated using the common part KS are accurate, the difference between the estimated angle θs4 and the post-turning angle θrb is small, and the difference between the estimated position Ps4 and the post-turning position Prb is small.
 一方、例えば共通部分KSの変位量の累積誤差などに起因して、共通部分KSを使って算出された推定角度θs4及び推定位置Ps4の誤差が大きくなる可能性がある。推定角度θs4及び推定位置Ps4の誤差が大きいと、推定角度θs4と旋回後角度θrbとの差は大きくなり、推定位置Ps4と旋回後位置Prbとの差は大きくなる。 On the other hand, for example, due to the accumulated error of the displacement amount of the common portion KS, the error of the estimated angle θs4 and the estimated position Ps4 calculated using the common portion KS may be large. When the error between the estimated angle θs4 and the estimated position Ps4 is large, the difference between the estimated angle θs4 and the post-turning angle θrb is large, and the difference between the estimated position Ps4 and the post-turning position Prb is large.
 撮像位置算出部416は、推定角度θs4と旋回後角度θrbとに差があるとき、旋回停止状態における旋回角度θである角度θr(旋回前角度θra及び旋回後角度θrb)に基づいて、旋回中における旋回角度θである推定角度θs(第1推定角度θs1、第2推定角度θs2、第3推定角度θs3、及び第4推定角度θs4)を補正する。また、撮像位置算出部416は、補正後の推定角度θsに基づいて、推定位置Psを補正する。 When there is a difference between the estimated angle θs4 and the after-turning angle θrb, the imaging position calculation unit 416 is turning based on the angle θr (the pre-turning angle θra and the after-turning angle θrb) which is the turning angle θ in the turning stop state. The estimated angle θs (a first estimated angle θs1, a second estimated angle θs2, a third estimated angle θs3, and a fourth estimated angle θs4), which is the turning angle θ, is corrected. Further, the imaging position calculation unit 416 corrects the estimated position Ps based on the corrected estimated angle θs.
 撮像位置算出部416は、補正後の推定角度θs1’に基づいて、画像PCs1を撮影時の撮像装置30の推定位置Ps1を精度良く算出することができる。同様に、撮像位置算出部416は、補正後の推定角度θs2’に基づいて、画像PCs2を撮影時の撮像装置30の推定位置Ps2を精度良く算出することができ、補正後の推定角度θs3’に基づいて、画像PCs3を撮影時の撮像装置30の推定位置Ps3を高精度に算出することができる。したがって、三次元位置算出部417は、高精度に算出された撮影時の撮像装置30の推定位置Psとその撮像装置30に撮影された画像PCsとに基づいて、現場座標系の施工対象SBの三次元位置を高精度に算出することができる。 The imaging position calculation unit 416 can accurately calculate the estimated position Ps1 of the imaging device 30 at the time of shooting based on the corrected estimated angle θs1 '. Similarly, the imaging position calculation unit 416 can accurately calculate the estimated position Ps2 of the imaging device 30 at the time of photographing based on the corrected estimated angle θs2 ′, and the corrected estimated angle θs3 ′. The estimated position Ps3 of the imaging device 30 at the time of shooting the image PCs3 can be calculated with high accuracy based on the above. Therefore, the three-dimensional position calculation unit 417 determines the construction target SB of the on-site coordinate system based on the estimated position Ps of the imaging device 30 at the time of imaging calculated with high accuracy and the image PCs captured by the imaging device 30. The three-dimensional position can be calculated with high accuracy.
[抽出対象領域の分割及びヒストグラム平坦化]
 次に、抽出対象領域WDの分割について説明する。図6においては、共通部分抽出部413は、画像PC全体から共通部分KSを抽出する例について説明した。共通部分抽出部413は、例えば画像PCの一部に共通部分KSを抽出するための抽出対象領域WDを設定し、その抽出対象領域WDから共通部分KSを抽出してもよい。以下、画像PCにおける中央部分に抽出対象領域WDを設定した場合を想定して説明する。共通部分抽出部413は、画像PCの中央部分に設定した抽出対象領域WDを複数の分割領域に分割し、複数の分割領域のそれぞれから共通部分KSを抽出することができる。
[Division of extraction target area and histogram equalization]
Next, division of the extraction target area WD will be described. In FIG. 6, the common part extraction part 413 demonstrated the example which extracts the common part KS from the whole image PC. For example, the common part extraction unit 413 may set an extraction target area WD for extracting the common part KS to a part of the image PC, and extract the common part KS from the extraction target area WD. The following description will be made on the assumption that the extraction target area WD is set in the central portion of the image PC. The common part extraction unit 413 can divide the extraction target area WD set in the central part of the image PC into a plurality of divided areas, and extract the common part KS from each of the plurality of divided areas.
 図9は、画像PCに抽出対象領域WDが1つ設定されている状態を模式的に示す図である。図9は、例えば下方向を向く撮像装置30Cによって撮像された画像PCである。上述のORBのような特徴点検出アルゴリズムによって特徴点(共通部分KS)が抽出される場合、施工対象SBの形状又は画像PCの撮影条件によっては、図9に示すように、抽出対象領域WDにおいて共通部分KSが分散されずに特定の領域に集中する可能性がある。共通部分KSが特定の領域に集中する原因として、特徴点検出アルゴリズムによっては、特徴の度合い(特徴量)が高い点から順に共通部分KSが抽出されることが挙げられる。そのため、画像PCにコントラストが高い部位が存在する場合、例えば、施工対象SBに人工的な直線部、角部、及び円弧部等が存在する場合には、その部分に共通部分KSが集中してしまう。例えば、図9のように、施工対象SBが油圧ショベル1のバケットによって掘削され、施工対象SBに穴HLが形成された場合、画像PCにおいて穴HLの内側は暗部となり、穴HLの周囲は明部となる。暗部と明部との境界である穴HLの縁は、画像PCにおいてコントラストが高い部位であるため、穴HLの縁に集中するように共通部分KSが抽出される可能性が高い。また、穴HLの内側である暗部は、コントラストが低いため、共通部分KSが抽出され難い。 FIG. 9 schematically shows a state in which one extraction target area WD is set in the image PC. FIG. 9 shows, for example, an image PC captured by the imaging device 30C facing downward. When a feature point (common part KS) is extracted by a feature point detection algorithm such as the above-described ORB, depending on the shape of the construction object SB or the imaging condition of the image PC, as shown in FIG. The common part KS may concentrate on a specific area without being distributed. As a cause of the common part KS concentrating on a specific area, it may be mentioned that the common part KS is extracted in order from a point having a high degree of feature (feature amount) depending on the feature point detection algorithm. Therefore, when there is a portion with high contrast in the image PC, for example, when there are artificial straight portions, corner portions, arc portions, etc. in the construction object SB, the common portion KS is concentrated in that portion I will. For example, as shown in FIG. 9, when the construction object SB is excavated by the bucket of the hydraulic shovel 1 and the hole HL is formed in the construction object SB, the inside of the hole HL in the image PC is a dark portion and the periphery of the hole HL is bright It becomes a part. The edge of the hole HL, which is the boundary between the dark portion and the light portion, is a portion with high contrast in the image PC, so there is a high possibility that the common portion KS is extracted to concentrate on the edge of the hole HL. In addition, since the dark part inside the hole HL has low contrast, it is difficult to extract the common part KS.
 特徴点である共通部分KSが画像PCの特定の領域に集中すると、その共通部分KSに基づいて算出される推定角度θs及び推定位置Psの精度が低下する可能性がある。一方、特徴点である共通部分KSが画像PC又は抽出対象領域WDにおいて分散されることにより、その共通部分KSに基づいて算出される推定角度θs及び推定位置Psの精度が向上する。 When the common part KS, which is a feature point, is concentrated on a specific area of the image PC, the accuracy of the estimated angle θs and the estimated position Ps calculated based on the common part KS may be reduced. On the other hand, the common part KS, which is a feature point, is dispersed in the image PC or the extraction target area WD, so that the accuracy of the estimated angle θs and the estimated position Ps calculated based on the common part KS is improved.
 そこで、共通部分抽出部413は、画像PCの抽出対象領域WDを複数の分割領域に分割して、複数の分割領域のそれぞれから共通部分KSを抽出する。これにより、画像PCにおいて抽出される共通部分KSは分散される。例えば、特徴点検出アルゴリズムに基づいて特徴量が高い点から順に100点の共通部分KSを抽出する場合、抽出対象領域WDが分割されないと、図9に示すように、穴HLの縁に100点の共通部分KSが集中してしまう。一方、抽出対象領域WDが例えば4分割されることにより、4分割された分割領域のそれぞれにおいて特徴量が高い点から順に25点の共通部分KSが抽出される。そのため、画像PCにおいて抽出される共通部分KSは分散される。撮像位置算出部416は、分散された共通部分KSに基づいて、推定角度θs及び推定位置Psを高精度に算出することができる。 Therefore, the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas, and extracts the common part KS from each of the plurality of divided areas. Thereby, the common part KS extracted in the image PC is dispersed. For example, when the common part KS of 100 points is extracted sequentially from the point with the highest feature amount based on the feature point detection algorithm, if the extraction target region WD is not divided, 100 points are formed along the edge of the hole HL as shown in FIG. Common part KS is concentrated. On the other hand, when the extraction target area WD is divided into four, for example, 25 common parts KS are extracted sequentially from the point with the highest feature amount in each of the four divided areas. Therefore, the common part KS extracted in the image PC is dispersed. The imaging position calculation unit 416 can calculate the estimated angle θs and the estimated position Ps with high accuracy based on the dispersed common part KS.
 図10は、本実施形態に係る画像PCの抽出対象領域WDの一例を示す図である。図10に示すように、共通部分抽出部413は、画像PCの抽出対象領域WDを複数の分割領域WD1,WD2,WD3,WD4に分割して、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから、複数の画像PCの共通部分KSを抽出する。撮像位置算出部416は、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから抽出された共通部分KS1、KS2,KS3,KS4に基づいて、撮影時の推定角度θs及び推定位置Psを取得する。 FIG. 10 is a diagram showing an example of the extraction target area WD of the image PC according to the present embodiment. As shown in FIG. 10, the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas WD1, WD2, WD3, and WD4 to form a plurality of divided areas WD1, WD2, WD3, and WD4. The common part KS of a plurality of images PC is extracted from each of them. The imaging position calculation unit 416 acquires the estimated angle θs and the estimated position Ps at the time of imaging based on the common parts KS1, KS2, KS3, and KS4 extracted from each of the plurality of divided areas WD1, WD2, WD3, and WD4. .
 共通部分抽出部413は、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから、共通部分KSを少なくとも1つ抽出する。共通部分抽出部413は、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから、共通部分KSを規定数以上抽出する。図10は、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから、共通部分KS(KS1,KS2,KS3,KS4)が少なくとも6つ抽出されている例を示す。これにより、画像PCにおいて抽出される共通部分KSは分散される。 The common part extraction unit 413 extracts at least one common part KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4. The common part extraction unit 413 extracts a predetermined number or more of common parts KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4. FIG. 10 shows an example in which at least six common parts KS (KS1, KS2, KS3, KS4) are extracted from each of the plurality of divided areas WD1, WD2, WD3, WD4. Thereby, the common part KS extracted in the image PC is dispersed.
 本実施形態において、抽出対象領域WDは、長方形状である。旋回方向RDにおける抽出対象領域WDの中心が第1の分割線によって分割され、旋回方向RDと直交する縦方向における抽出対象領域WDの中心が第2の分割線によって分割される。分割領域WD1,WD2,WD3,WD4の外形及び面積は、同一である。 In the present embodiment, the extraction target area WD has a rectangular shape. The center of the extraction target area WD in the turning direction RD is divided by the first dividing line, and the center of the extraction target area WD in the vertical direction orthogonal to the turning direction RD is divided by the second dividing line. The outer shapes and areas of divided regions WD1, WD2, WD3, and WD4 are the same.
 なお、抽出対象領域WD及び分割領域WD1,WD2,WD3,WD4の形状は、任意に定めることができる。抽出対象領域WDは、長方形状でなくてもよく、正方形状でもよいし、円形状でもよい。また、分割領域WD1,WD2,WD3,WD4の外形及び面積は、同一でもよいし異なってもよい。また、分割領域の数は、4つに限定されず、任意の複数に定めることができる。 The shapes of the extraction target area WD and the divided areas WD1, WD2, WD3, and WD4 can be arbitrarily determined. The extraction target area WD may not be rectangular, may be square, or may be circular. The outer shapes and areas of divided regions WD1, WD2, WD3 and WD4 may be the same or different. Further, the number of divided regions is not limited to four, and can be set to any plural number.
 上述のように、穴HLの内側のような暗部はコントラストが低いため、共通部分KSが抽出され難い。そのため、抽出対象領域WDを複数の分割領域に分割した場合、施工対象SBのうちコントラストの低い部位が分割領域に配置されると、その分割領域における暗部において共通部分KSを抽出することが困難となる。 As described above, since the dark part such as the inside of the hole HL has low contrast, it is difficult to extract the common part KS. Therefore, when the extraction target area WD is divided into a plurality of divided areas, if a portion with low contrast in the construction object SB is arranged in the divided area, it is difficult to extract the common part KS in the dark part in the divided area Become.
 そこで、画像処理部412は、画像PCのコントラスト分布を補正する。画像処理部412は、例えばヒストグラム平坦化アルゴリズムに基づいて、画像PCのコントラスト分布を補正する。共通部分抽出部413は、コントラスト分布が補正された画像PCの抽出対象領域WDから複数の画像PCの共通部分KSを抽出する。これにより、コントラストが低い部位が分割領域に存在しても、その分割領域のコントラストが高くなるように、コントラスト分布が補正される。分割領域のコントラストが高くなることにより、共通部分抽出部413は、分割領域から共通部分KSを抽出することができる。 Therefore, the image processing unit 412 corrects the contrast distribution of the image PC. The image processing unit 412 corrects the contrast distribution of the image PC based on, for example, a histogram flattening algorithm. The common part extraction unit 413 extracts a common part KS of the plurality of images PC from the extraction target area WD of the image PC for which the contrast distribution has been corrected. As a result, even if a portion with low contrast is present in the divided area, the contrast distribution is corrected so that the contrast of the divided area becomes high. Since the contrast of the divided area becomes high, the common part extraction unit 413 can extract the common part KS from the divided area.
 図11は、ヒストグラム平坦化アルゴリズムにより各分割領域のコントラスト分布を補正した画像PCにおいて共通部分KSを抽出した図である。ヒストグラム平坦化とは、画像PCのそれぞれの画素の階調と頻度との関係を示すヒストグラムが階調の全域に亘って均等に分布するように画像PCを変換する処理をいう。 FIG. 11 is a diagram in which the common portion KS is extracted in the image PC in which the contrast distribution of each divided region is corrected by the histogram flattening algorithm. Histogram flattening refers to a process of converting the image PC so that a histogram indicating the relationship between the tone and the frequency of each pixel of the image PC is uniformly distributed over the entire range of the tone.
 図11では、画像処理部412が分割領域のそれぞれの画素の輝度を補正して画像PCのコントラストを向上させることにより、共通部分抽出部412は、各分割領域WD1,WD2,WD3,WD4における穴HLにおいて共通部分KSを抽出することができる。 In FIG. 11, the common part extraction unit 412 corrects the holes in the divided areas WD1, WD2, WD3, and WD4 by the image processing unit 412 correcting the brightness of each pixel of the divided area to improve the contrast of the image PC. The common part KS can be extracted at HL.
 図9に示すように、抽出対象領域WDは、画像PCの中央部に規定される。画像PCにおいて、抽出対象領域WDの周囲には、枠状の非対象領域WEが規定される。 As shown in FIG. 9, the extraction target area WD is defined in the center of the image PC. In the image PC, a frame-like non-target area WE is defined around the extraction target area WD.
 抽出対象領域WDが、旋回方向RDにおける画像PCの中央部に規定されているので、旋回体3が右旋回又は左旋回したとき、抽出対象領域WDにおいて抽出された共通部分KSは、抽出対象領域WDの左側又は右側の非対象領域WEに移動する。すなわち、旋回体3が旋回したとき、抽出対象領域WDにおいて抽出された共通部分KSは、画像PCから直ちに無くならずに、非対象領域WEに移動するため、継続的に共通部分KSを抽出することができ、精度良く旋回角度θを推定することができる。 Since the extraction target area WD is defined at the center of the image PC in the turning direction RD, when the swing body 3 turns right or left, the common part KS extracted in the extraction target area WD is an extraction target It moves to the non-target area WE on the left side or the right side of the area WD. That is, since the common part KS extracted in the extraction target area WD moves to the non-target area WE without immediately disappearing from the image PC when the rotary body 3 turns, the common part KS is continuously extracted. It is possible to accurately estimate the turning angle θ.
 また、抽出対象領域WDは、画像PCにおいて走行体5が抽出対象領域WDの外側に位置するように、換言すれば、走行体5の画像が抽出対象領域WDに入り込まないように、縦方向における画像PCの中央部に規定される。これにより、走行体5の特徴点が抽出されることが抑制され、共通部分抽出部413は、より多くの施工対象SBの特徴点を抽出することができる。 Further, the extraction target area WD is in the vertical direction so that the traveling body 5 is located outside the extraction target area WD in the image PC, in other words, the image of the traveling body 5 does not enter the extraction target area WD. It is defined in the central part of the image PC. Thereby, extraction of the feature points of the traveling body 5 is suppressed, and the common portion extraction unit 413 can extract more feature points of the construction object SB.
 なお、画像処理部412は、複数の分割領域WD1,WD2,WD3,WD4毎にコントラスト分布を補正してもよいし、分割されていない抽出対象領域WD全体のコントラスト分布を補正してもよいし、画像PCにおいてコントラスト分布を補正してもよい。 The image processing unit 412 may correct the contrast distribution for each of the plurality of divided areas WD1, WD2, WD3, and WD4, or may correct the contrast distribution over the entire extraction target area WD that is not divided. The contrast distribution may be corrected in the image PC.
 なお、抽出対象領域WDは、画像PCの中央部からずれた位置に設定されてもよい。また、非対象領域WEを設けずに、画像PCの全域が抽出対象領域WDに設定されてもよい。 The extraction target area WD may be set at a position shifted from the center of the image PC. In addition, the entire area of the image PC may be set as the extraction target area WD without providing the non-target area WE.
[入力装置]
 本実施形態において、入力装置32は、旋回連続撮影モードの開始を指令する撮影開始指令信号及び旋回連続撮影モードの終了を指令する撮影終了指令信号を生成可能な旋回連続撮影スイッチ32Aを有する。
[Input device]
In the present embodiment, the input device 32 has a swing continuous shooting switch 32A capable of generating a shooting start command signal for commanding the start of the swing continuous shooting mode and a shooting end command signal for instructing the end of the swing continuous shooting mode.
[計測方法]
 図12は、本実施形態に係る計測方法の一例を示すフローチャートである。図13は、本実施形態に係る計測方法のタイミングチャートである。
[Measurement method]
FIG. 12 is a flowchart showing an example of the measurement method according to the present embodiment. FIG. 13 is a timing chart of the measurement method according to the present embodiment.
 油圧ショベル1の運転者は、撮像装置30が施工対象SBの計測開始位置を向くように、操作装置35を操作して旋回体3を旋回させる。旋回体3の旋回を含む油圧ショベル1の作動が停止した時点t0から予め規定された時間が経過して時点t1になったとき、判定部418は、位置検出装置23及び姿勢検出装置24のそれぞれが検出データを安定して出力できる静定状態であると判定する。 The driver of the hydraulic shovel 1 operates the operation device 35 to turn the swing body 3 so that the imaging device 30 faces the measurement start position of the construction target SB. When a predetermined time elapses from time t0 when the operation of the hydraulic excavator 1 including the swing of the swing structure 3 stops and time t1 is reached, the determination unit 418 determines that the position detection device 23 and the posture detection device 24 respectively Is determined to be in a stationary state where detection data can be stably output.
 油圧ショベル1が動作停止状態であり、位置検出装置32及び姿勢検出装置24が静定状態において、撮影開始指令信号を取得すると、撮像位置算出部416は、位置検出装置23から旋回体3の位置を示す検出データを取得し、姿勢検出装置24から旋回体3の姿勢を示す検出データを取得する(ステップS10)。 When the hydraulic shovel 1 is in the operation stop state, and the position detection device 32 and the posture detection device 24 are in the stationary state, when the imaging start command signal is obtained, the imaging position calculation unit 416 determines the position of the swing body 3 from the position detection device 23 The detection data indicating the posture of the swing body 3 is acquired from the posture detection device 24 (step S10).
 撮像位置算出部416は、旋回前角度θra、及び旋回前位置Praを取得する。 The imaging position calculation unit 416 acquires the pre-turning angle θra and the pre-turning position Pra.
 位置検出装置23の検出データ及び姿勢検出装置24の検出データは、一時的に記憶装置42に記憶される。 The detection data of the position detection device 23 and the detection data of the attitude detection device 24 are temporarily stored in the storage device 42.
 旋回連続撮影モードを開始する場合、油圧ショベル1の運転者は、旋回連続撮影スイッチ32Aを操作(押下)する。図13に示す例では、時点t2において旋回連続撮影スイッチ32Aが操作される。旋回連続撮影スイッチ32Aが操作されることにより生成された撮影開始指令信号は、演算処理装置41に出力される。信号取得部411は、撮影開始指令信号を取得する(ステップS20)。 When starting the turning continuous shooting mode, the driver of the hydraulic shovel 1 operates (depresses) the turning continuous shooting switch 32A. In the example shown in FIG. 13, the turning continuous shooting switch 32A is operated at time t2. A photographing start instruction signal generated by operating the turning continuous photographing switch 32A is output to the arithmetic processing unit 41. The signal acquisition unit 411 acquires a photographing start instruction signal (step S20).
 また、旋回連続撮影スイッチ32Aが操作され、撮影開始指令信号を取得すると、演算処理装置41は、撮像装置30の撮像を開始する(ステップS30)。画像取得部410は、油圧ショベル1が動作開始前の動作停止状態において撮像装置30によって撮影された対象SBの画像PCraを取得する。また、画像記憶部423は、画像PCraを記憶する。 In addition, when the turning continuous shooting switch 32A is operated and a shooting start command signal is obtained, the arithmetic processing unit 41 starts shooting of the imaging device 30 (step S30). The image acquisition unit 410 acquires the image PCra of the target SB captured by the imaging device 30 in the operation stop state before the hydraulic shovel 1 starts the operation. Further, the image storage unit 423 stores the image PCra.
 運転者は、操作装置35を操作して、走行体5の走行を停止した状態で、旋回体3の旋回を開始する(ステップS40)。運転者は、撮像装置30が対象SBの計測開始位置を向いている旋回開始位置から、撮像装置30が対象SBの計測終了位置を踏む旋回終了位置まで旋回するように、操作装置35を操作して、旋回体3の旋回を開始する。 The driver operates the operation device 35 to start the turning of the swing body 3 in a state in which the traveling of the traveling body 5 is stopped (step S40). The driver operates the operation device 35 so that the imaging device 30 turns from the turning start position where the imaging device 30 faces the measurement start position of the target SB to the turning end position where the imaging device 30 steps on the measurement end position of the target SB. And the turning of the turning body 3 is started.
 図13に示す例では、時点t3において操作装置35が操作され、旋回体3の旋回が開始される。複数の撮像装置30(30A,30B,30C,30D)のそれぞれは、旋回体3が旋回状態において時間の間隔をあけて対象SBの画像PCsを複数回撮影する。 In the example shown in FIG. 13, the operating device 35 is operated at time t3 to start turning of the swing body 3. Each of the plurality of imaging devices 30 (30A, 30B, 30C, 30D) captures the image PCs of the target SB a plurality of times at intervals of time while the rotating body 3 is in a turning state.
 画像取得部410は、旋回体3が旋回中において撮像装置30によって撮影された施工対象SBの複数の画像PCsを順次取得する。また、画像記憶部423は、複数の画像PCsを順次記憶する。 The image acquisition unit 410 sequentially acquires a plurality of images PCs of the construction target SB captured by the imaging device 30 while the rotating body 3 is turning. Further, the image storage unit 423 sequentially stores a plurality of image PCs.
 旋回体3が旋回終了位置に到達すると、運転者は、操作装置35の操作を解除し、旋回体3の旋回を終了する(ステップS60)。図13に示す例では、時点t4において操作装置35の操作が解除され、旋回体3の旋回が終了される。 When the swing body 3 reaches the swing end position, the driver cancels the operation of the operating device 35 and ends the swing of the swing body 3 (step S60). In the example shown in FIG. 13, the operation of the operating device 35 is released at time t4, and the turning of the swing body 3 is ended.
 旋回体3の旋回が停止した時点t4から予め規定された時間が経過して時点t5になったとき、判定部418は、位置検出装置23及び姿勢検出装置24のそれぞれが検出データを安定して出力できる静定状態であると判定する。 When a predetermined time elapses from time t4 at which the turning of the swing body 3 stops and time t5 is reached, the determination unit 418 causes each of the position detection device 23 and the posture detection device 24 to stabilize detection data. It is determined that it is in a stationary state that can be output.
 油圧ショベル1が動作停止状態であり、位置検出装置32及び姿勢検出装置24が静定状態において、撮影位置算出部416は、位置検出装置23から旋回体3の位置を示す検出データを取得し、姿勢検出装置24から旋回体3の姿勢を示す検出データを取得する(ステップS70)。 When the hydraulic shovel 1 is in the operation stop state, and the position detection device 32 and the posture detection device 24 are in the static state, the imaging position calculation unit 416 acquires detection data indicating the position of the swing body 3 from the position detection device 23, Detection data indicating the attitude of the swing body 3 is acquired from the attitude detection device 24 (step S70).
 撮像位置算出部416は、旋回後角度θrb、及び旋回後位置Prbを取得する。 The imaging position calculation unit 416 acquires the after-turn angle θrb and the after-turn position Prb.
 位置検出装置23の検出データ及び姿勢検出装置24の検出データは、一時的に記憶装置42に記憶される。 The detection data of the position detection device 23 and the detection data of the attitude detection device 24 are temporarily stored in the storage device 42.
 また、画像取得部410は、油圧ショベル1が動作終了後の動作停止状態において撮像装置30によって撮影された施工対象SBの画像PCrbを取得する。また、画像記憶部423は、画像PCrbを記憶する。 Further, the image acquisition unit 410 acquires the image PCrb of the construction target SB captured by the imaging device 30 in the operation stop state after the hydraulic shovel 1 ends the operation. Further, the image storage unit 423 stores the image PCrb.
 旋回連続撮影モードを終了する場合、油圧ショベル1の運転者は、旋回連続撮影スイッチ32Aを操作(押下)する。図13に示す例では、時点t6において旋回連続撮影スイッチ32Aが操作される。旋回連続撮影スイッチ32Aが操作されることにより生成された撮影終了指令信号は、演算処理装置41に出力される。信号取得部411は、撮影終了指令信号を取得する(ステップS80)。 When ending the turning continuous shooting mode, the driver of the hydraulic shovel 1 operates (depresses) the turning continuous shooting switch 32A. In the example shown in FIG. 13, the turning continuous shooting switch 32A is operated at time t6. A photographing end command signal generated by operating the turning continuous photographing switch 32A is output to the arithmetic processing unit 41. The signal acquisition unit 411 acquires a photographing end instruction signal (step S80).
 撮影終了信号が取得されることにより、撮像装置30の撮影が終了する(ステップS90)。撮像装置30は、撮影不可状態に遷移する。 By obtaining the imaging end signal, the imaging of the imaging device 30 ends (step S90). The imaging device 30 transitions to the imaging disabled state.
 共通部分抽出部413は、画像記憶部423に記憶されている複数の画像PCのそれぞれから共通部分KSを抽出する(ステップS120)。 The common part extraction unit 413 extracts the common part KS from each of the plurality of images PC stored in the image storage unit 423 (step S120).
 共通部分抽出部413は、旋回体3が旋回停止中及び旋回中において撮像装置30によって撮影された少なくとも2つの画像PCからそれら2つの画像PCの共通部分KSを抽出する。共通部分KSの抽出は、旋回体3の旋回方向Rにおいて、旋回体3の旋回が開始されてから終了するまでに取得された全ての画像PCについて実施されてもよいし、所定のルールに基づいて選定された画像PCについて実施されてもよい。 The common part extraction unit 413 extracts a common part KS of the two images PC from at least two images PC captured by the imaging device 30 while the rotating body 3 is stopped and turned. Extraction of the common part KS may be performed on all the images PC acquired from the start to the end of the turning of the turning body 3 in the turning direction R of the turning body 3 or based on a predetermined rule May be implemented for the selected image PC.
 図10を参照して説明したように、共通部分抽出部413は、画像PCの抽出対象領域WDを複数の分割領域WD1,WD2,WD3,WD4に分割して、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから規定数以上の共通部分KSを抽出する。 As described with reference to FIG. 10, the common part extraction unit 413 divides the extraction target area WD of the image PC into a plurality of divided areas WD1, WD2, WD3, and WD4 to form a plurality of divided areas WD1, WD2, Extract common parts KS more than a specified number from each of WD3 and WD4.
 また、図11を参照して説明したように、共通部分抽出部413は、ヒストグラム平坦化アルゴリズムを用いて、画像PCのコントラスト分布を補正することができる。共通部分抽出部413は、コントラスト分布が補正された画像PCの抽出対象領域WDから共通部分KSを抽出することができる。 Further, as described with reference to FIG. 11, the common part extraction unit 413 can correct the contrast distribution of the image PC using a histogram flattening algorithm. The common part extraction unit 413 can extract the common part KS from the extraction target area WD of the image PC for which the contrast distribution has been corrected.
 次に、撮像位置算出部416は、複数の画像PCの共通部分KSに基づいて、旋回体3の推定角度θsを推定する(ステップS140)。撮像位置算出部416は、複数の画像PCにおける共通部分KSの位置に基づいて、旋回角度Δθを算出して、推定角度θsを推定することができる。 Next, the imaging position calculation unit 416 estimates the estimated angle θs of the rotating body 3 based on the common part KS of the plurality of images PC (step S140). The imaging position calculation unit 416 can estimate the estimated angle θs by calculating the turning angle Δθ based on the position of the common part KS in the plurality of images PC.
 撮像位置算出部416は、旋回終了後の推定角度θsと旋回後角度θrbとの差が規定値未満であるとき、角度θr(旋回前角度θra、旋回後角度θrb)に基づいて、旋回中における推定角度θsを補正する。また、撮像位置算出部416は、補正後の推定角度θsに基づいて、旋回中における撮像装置30の推定位置Psを補正する。撮像位置算出部416は、上述した手順に基づいて、推定角度θsの補正を実施することができる。 When the difference between the estimated angle θs after the end of turning and the after-turning angle θrb is less than a prescribed value, the imaging position calculation unit 416 determines during turning based on the angle θr (pre-turning angle θra, after-turning angle θrb). The estimated angle θs is corrected. Further, the imaging position calculation unit 416 corrects the estimated position Ps of the imaging device 30 during turning based on the corrected estimated angle θs. The imaging position calculation unit 416 can correct the estimated angle θs based on the above-described procedure.
 三次元位置算出部417は、撮影時の画像PCをステレオ処理して、複数の施工対象SBの撮像装置座標系における三次元位置を算出する(ステップS160)。また、三次元位置算出部417は、撮像装置座標系における三次元位置を現場座標系における三次元位置に変換する。 The three-dimensional position calculation unit 417 performs stereo processing on the image PC at the time of shooting to calculate a three-dimensional position in the imaging device coordinate system of the plurality of construction targets SB (step S160). Also, the three-dimensional position calculation unit 417 converts the three-dimensional position in the imaging device coordinate system into a three-dimensional position in the on-site coordinate system.
[効果]
 以上説明したように、本実施形態によれば、複数の画像PCで共通する特徴点である共通部分KSに基づいて撮像装置30の位置を取得するとき、共通部分KSを抽出する抽出対象領域WDが複数の分割領域WD1,WD2,WD3,WD4に分割される。共通部分抽出部413は、複数の分割領域WD1,WD2,WD3,WD4のそれぞれから共通部分KSを抽出する。これにより、画像PCにおいて抽出される共通部分KSは分散される。そのため、撮像位置算出部416は、分散された共通部分KSに基づいて、撮像装置30の推定位置Psを高精度に算出することができる。したがって、対象SBの三次元形状の計測精度の低下が抑制される。
[effect]
As described above, according to the present embodiment, when acquiring the position of the imaging device 30 based on the common part KS that is a feature point common to a plurality of images PC, the extraction target area WD extracts the common part KS Is divided into a plurality of divided areas WD1, WD2, WD3 and WD4. The common part extraction unit 413 extracts the common part KS from each of the plurality of divided areas WD1, WD2, WD3, and WD4. Thereby, the common part KS extracted in the image PC is dispersed. Therefore, the imaging position calculation unit 416 can calculate the estimated position Ps of the imaging device 30 with high accuracy based on the dispersed common part KS. Therefore, the decrease in the measurement accuracy of the three-dimensional shape of the target SB is suppressed.
 また、本実施形態においては、画像処理部412によって画像PCのコントラスト分布が補正される。これにより、コントラストが低い部位が抽出対象領域WDに存在しても、その抽出対象領域WDのコントラストが高くなるようにコントラスト分布が補正される。抽出対象領域WDのコントラストが高くなることにより、共通部分抽出部413は、抽出対象領域WDから共通部分KSを抽出することができる。そのため、撮像位置算出部416は、抽出された共通部分KSに基づいて、撮像装置30の推定位置Psを高精度に算出することができる。 Furthermore, in the present embodiment, the image processor 412 corrects the contrast distribution of the image PC. As a result, even if a region with low contrast is present in the extraction target region WD, the contrast distribution is corrected so that the contrast of the extraction target region WD becomes high. Since the contrast of the extraction target area WD is high, the common part extraction unit 413 can extract the common part KS from the extraction target area WD. Therefore, the imaging position calculation unit 416 can calculate the estimated position Ps of the imaging device 30 with high accuracy based on the extracted common part KS.
第2実施形態.
 第2実施形態について説明する。以下の説明において、上述の実施形態と同一又は同等の構成要素については同一の符号を付し、その説明を簡略又は省略する。
Second embodiment.
The second embodiment will be described. In the following description, constituent elements identical or equivalent to those of the above-described embodiment are denoted by the same reference numerals, and the description thereof is simplified or omitted.
 図14は、本実施形態に係る計測システム30の処理の一例を説明するための模式図である。上述の実施形態においては、油圧ショベル1が動作中であることとは、走行体5が走行停止中であり旋回体3が旋回中であることとした。油圧ショベル1が動作中であるとは、旋回体3が旋回停止中であり走行体5が走行中であることでもよいし、旋回体3が旋回中であり走行体5が走行中であることでもよい。 FIG. 14 is a schematic view for explaining an example of processing of the measurement system 30 according to the present embodiment. In the above-described embodiment, the fact that the hydraulic shovel 1 is in operation means that the traveling body 5 is stopped for traveling and the swing body 3 is turning. That the hydraulic shovel 1 is in operation may be that the swing body 3 is in a swing stop state and the traveling body 5 is in travel, or that the swing body 3 is in rotation and the travel body 5 is in travel May be.
 図14に示すように、走行体5が走行方向MDに走行することによって、撮像装置30及び撮像装置30の撮影領域FMが走行方向MDに移動する。撮像装置30は、撮影領域FM1と撮影領域FM2とに重複領域が設けられ、撮影領域FM1と撮影領域FM2とに重複領域が設けられるように、対象SBを撮影する。撮像装置30は、撮影領域FM1,FM2,FM3のそれぞれに配置された対象SBの画像PC1,PC2,PC3を撮影する。 As shown in FIG. 14, when the traveling body 5 travels in the traveling direction MD, the imaging device 30 and the imaging region FM of the imaging device 30 move in the traveling direction MD. The imaging device 30 captures an image of the target SB such that an overlapping area is provided in the imaging area FM1 and the imaging area FM2, and an overlapping area is provided in the imaging area FM1 and the imaging area FM2. The imaging device 30 captures images PC1, PC2, and PC3 of the target SB disposed in the imaging regions FM1, FM2, and FM3, respectively.
 共通部分抽出部413は、画像PC1と画像PC2との共通部分KS1、及び画像PC2と画像PC3との共通部分KS2を抽出することができる。 The common part extraction unit 413 can extract a common part KS1 between the image PC1 and the image PC2 and a common part KS2 between the image PC2 and the image PC3.
 撮像位置算出部416は、共通部分KS1の変位量に基づいて、撮像装置30の移動量ΔD1を算出する。また、撮像位置算出部416は、共通部分KS2の変位量に基づいて、撮像装置30の移動量ΔD2を算出する。撮像位置算出部416は、撮像装置30の移動量Δ(ΔD1,ΔD2)に基づいて、走行体5が走行中における撮影時の撮像装置30の位置を取得する。上述の実施形態においては、撮像位置算出部416は、旋回角度θを算出したが、本実施形態においては、撮像装置30の位置として、X軸方向の位置、Y軸方向の位置、Z軸方向の位置、ロール角、ピッチ角、及びヨー角の6変数を算出する。三次元位置算出部417は、撮影時の撮像装置30の位置と撮像装置30が撮影した画像PCとに基づいて対象SBの三次元位置を算出する。 The imaging position calculation unit 416 calculates the movement amount ΔD1 of the imaging device 30 based on the displacement amount of the common part KS1. Further, the imaging position calculation unit 416 calculates the movement amount ΔD2 of the imaging device 30 based on the displacement amount of the common part KS2. The imaging position calculation unit 416 acquires the position of the imaging device 30 at the time of shooting while the traveling body 5 is traveling, based on the movement amount Δ (ΔD1, ΔD2) of the imaging device 30. In the above-described embodiment, the imaging position calculation unit 416 calculates the turning angle θ. However, in the present embodiment, the position of the imaging device 30 includes the position in the X axis direction, the position in the Y axis direction, and the Z axis direction. Six variables of position, roll angle, pitch angle, and yaw angle are calculated. The three-dimensional position calculation unit 417 calculates the three-dimensional position of the target SB based on the position of the imaging device 30 at the time of imaging and the image PC captured by the imaging device 30.
 なお、上述の実施形態においては、油圧ショベル1が旋回中においては、撮像装置30の位置が共通部分KSに基づいて算出されることとしたが、その実施形態に限られない。例えば、油圧ショベル1が旋回中における撮像装置30の位置が、位置検出装置23の検出データ及び姿勢検出装置24の検出データに基づいて算出されてもよい。 In the above-described embodiment, the position of the imaging device 30 is calculated based on the common portion KS while the hydraulic excavator 1 is turning, but the present invention is not limited to this embodiment. For example, the position of the imaging device 30 while the hydraulic shovel 1 is turning may be calculated based on detection data of the position detection device 23 and detection data of the posture detection device 24.
 なお、上述の実施形態において、撮像位置算出部416は、旋回軸Zrを拘束条件として、変数を旋回角度θのみとして撮影位置を算出したが、旋回軸Zrを拘束条件とすることなく、撮像装置30の位置及び姿勢を、X軸方向の位置、Y軸方向の位置、Z軸方向の位置、ロール角、ピッチ角、及びヨー角の6変数に基づいて算出してもよい。 In the above embodiment, the imaging position calculation unit 416 calculates the imaging position with the turning axis Zr as the constraint condition and the variable as the turning angle θ only, but the imaging device does not use the turning axis Zr as the constraint condition. The 30 positions and orientations may be calculated based on six variables of the position in the X axis direction, the position in the Y axis direction, the position in the Z axis direction, the roll angle, the pitch angle, and the yaw angle.
 なお、上述の実施形態において、入力装置32は、例えば操作装置35の右操作レバー35R及び左操作レバー35Lの少なくとも一方に取り付けられていてもよいし、運転室4に配置されるモニタパネルに設けられていてもよいし、携帯端末装置に設けられてもよい。また、入力装置32が油圧ショベル1の外部に設けられ、撮像装置30の撮影の開始又は終了が遠隔操作されてもよい。 In the above-described embodiment, the input device 32 may be attached to at least one of the right control lever 35R and the left control lever 35L of the operation device 35, for example. It may be provided, or may be provided in a portable terminal device. In addition, the input device 32 may be provided outside the hydraulic shovel 1 and the start or end of imaging of the imaging device 30 may be remotely controlled.
 なお、上述の実施形態においては、撮像位置算出部416が共通部分KSに基づいて旋回角度を算出したが、撮像位置算出部416は、検出装置の検出結果に基づいて旋回角度を算出してもよい。例えば、撮像位置算出部416は、位置検出装置23の検出データまたは姿勢検出装置24の検出データに基づいて、旋回体3の旋回角度θを算出してもよいし、操作量センサ36の検出データに基づいて旋回角度θを算出してもよいし、旋回体3の旋回角度を検出可能な角度センサ、例えばロータリーエンコーダの検出データに基づいて旋回角度θを算出してもよい。この場合、演算処理装置41は、角度検出センサの検出値である旋回角度θを角度検出センサから取得するタイミングと、少なくとも一対の撮像装置30が施工対象SBを撮影するタイミングとを同期させる。このようにして、少なくとも一対の撮像装置30によって画像PCが撮影されたタイミングと、そのタイミングにおける旋回体3の旋回角度θとが対応付けられる。 In the above embodiment, although the imaging position calculation unit 416 calculates the turning angle based on the common part KS, the imaging position calculation unit 416 may calculate the turning angle based on the detection result of the detection device. Good. For example, the imaging position calculation unit 416 may calculate the swing angle θ of the swing body 3 based on the detection data of the position detection device 23 or the detection data of the posture detection device 24 or the detection data of the operation amount sensor 36 The turning angle θ may be calculated based on the above, or the turning angle θ may be calculated based on detection data of an angle sensor capable of detecting the turning angle of the turning body 3, for example, a rotary encoder. In this case, the arithmetic processing unit 41 synchronizes the timing at which the turning angle θ, which is the detection value of the angle detection sensor, is acquired from the angle detection sensor with the timing at which at least a pair of imaging devices 30 shoot the construction object SB. In this manner, the timing at which the image PC is captured by at least a pair of imaging devices 30 is associated with the pivot angle θ of the pivoting body 3 at that timing.
 なお、上述の実施形態においては、演算処理装置41が少なくとも一対の撮像装置30によって撮像された画像PCをステレオ処理して三次元計測を実現したが、このようなものには限定されない。例えば、少なくとも一対の撮像装置30によって撮像された油圧ショベル1の周囲の掘削対象SBの画像PCと、位置検出装置23及び姿勢検出装置24によって求められた油圧ショベル1の静止時における位置及び姿勢とが、例えば油圧ショベル1の外部の管理装置(携帯端末、サーバ装置等)に送信される。そして、外部の管理装置が油圧ショベル1の周囲の掘削対象SBの画像PCをステレオ処理するとともに、旋回体3の旋回時における旋回角度θ、油圧ショベル1の位置及び姿勢を求め、得られた結果を用いて旋回時における油圧ショベル1の周囲の掘削対象SBの三次元位置を求めてもよい。この場合、油圧ショベル1の外部の管理装置が、演算処理装置41に相当する。すなわち、他の実施形態において、計測システム50は、油圧ショベル1のみにおいてすべての機能を実現する必要はなく、外部の管理装置が一部の機能、例えば演算処理装置41の機能を備えていてもよい。 In the above-mentioned embodiment, although arithmetic processing unit 41 carries out stereo processing of image PC picturized by at least one pair of imaging devices 30, and realizes three-dimensional measurement, it is not limited to such a thing. For example, the image PC of the excavating object SB around the hydraulic shovel 1 captured by at least a pair of imaging devices 30, the position and posture of the hydraulic shovel 1 at rest when determined by the position detection device 23 and the posture detection device 24 Is transmitted to, for example, an external management device (portable terminal, server device, etc.) of the hydraulic shovel 1. Then, the external management device performs stereo processing on the image PC of the excavating object SB around the hydraulic shovel 1, and obtains the turning angle θ at the time of turning of the turning body 3 and the position and attitude of the hydraulic shovel 1 The three-dimensional position of the excavating target SB around the hydraulic shovel 1 at the time of turning may be determined using In this case, an external management device of the hydraulic shovel 1 corresponds to the arithmetic processing unit 41. That is, in the other embodiment, the measurement system 50 does not have to realize all the functions only in the hydraulic shovel 1, and the external management device may have some functions, for example, the function of the arithmetic processing unit 41. Good.
 上述の実施形態において、撮像装置は少なくとも一対の撮像装置30を含むステレオカメラである。ステレオカメラで撮影する際には、各カメラの撮影タイミングを同期させる。撮像装置は1つのカメラによりステレオ撮影可能なものであってもよい。すなわち、1つのカメラにより撮影タイミングの異なる2つの画像に基づいてステレオ処理可能な撮像装置であってもよい。撮像装置はステレオカメラに限定されない。撮像装置は、例えば、TOF(Time Of Flight)カメラのような、画像と三次元データとの両方が得られるセンサであってもよい。撮像装置は一つのカメラにより三次元データが得られる撮像装置であってもよい。撮像装置は一つのカメラによりステレオ計測可能な撮像装置であってもよい。撮像装置は、レーザスキャナであってもよい。 In the embodiments described above, the imaging device is a stereo camera that includes at least one pair of imaging devices 30. When shooting with a stereo camera, the shooting timing of each camera is synchronized. The imaging device may be capable of stereo imaging with one camera. That is, it may be an imaging device capable of stereo processing based on two images different in photographing timing by one camera. The imaging device is not limited to a stereo camera. The imaging device may be, for example, a sensor that can obtain both an image and three-dimensional data, such as a TOF (Time Of Flight) camera. The imaging device may be an imaging device in which three-dimensional data can be obtained by one camera. The imaging device may be an imaging device that can perform stereo measurement with one camera. The imaging device may be a laser scanner.
 なお、上述の実施形態においては、作業機械1が旋回体3を有する油圧ショベル1であることとした。作業機械1は、旋回体を有しない作業機械でもよい。例えば、作業機械は、ブルドーザ、ホイールローダ、ダンプトラック、及びモーターグレーダの少なくとも一つでもよい。 In the above-described embodiment, the working machine 1 is the hydraulic shovel 1 having the swing body 3. The work machine 1 may be a work machine having no rotating body. For example, the work machine may be at least one of a bulldozer, a wheel loader, a dump truck, and a motor grader.
 1…油圧ショベル(作業機械)、1B…車体、2…作業機、3…旋回体、4…運転室、4S…運転席、5…走行体、5A,5B…履帯、6…ブーム、7…アーム、8…バケット、9…カウンタウエイト、10…ブームシリンダ、11…アームシリンダ、12…バケットシリンダ、21…GPSアンテナ、23…位置検出装置、24…姿勢検出装置、30…撮像装置、30A,30B,30C,30D…撮像装置、31…ハブ、32…入力装置、32A…旋回連続撮影スイッチ、35…操作装置、35L…左操作レバー、35R…右操作レバー、36…操作量センサ、37…旋回センサ、40…制御装置、41…演算処理装置、42…記憶装置、43…入出力インターフェース、50…計測システム、300…ステレオカメラ、301…第1ステレオカメラ、302…第2ステレオカメラ、410…画像取得部、411…信号取得部、412…画像処理部、413…共通部分抽出部、416…撮像位置算出部、417…三次元位置算出部、418…判定部、423…画像記憶部、FM…撮影領域、KS…共通部分、MD…走行方向、PC…画像、PK…演算用画像、Pr…旋回位置、Pra…旋回前位置、Prb…旋回後位置、Ps…推定位置、RD…旋回方向、SB…施工対象、WD…抽出対象領域、WD1,WD2,WD3,WD4…分割領域、WE…非対象領域、Zr…旋回軸、θr…旋回角度、θra…旋回前角度、θrb…旋回後角度、θs…推定角度。 DESCRIPTION OF SYMBOLS 1 ... Hydraulic shovel (work machine), 1B ... Car body, 2 ... Work machine, 3 ... Swirling body, 4 ... Cab, 4S ... Driver seat, 5 ... Traveling body, 5A, 5B ... Track, 6 ... Boom, 7 ... Arm ... 8 ... bucket, 9 ... counter weight, 10 ... boom cylinder, 11 ... arm cylinder, 12 ... bucket cylinder, 21 ... GPS antenna, 23 ... position detection device, 24 ... posture detection device, 30 ... imaging device, 30A, 30B, 30C, 30D: imaging device, 31: hub, 32: input device, 32A: rotation continuous shooting switch, 35: operating device, 35L: left operating lever, 35R: right operating lever, 36: operating amount sensor, 37: ... Reference numeral 40: control device 41: arithmetic processing device 42: storage device 43: input / output interface 50: measurement system 300: stereo camera 301: first Tereo camera 302: second stereo camera 410: image acquisition unit 411: signal acquisition unit 412: image processing unit 413: common part extraction unit 416: imaging position calculation unit 417: three-dimensional position calculation unit 418 ... determination unit, 423 ... image storage unit, FM ... shooting area, KS ... common part, MD ... traveling direction, PC ... image, PK ... calculation image, Pr ... turning position, Pra ... pre-turn position, Prb ... after turning Position, Ps: Estimated position, RD: Rotation direction, SB: Construction object, WD: Extraction target area, WD1, WD2, WD3, WD4: Division area, WE: Non-target area, Zr: Rotation axis, θr: Rotation angle, θra: Angle before turning, θrb: Angle after turning, θs: Estimated angle.

Claims (6)

  1.  作業機械の動作中に旋回体に搭載された撮像装置によって撮影された施工対象の複数の画像を取得する画像取得部と、
     前記複数の画像における共通部分を抽出する抽出対象領域を分割領域に分割し、前記分割領域のそれぞれから前記共通部分を抽出する共通部分抽出部と、
     前記作業機械の位置を検出する位置検出部により検出された前記作業機械の位置データ、前記作業機械の姿勢を検出する姿勢検出部により検出された前記作業機械の姿勢データ、及び前記共通部分に基づいて、前記画像が撮影された時点における前記撮像装置の位置及び姿勢を算出する撮像位置算出部と、
     前記位置データ、前記姿勢データ、および前記画像に基づいて、前記施工対象の三次元位置を算出する三次元位置算出部と、
    を備える作業機械の計測システム。
    An image acquisition unit that acquires a plurality of images of a construction target captured by an imaging device mounted on a rotating body during operation of the work machine;
    A common part extraction unit which divides an extraction target area for extracting a common part in the plurality of images into divided areas, and extracts the common part from each of the divided areas;
    Based on the position data of the work machine detected by the position detection unit that detects the position of the work machine, the posture data of the work machine detected by the posture detection unit that detects the posture of the work machine, and the common part An imaging position calculation unit that calculates the position and orientation of the imaging device at the time when the image is captured;
    A three-dimensional position calculation unit that calculates a three-dimensional position of the construction target based on the position data, the posture data, and the image;
    Measuring system of work machine equipped with
  2.  前記画像のコントラスト分布を補正する画像処理部を備え、
     前記共通部分抽出部は、コントラスト分布が補正された前記画像から前記共通部分を抽出する、
    請求項1に記載の作業機械の計測システム。
    An image processing unit that corrects the contrast distribution of the image;
    The common part extraction unit extracts the common part from the image whose contrast distribution has been corrected.
    The measurement system of the working machine according to claim 1.
  3.  作業機械が動作中に旋回体に搭載された撮像装置によって撮影された施工対象の複数の画像を取得する画像取得部と、
     前記画像のコントラスト分布を補正する画像処理部と、
     コントラスト分布が補正された前記複数の画像における共通部分を抽出する抽出対象領域から前記画像の共通部分を抽出する共通部分抽出部と、
     前記作業機械の位置を検出する位置検出部により検出された前記作業機械の位置データ、前記作業機械の姿勢を検出する姿勢検出部により検出された前記作業機械の姿勢データ、及び前記共通部分に基づいて、前記画像が撮像された時点における前記撮像装置の位置及び姿勢を算出する撮像位置算出部と、
     前記位置データ、前記姿勢データ、および前記画像に基づいて、前記施工対象の三次元位置を算出する三次元位置算出部と、
    を備える作業機械の計測システム。
    An image acquisition unit that acquires a plurality of images of a construction target captured by an imaging device mounted on a rotating body while the work machine is operating;
    An image processing unit that corrects the contrast distribution of the image;
    A common part extraction unit for extracting a common part of the images from an extraction target area for extracting a common part in the plurality of images whose contrast distribution has been corrected;
    Based on the position data of the work machine detected by the position detection unit that detects the position of the work machine, the posture data of the work machine detected by the posture detection unit that detects the posture of the work machine, and the common part An imaging position calculation unit that calculates the position and orientation of the imaging device at the time when the image is captured;
    A three-dimensional position calculation unit that calculates a three-dimensional position of the construction target based on the position data, the posture data, and the image;
    Measuring system of work machine equipped with
  4.  作業機械は、走行体を有し、
     前記抽出対象領域は、前記画像において前記走行体が前記抽出対象領域の外側に位置するように規定される、
    請求項1から請求項3のいずれか一項に記載の作業機械の計測システム。
    The working machine has a traveling body,
    The extraction target area is defined so that the traveling body is located outside the extraction target area in the image.
    The measuring system of the working machine according to any one of claims 1 to 3.
  5.  請求項1から請求項4のいずれか一項に記載の作業機械の計測システムを備える作業機械。 The working machine provided with the measuring system of the working machine according to any one of claims 1 to 4.
  6.  作業機械が動作中に旋回体に搭載された撮像装置によって撮影された施工対象の複数の画像を取得することと、
     前記画像における共通部分を抽出する抽出対象領域を複数の分割領域に分割し、複数の前記分割領域のそれぞれから前記共通部分を抽出することと、
     前記作業機械の位置データ、前記作業機械の姿勢データ、及び前記共通部分に基づいて、前記画像が撮影された時点における前記撮像装置の位置及び姿勢を算出することと、
     前記撮像装置の位置及び姿勢と前記画像とに基づいて前記施工対象の三次元位置を算出することと、
    を含む作業機械の計測方法。
    Obtaining a plurality of images of a construction target photographed by an imaging device mounted on a revolving unit while the work machine is in operation;
    Dividing an extraction target area for extracting a common part in the image into a plurality of divided areas, and extracting the common part from each of the plurality of divided areas;
    Calculating the position and orientation of the imaging device at the time the image is captured, based on the position data of the work machine, the attitude data of the work machine, and the common part;
    Calculating the three-dimensional position of the construction object based on the position and attitude of the imaging device and the image;
    How to measure work machine including.
PCT/JP2018/026709 2017-09-08 2018-07-17 Work machine measurement system, work machine, and work machine measurement method WO2019049515A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-173564 2017-09-08
JP2017173564A JP6895350B2 (en) 2017-09-08 2017-09-08 Work machine measurement system and work machine

Publications (1)

Publication Number Publication Date
WO2019049515A1 true WO2019049515A1 (en) 2019-03-14

Family

ID=65633817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/026709 WO2019049515A1 (en) 2017-09-08 2018-07-17 Work machine measurement system, work machine, and work machine measurement method

Country Status (2)

Country Link
JP (1) JP6895350B2 (en)
WO (1) WO2019049515A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111703202B (en) 2019-03-18 2021-10-29 精工爱普生株式会社 Liquid absorber, and liquid ejecting apparatus
JP7341861B2 (en) * 2019-11-11 2023-09-11 株式会社トプコン Management system and method using eyewear device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325355A (en) * 2003-04-25 2004-11-18 Topcon Corp Method and device for measuring three-dimensional coordinate
JP2006350897A (en) * 2005-06-20 2006-12-28 Toyota Central Res & Dev Lab Inc Motion measurement device
WO2014192061A1 (en) * 2013-05-27 2014-12-04 パイオニア株式会社 Image processing device, image processing method, and image processing program
JP2017026552A (en) * 2015-07-27 2017-02-02 株式会社パスコ Three-dimensional measurement device, three-dimensional measurement method and program
JP2017072425A (en) * 2015-10-05 2017-04-13 株式会社小松製作所 Construction management system and shape measuring method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325355A (en) * 2003-04-25 2004-11-18 Topcon Corp Method and device for measuring three-dimensional coordinate
JP2006350897A (en) * 2005-06-20 2006-12-28 Toyota Central Res & Dev Lab Inc Motion measurement device
WO2014192061A1 (en) * 2013-05-27 2014-12-04 パイオニア株式会社 Image processing device, image processing method, and image processing program
JP2017026552A (en) * 2015-07-27 2017-02-02 株式会社パスコ Three-dimensional measurement device, three-dimensional measurement method and program
JP2017072425A (en) * 2015-10-05 2017-04-13 株式会社小松製作所 Construction management system and shape measuring method

Also Published As

Publication number Publication date
JP6895350B2 (en) 2021-06-30
JP2019049460A (en) 2019-03-28

Similar Documents

Publication Publication Date Title
CN110249203B (en) Work machine surveying system, work machine, and work machine surveying method
CN108700402B (en) Position measurement system, working machine, and position measurement method
KR102089454B1 (en) Measuring system, working machine and measuring method
JP6777375B2 (en) Work machine image display system, work machine remote control system and work machine
WO2020241618A1 (en) Map generation system and map generation method
JP6898816B2 (en) Display system, display method, and display device
JP2016008484A (en) Construction machinery
WO2018062523A1 (en) Detection processing device of working machine and detection processing method of working machine
WO2019049515A1 (en) Work machine measurement system, work machine, and work machine measurement method
JP2022164713A (en) Image display system of work machine and image display method of work machine
JP7135191B2 (en) Working machine measurement system, working machine, and working machine measurement method
KR20230171035A (en) Control system of working machine, working machine, and control method of working machine
JP2020197045A (en) Display system and display method
JP2019152098A (en) Shovel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18854779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18854779

Country of ref document: EP

Kind code of ref document: A1