WO2020196874A1 - 建設機械、支援システム - Google Patents
建設機械、支援システム Download PDFInfo
- Publication number
- WO2020196874A1 WO2020196874A1 PCT/JP2020/014204 JP2020014204W WO2020196874A1 WO 2020196874 A1 WO2020196874 A1 WO 2020196874A1 JP 2020014204 W JP2020014204 W JP 2020014204W WO 2020196874 A1 WO2020196874 A1 WO 2020196874A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- excavator
- information
- work area
- construction machine
- controller
- Prior art date
Links
- 238000010276 construction Methods 0.000 title claims description 61
- 238000012544 monitoring process Methods 0.000 claims description 54
- 230000005540 biological transmission Effects 0.000 claims description 6
- 238000004891 communication Methods 0.000 abstract description 47
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 152
- 230000006870 function Effects 0.000 description 38
- 238000013528 artificial neural network Methods 0.000 description 27
- 238000000034 method Methods 0.000 description 22
- 239000008186 active pharmaceutical agent Substances 0.000 description 17
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 5
- 239000000446 fuel Substances 0.000 description 5
- 210000002569 neuron Anatomy 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000004576 sand Substances 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 239000010720 hydraulic oil Substances 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- RFHAOTPXVQNOHP-UHFFFAOYSA-N fluconazole Chemical compound C1=NC=NN1CC(C=1C(=CC(F)=CC=1)F)(O)CN1C=NC=N1 RFHAOTPXVQNOHP-UHFFFAOYSA-N 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 239000003921 oil Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000009412 basement excavation Methods 0.000 description 2
- 210000004027 cell Anatomy 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 239000002023 wood Substances 0.000 description 2
- 241001124569 Lycaenidae Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000007858 starting material Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
- E02F9/262—Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2058—Electric or electro-mechanical or mechanical control devices of vehicle sub-units
- E02F9/2095—Control of electric, electro-mechanical or mechanical equipment not otherwise provided for, e.g. ventilators, electro-driven fans
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/24—Safety devices, e.g. for preventing overload
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0022—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/205—Remotely operated machines, e.g. unmanned vehicles
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/2054—Fleet management
Definitions
- This disclosure relates to construction machinery and support systems.
- Patent Document 1 A construction machine that acquires useful information about a work area around a construction machine (for example, information about detection of an object to be monitored by a surrounding worker or the like) is known (for example, Patent Document 1).
- the information about the work area acquired by one construction machine may be useful for other construction machines that work in the same work area (work site). Therefore, it is desirable that the information about the surrounding work area acquired by one construction machine is also available to other construction machines.
- the acquisition department that acquires information about the work area around the construction machine
- a transmission unit that transmits information acquired by the acquisition unit to other construction machines around the construction machine is provided. Construction machinery is provided.
- a support system that includes multiple construction machines located within a given work area.
- the plurality of construction machines are each An acquisition unit that acquires information about the work area, A transmission unit that transmits information acquired by the acquisition unit to the other construction machine is provided.
- a support system is provided.
- FIG. 1 is a schematic diagram showing an example of the configuration of the excavator support system SYS.
- the excavator support system SYS includes a plurality of excavators 100 arranged at relatively close distances to each other (for example, work is performed at the same work site (work area)), and supports the work by each excavator 100.
- the description of the plurality of excavators 100 will proceed on the premise that they each have the same configuration with respect to the excavator support system SYS.
- the excavator 100 (an example of a construction machine) includes a lower traveling body 1, an upper rotating body 3 that is rotatably mounted on the lower traveling body 1 via a turning mechanism 2, a boom 4, an arm 5, and an attachment. , Bucket 6 and cabin 10.
- the lower traveling body 1 includes a pair of left and right crawlers 1C, specifically, a left crawler 1CL and a right crawler 1CR.
- the lower traveling body 1 travels the excavator 100 by hydraulically driving the left crawler 1CL and the right crawler 1CR by the traveling hydraulic motors 2M (2ML, 2MR), respectively.
- the upper swing body 3 turns with respect to the lower traveling body 1 by being driven by the swing hydraulic motor 2A. Further, the upper swing body 3 may be electrically driven by an electric motor instead of being hydraulically driven by the swing hydraulic motor 2A.
- the side of the upper swing body 3 to which the attachment AT is attached is referred to as the front, and the side to which the counterweight is attached is referred to as the rear.
- the boom 4 is pivotally attached to the center of the front portion of the upper swing body 3 so as to be vertically movable, an arm 5 is pivotally attached to the tip of the boom 4 so as to be vertically rotatable, and a bucket 6 is vertically attached to the tip of the arm 5. It is rotatably pivoted.
- the boom 4, arm 5, and bucket 6 are hydraulically driven by the boom cylinder 7, arm cylinder 8, and bucket cylinder 9 as hydraulic actuators, respectively.
- the cabin 10 is a driver's cab on which the operator is boarded, and is mounted on the front left side of the upper swing body 3.
- the excavator 100 is in a connection state capable of communicating with another excavator 100 by a predetermined method of short-range wireless communication conforming to a predetermined communication protocol such as Bluetooth (registered trademark) communication or WiFi (registered trademark) communication, for example, equality.
- P2P Peer to Peer
- the excavator 100 can acquire various information from the other excavator 100 and transmit various information to the other excavator 100. Details will be described later.
- FIG. 2 is a top view of the excavator 100.
- FIG. 3 is a configuration diagram showing an example of the configuration of the excavator 100.
- the excavator 100 includes hydraulic actuators such as a traveling hydraulic motor 2M (2ML, 2MR), a swing hydraulic motor 2A, a boom cylinder 7, an arm cylinder 8, and a bucket cylinder 9 as a configuration related to a hydraulic system. Further, the excavator 100 has an engine 11, a regulator 13, a main pump 14, an oil temperature sensor 14c, a pilot pump 15, a control valve 17, an operating device 26, and a discharge pressure sensor 28 as configurations related to a hydraulic system. The operation pressure sensor 29, the pressure reducing valve 50, and the control valve 60 are included.
- the excavator 100 has a controller 30, an engine control unit (ECU: Engine Control Unit) 74, an engine rotation speed adjustment dial 75, a boom angle sensor S1, an arm angle sensor S2, and a bucket as configurations related to the control system.
- the engine 11 is the main power source of the hydraulic system, and is mounted on the rear part of the upper swing body 3, for example. Specifically, the engine 11 rotates constantly at a preset target rotation speed under the control of the ECU 74 to drive the main pump 14, the pilot pump 15, and the like.
- the engine 11 is, for example, a diesel engine that uses light oil as fuel.
- the regulator 13 controls the discharge amount of the main pump 14. For example, the regulator 13 adjusts the angle of the swash plate of the main pump 14 (hereinafter, “tilt angle”) in response to a control command from the controller 30.
- the main pump 14 is mounted on the rear part of the upper swing body 3 like the engine 11, and is driven by the engine 11 as described above to supply hydraulic oil to the control valve 17 through the high-pressure hydraulic line.
- the main pump 14 is, for example, a variable displacement hydraulic pump, and under the control of the controller 30, as described above, the tilt angle of the swash plate is adjusted by the regulator 13 to adjust the stroke length of the piston and discharge.
- the flow rate (discharge pressure) is controlled.
- the oil temperature sensor 14c detects the temperature of the hydraulic oil flowing into the main pump 14. The detection signal corresponding to the detected hydraulic oil temperature is taken into the controller 30.
- the pilot pump 15 is mounted on the rear part of the upper swing body 3, for example, and supplies the pilot pressure to the operating device 26 via the pilot line.
- the pilot pump 15 is, for example, a fixed displacement hydraulic pump, and is driven by the engine 11 as described above.
- the control valve 17 is, for example, a hydraulic control device mounted in the central portion of the upper swing body 3 and controlling a hydraulic actuator in response to an operator's operation on the operating device 26.
- the control valve 17 is connected to the main pump 14 via the high-pressure hydraulic line, and the hydraulic oil supplied from the main pump 14 is supplied to the hydraulic actuator (operation content) according to the operating state (operation content) of the operating device 26. It is selectively supplied to the traveling hydraulic motors 2ML, 2MR, the turning hydraulic motor 2A, the boom cylinder 7, the arm cylinder 8, and the bucket cylinder 9).
- the operation device 26 is provided near the driver's seat of the cabin 10, and is an operation input for the operator to operate various driven elements (lower traveling body 1, upper turning body 3, boom 4, arm 5, bucket 6, etc.). It is a means.
- the operating device 26 is a hydraulic actuator (that is, a traveling hydraulic motor 2ML, 2MR, a swing hydraulic motor 2A, a boom cylinder 7, an arm cylinder 8, a bucket cylinder 9, etc.) in which an operator drives each driven element. It is an operation input means for performing an operation.
- the operating device 26 is connected to the control valve 17 through a pilot line on the secondary side thereof.
- the control valve 17 can selectively drive each hydraulic actuator according to the operating state of the operating device 26.
- the discharge pressure sensor 28 detects the discharge pressure of the main pump 14. The detection signal corresponding to the discharge pressure detected by the discharge pressure sensor 28 is taken into the controller 30.
- the operation pressure sensor 29 is a pilot pressure on the secondary side of the operation device 26, that is, a pilot pressure (hereinafter, operation content) corresponding to an operation state (that is, operation content) of each driven element (that is, hydraulic actuator) in the operation device 26. , "Operating pressure" is detected.
- the detection signal of the pilot pressure corresponding to the operating state of the lower traveling body 1, the upper swinging body 3, the boom 4, the arm 5, the bucket 6 and the like in the operating device 26 by the operating pressure sensor 29 is taken into the controller 30.
- the pressure reducing valve 50 is provided in the pilot line on the secondary side of the operating device 26, that is, in the pilot line between the operating device 26 and the control valve 17, and the operation content (operation) of the operating device 26 is controlled by the controller 30. Adjust (decompress) the pilot pressure corresponding to the amount). As a result, the controller 30 can control (limit) the operation of various driven elements by controlling the pressure reducing valve 50.
- the control valve 60 switches between an enabled state and an invalid state of the operation of the operating device 26, that is, the operation of various driven elements of the excavator 100.
- the control valve 60 is, for example, a gate lock valve configured to operate in response to a control command from the controller 30.
- the control valve 60 is arranged on the pilot line between the pilot pump 15 and the operating device 26, and switches the communication / interruption (non-communication) of the pilot line in response to the control command from the controller 30.
- the gate lock lever provided near the entrance of the driver's seat of the cabin 10 is pulled up, the gate lock valve is in a communication state, the operation on the operation device 26 is in an effective state (operable state), and the gate lock lever is released.
- the controller 30 can limit (stop) the operation of the excavator 100 by outputting a control command to the control valve 60.
- the controller 30 is, for example, a control device mounted inside the cabin 10 to drive and control the excavator 100.
- the controller 30 operates on the electric power supplied from the storage battery BT.
- the function of the controller 30 may be realized by arbitrary hardware or a combination of arbitrary hardware and software.
- the controller 30 includes, for example, a memory device such as a CPU (Central Processing Unit) and a RAM (Random Access Memory), a non-volatile auxiliary storage device such as a ROM (Read Only Memory), and an interface device for input / output to / from the outside. It is mainly composed of computers including. In this case, the controller 30 can realize various functions by reading one or more programs stored (installed) in the auxiliary storage device, loading them into the memory device, and executing them on the CPU.
- a memory device such as a CPU (Central Processing Unit) and
- controller 30 may be realized by another controller (control device). That is, the function of the controller 30 may be realized in a manner distributed by a plurality of controllers. Further, the storage battery BT is charged by the generated power of the alternator 11b driven by the engine 11.
- the controller 30 controls the regulator 13 and the like based on detection signals taken from various sensors such as the boom angle sensor S1, the arm angle sensor S2, the bucket angle sensor S3, the discharge pressure sensor 28, and the operating pressure sensor 29. ..
- the controller 30 uses the object detection device 70 to monitor an object (for example, a person, a truck, etc.) within a predetermined monitoring area around the excavator 100 (for example, a work area within 5 meters from the excavator 100).
- an object for example, a person, a truck, etc.
- control for avoiding contact between the shovel 100 and the object to be monitored (hereinafter, “contact avoidance control”) is performed.
- the controller 30 may output a control command to the alarm device 49 to output an alarm as an example of contact avoidance control.
- the controller 30 may output a control command to the pressure reducing valve 50 or the control valve 60 to limit the operation of the excavator 100.
- the target of the operation restriction may be all the driven elements, or may be only a part of the driven elements necessary for avoiding the contact between the object to be monitored and the excavator 100. ..
- the controller 30 acquires information about the work area around the excavator 100 (hereinafter, “work area information”) and is useful for other excavators 100 around the acquired excavator 100.
- Area information is transmitted to other excavators 100 around it through a communication device 90 (an example of a transmission unit).
- the controller 30 acquires information regarding the presence / absence of object detection by the object detection device 70, which will be described later, that is, information regarding the determination result of the presence / absence of an object around the excavator 100 (hereinafter, “object detection information”). Then, it is transmitted to other excavators 100 around the excavator 100 through the communication device 90.
- the object detection information includes, for example, information such as the presence / absence of an object, the type of the object, and the position of the object. Further, the object detection information may be transmitted only when the object is detected by the object detection device 70, or may be transmitted regardless of the presence or absence of detection. Details of the function (hereinafter, “information sharing function”) among the plurality of excavators 100 in the excavator support system SYS will be described later (see FIGS. 5 to 7).
- the controller 30 controls a function for analyzing the situation of the work site including the work area around the excavator 100 (hereinafter, “work site situation analysis function”). Specifically, the controller 30 recognizes surrounding objects in time series based on the outputs of the object detection device 70 and the image pickup device 80, and analyzes the situation at the work site. Details of the work site situation analysis function will be described later (see FIG. 8).
- the ECU 74 drives and controls the engine 11 under the control of the controller 30.
- the ECU 74 appropriately controls the fuel injection device and the like in accordance with the operation of the starter 11a driven by the electric power from the storage battery BT in response to the ignition on operation, and starts the engine 11.
- the ECU 74 appropriately controls the fuel injection device and the like so that the engine 11 rotates constantly at a set rotation speed specified by a control signal from the controller 30 (isochronous control).
- the engine 11 may be directly controlled by the controller 30.
- the ECU 74 may be omitted.
- the engine speed adjustment dial 75 is an operating means for adjusting the speed of the engine 11 (hereinafter, "engine speed").
- engine speed The data regarding the engine speed setting state output from the engine speed adjustment dial 75 is taken into the controller 30.
- the engine speed adjustment dial 75 is configured so that the engine speed can be switched in four stages of SP (Super Power) mode, H (Heavy) mode, A (Auto) mode, and idling mode.
- the SP mode is an engine speed mode selected when it is desired to prioritize the amount of work, and a target speed having the highest engine speed is set.
- the H mode is an engine speed mode selected when it is desired to achieve both work load and fuel consumption, and the engine speed is set to the second highest target speed.
- the A mode is an engine speed mode selected when it is desired to operate the excavator 100 with low noise while giving priority to fuel consumption, and the engine speed is set to the third highest target speed.
- the idling mode is an engine speed mode selected when the engine 11 is desired to be in an idling state, and the engine speed is set to the lowest target speed. Under the control of the ECU 74, the engine 11 is controlled so as to be constant at a target rotation speed corresponding to the engine rotation speed mode set by the engine rotation speed adjustment dial 75.
- the boom angle sensor S1 is attached to the boom 4 and detects the elevation angle (hereinafter, “boom angle”) ⁇ 1 of the boom 4 with respect to the upper swing body 3.
- the boom angle ⁇ 1 is, for example, an ascending angle from the state in which the boom 4 is most lowered. In this case, the boom angle ⁇ 1 becomes maximum when the boom 4 is raised most.
- the boom angle sensor S1 may include, for example, a rotary encoder, an acceleration sensor, a 6-axis sensor, an IMU (Inertial Measurement Unit) and the like, and hereinafter, an arm angle sensor S2, a bucket angle sensor S3, and a body tilt sensor S4. The same applies to.
- the boom angle sensor S1 may be a stroke sensor attached to the boom cylinder 7, and the same applies to the arm angle sensor S2 and the bucket angle sensor S3 below.
- the detection signal corresponding to the boom angle ⁇ 1 by the boom angle sensor S1 is taken into the controller 30.
- the arm angle sensor S2 is attached to the arm 5 and detects the rotation angle (hereinafter, “arm angle”) ⁇ 2 of the arm 5 with respect to the boom 4.
- the arm angle ⁇ 2 is, for example, an opening angle from the most closed state of the arm 5. In this case, the arm angle ⁇ 2 becomes maximum when the arm 5 is opened most.
- the detection signal corresponding to the arm angle by the arm angle sensor S2 is taken into the controller 30.
- the bucket angle sensor S3 is attached to the bucket 6 and detects the rotation angle (hereinafter, “bucket angle”) ⁇ 3 of the bucket 6 with respect to the arm 5.
- the bucket angle ⁇ 3 is an opening angle from the most closed state of the bucket 6. In this case, the bucket angle ⁇ 3 becomes maximum when the bucket 6 is opened most.
- the detection signal corresponding to the bucket angle by the bucket angle sensor S3 is taken into the controller 30.
- the airframe tilt sensor S4 detects the tilted state of the airframe (for example, the upper swivel body 3) with respect to a predetermined plane (for example, a horizontal plane).
- the airframe tilt sensor S4 is attached to, for example, the upper swing body 3, and tilt angles around two axes in the front-rear direction and the left-right direction of the shovel 100 (that is, the upper swing body 3) (hereinafter, “front-back tilt angle” and “left-right” Tilt angle ”) is detected.
- the detection signal corresponding to the tilt angle (front-back tilt angle and left-right tilt angle) by the aircraft tilt sensor S4 is taken into the controller 30.
- the swivel state sensor S5 is attached to the upper swivel body 3 and outputs detection information regarding the swivel state of the upper swivel body 3.
- the turning state sensor S5 detects, for example, the turning angular velocity and the turning angle of the upper swing body 3.
- the swivel state sensor S5 includes, for example, a gyro sensor, a resolver, a rotary encoder, and the like.
- the body tilt sensor S4 includes a gyro sensor, a 6-axis sensor, an IMU, etc. capable of detecting angular velocities around three axes
- the upper swivel body 3 is swiveled (for example, swiveled) based on the detection signal of the body tilt sensor S4.
- Angular velocity may be detected.
- the turning state sensor S5 may be omitted.
- the alarm device 49 alerts a person involved in the work of the excavator 100 (for example, an operator in the cabin 10 or a worker around the excavator 100).
- the alarm device 49 includes, for example, an indoor alarm device for alerting an operator or the like inside the cabin 10.
- the indoor alarm device includes, for example, at least one of an audio output device, a vibration generator, and a light emitting device provided in the cabin 10.
- the indoor alarm device may include a display device DS.
- the alarm device 49 may include an outdoor alarm device for alerting workers and the like outside the cabin 10 (for example, around the excavator 100).
- the outdoor alarm device includes, for example, at least one of an audio output device and a light emitting device provided outside the cabin 10.
- the voice output device may be, for example, a traveling alarm device attached to the bottom surface of the upper swing body 3.
- the outdoor alarm device may be a light emitting device provided on the upper swivel body 3.
- the alarm device 49 notifies a person involved in the work of the excavator 100 under the control of the controller 30 as described above. You can.
- the object detection device 70 detects an object existing around the excavator 100.
- Objects to be detected include, for example, people, animals, vehicles, construction machinery, buildings, walls, fences, holes, and the like.
- the object detection device 70 includes, for example, at least one such as a monocular camera (an example of a camera), an ultrasonic sensor, a millimeter wave radar, a stereo camera, a LIDAR (Light Detecting and Ringing), a range image sensor, and an infrared sensor.
- the object detection device 70 may be configured to detect a predetermined object in a predetermined area set around the excavator 100.
- the object detection device 70 may be configured in a mode in which the types of objects can be distinguished, for example, a mode in which a person and a non-human object can be distinguished.
- the object detection device 70 may have a configuration capable of detecting a predetermined object or distinguishing the type of the object based on a predetermined model such as a pattern recognition model or a machine learning model.
- the object detection device 70 includes a front sensor 70F, a rear sensor 70B, a left sensor 70L, and a right sensor 70R.
- the output signal corresponding to the detection result by the object detection device 70 (front sensor 70F, rear sensor 70B, left sensor 70L, and right sensor 70R, respectively) is taken into the controller 30.
- the front sensor 70F is attached to, for example, the front end of the upper surface of the cabin 10 and detects an object existing in front of the upper swing body 3.
- the rear sensor 70B is attached to, for example, the rear end of the upper surface of the upper swing body 3 and detects an object existing behind the upper swing body 3.
- the left sensor 70L is attached to, for example, the left end of the upper surface of the upper swing body 3 and detects an object existing on the left side of the upper swing body 3.
- the right sensor 70R is attached to, for example, the right end of the upper surface of the upper swing body 3 and detects an object existing on the right side of the upper swing body 3.
- the object detection device 70 acquires environmental information around the excavator 100, which is the base for object detection (for example, captured images, reflected wave data for detection waves such as millimeter waves and lasers transmitted to the surroundings). Only, the specific object detection process, the process of distinguishing the type of the object, and the like may be executed by the outside of the object detection device 70 (for example, the controller 30).
- the image pickup device 80 takes an image of the surroundings of the excavator 100 and outputs the captured image.
- the image pickup apparatus 80 includes a front camera 80F, a rear camera 80B, a left camera 80L, and a right camera 80R.
- the images captured by the image pickup device 80 (front camera 80F, rear camera 80B, left camera 80L, and right camera 80R, respectively) are captured in the display device DS.
- the image captured by the imaging device 80 is captured by the controller 30 via the display device DS.
- the image captured by the imaging device 80 may be directly captured by the controller 30 without going through the display device DS.
- the front camera 80F is attached to the front end of the upper surface of the cabin 10 so as to be adjacent to the front sensor 70F, and images the front state of the upper swing body 3.
- the rear camera 80B is attached to the rear end of the upper surface of the upper swivel body 3 so as to be adjacent to the rear sensor 70B, and images the rear state of the upper swivel body 3.
- the left camera 80L is attached to the left end of the upper surface of the upper swing body 3 so as to be adjacent to the left sensor 70L, for example, and images the state of the left side of the upper swing body 3.
- the right camera 80R is attached to the upper right end of the upper swing body 3 so as to be adjacent to the right sensor 70R, and images the state of the right side of the upper swing body 3.
- the object detection device 70 includes an image pickup device such as a monocular camera or a stereo camera
- some or all the functions of the image pickup device 80 may be integrated in the object detection device 70.
- the front sensor 70F includes an image pickup device
- the functions of the front camera 80F may be integrated into the front sensor 70F.
- the functions of the rear camera 80B, the left camera 80L, and the right camera 80R when the rear sensor 70B, the left sensor 70L, and the right sensor 70R each include an imaging device.
- the orientation detection device 85 is configured to detect information regarding the relative relationship between the orientation of the upper swivel body 3 and the orientation of the lower traveling body 1 (hereinafter, referred to as "direction information").
- the orientation detection device 85 may be composed of a combination of a geomagnetic sensor attached to the lower traveling body 1 and a geomagnetic sensor attached to the upper rotating body 3.
- the orientation detection device 85 may be composed of a combination of a GNSS (Global Navigation Satellite System) receiver attached to the lower traveling body 1 and a GNSS receiver attached to the upper turning body 3.
- the orientation detection device 85 may be configured by a resolver attached to the electric motor.
- the orientation detection device 85 may be arranged, for example, at a center joint provided in connection with the turning mechanism 2 that realizes the relative rotation between the lower traveling body 1 and the upper turning body 3. The detection information by the orientation detection device 85 is taken into the controller 30.
- the communication device 90 includes various devices in the work site (for example, a position information management device that measures and manages position information of other construction machines and workers in the work site) and other excavators around the excavator 100. It is an arbitrary device that performs short-range communication of a predetermined method with 100 or the like.
- the position information management device is, for example, a terminal device installed in a temporary office or the like in the work site of the excavator 100.
- the terminal device may be, for example, a stationary terminal device such as a desktop computer terminal, or may be a mobile terminal such as a smartphone, a tablet terminal, or a laptop computer terminal.
- the position information management device is installed in, for example, a temporary office in the work site of the excavator 100 or a place relatively close to the work site (for example, a communication facility such as a station building or a base station near the work site). It may be an edge server to be used. Further, the location information management device may be, for example, a cloud server installed in a facility such as a management center installed outside the work site of the excavator 100.
- the communication device 90 may be, for example, a Bluetooth communication module, a WiFi communication module, or the like.
- the display device DS is attached to a place that is easily visible to an operator or the like seated in the cockpit inside the cabin 10, and displays various information images.
- the display device DS is, for example, a liquid crystal display or an organic EL (Electroluminescence) display.
- the display device DS may be an image captured from the image pickup device 80, or a converted image obtained by subjecting the captured image to a predetermined conversion process (for example, a viewpoint conversion image or a composite image obtained by synthesizing a plurality of captured images). ) Is displayed.
- the display device DS includes a display control unit DSa, an image display unit DS1, and an operation input unit DS2.
- the display control unit DSa performs control processing for displaying various information images on the image display unit DS1 in response to an operation input by an operator or the like to the operation input unit DS2. Similar to the controller 30, the display control unit DSa may be configured around a computer including, for example, a CPU, a memory device, an auxiliary storage device, an interface device, and the like.
- the function of the display control unit DSa may be provided outside the display device DS, and may be realized by, for example, the controller 30.
- the image display unit DS1 is an area portion for displaying an information image on the display device DS.
- the image display unit DS1 is composed of, for example, a liquid crystal panel, an organic EL panel, or the like.
- the operation input unit DS2 receives the operation input related to the display device DS.
- the operation input signal corresponding to the operation input to the operation input unit DS2 is taken into the display control unit DSa.
- the operation input unit DS2 may accept various operation inputs related to the excavator 100 other than the display device DS.
- the operation input signals corresponding to the various operation inputs to the operation input unit DS2 are taken into the controller 30 directly or indirectly via the display control unit DSa.
- the operation input unit DS2 includes, for example, a touch panel mounted on a liquid crystal panel or an organic EL panel as the image display unit DS1.
- the operation input unit DS2 may include an arbitrary operation member such as a touch pad, a button, a switch, a toggle, a lever, etc., which are separate from the image display unit DS1.
- the operation input unit that receives various operation inputs related to the excavator 100 other than the display device DS may be provided separately from the display device DS (operation input unit DS2), for example, the lever button LB.
- the lever button LB is provided on the operation device 26 and receives a predetermined operation input regarding the excavator 100.
- the lever button LB is provided at the tip of the operating lever as the operating device 26.
- the operator or the like can operate the lever button LB while operating the operation lever (for example, the lever button LB can be pressed with the thumb while holding the operation lever by hand).
- FIG. 4 is a diagram illustrating an example of an object detection method.
- the object detection device 70 detects an object around the excavator 100 by using a trained model composed mainly of a neural network DNN.
- the neural network DNN is a so-called deep neural network having one or more intermediate layers (hidden layers) between the input layer and the output layer.
- a weighting parameter representing the connection strength with the lower layer is defined for each of the plurality of neurons constituting each intermediate layer. Then, the neurons in each layer output the sum of the values obtained by multiplying the input values from the plurality of neurons in the upper layer by the weighting parameters defined for each neuron in the upper layer to the neurons in the lower layer through the threshold function.
- the neural network DNN is configured.
- Machine learning specifically, deep learning (deep learning) is performed on the neural network DNN, and the above-mentioned weighting parameters are optimized.
- the neural network DNN receives the environmental information (for example, the captured image) acquired by the object detection device 70 as the input signal x, and the output signal y of the object corresponding to the predetermined monitoring target list. It is possible to output the probability (prediction probability) that an object of each type exists.
- the output signal y1 output from the neural network DNN has a 10% prediction probability that a "person" exists around the excavator 100, specifically, within the range of acquisition of environmental information by the object detection device 70. It represents that there is.
- the neural network DNN is, for example, a convolutional neural network (CNN).
- CNN is a neural network to which existing image processing techniques (convolution processing and pooling processing) are applied. Specifically, the CNN extracts feature amount data (feature map) smaller in size than the captured image by repeating a combination of a convolution process and a pooling process on the captured image acquired by the object detection device 70. Then, the pixel value of each pixel of the extracted feature map is input to the neural network composed of a plurality of fully connected layers, and the output layer of the neural network outputs, for example, the prediction probability that an object exists for each type of object. can do.
- an captured image acquired by the object detection device 70 is input as an input signal x, and the position and size of the object in the captured image (that is, the occupied area of the object on the captured image) and the object thereof.
- the type may be configured to be output as an output signal y. That is, the neural network DNN may be configured to detect an object on the captured image (determine the occupied region portion of the object on the captured image) and determine the classification of the object. Further, in this case, the output signal y may be configured in an image data format in which information regarding the occupied area of the object and its classification is superimposed on the captured image as the input signal x.
- the object detection device 70 is based on the position and size of the occupied area of the object in the captured image output from the trained model (neural network DNN), and the relative position (distance) of the object from the excavator 100. And direction) can be specified. This is because the object detection device 70 (front sensor 70F, rear sensor 70B, left sensor 70L, and right sensor 70R) is fixed to the upper swing body 3 and the imaging range (angle of view) is predetermined (fixed). Is.
- the output signal y1 output from the neural network DNN has the coordinates of the position where the "person" exists around the excavator 100, specifically, within the acquisition range of the environmental information by the object detection device 70.
- the object detection device 70 monitors the object in the monitoring area. It can be determined that the target object has been detected.
- the neural network DNN has a neural network corresponding to each of a process of extracting an occupied area (window) in which an object exists in a captured image and a process of specifying the type of an object in the extracted area. It may be. That is, the neural network DNN may have a configuration in which object detection and object classification are performed step by step. Further, for example, in the neural network DNN, a process of classifying an object and defining an occupied area (bounding box) of an object for each grid cell in which the entire area of the captured image is divided into a predetermined number of partial areas, and a grid.
- the configuration may have a neural network corresponding to the process of combining the occupied areas of the objects for each type and determining the final occupied area of the object. That is, the neural network DNN may have a configuration in which object detection and object classification are performed in parallel.
- the object detection device 70 calculates, for example, the prediction probability for each type of object on the captured image for each predetermined control cycle.
- the object detection device 70 may further increase the current prediction probability when the current judgment result and the previous judgment result match. For example, the predicted probability that an object appearing in a predetermined area on a captured image is determined to be a "person" (y1) during the previous object detection process is continuously determined to be a "person" (y1) this time as well. If so, the prediction probability of being judged as the "person" (y1) this time may be further increased.
- the prediction probability is calculated to be relatively high. Therefore, the object detection device 70 makes an erroneous determination that the prediction probability of the object of the type is relatively low due to some noise even though the object of the type actually exists. It can be suppressed.
- the object detection device 70 may make a determination regarding an object on the captured image in consideration of operations such as traveling and turning of the excavator 100. This is because even when an object around the excavator 100 is stationary, the position of the object on the captured image may move due to the traveling or turning of the excavator 100, and the object may not be recognized as the same object. For example, the image area determined to be "person" (y1) in the current process may be different from the image area determined to be "person” (y1) in the previous process due to the running or turning of the excavator 100. ..
- the object detection device 70 if the image area determined to be "person” (y1) in the current process is within a predetermined range from the image area determined to be “person” (y1) in the previous process, the object detection device 70 , It may be regarded as the same object, and a continuous match judgment (that is, a judgment of a state in which the same object is continuously detected) may be performed.
- the image area used in this determination may include an image area within a predetermined range from this image area in addition to the image area used in the previous determination. ..
- the object detection device 70 can make a continuous match determination with respect to the same object around the excavator 100.
- the object detection device 70 may detect an object around the excavator 100 by using an object detection method based on any machine learning other than the method using the neural network DNN.
- the range of the type of object and the range of the non-type of object are classified for each type of object in the multivariable space.
- a trained model representing the boundaries to be (classified) may be generated by supervised learning.
- the machine learning (supervised learning) method applied to the generation of information about the boundary may be, for example, a support vector machine (SVM: Support Vector Machine), a k-nearest neighbor method, a mixed Gaussian distribution model, or the like.
- SVM Support Vector Machine
- the object detection device 70 is based on the trained model and based on whether the local feature amount acquired from the captured image is in the range of the object of a predetermined type or in the range of not the object of the type. It can detect objects.
- FIG. 5 and 6 are diagrams for explaining a first example of the operation related to the information sharing function of the excavator support system SYS according to the present embodiment.
- FIG. 5 is a diagram illustrating a situation in which object detection information is shared by excavator 100s in the same work site (work area 400) by an information sharing function.
- FIG. 6 is a diagram for explaining a recognition state of surrounding objects for each excavator 100 in the same work site (work area 400), and more specifically, the excavator 100 during operation related to the information sharing function of FIG. It is a figure which shows the recognition state about the surrounding object for each.
- excavator 100A the excavator 100 during excavation work
- excavator 100B the traveling excavator 100
- the excavator 100A is excavating the construction target area 401 in the work area 400, and the excavator pile 402 is formed on the right side of the excavator 100A. Further, in the work area 400, the excavator 100B is traveling in a manner of passing the excavator 100A and the left side of the construction target area 401. Further, the worker W is performing the work in the monitoring area relatively close to the excavator 100 between the excavator 100A and the band-shaped range through which the excavator 100B travels.
- the object detection device 70 of the excavator 100A cannot recognize the face even if it tries to detect the worker W from the captured image, for example, and depending on the object detection algorithm, the object detection device 70 is the object (person) to be monitored. ) May not be detected. Therefore, in this example, the object detection device 70 of the excavator 100A cannot detect the worker W, and the contact avoidance control is performed because the worker W originally invades the monitoring area close to the excavator 100A.
- the alarm device 49 should be activated, but the alarm device 49 does not operate, resulting in a false alarm state.
- a control command is output to the pressure reducing valve 50 and the control valve 60, and the operation of the excavator 100 should be restricted, but the operation of the excavator 100 is not restricted.
- the operator of the excavator 100A turns the upper swivel body 3 to the right in order to discharge the excavated earth and sand to the excavated pile 402 without noticing the worker W
- the excavator 100A and the rear part of the upper swivel body 3 The positional relationship with the worker W becomes very close, and in the worst case, there is a possibility of contact.
- the excavator 100B is running so as to pass in front of the worker W, and the worker W is working with his face facing forward when viewed from the excavator 100B. Therefore, although the object detection device 70 of the excavator 100B has a long distance from the excavator 100B to the worker W, for example, there is a possibility that the face of the worker W can be recognized from the captured image and the worker W can be detected. high. Therefore, in this example, the object detection device 70 of the excavator 100B can detect the worker W.
- the controller 30 of the excavator 100B acquires the object detection information related to the detection of the worker W from the object detection device 70, and the object related to the detection of the worker W from the excavator 100B to the excavator 100A through the communication device 90. Notify the detection information.
- the controller 30 of the excavator 100A can recognize that the worker W is present at a close position on the left side of the excavator 100A based on the object detection information received from the excavator 100B through the communication device 90. Therefore, the controller 30 of the excavator 100A may activate the alarm device 49 to notify the operator and the surrounding worker W that the worker W has been detected in the monitoring area around the excavator 100A. it can.
- the operator of the excavator 100 can temporarily suspend the work of the excavator 100, the worker W can move away from the excavator 100, and each of them can take actions for ensuring safety, and the excavator support system SYS provides information.
- the safety of the work area 400 on which the excavators 100A and 100B work can be improved.
- the controller 30 of the excavator 100B passes through the communication device 90 from the position information management device in the work site to the local coordinate system having the reference point RP of the work area 400 as the origin (hereinafter, The position information of the excavator 100B itself in the "local coordinate system") can be acquired and confirmed. Further, the controller 30 of the excavator 100B can confirm the relative position of the worker W with respect to the excavator 100B based on the object detection information of the object detection device 70. Further, the controller 30 of the excavator 100B can confirm the relative position of the excavator 100A with respect to the excavator 100B during the excavation work based on the object detection information of the object detection device 70.
- the controller 30 of the excavator 100B can derive the positional relationship between the excavator 100A and the worker W on the local coordinate system by using this information. Therefore, the controller 30 of the excavator 100B can notify the excavator 100A of information regarding the positional relationship between the excavator 100A and the worker W through the communication device 90.
- the object detection device 70 of the shovel 100A cannot detect the worker.
- the object detection device 70 of the excavator 100B can detect the worker. In this way, the excavator 100B (object detection device 70) can also interpolate and monitor the blind spot region of the object detection device 70 of the excavator 100A.
- the controller 30 of the excavator 100B uses the local coordinate system based on the detection information of the positioning device (for example, the GNSS receiver) mounted on the excavator 100B and the information regarding the reference point RP of the local coordinate system defined in advance.
- the position information of the excavator 100B may be acquired, and the same applies to the case of the excavator 100A.
- the controller 30 of the excavator 100B can acquire the position information of the excavator 100B in an absolute coordinate system (for example, a world geodetic system represented by latitude, longitude, and altitude) instead of the local coordinate system, or can use the excavator 100A.
- the positional relationship with the worker W may be derived, and the same applies to the case of the excavator 100A.
- the controller 30 of the excavator 100A can acquire and confirm the position information of the excavator 100A itself in the local coordinate system from the position information management device in the work site through the communication device 90. Further, the controller 30 of the excavator 100A can confirm the positional relationship between the excavator 100A itself and the worker W by receiving the notification from the excavator 100B while traveling. Therefore, the controller 30 of the excavator 100A can recognize the relative position of the worker W as seen from the excavator 100A on the local coordinate system by using this information. Therefore, the controller 30 of the excavator 100A may activate the alarm device 49 or restrict the operation of the excavator 100 such as braking and stopping after confirming that the operator W is in the monitoring area.
- a correspondence relationship regarding the safety level such as whether to continue the operation, slow down the operation, or stop the operation is set in advance.
- the correspondence relationship such as whether to continue the operation, slow down the operation, or stop the operation is determined for each actuator based on the type of the detected object. It may be set in advance.
- the excavator 100A receives the object detection information from one excavator 100B, but may further receive the object detection information from the other excavator 100. That is, one excavator 100 may receive object detection information from a plurality of excavators 100 that perform surrounding work. In this case, one excavator 100 may comprehensively determine the object detection information received from the plurality of excavators 100 to determine the presence or absence of a monitoring target such as a surrounding worker. Specifically, among the object detection information received from the plurality of excavators 100, the object detection information affirming the existence of a certain monitoring target and the notification from the excavator 100 at a position where the monitoring target can be detected.
- the controller 30 of one excavator 100 on the receiving side may preferentially adopt, for example, object detection information that prioritizes safety and affirms the existence of the monitoring target. Further, the controller 30 of one excavator 100 on the receiving side emphasizes the balance between the safety of the excavator 100 and the deterioration of workability due to false alarms, and the number of object detection information that affirms the existence of the monitored object and the object that denies it. It may be determined which is adopted by comparing the number of detected information or comparing the accuracy information of the object detection device 70 of the excavator 100 which is the transmission source of the object detection information.
- the controller 30 of one excavator 100 compares the two. Therefore, the one with higher identification accuracy can be adopted. For example, for an object existing at the same position, the controller 30 of one excavator 100 identifies it as wood with an identification rate of 50%, and the controller 30 of another excavator 100 around it identifies it as a person with an identification rate of 60%. In this case, the controller of one excavator 100 adopts the identification result by the controller 30 of another excavator 100 around which the object detection device 70 with higher accuracy is mounted.
- the controller 30 of the one excavator 100 compares the two. You can control your own aircraft based on the information of the one with the higher degree of safety. For example, for an object existing at the same position, the controller 30 of one excavator 100 makes a determination of continuation of operation (that is, a determination of low safety) based on the identification result of being identified as wood with an identification rate of 50%.
- the controller 30 of the other excavators 100 in the surroundings makes a judgment of stopping the operation (that is, a judgment of high safety) based on the identification result of being identified as a person with an identification rate of 30%, the other excavators 100 in the surroundings Even if the identification result by the controller 30 is a judgment that the person is a person with an identification rate of 30%, the controller 30 of one excavator 100 has a relatively high degree of safety, that is, the other excavators 100 around it. Control your own machine based on the judgment result.
- the information sharing function naturally causes a false alarm of the alarm device 49 on the excavator 100. It may be used to avoid it.
- the object detection device 70 of the excavator 100A detects a monitoring target that does not actually exist at the position of the worker W in FIGS. 5 and 6.
- the object detection device 70 of the excavator 100B is likely to determine that the monitoring target does not exist on the left side of the excavator 100A and output the object detection information indicating that the monitoring target does not exist.
- the controller 30 of the excavator 100B transmits the object detection information denying the existence of the monitoring target from the excavator 100B to the excavator 100A through the communication device 90. Therefore, the controller 30 of the excavator 100A gives priority to the notification from the excavator 100B based on some judgment criteria, determines that the monitoring target does not exist, cancels the operation of the alarm device 49, or cancels the operation of the alarm device 49, or the alarm device after the operation starts. It is possible to stop the 49, cancel the operation restriction of the excavator 100, or stop the operation restriction of the excavator 100 after the restriction starts.
- the determination criteria include, for example, that the accuracy information of the object detection device 70 of the excavator 100B, which is the source of the object detection information, exceeds a certain standard, and that there is no monitoring target included in the object detection information. It may include that the information on the existence probability (prediction probability) of the monitoring target when it is judged to be below a certain standard.
- the excavator 100B determines whether or not the monitoring target exists in the monitoring area of the excavator 100B, and also determines whether or not the monitoring target exists outside the monitoring area of the excavator 100B.
- the controller 30 of the excavator 100B stores each determination result (for example, information on the presence / absence of the monitoring target, the type of the monitoring target, the position of the monitoring target, etc.) in a predetermined storage unit (for example, an auxiliary storage device). Let me.
- the excavator 100A also determines whether or not a monitoring target exists within the monitoring area of the excavator 100A, and also determines whether or not a monitoring target exists outside the monitoring area of the excavator 100A.
- the controller 30 of the excavator 100A stores each determination result (for example, information on the presence / absence of the monitoring target, the type of the monitoring target, the position of the monitoring target, etc.) in a predetermined storage unit (for example, an auxiliary storage device). Let me. Therefore, the blind spot region of the object detection device 70 of the excavator 100 can be mutually interpolated and monitored. Further, the determination of whether or not there is a monitoring target outside the monitoring area of the excavator 100 is executed even when the excavator 100 is inoperable.
- a predetermined storage unit for example, an auxiliary storage device
- the excavator 100A may be installed at a fixed point in the work area 400 in place of or in addition to the excavator 100B, and may receive object detection information from a stationary device including an object detection device similar to the object detection device 70. .. That is, the excavator support system SYS is arranged at a position relatively close to the plurality of excavators 100 in addition to the plurality of excavators 100 (for example, a work site (work area) where the plurality of excavators 100 work).
- the above-mentioned stationary device may be included.
- the excavator 100A can receive object detection information regarding the presence or absence of surrounding objects not only from the excavator 100B but also from the stationary device.
- the work area information shared among the plurality of excavators 100 may be information on the construction area in the work area.
- the controller 30 of one excavator 100 determines the shape of the groove set by the excavator 100, or the side surface of the groove.
- Information about the target construction surface corresponding to the bottom surface may be transmitted to the other excavator 100 through the communication device 90.
- the information regarding the target construction surface instructing the shape of the groove may be set by the operation input of the operator through the operation input unit DS2, for example, a part of the excavator has already been excavated.
- the wall surface and the bottom surface of the groove, the sheet piles installed on the wall surface, and the like may be automatically set by being recognized through the captured image of the image pickup apparatus 80.
- the controller 30 of one excavator 100 transmits information on the work range virtual surface set by the excavator 100 through the communication device 90 to other excavators 100. It may be transmitted to the excavator 100.
- the information about the work range virtual surface may be set by the operation input of the operator through the operation input unit DS2, for example, a plurality of load cones or obstacles that define the work range.
- An object for example, a fence, a utility pole, an electric wire
- the like may be automatically recognized by being recognized through an image captured by the image pickup apparatus 80 or the like.
- the excavator support system SYS includes a drone 700 in addition to a plurality of excavators 100.
- FIG. 7 is a diagram illustrating a third example of the operation related to the information sharing function of the excavator support system SYS according to the present embodiment.
- the excavators 100A and 100B are in the same situation as in the case of the first example (FIGS. 5 and 6) described above, and further, have an object detection function similar to that of the excavator 100 above the work area 400.
- the explanation will proceed on the assumption that the drone 700 is flying.
- Worker W in the work area 400 is facing backward when viewed from the excavator 100A and is working in a state where his face cannot be seen. Therefore, in this example, the object detection device 70 of the excavator 100A outputs that the prediction probability that "people" exist is 10% even if the acquired captured image is input to the trained model (neural network DNN). The worker W at the position "(e1, n1, h1)" on the local coordinate system cannot be detected.
- the excavator 100B is running so as to pass in front of the worker W, and the worker W is working with his face facing forward when viewed from the excavator 100B. Therefore, the object detection device 70 of the excavator 100B outputs the predicted probability that a "person" exists as 80% by inputting the acquired captured image into the trained model (neural network DNN), and is on the local coordinate system. The worker W at the position "(e1, n1, h1)" can be detected. Therefore, the controller 30 of the excavator 100B transmits the object detection information regarding the detection of the worker W acquired from the object detection device 70 to the excavator 100A through the communication device 90 as in the case of the first example described above.
- the drone 700 is flying over the front side of the worker W, and the worker W is working with his face facing forward when viewed from the drone 700. Therefore, by inputting the captured image acquired by the image pickup device mounted on the drone 700 into the trained model (neural network), the prediction probability that "people” exist is output as 80%, and the local coordinates. The worker W at the position "(e1, n1, h1)" on the system can be detected. Therefore, the drone 700 transmits the object detection information related to the detection of the worker W to the excavators 100A and 100B through a predetermined communication device mounted on the drone 700.
- the drone 700 may detect an object by using environmental information or an object detection method different from that of the excavator 100 (object detection device 70).
- the excavator 100A cannot detect the worker W by using its own object detection device 70, but the worker W at the coordinates "(e1, n1, h1)" of the local coordinate system from the excavator 100B and the drone 700. It is possible to receive object detection information related to the detection of. As a result, the excavator 100A can recognize the existence of the worker W that cannot be detected by its own object detection device 70 by the information sharing function between the excavator 100B and the drone 700. Further, the excavator 100A can receive the object detection information from the drone 700 in addition to the object detection information from the excavator 100B.
- the excavator 100A can improve the detection accuracy of surrounding objects.
- the excavator 100 (100A, 100B) has an information sharing function capable of receiving object detection information from another device capable of detecting an object in the work area 400 in place of or in addition to the drone 700. You may.
- the other device may be, for example, a fixed point camera installed in the work area 400.
- FIG. 8 is a diagram illustrating an operation related to the work site situation analysis function of the excavator 100. Specifically, the movement status (movement history) of the dump truck DT in the work site in the time series from time t1 to time tn (n: an integer of 3 or more) is analyzed, and the dump truck DT travels in the work site. It is a figure which shows the process of grasping a road.
- the excavator 100 grasps the movement status of the dump truck DT in the work site in the time series from time t1 to time tun.
- the excavator 100 is carrying out the work of loading earth and sand into the stopped dump truck DT.
- the excavator 100 determines the position of the dump truck DT at the time of loading the earth and sand from the coordinates of the dump truck DT on the local coordinate system in the work site at time t1 detected by the object detection device 70. Can be grasped.
- the work of loading the earth and sand of the excavator 100 is completed, and the dump truck DT is at the work site for carrying out the earth and sand. It is moving toward the doorway of.
- the excavator 100 grasps the position of the dump truck DT at the time of carrying out from the coordinates of the dump truck DT on the local coordinate system in the work site at the time tk detected by the object detection device 70. be able to.
- the dump truck DT has reached the doorway of the work site.
- the excavator 100 (controller 30) can grasp the movement of the dump truck DT from the time t1 (at the time of loading) to the time tun (when passing through the doorway of the work site).
- the controller 30 can grasp the travel road (travel route) of the vehicle such as the dump truck DT at the work site by analyzing the movement history from the time t1 to the time tn.
- the traveling road includes a loading place 811 for the dump truck DT, a turning point 812 when the dump truck DT is carried out and carried in, and an loading / unloading road 813 where the dump truck DT travels toward the entrance / exit of the work site.
- controller 30 may grasp not only the movement history of the dump truck DT but also the position of the building (for example, a temporary office) in the work site detected by the object detection device 70.
- a fixed road is not laid at the work site of the excavator 100, and road information or the like indicating a running road such as a dump truck at the work site usually does not exist in many cases.
- the installation location of the temporary building at the work site may be changed from the plan depending on the situation at the work site.
- the traveling route of a dump truck or the like may often be changed depending on the progress of work at the work site, the weather, or the like. Therefore, for example, it is difficult to grasp the situation of the work site only by the information representing the current situation of the work site.
- the excavator 100 (controller 30) analyzes the movement history of the vehicle in the work site such as the dump truck DT by using the object detection information in the time series, and works on the traveling road or the like. You can grasp the situation at the site.
- the excavator 100 tells the worker when a person such as a worker invades a high-risk place (for example, a range relatively close to the driving road) based on the grasped situation of the work site. You may call attention to it.
- the controller 30 may activate the alarm device 49, for example, to alert the operator.
- the controller 30 may vibrate the mobile terminal by transmitting a predetermined signal to the mobile terminal possessed by the operator by using the communication device 90 to alert the operator.
- the control roller 30 may activate the alarm device 49.
- a predetermined signal may be transmitted to the mobile terminal of the worker W by using the communication device 90.
- the excavator 100 can improve the safety of the work site.
- FIG. 9 is a schematic view showing another example of the configuration of the excavator support system SYS.
- the excavator management system SYS includes a support device 200 and a management device 300 in addition to the plurality of excavators 100.
- the excavator management system SYS manages a plurality of excavators 100 by the management device 300.
- the number of support devices 200 included in the excavator management system SYS may be one or a plurality.
- the number of management devices 300 included in the excavator management system SYS may be one or a plurality.
- the support device 200 is communicably connected to the management device 300 through a predetermined communication line. Further, the support device 200 may be communicably connected to the excavator 100 through a predetermined communication line.
- the predetermined communication line includes, for example, a mobile communication network having a base station as a terminal, a satellite communication network using a communication satellite, a short-range wireless communication network based on communication standards such as Bluetooth (registered trademark) and WiFi. You can.
- the support device 200 is a user used by, for example, an operator or owner of the excavator 100, a worker or supervisor at a work site, or a user such as a manager or worker of the management device 300 (hereinafter, “support device user”). It is a terminal.
- the support device 200 is, for example, a mobile terminal such as a laptop computer terminal, a tablet terminal, or a smartphone. Further, the support device 200 may be, for example, a stationary terminal device such as a desktop computer terminal.
- the management device 300 is communicably connected to the excavator 100 and the support device 200 through a predetermined communication line.
- the management device 300 is, for example, a cloud server installed in a management center or the like outside the work site. Further, the management device 300 may be, for example, an edge server installed in a temporary office or the like in the work site or a communication facility (for example, a base station or a station building) relatively close to the work site. Further, the management device 300 may be, for example, a terminal device used in the work site.
- the terminal device may be, for example, a mobile terminal such as a laptop computer terminal, a tablet terminal, or a smartphone, or may be, for example, a stationary terminal device such as a desktop computer terminal.
- At least one of the support device 200 and the management device 300 may be provided with a display device or an operation device for remote control.
- the operator who uses the support device 200 or the management device 300 may remotely control the excavator 100 by using the remote control operation device.
- the support device 200 and the management device 300 equipped with the operation device for remote control are controllers mounted on the excavator 100 through predetermined communication lines such as a short-range wireless communication network, a mobile communication network, and a satellite communication network. Connected to 30 communicably.
- information images similar to the contents that can be displayed on the display device DS of the cabin 10 are displayed on the display devices of the support device 200 and the management device 300. It may be displayed as.
- the image information representing the surrounding state of the excavator 100 may be generated based on the image captured by the image pickup apparatus 80 or the like.
- the management device 300 may perform a function corresponding to, for example, the position information management device of the above-mentioned example.
- the controller 30 of the excavator 100 may transmit various information to at least one of the support device 200 and the management device 300 by using, for example, the communication device 90.
- the controller 30 may transmit, for example, at least one of the output of the object detection device 70 (object detection information) and the captured image of the image pickup device 80 to at least one of the support device 200 and the management device 300.
- the controller 30 of the excavator 100 may transmit, for example, information on the analysis result by the work site situation analysis function (that is, information representing the state of the work site) to at least one of the support device 200 and the management device 300.
- the excavator management system SYS can store various information such as object detection information acquired by the excavator 100 and information indicating the situation at the work site in the support device 200 and the management device 300 in a predetermined storage unit. .. Further, the support device user and the management device user can confirm the object detection information, the information indicating the situation at the work site, and the like through the display device of the support device 200 and the management device 300.
- the excavator management system SYS can share the information related to the excavator 100 (information acquired by the excavator 100) to the support device user and the management device user. Further, in this example, the excavator management system SYS can store the object detection information of the excavator 100 in a predetermined storage unit in the support device 200 and the management device 300. For example, the support device 200 and the management device 300 can store information about the monitoring target such as the type of the monitoring target and the position of the monitoring target outside the monitoring area of the excavator 100 in the storage unit in chronological order.
- the information about the monitoring target stored in the storage unit of the support device 200 or the management device 300 is outside the monitoring area of the excavator 100, and the type of monitoring target or the monitoring target in the monitoring area of the other excavator 100. It may be information about the position of.
- the controller 30 acquires information about the work area around the construction machine, and the communication device 90 transmits the information acquired by the controller 30 to other excavators 100 around the excavator 100.
- the information about the work area acquired by the controller 30 is determined based on the image captured by the camera (object detection device 70) that images the work area around the excavator 100. (For example, determination of the presence or absence of an object in the work area, determination of the type of object, etc.) may be included.
- the information regarding the work area acquired by the controller 30 may include information regarding the construction area of the work area (for example, information regarding the target construction surface and information regarding the work range virtual surface).
- the communication device 90 is a predetermined device located around the excavator 100 (for example, another excavator 100, a stationary device having a stationary camera for photographing the surrounding work area, and an sky above the work area. You may receive information about the work area from a drone 700, etc. flying in.
- the excavator 100 on the side of transmitting information on the work area can also use the information on the work area acquired by a predetermined device such as another excavator 100, a stationary device, or a drone 700.
- the communication device 90 receives an image captured by the stationary camera or information on a work area based on the captured image (for example, information on determining the presence or absence of an object in the work area) from the stationary device. You can do it.
- the excavator 100 can specifically use the captured image of the stationary camera included in the stationary device and the information regarding the work area based on the captured image.
- the plurality of excavators 100 send and receive work area information and the like to each other, but instead of or in addition to the excavator 100, a plurality of construction machines including other construction machines work with each other. It may be configured to send and receive area information and the like. That is, the excavator support system SYS according to the above-described embodiment replaces or in addition to the excavator 100, road machines such as bulldozers, wheel loaders, and asphalt finishers, and other construction machines such as forestry machines provided with harvesters and the like. It may be an aspect including.
Landscapes
- Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Evolutionary Computation (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Astronomy & Astrophysics (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Component Parts Of Construction Machinery (AREA)
- Operation Control Of Excavators (AREA)
Abstract
Description
建設機械の周囲の作業領域に関する情報を取得する取得部と、
前記取得部により取得された情報を当該建設機械の周囲の他の建設機械に送信する送信部と、を備える、
建設機械が提供される。
所定の作業領域内に位置する複数の建設機械を含む支援システムであって、
前記複数の建設機械は、それぞれ、
前記作業領域に関する情報を取得する取得部と、
前記取得部により取得された情報を他の前記建設機械に送信する送信部と、を備える、
支援システムが提供される。
図1~図8を参照して、本実施形態に係るショベル支援システムSYSの一例について説明する。
まず、図1を参照して、本例に係るショベル支援システムSYSについて説明する。
続いて、図1に加えて、図2、図3を参照して、ショベル支援システムSYS(ショベル100)の具体的な構成について説明する。
続いて、図4を参照して、物体検知方法の具体例について説明する。
続いて、図5、図6を参照して、ショベル支援システムSYSの動作、具体的には、ショベル支援システムSYSにおける複数のショベル100の間での情報共有機能に関する動作の第1例について説明する。
続いて、ショベル支援システムSYSの情報共有機能に関する動作の第2例を説明する。
続いて、図7を参照して、ショベル支援システムSYSの情報共有機能に関する動作の第3例を説明する。
次に、図8を参照して、ショベル100の作業現場状況分析機能に関する動作について説明する。
次に、図9を参照して、ショベル支援システムSYSの他の例について説明する。
次に、本実施形態に係るショベル支援システムSYSの作用について説明する。
以上、実施形態について詳述したが、本開示はかかる特定の実施形態に限定されるものではなく、特許請求の範囲に記載された要旨の範囲内において、種々の変形・変更が可能である。
49 警報装置
50 減圧弁
60 制御弁
70 物体検知装置(カメラ)
70B 後方センサ(カメラ)
70F 前方センサ(カメラ)
70L 左方センサ(カメラ)
70R 右方センサ(カメラ)
90 通信機器(送信部)
100 ショベル(建設機械)
700 ドローン
SYS ショベル支援システム(支援システム)
Claims (10)
- 建設機械の周囲の作業領域に関する情報を取得する取得部と、
前記取得部により取得された情報を当該建設機械の周囲の他の建設機械に送信する送信部と、を備える、
建設機械。 - 当該建設機械の周囲の作業領域を撮像するカメラを更に備え、
前記取得部により取得される前記作業領域に関する情報は、前記カメラの撮像画像に基づき行われる、前記作業領域に関する所定の判断の判断結果を含む、
請求項1に記載の建設機械。 - 前記所定の判断は、前記作業領域における物体の有無の判断を含む、
請求項2に記載の建設機械。 - 前記取得部により取得される前記作業領域に関する情報は、前記作業領域の施工領域に関する情報を含む、
請求項1に記載の建設機械。 - 当該建設機械の周囲に位置する所定の機器から前記作業領域に関する情報を受信する受信部を更に備える、
請求項1に記載の建設機械。 - 前記所定の機器は、前記作業領域を撮像する定置型カメラを有する定置機器を含み、
前記受信部は、前記定置型カメラの撮像画像、又は、当該撮像画像に基づく前記作業領域に関する情報を前記定置機器から受信する、
請求項5に記載の建設機械。 - 前記所定の機器は、前記他の建設機械を含み、
前記受信部は、前記他の建設機械で取得された前記作業領域に関する情報を受信する、
請求項5に記載の建設機械。 - 所定の作業領域内に位置する複数の建設機械を含む支援システムであって、
前記複数の建設機械は、それぞれ、
前記作業領域に関する情報を取得する取得部と、
前記取得部により取得された情報を他の前記建設機械に送信する送信部と、を備える、
支援システム。 - 記憶部を備え、
前記取得部により取得された情報は、前記建設機械の監視領域外における監視対象に関する情報であり、
前記記憶部には、前記監視対象に関する情報が記憶される、
請求項8に記載の支援システム。 - 前記記憶部に記憶される前記監視対象に関する情報は、前記他の建設機械の監視領域内における監視対象に関する情報である、
請求項9に記載の支援システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021509668A JP7387718B2 (ja) | 2019-03-27 | 2020-03-27 | 建設機械、支援システム |
EP20777179.1A EP3951084A4 (en) | 2019-03-27 | 2020-03-27 | CONSTRUCTION MACHINE AND AUXILIARY SYSTEM |
KR1020217031425A KR20210139297A (ko) | 2019-03-27 | 2020-03-27 | 건설기계, 지원시스템 |
CN202080024831.6A CN113661296A (zh) | 2019-03-27 | 2020-03-27 | 施工机械、支援系统 |
US17/448,396 US20220002978A1 (en) | 2019-03-27 | 2021-09-22 | Construction machine and support system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019061771 | 2019-03-27 | ||
JP2019-061771 | 2019-03-27 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/448,396 Continuation US20220002978A1 (en) | 2019-03-27 | 2021-09-22 | Construction machine and support system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020196874A1 true WO2020196874A1 (ja) | 2020-10-01 |
Family
ID=72609575
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/014204 WO2020196874A1 (ja) | 2019-03-27 | 2020-03-27 | 建設機械、支援システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220002978A1 (ja) |
EP (1) | EP3951084A4 (ja) |
JP (1) | JP7387718B2 (ja) |
KR (1) | KR20210139297A (ja) |
CN (1) | CN113661296A (ja) |
WO (1) | WO2020196874A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023120598A1 (ja) * | 2021-12-22 | 2023-06-29 | 住友建機株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
DE112022001908T5 (de) | 2021-03-31 | 2024-02-08 | Sumitomo Heavy Industries, Ltd. | Baumaschine und baumaschinen-unterstützungssystem |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11773567B2 (en) * | 2020-07-22 | 2023-10-03 | Baidu Usa Llc | Engineering machinery equipment, and method, system, and storage medium for safety control thereof |
US20230339734A1 (en) * | 2022-04-26 | 2023-10-26 | Deere & Company | Object detection system and method on a work machine |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006106685A1 (ja) * | 2005-03-31 | 2006-10-12 | Hitachi Construction Machinery Co., Ltd. | 作業機械の周囲監視装置 |
WO2017208997A1 (ja) * | 2016-05-31 | 2017-12-07 | 株式会社小松製作所 | 形状計測システム、作業機械及び形状計測方法 |
JP6290497B2 (ja) | 2017-06-16 | 2018-03-07 | 住友重機械工業株式会社 | ショベル |
JP2019061771A (ja) | 2017-09-25 | 2019-04-18 | トヨタ自動車株式会社 | プラズマ処理装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3007239B2 (ja) | 1993-03-31 | 2000-02-07 | 新日本製鐵株式会社 | 光磁気記録媒体用ガーネット二層膜及び光磁気記録ディスク |
JP2002188183A (ja) * | 2000-10-12 | 2002-07-05 | Komatsu Ltd | 作機機械の管理装置 |
JP4720386B2 (ja) * | 2005-09-07 | 2011-07-13 | 株式会社日立製作所 | 運転支援装置 |
JP4964321B2 (ja) * | 2010-04-16 | 2012-06-27 | 三菱電機株式会社 | 乗員保護装置 |
CN106462962B (zh) * | 2014-06-03 | 2020-08-04 | 住友重机械工业株式会社 | 施工机械用人检测系统以及挖土机 |
CN104943689B (zh) * | 2015-06-03 | 2017-05-10 | 奇瑞汽车股份有限公司 | 一种汽车主动防撞系统的控制方法 |
JP6980391B2 (ja) * | 2017-01-25 | 2021-12-15 | 住友重機械工業株式会社 | 作業機械用周辺監視システム |
US10109198B2 (en) * | 2017-03-08 | 2018-10-23 | GM Global Technology Operations LLC | Method and apparatus of networked scene rendering and augmentation in vehicular environments in autonomous driving systems |
US10349011B2 (en) * | 2017-08-14 | 2019-07-09 | GM Global Technology Operations LLC | System and method for improved obstacle awareness in using a V2X communications system |
JP6960279B2 (ja) * | 2017-08-31 | 2021-11-17 | 古野電気株式会社 | 車載装置、基地局装置、映像情報提供システム、映像情報提供方法、及びコンピュータプログラム |
-
2020
- 2020-03-27 KR KR1020217031425A patent/KR20210139297A/ko not_active Application Discontinuation
- 2020-03-27 WO PCT/JP2020/014204 patent/WO2020196874A1/ja unknown
- 2020-03-27 CN CN202080024831.6A patent/CN113661296A/zh active Pending
- 2020-03-27 JP JP2021509668A patent/JP7387718B2/ja active Active
- 2020-03-27 EP EP20777179.1A patent/EP3951084A4/en active Pending
-
2021
- 2021-09-22 US US17/448,396 patent/US20220002978A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006106685A1 (ja) * | 2005-03-31 | 2006-10-12 | Hitachi Construction Machinery Co., Ltd. | 作業機械の周囲監視装置 |
WO2017208997A1 (ja) * | 2016-05-31 | 2017-12-07 | 株式会社小松製作所 | 形状計測システム、作業機械及び形状計測方法 |
JP6290497B2 (ja) | 2017-06-16 | 2018-03-07 | 住友重機械工業株式会社 | ショベル |
JP2019061771A (ja) | 2017-09-25 | 2019-04-18 | トヨタ自動車株式会社 | プラズマ処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3951084A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112022001908T5 (de) | 2021-03-31 | 2024-02-08 | Sumitomo Heavy Industries, Ltd. | Baumaschine und baumaschinen-unterstützungssystem |
WO2023120598A1 (ja) * | 2021-12-22 | 2023-06-29 | 住友建機株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3951084A4 (en) | 2022-05-18 |
KR20210139297A (ko) | 2021-11-22 |
JP7387718B2 (ja) | 2023-11-28 |
JPWO2020196874A1 (ja) | 2020-10-01 |
US20220002978A1 (en) | 2022-01-06 |
EP3951084A1 (en) | 2022-02-09 |
CN113661296A (zh) | 2021-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020196874A1 (ja) | 建設機械、支援システム | |
WO2020204007A1 (ja) | ショベル及び施工システム | |
JP7472034B2 (ja) | ショベル、ショベル支援システム | |
WO2020218455A1 (ja) | ショベル | |
WO2021025123A1 (ja) | ショベル、情報処理装置 | |
WO2020218454A1 (ja) | 表示装置、ショベル、情報処理装置 | |
EP3885495B1 (en) | Excavator and excavator control device | |
US20220341124A1 (en) | Shovel and remote operation support apparatus | |
US20240026654A1 (en) | Construction machine and support system of construction machine | |
US20230008338A1 (en) | Construction machine, construction machine management system, and machine learning apparatus | |
JP2021095718A (ja) | ショベル、情報処理装置 | |
CN113661295B (zh) | 挖土机 | |
WO2021149775A1 (ja) | 作業機械、情報処理装置 | |
JP2022154722A (ja) | ショベル | |
JP7488753B2 (ja) | 周辺監視装置 | |
JP7349947B2 (ja) | 情報処理装置、作業機械、情報処理方法、情報処理プログラム | |
WO2022210613A1 (ja) | ショベル及びショベルの制御装置 | |
JP2022085617A (ja) | 周辺監視システム、表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20777179 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021509668 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20217031425 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020777179 Country of ref document: EP Effective date: 20211027 |