WO2020204007A1 - ショベル及び施工システム - Google Patents
ショベル及び施工システム Download PDFInfo
- Publication number
- WO2020204007A1 WO2020204007A1 PCT/JP2020/014696 JP2020014696W WO2020204007A1 WO 2020204007 A1 WO2020204007 A1 WO 2020204007A1 JP 2020014696 W JP2020014696 W JP 2020014696W WO 2020204007 A1 WO2020204007 A1 WO 2020204007A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- excavator
- image
- construction
- input
- Prior art date
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
- E02F3/435—Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/08—Superstructures; Supports for superstructures
- E02F9/10—Supports for movable superstructures mounted on travelling or walking gears or on other superstructures
- E02F9/12—Slewing or traversing gears
- E02F9/121—Turntables, i.e. structure rotatable about 360°
- E02F9/123—Drives or control devices specially adapted therefor
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/2033—Limiting the movement of frames or implements, e.g. to avoid collision between implements and the cabin
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/22—Hydraulic or pneumatic drives
- E02F9/226—Safety arrangements, e.g. hydraulic driven fans, preventing cavitation, leakage, overheating
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/24—Safety devices, e.g. for preventing overload
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2431—Multiple classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
Definitions
- This disclosure relates to excavators and construction systems.
- the above-mentioned excavator can detect a person who has entered within a predetermined range set around the excavator, it merely compares the relative positional relationship between the excavator and the person who has entered. Yes, I do not know the situation at the work site.
- the excavator according to the embodiment of the present invention includes a lower traveling body, an upper rotating body rotatably mounted on the lower traveling body, a storage device provided on the upper rotating body, and information for acquiring information on construction.
- the acquisition device and the control device for controlling the notification device are provided, and the control device determines a dangerous situation based on the information acquired by the information acquisition device.
- the excavator mentioned above can prevent a dangerous situation from occurring.
- FIG. 1 is a side view of the excavator 100.
- FIG. 2 is a top view of the excavator 100.
- FIG. 3 shows a configuration example of the basic system mounted on the excavator 100 of FIG.
- the lower traveling body 1 of the excavator 100 includes the crawler 1C.
- the crawler 1C is driven by a traveling hydraulic motor 2M as a traveling actuator mounted on the lower traveling body 1.
- the crawler 1C includes a left crawler 1CL and a right crawler 1CR
- the traveling hydraulic motor 2M includes a left traveling hydraulic motor 2ML and a right traveling hydraulic motor 2MR.
- the left crawler 1CL is driven by the left traveling hydraulic motor 2ML
- the right crawler 1CR is driven by the right traveling hydraulic motor 2MR.
- the lower traveling body 1 is mounted so that the upper rotating body 3 can be swiveled via the swivel mechanism 2.
- the swivel mechanism 2 is driven by a swivel hydraulic motor 2A as a swivel actuator mounted on the upper swivel body 3.
- the swivel actuator may be a swivel motor generator as an electric actuator.
- a boom 4 is attached to the upper swing body 3.
- An arm 5 is attached to the tip of the boom 4, and a bucket 6 as an end attachment is attached to the tip of the arm 5.
- the boom 4, arm 5, and bucket 6 form an excavation attachment AT, which is an example of an attachment.
- the boom 4 is driven by the boom cylinder 7, the arm 5 is driven by the arm cylinder 8, and the bucket 6 is driven by the bucket cylinder 9.
- the boom 4 is supported so as to be rotatable up and down with respect to the upper swing body 3.
- a boom angle sensor S1 is attached to the boom 4.
- the boom angle sensor S1 can detect the boom angle ⁇ 1 which is the rotation angle of the boom 4.
- the boom angle ⁇ 1 is, for example, an ascending angle from the state in which the boom 4 is most lowered. Therefore, the boom angle ⁇ 1 becomes maximum when the boom 4 is raised most.
- the arm 5 is rotatably supported with respect to the boom 4.
- An arm angle sensor S2 is attached to the arm 5.
- the arm angle sensor S2 can detect the arm angle ⁇ 2, which is the rotation angle of the arm 5.
- the arm angle ⁇ 2 is, for example, an opening angle from the most closed state of the arm 5. Therefore, the arm angle ⁇ 2 becomes maximum when the arm 5 is opened most.
- the bucket 6 is rotatably supported with respect to the arm 5.
- a bucket angle sensor S3 is attached to the bucket 6.
- the bucket angle sensor S3 can detect the bucket angle ⁇ 3, which is the rotation angle of the bucket 6.
- the bucket angle ⁇ 3 is an opening angle from the most closed state of the bucket 6. Therefore, the bucket angle ⁇ 3 becomes maximum when the bucket 6 is opened most.
- each of the boom angle sensor S1, the arm angle sensor S2, and the bucket angle sensor S3 is composed of a combination of an acceleration sensor and a gyro sensor. However, it may be composed only of an acceleration sensor. Further, the boom angle sensor S1 may be a stroke sensor attached to the boom cylinder 7, a rotary encoder, a potentiometer, an inertial measurement unit, or the like. The same applies to the arm angle sensor S2 and the bucket angle sensor S3.
- the upper swing body 3 is provided with a cabin 10 as a driver's cab, and is equipped with a power source such as an engine 11. Further, a space recognition device 70, an orientation detection device 71, a positioning device 73, an airframe tilt sensor S4, a swivel angular velocity sensor S5, and the like are attached to the upper swivel body 3. Inside the cabin 10, an operating device 26, an operating pressure sensor 29, a controller 30, an information input device 72, a display device D1, a sound output device D2, and the like are provided.
- the side of the upper swing body 3 to which the excavation attachment AT is attached (+ X side) is referred to as the front
- the side to which the counterweight is attached ( ⁇ X side) is referred to as the rear.
- the operating device 26 is a device used by the operator to operate the actuator.
- the operating device 26 includes, for example, an operating lever and an operating pedal.
- the actuator includes at least one of a hydraulic actuator and an electric actuator.
- the operating device 26 can supply the hydraulic oil discharged by the pilot pump 15 to the pilot port of the corresponding control valve in the control valve 17 via the pilot line. It is configured.
- the pressure of the hydraulic oil (pilot pressure) supplied to each of the pilot ports is a pressure corresponding to the operation direction and the operation amount of the operation device 26 corresponding to each of the hydraulic actuators.
- the operating device 26 may be an electrically controlled type instead of such a pilot pressure type.
- the control valve in the control valve 17 may be an electromagnetic solenoid type spool valve.
- the operating device 26 includes a left operating lever and a right operating lever, as shown in FIG.
- the left operating lever is used for turning and operating the arm 5.
- the right operating lever is used for operating the boom 4 and the bucket 6.
- the operating pressure sensor 29 is configured to be able to detect the content of the operation of the operating device 26 by the operator.
- the operating pressure sensor 29 detects the operating direction and operating amount of the operating device 26 corresponding to each of the actuators in the form of pressure (operating pressure), and outputs the detected value to the controller 30.
- the content of the operation of the operating device 26 may be detected by using a sensor other than the operating pressure sensor.
- the operating pressure sensor 29 includes a left operating pressure sensor and a right operating pressure sensor.
- the left operating pressure sensor detects the content of the operator's operation of the left operating lever in the front-rear direction and the content of the operator's operation of the left operating lever in the left-right direction in the form of pressure, and the detected value. Is output to the controller 30.
- the contents of the operation are, for example, the lever operation direction and the lever operation amount (lever operation angle). The same applies to the right operating lever.
- the space recognition device 70 is configured to acquire information about the three-dimensional space around the excavator 100. Further, the space recognition device 70 may be configured to calculate the distance from the space recognition device 70 or the excavator 100 to the object recognized by the space recognition device 70.
- the space recognition device 70 is, for example, an ultrasonic sensor, a millimeter wave radar, a monocular camera, a stereo camera, a LIDAR, a distance image sensor, an infrared sensor, or the like.
- the space recognition device 70 is attached to the front camera 70F attached to the front end of the upper surface of the cabin 10, the rear camera 70B attached to the rear end of the upper surface of the upper swing body 3, and the left end of the upper surface of the upper swing body 3.
- the left camera 70L and the right camera 70R attached to the upper right end of the upper swing body 3 are included.
- the front camera 70F may be omitted.
- the space recognition device 70 is, for example, a monocular camera having an image sensor such as a CCD or CMOS, and outputs the captured image to the display device D1.
- the space recognition device 70 not only uses the captured image, but also uses a large number of signals (laser light, etc.) when the space recognition device 70 uses a LIDAR, a millimeter-wave radar, an ultrasonic sensor, a laser radar, or the like. May be detected from the reflected signal by transmitting the laser toward the object and receiving the reflected signal.
- the space recognition device 70 may be configured to detect an object existing around the excavator 100.
- the object is, for example, a terrain shape (inclination or hole, etc.), an electric wire, a utility pole, a person, an animal, a vehicle, a construction machine, a building, a wall, a helmet, a safety vest, work clothes, or a predetermined mark on the helmet. ..
- the space recognition device 70 may be configured to be able to identify at least one of the type, position, shape, and the like of the object.
- the space recognition device 70 may be configured so as to be able to distinguish between a person and an object other than a person.
- the orientation detection device 71 is configured to detect information regarding the relative relationship between the orientation of the upper swing body 3 and the orientation of the lower traveling body 1.
- the orientation detection device 71 may be composed of, for example, a combination of a geomagnetic sensor attached to the lower traveling body 1 and a geomagnetic sensor attached to the upper rotating body 3.
- the orientation detection device 71 may be composed of a combination of a GNSS receiver attached to the lower traveling body 1 and a GNSS receiver attached to the upper rotating body 3.
- the orientation detection device 71 may be a rotary encoder, a rotary position sensor, or the like.
- the orientation detection device 71 may be configured by a resolver.
- the orientation detection device 71 may be attached to, for example, a center joint provided in connection with the swivel mechanism 2 that realizes relative rotation between the lower traveling body 1 and the upper swivel body 3.
- the orientation detection device 71 may be composed of a camera attached to the upper swing body 3. In this case, the orientation detection device 71 performs known image processing on the image (input image) acquired by the camera attached to the upper swivel body 3 to detect the image of the lower traveling body 1 included in the input image. Then, the orientation detection device 71 identifies the longitudinal direction of the lower traveling body 1 by detecting the image of the lower traveling body 1 by using a known image recognition technique. Further, the orientation detection device 71 derives an angle formed between the direction of the front-rear axis of the upper swing body 3 and the longitudinal direction of the lower traveling body 1. The direction of the front-rear axis of the upper swing body 3 is derived from the input image.
- the orientation detection device 71 can specify the longitudinal direction of the lower traveling body 1 by detecting the image of the crawler 1C.
- the orientation detection device 71 may be integrated with the controller 30.
- the information input device 72 is configured so that the operator of the excavator can input information to the controller 30.
- the information input device 72 is a switch panel installed in the vicinity of the image display unit 41 of the display device D1.
- the information input device 72 may be a touch panel arranged on the image display unit 41 of the display device D1, a dial or a cross button provided at the tip of the operation lever, or the like, and may be inside the cabin 10. It may be a voice input device such as a microphone arranged in.
- the information input device 72 may be a communication device. In this case, the operator can input information to the controller 30 via a communication terminal such as a smartphone.
- the positioning device 73 is configured to measure the current position.
- the positioning device 73 is a GNSS receiver, detects the position of the upper swing body 3, and outputs the detected value to the controller 30.
- the positioning device 73 may be a GNSS compass. In this case, the positioning device 73 can detect the position and orientation of the upper swing body 3.
- the body tilt sensor S4 detects the tilt of the upper swivel body 3 with respect to a predetermined plane.
- the airframe tilt sensor S4 is an acceleration sensor that detects the tilt angle (roll angle) around the front-rear axis and the tilt angle (pitch angle) around the left-right axis of the upper swing body 3 with respect to the horizontal plane.
- Each of the front-rear axis and the left-right axis of the upper swivel body 3 passes through, for example, the excavator center point, which is one point on the swivel axis of the shovel 100, and is orthogonal to each other.
- the turning angular velocity sensor S5 detects the turning angular velocity of the upper swing body 3. In this embodiment, it is a gyro sensor.
- the turning angular velocity sensor S5 may be a resolver, a rotary encoder, or the like.
- the turning angular velocity sensor S5 may detect the turning velocity.
- the turning speed may be calculated from the turning angular velocity.
- At least one of the boom angle sensor S1, the arm angle sensor S2, the bucket angle sensor S3, the body tilt sensor S4, and the turning angular velocity sensor S5 is also referred to as an attitude detection device.
- the posture of the excavation attachment AT is detected based on, for example, the outputs of the boom angle sensor S1, the arm angle sensor S2, and the bucket angle sensor S3.
- the display device D1 is an example of a notification device, and is configured to be able to display various information.
- the display device D1 is a liquid crystal display installed in the cabin 10.
- the display device D1 may be a display of a communication terminal such as a smartphone.
- the sound output device D2 is another example of the notification device, and is configured to be able to output sound.
- the sound output device D2 includes at least one device that outputs sound to an operator inside the cabin 10 and a device that outputs sound to an operator outside the cabin 10.
- the sound output device D2 may be a speaker attached to the communication terminal.
- the controller 30 is a control device for controlling the excavator 100.
- the controller 30 is composed of a computer including a CPU, a volatile storage device VM (see FIG. 3), a non-volatile storage device NM (see FIG. 3), and the like. Then, the controller 30 reads the program corresponding to each function from the non-volatile storage device NM, loads it into the volatile storage device VM, and causes the CPU to execute the corresponding processing.
- Each function is, for example, a machine guidance function for guiding the manual operation of the excavator 100 by the operator, supporting the manual operation of the excavator 100 by the operator, or operating the excavator 100 automatically or autonomously. Includes a machine control function.
- the controller 30 has a contact avoidance function for automatically or autonomously operating or stopping the excavator 100 in order to avoid contact between an object existing in the monitoring area around the excavator 100 and the excavator 100. You may have. Monitoring of objects around the excavator 100 can be performed not only within the monitoring area but also outside the monitoring area. At this time, the controller 30 may be configured to detect the type of the object and the position of the object.
- the mechanical power transmission line is indicated by a double line
- the hydraulic oil line is indicated by a thick solid line
- the pilot line is indicated by a broken line
- the power line is indicated by a fine solid line
- the electric control line is indicated by a long-dot chain line.
- the basic system is mainly engine 11, main pump 14, pilot pump 15, control valve 17, operating device 26, operating pressure sensor 29, controller 30, switching valve 35, engine control device 74, engine rotation speed adjustment dial 75, It includes a storage battery 80, a display device D1, a sound output device D2, an information acquisition device E1, and the like.
- the engine 11 is a diesel engine that employs isochronous control that maintains the engine speed constant regardless of the increase or decrease in load.
- the fuel injection amount, fuel injection timing, boost pressure, and the like in the engine 11 are controlled by the engine control device 74.
- the rotating shaft of the engine 11 is connected to the rotating shafts of the main pump 14 and the pilot pump 15 as hydraulic pumps.
- the main pump 14 is connected to the control valve 17 via a hydraulic oil line.
- the pilot pump 15 is connected to the operating device 26 via a pilot line.
- the pilot pump 15 may be omitted.
- the function carried out by the pilot pump 15 may be realized by the main pump 14. That is, even if the main pump 14 has a function of supplying hydraulic oil to the operating device 26 or the like after reducing the pressure of the hydraulic oil by a throttle or the like, in addition to the function of supplying the hydraulic oil to the control valve 17. Good.
- the control valve 17 is a hydraulic control device that controls the hydraulic system of the excavator 100.
- the control valve 17 is connected to hydraulic actuators such as a left traveling hydraulic motor 2ML, a right traveling hydraulic motor 2MR, a boom cylinder 7, an arm cylinder 8, a bucket cylinder 9, and a swing hydraulic motor 2A.
- control valve 17 includes a plurality of spool valves corresponding to each hydraulic actuator. Each spool valve is configured to be displaceable according to the pilot pressure so that the opening area of the PC port and the opening area of the CT port can be increased or decreased.
- the PC port is a port that forms a part of an oil passage that connects the main pump 14 and the hydraulic actuator.
- the CT port is a port that forms a part of an oil passage that connects the hydraulic actuator and the hydraulic oil tank.
- the switching valve 35 is configured to be able to switch between the enabled state and the disabled state of the operating device 26.
- the effective state of the operating device 26 is a state in which the operator can operate the hydraulic actuator using the operating device 26.
- the invalid state of the operating device 26 is a state in which the operator cannot operate the hydraulic actuator using the operating device 26.
- the switching valve 35 is a gate lock valve as a solenoid valve configured to operate in response to a command from the controller 30.
- the switching valve 35 is arranged in the pilot line connecting the pilot pump 15 and the operating device 26, and is configured so that the shutoff / opening of the pilot line can be switched in response to a command from the controller 30.
- the operating device 26 is activated when the gate lock lever (not shown) is pulled up and the gate lock valve is opened, and is disabled when the gate lock lever is pushed down and the gate lock valve is closed. ..
- the display device D1 has a control unit 40, an image display unit 41, and an operation unit 42 as an input unit.
- the control unit 40 is configured to be able to control the image displayed on the image display unit 41.
- the control unit 40 is composed of a computer including a CPU, a volatile storage device, a non-volatile storage device, and the like.
- the control unit 40 reads the program corresponding to each functional element from the non-volatile storage device, loads it into the volatile storage device, and causes the CPU to execute the corresponding process.
- each functional element may be composed of hardware or may be composed of a combination of software and hardware.
- the image displayed on the image display unit 41 may be controlled by the controller 30 or the space recognition device 70.
- the operation unit 42 is a panel including a hardware switch.
- the operation unit 42 may be a touch panel.
- the display device D1 operates by receiving power supplied from the storage battery 80.
- the storage battery 80 is charged with electricity generated by the alternator 11a, for example.
- the electric power of the storage battery 80 is also supplied to the controller 30 and the like.
- the starter 11b of the engine 11 is driven by the electric power from the storage battery 80 to start the engine 11.
- the engine control device 74 transmits data related to the state of the engine 11 such as the cooling water temperature to the controller 30.
- the regulator 14a of the main pump 14 transmits data regarding the tilt angle of the swash plate to the controller 30.
- the discharge pressure sensor 14b transmits data regarding the discharge pressure of the main pump 14 to the controller 30.
- the oil temperature sensor 14c provided in the oil passage between the hydraulic oil tank and the main pump 14 transmits data regarding the temperature of the hydraulic oil flowing through the oil passage to the controller 30.
- the controller 30 can store these data in the volatile storage device VM and transmit them to the display device D1 when necessary.
- the engine speed adjustment dial 75 is a dial for adjusting the speed of the engine 11.
- the engine speed adjustment dial 75 transmits data regarding the set state of the engine speed to the controller 30.
- the engine speed adjustment dial 75 is configured so that the engine speed can be switched in four stages of SP mode, H mode, A mode, and IDLE mode.
- the SP mode is a rotation speed mode selected when it is desired to prioritize the amount of work, and uses the highest engine speed.
- the H mode is a rotation speed mode selected when it is desired to achieve both work load and fuel consumption, and uses the second highest engine speed.
- the A mode is a rotation speed mode selected when it is desired to operate the excavator 100 with low noise while giving priority to fuel consumption, and uses the third highest engine speed.
- the IDLE mode is a rotation speed mode selected when the engine 11 is desired to be in an idling state, and uses the lowest engine speed.
- the engine 11 is controlled so as to be constant at the engine speed corresponding to the speed mode set by the engine speed adjustment dial 75.
- the sound output device D2 is configured to attract the attention of those involved in the work of the excavator 100.
- the sound output device D2 may be composed of, for example, a combination of an indoor alarm device and an outdoor alarm device.
- the indoor alarm device is a device for calling the attention of the operator of the excavator 100 in the cabin 10, and includes, for example, at least one of a speaker, a vibration generator, and a light emitting device provided in the cabin 10.
- the indoor alarm device may be a display device D1 which is an example of the notification device.
- the outdoor alarm device is a device for calling the attention of an operator working around the excavator 100, and includes, for example, at least one of a speaker and a light emitting device provided outside the cabin 10.
- the speaker as the outdoor alarm device includes, for example, a traveling alarm device attached to the bottom surface of the upper swing body 3. Further, the outdoor alarm device may be a light emitting device provided on the upper swing body 3. However, the outdoor alarm device may be omitted. For example, when the space recognition device 70 functioning as an object detection device detects a predetermined object, the sound output device D2 may notify a person involved in the work of the excavator 100 to that effect. Further, the outdoor alarm device may be a portable information terminal device carried by a worker outside the cabin 10. The portable information terminal device is, for example, a smartphone, a tablet terminal, a smart watch, a helmet with a speaker, or the like.
- the notification device may be installed outside the excavator 100.
- the notification device may be attached to, for example, a pole or a steel tower installed at a work site.
- the controller 30 can receive a signal output by at least one of the information acquisition devices E1, execute various operations, and output a control command to at least one of the display device D1 and the sound output device D2. It is configured as follows.
- the information acquisition device E1 is configured to be able to acquire information related to construction.
- the information acquisition device E1 includes a boom angle sensor S1, an arm angle sensor S2, a bucket angle sensor S3, a machine body tilt sensor S4, a turning angle speed sensor S5, a boom rod pressure sensor, a boom bottom pressure sensor, and an arm rod pressure sensor.
- the information acquisition device E1 includes, for example, information about the excavator 100, such as boom angle, arm angle, bucket angle, body tilt angle, turning angular velocity, boom rod pressure, boom bottom pressure, arm rod pressure, arm bottom pressure, bucket rod pressure, Bucket bottom pressure, boom stroke amount, arm stroke amount, bucket stroke amount, discharge pressure of main pump 14, operating pressure of operating device 26, information on three-dimensional space around excavator 100, orientation of upper swivel body 3 and lower running Acquires at least one of information regarding the relative relationship with the orientation of the body 1, information input to the controller 30, information regarding the current position, and the like. Further, the information acquisition device E1 may obtain information from other construction machines, flying objects, or the like.
- the air vehicle is, for example, a multicopter or an airship that acquires information about the work site. Further, the information acquisition device E1 may acquire work environment information.
- the working environment information is, for example, information on at least one such as sediment characteristics, weather, altitude, and the like.
- the controller 30 mainly has a danger determination unit 30A as a functional element.
- the danger determination unit 30A may be configured by hardware or software. Specifically, the danger determination unit 30A is configured to be able to determine whether or not a dangerous situation occurs based on the information acquired by the information acquisition device E1 and the information stored in the danger information database DB. ing.
- the danger information database DB is stored in, for example, the non-volatile storage device NM in the controller 30.
- the danger information database DB may be provided in the management device 200, which will be described later, and may be configured to be able to communicate with the excavator 100 via the communication network.
- the danger information database DB is a collection of information systematically configured so that information on dangerous situations that can occur at the work site can be searched.
- the danger information database DB stores, for example, information on a dangerous situation brought about by the position of a hole excavated by the excavator 100 and the temporary placement position of a gutter block embedded in the hole.
- the danger information database DB uses the depth of the hole excavated by the excavator 100, the volume of the gutter block, the distance from the edge of the hole to the gutter block, etc., and uses the conditions of a dangerous situation, the degree of danger, etc. At least one of is defined.
- the danger determination unit 30A derives the relative positional relationship between a plurality of objects such as a hole excavated by the excavator 100 and a gutter block as input information.
- FIG. 4 is a conceptual diagram showing an example of the relationship between the danger determination unit 30A and the danger information database DB.
- the danger determination unit 30A collates the derived input information with the reference information corresponding to the input information stored in the danger information database DB.
- the reference information corresponding to the input information is, for example, the reference information associated with the hole excavated by the excavator 100 and the gutter block among the plurality of reference information. Then, when the danger determination unit 30A determines that the situation represented by the input information matches or is similar to the situation represented by the reference information, it determines that a dangerous situation occurs.
- the danger determination unit 30A determines the depth of the hole excavated by the excavator 100, the volume of the gutter block, and the distance from the edge of the hole to the gutter block based on the information acquired by the information acquisition device E1. Derive the distance etc. as input information. Then, the danger determination unit 30A collates the derived input information with the reference information representing the dangerous situation stored in the danger information database DB. Then, when the danger determination unit 30A determines that the situation represented by the input information matches or is similar to the situation represented by the reference information, it determines that a dangerous situation occurs.
- the danger determination unit 30A collates the input information with the reference information indicating a non-dangerous situation, and determines that the situation represented by the input information does not match or is similar to the situation represented by the reference information. If it is determined, it may be determined that a dangerous situation will occur. Further, the danger determination unit 30A may determine whether or not a dangerous situation occurs by using work environment information such as information on sediment characteristics or information on weather.
- the danger determination unit 30A recognizes the positional relationship shown in FIG. 5 based on the input image acquired by the front camera 70F, which is an example of the information acquisition device E1, it determines that a dangerous situation will occur.
- FIG. 5 shows an example of an input image displayed on the display device D1 and acquired by the front camera 70F.
- the displayed input images are the message window G0, the image G1 of the arm 5, the image G2 of the bucket 6, the image G3 of the hole excavated by the excavator 100, the image G4 of the gutter block temporarily placed near the hole, and the image G4.
- a frame image G4F surrounding the image G4 is included.
- the message window G0 indicates that the current danger level is level 4 and the cause is "risk of block fall".
- the danger determination unit 30A By performing image processing on the input image, the danger determination unit 30A recognizes that the gutter block exists and that the hole excavated by the excavator 100 exists, and the gutter block and the edge of the hole Derive the distance between. Then, the danger determination unit 30A determines that a dangerous situation occurs when it is determined that the distance between the gutter block and the edge of the hole is less than the threshold value stored in the danger information database DB.
- the danger determination unit 30A activates the notification device to notify the outside that a dangerous situation may occur.
- the danger determination unit 30A operates the display device D1 and the indoor alarm device to notify the operator of the excavator 100 to that effect. Further, the danger determination unit 30A may activate the outdoor alarm device to notify the operator working around the excavator 100 to that effect.
- the judgment result of whether or not a dangerous situation occurs is further determined by the position of the center of gravity of the gutter block, the size of the gutter block (width, height, length), and the size of the hole (width, height). , Length), etc., may vary. Therefore, the danger determination unit 30A may gradually change the degree of danger (the degree of unsafe situation).
- the danger determination unit 30A may notify the content of the dangerous situation.
- the danger determination unit 30A causes the sound output device D2 to output a voice message that conveys the content of a possible situation, such as "the edge of the hole may collapse", or a text message that conveys the content of the possible situation. May be displayed on the display device D1.
- FIG. 6 shows another example of the input image displayed on the display device D1 and acquired by the front camera 70F.
- the displayed input images are the message window G0, the image G1 of the arm 5, the image G2 of the bucket 6, the image G3 of the hole excavated by the excavator 100, the image G4 of the side groove block temporarily placed near the hole, and the image G4.
- the frame image G4F surrounding the image G4F, the image G5 of the worker who has entered the hole, and the frame image G5F surrounding the image G5 are included.
- the message window G0 indicates that the current danger level is level 5 and the cause is "danger of serious accident".
- the danger determination unit 30A By performing image processing on the input image, the danger determination unit 30A recognizes that the gutter block exists, the hole excavated by the excavator 100 exists, and the worker exists in the hole. .. Then, the danger determination unit 30A derives the distance between the gutter block and the edge of the hole and the distance between the gutter block and the operator. Then, in the danger determination unit 30A, the distance between the gutter block and the edge of the hole is less than the first threshold value stored in the danger information database DB, and the distance between the gutter block and the operator is , When it is determined that the threshold value is less than the second threshold value stored in the danger information database DB, it is determined that a dangerous situation occurs.
- the danger determination unit 30A may change at least one of the first threshold value and the second threshold value based on the size of the gutter block, the size of the hole, and the work environment information.
- the danger determination unit 30A When it is determined that a dangerous situation occurs, the danger determination unit 30A operates the notification device in a manner different from that when the notification device is operated in the situation shown in FIG. This is because in the situation shown in FIG. 5, the worker is not involved in the dangerous situation, whereas in the situation shown in FIG. 6, the worker is involved in the dangerous situation. Specifically, the danger determination unit 30A operates the notification device so that the attention of the operator and the operator of the excavator 100 can be more reliably called.
- the danger determination unit 30A estimates the construction status after a lapse of a predetermined time based on the information acquired by the information acquisition device E1, and based on the information on the estimated construction status and the information stored in the danger information database DB, the danger It may be configured so that it can be determined whether or not such a situation occurs after a lapse of a predetermined time.
- the danger determination unit 30A estimates the shape of the hole TR after a lapse of a predetermined time based on the shape of the hole TR excavated by the excavator 100.
- FIG. 7 is a top view of the work site where the excavator 100 is located.
- the virtual broken line in FIG. 7 represents the shape of the hole TR after the elapse of a predetermined time estimated by the danger determination unit 30A, that is, the shape of the unexcavated hole TRx.
- the danger determination unit 30A derives the relative positional relationship between the unexcavated hole TRx and the gutter block BL as input information.
- the danger determination unit 30A recognizes the position of the gutter block BL based on the input image acquired by the left camera 70L.
- the danger determination unit 30A collates the derived input information with the reference information corresponding to the input information stored in the danger information database DB.
- the danger determination unit 30A determines that the situation represented by the input information matches or is similar to the situation represented by the reference information, a dangerous situation may occur after a lapse of a predetermined time. Judge that there is.
- the danger determination unit 30A derives the current shape of the hole TR excavated by the excavator 100 based on the information acquired by the information acquisition device E1. Then, the danger determination unit 30A estimates the shape of the hole TRx after a lapse of a predetermined time from the current shape of the hole TR excavated by the excavator 100. After that, the danger determination unit 30A derives the distance X1 and the like from the edge of the hole TRx to the gutter block BL after the lapse of a predetermined time as input information. Then, the danger determination unit 30A collates the derived input information with the reference information representing the dangerous situation stored in the danger information database DB. Then, when the danger determination unit 30A determines that the situation represented by the input information matches or is similar to the situation represented by the reference information, a dangerous situation may occur after a lapse of a predetermined time. Judge that there is.
- the danger determination unit 30A may be configured so that it can determine whether or not a dangerous situation will occur in the future before the hole is excavated by the excavator 100.
- the danger determination unit 30A may determine whether or not a dangerous situation may occur in the future when the gutter block BL is temporarily placed as shown in FIG. Alternatively, the danger determination unit 30A determines that a dangerous situation may occur in the future when the excavation of the hole is started near the temporarily placed gutter block BL as shown in FIG. May be good.
- FIG. 8 shows yet another example of the input image displayed on the display device D1 and acquired by the front camera 70F.
- the displayed input image will be excavated by the message window G0, the image G1 of the arm 5, the image G2 of the bucket 6, the image G4 of the temporarily placed side groove block BL, the frame image G4F surrounding the image G4, and the excavator 100.
- the message window G0 indicates that the current danger level is level 4 and the cause is "risk of block fall".
- the image G6 is generated based on the information related to the construction plan such as the design data stored in advance in the non-volatile storage device NM included in the controller 30.
- the image G6 may be generated based on the data regarding the posture of the excavation attachment at the present time, the data regarding the orientation of the upper swing body 3, and the like.
- the danger determination unit 30A recognizes the existence of the gutter block BL by performing image processing on the input image, and derives the distance between the edge of the hole to be excavated in the future and the gutter block BL. Then, when the danger determination unit 30A determines that the distance between the edge of the unexcavated hole and the gutter block BL is less than the threshold value stored in the danger information database DB, a dangerous situation in the future Is determined to occur.
- the danger determination unit 30A may recognize that the gutter block BL exists at a position other than the area set as the temporary storage place of the gutter block BL by performing image processing on the input image. In this case, the danger determination unit 30A may specify an area set as a temporary storage place for the gutter block BL based on the design data. Then, even if the danger determination unit 30A determines that a dangerous situation may occur in the future based on the fact that the gutter block BL is temporarily placed at a position other than the area set as the temporary storage place. Good. In this way, the danger determination unit 30A may determine whether or not a dangerous situation may occur in the future based on the information regarding the arrangement of materials such as the gutter block BL.
- the danger determination unit 30A recognizes that a hole excavated by the excavator 100 exists by performing image processing on the input image, and is between the temporary storage place of the material such as the gutter block and the edge of the hole. You may derive the distance of. Then, when the danger determination unit 30A determines that the distance between the temporary storage place where the material has not been temporarily stored and the edge of the hole is less than the threshold value stored in the danger information database DB, the future It may be determined that a dangerous situation may occur. This is because if the material is temporarily placed in the temporary storage place according to the construction plan after the hole is excavated, the edge of the hole may collapse.
- the danger determination unit 30A When the danger determination unit 30A recognizes the positional relationship shown in FIG. 9 based on the input image acquired by the front camera 70F, which is an example of the information acquisition device E1, it may determine that a dangerous situation occurs.
- FIG. 9 shows yet another example of the input image displayed on the display device D1 and acquired by the front camera 70F.
- the displayed input images are the message window G0, the image G1 of the arm 5, the image G2 of the bucket 6, the image G7 of the dump truck, the image G8 of the iron plate loaded on the loading platform of the dump truck, and the frame image G8F surrounding the image G8.
- an image G9 of a crane wire (wire rope) for lifting an iron plate as a suspended load is included.
- the message window G0 indicates that the current danger level is level 4 and the cause is "risk of collapse of cargo".
- the danger determination unit 30A By performing image processing on the input image, the danger determination unit 30A recognizes that there is a dump truck loaded with iron plates and that there is an iron plate lifted by the excavator 100 operating in the crane mode. , The shape of the iron plate to be lifted, the number and position of the hanging points, and the horizontal distance between the center of gravity of the iron plate and the center of the hanging points are derived. Then, the danger determination unit 30A has determined that, for example, the relationship between the shape of the iron plate to be lifted and the number and position of the suspension points matches or is similar to the relationship stored in the danger information database DB. If so, determine that a dangerous situation will occur.
- the danger determination unit 30A determines that the horizontal distance between the center of gravity of the iron plate and the centers of the plurality of suspension points is equal to or greater than the threshold value stored in the danger information database DB, a dangerous situation occurs. Then, it may be determined.
- the danger determination unit 30A activates the notification device to notify the outside that a dangerous situation may occur.
- the danger determination unit 30A operates the display device D1 and the indoor alarm device to notify the operator of the excavator 100 to that effect. Further, the danger determination unit 30A may activate the outdoor alarm device to notify the operator working around the excavator 100 to that effect.
- the danger determination unit 30A may notify the content of a dangerous situation that may occur.
- the danger determination unit 30A may output a voice message and a text message that convey the content of a possible situation such as "the suspended load may shake".
- the danger determination unit 30A is realized as a functional element of the controller 30 mounted on the excavator 100, but may be installed outside the excavator 100. In this case, if the danger determination unit 30A predicts that the tilt of the iron plate will occur because the position of the suspension point is not appropriate, the degree of danger when the worker enters a place where the tilt of the iron plate is predicted to occur. May be increased.
- the danger determination unit 30A may be realized as a functional element of the management device 200 installed in the management center or the like outside the excavator 100.
- FIG. 10 is a diagram showing a configuration example of the excavator support system.
- the excavator support system is mainly composed of one or a plurality of excavators 100, one or a plurality of management devices 200, one or a plurality of support devices 300, and one or a plurality of fixed point cameras 70X.
- the excavator support system of FIG. 10 is composed of one excavator 100, one management device 200, one support device 300, and three fixed-point cameras 70X.
- the support device 300 is a mobile terminal such as a smartphone or tablet PC carried by the worker WK.
- Each of the excavator 100, the management device 200, the support device 300, and the fixed point camera 70X is communicably connected to each other via at least one of a mobile phone communication network, a satellite communication network, a wireless LAN communication network, and the like.
- Each of the three fixed-point cameras 70X is attached to a structure PL such as a pole or a steel tower installed at the work site, and is arranged apart from each other so that the entire work site can be included in the imaging range. There is.
- the danger determination unit 30A is in a dangerous situation based on the information acquired by the information acquisition device E1 attached to the excavator 100 or the structure PL or the like and the information stored in the danger information database DB. It is configured so that it can be determined whether or not the occurrence of.
- the information acquisition device E1 includes a fixed point camera 70X.
- the danger information database DB is stored in the non-volatile storage device NM included in the management device 200.
- the danger determination unit 30A recognizes the positional relationship as shown in FIGS. 5 to 8 based on the input image acquired by the fixed point camera 70X, which is an example of the information acquisition device E1, it is dangerous. Determine that a situation may occur.
- the danger determination unit 30A and the danger information database DB may be mounted on the support device 300, or may be separately mounted on two of the shovel 100, the management device 200, and the support device 300.
- the danger determination unit 30A may be configured so that it can determine whether or not a dangerous situation may occur at the stage of construction planning.
- the danger determination unit 30A is typically mounted on the management device 200 or the support device 300 to form a construction system that supports the creation of a construction plan.
- FIG. 11 is a diagram showing a configuration example of a construction system.
- the construction system is, for example, a computer system installed in a management center or the like, and is mainly composed of a display device MD1, a sound output device MD2, an information input device MD3, and a controller MD4.
- the display device MD1 is an example of a notification device, and is configured to be able to display various information.
- the display device MD1 is a liquid crystal display installed in the management center.
- the sound output device MD2 is another example of the notification device, and is configured to be able to output sound.
- the sound output device MD2 is a speaker that outputs sound to a manager who uses the construction system.
- the information input device MD3 is configured so that the manager who creates the construction plan can input information to the controller MD4.
- the information input device MD3 is a touch panel arranged on the image display unit of the display device MD1.
- the information input device MD3 may be a digitizer, a stylus, a mouse, a trackball, or the like.
- the controller MD4 is a control device for controlling the construction system.
- the controller MD4 is composed of a computer including a CPU, a volatile storage device VM, a non-volatile storage device NM, and the like. Then, the controller MD4 reads the program corresponding to each function from the non-volatile storage device NM, loads it into the volatile storage device VM, and causes the CPU to execute the corresponding processing.
- the danger determination unit 30A is realized as a functional element of the controller MD4.
- the image display unit of the display device MD1 of FIG. 11 displays an image displayed when the manager creates a construction plan for burying the gutter block.
- the displayed image includes an image G10 representing a range in which a hole for burying a gutter block is excavated, an image G11 representing a normal gutter block, an image G12 representing a gutter block for a corner, and a cursor. It includes an image G13 representing, an image G14 representing a selected (drag-operated) gutter block, and an image G15 representing a pop-up window containing a text message.
- the administrator can determine, for example, the range in which the hole for burying the gutter block is formed by arranging the image G10 at a desired position in a desired size and a desired shape.
- the range represented by the image G10 represents the range excavated by the excavator 100.
- the administrator can determine the shape and size of the image G10 by designating a desired range in the image display unit using, for example, a digitizer or the like.
- the administrator moves the image G11 displayed in the material display area R1 or a duplicate thereof to a desired position in the work site display area R2 by a drag-and-drop operation of the image G11, thereby performing a normal operation.
- the temporary placement position of the gutter block can be determined. The same applies to the gutter block for corners.
- the material display area is an area for displaying an image showing each type of a plurality of materials whose temporary placement position is determined by the construction system so that the manager can select them.
- the work site display area R2 is an area for displaying a top view of the work site.
- the manager may create a construction plan (material temporary storage plan) so that the side groove block is temporarily placed at a desired position before the hole is actually excavated by the excavator 100, and the shovel 100 actually performs the construction plan.
- a construction plan material temporary storage plan
- a construction plan may be created so that the side groove block is temporarily placed at a desired position after the hole is excavated.
- the danger determination unit 30A derives the distance from the edge of the hole to be excavated to the gutter block temporarily placed as input information based on the information acquired by the information input device MD3 as the information acquisition device E1.
- the information acquired by the information input device MD3 includes, for example, information regarding the position of the hole to be excavated represented by the image G10, information regarding the position of the temporarily placed gutter block represented by the image G14, and the like. ..
- the information regarding the position of the hole to be excavated is an example of the scheduled information after a predetermined time.
- the danger determination unit 30A collates the derived input information with the reference information representing the dangerous situation stored in the danger information database DB. Then, if the danger determination unit 30A determines that the situation represented by the input information matches or is similar to the situation represented by the reference information, a dangerous situation may occur in the future. Is determined.
- the danger determination unit 30A activates the notification device and notifies the administrator to that effect.
- the danger determination unit 30A displays the image G15 including the warning message “too close to the groove” on the image display unit of the display device MD1 to call the attention of the administrator. This is because if the work is carried out according to such a construction plan, the edge of the hole may collapse due to the weight of the gutter block.
- the danger determination unit 30A may output a voice message from the sound output device MD2 to call the attention of the administrator.
- the construction system can prevent the manager from creating a construction plan that may cause a dangerous situation in the future.
- the danger determination unit 30A does not quantitatively derive the relative positional relationship between a plurality of specific objects such as the hole and the gutter block excavated by the excavator 100, and tables by the presence or absence of one or a plurality of specific objects. After recognizing the input scene to be input, it may be configured to determine whether or not the recognized input scene represents a dangerous situation.
- the input scene includes, for example, a scene where only the hole excavated by the excavator 100 exists, a scene where the hole excavated by the excavator 100 and the side groove block exist, or a scene where the hole excavated by the excavator 100 and the side groove block and the operator exist. It is a scene to do.
- FIG. 12 is a conceptual diagram showing another example of the relationship between the danger determination unit 30A and the danger information database DB.
- the danger determination unit 30A collates the recognized input scene with the reference scene representing the dangerous situation stored in the danger information database DB. Then, when the danger determination unit 30A determines that the input scene matches or is similar to the reference scene, it determines that a dangerous situation occurs.
- the reference scene representing a dangerous situation is, for example, information generated based on accumulated past accident cases, etc., and includes, for example, information based on an image of a work site immediately before an accident occurs.
- the danger determination unit 30A creates a neural network without deriving numerical values such as the depth of the hole excavated by the excavator 100, the volume of the gutter block, and the distance from the edge of the hole to the gutter block.
- the input scene is recognized by specifying one or more objects using the gutter.
- the danger determination unit 30A uses a neural network to determine whether or not the recognized input scene is a reference scene representing a dangerous situation.
- the danger determination unit 30A may determine whether the input scene matches or resembles a plurality of reference scenes having different degrees of danger by using an image classification technique using a neural network.
- FIG. 13 is a diagram showing a configuration example of the excavator support system.
- the excavator 100 includes a controller 30, a recording device 32, and a determination device 34.
- the controller 30 is subjected to a determination device 34 in a predetermined monitoring area around the excavator 100 (for example, a work area within 5 meters from the excavator 100), and an object to be monitored (for example, a person, a truck, or other construction machine, etc.). Control to determine the type of object to be monitored when an electric pole, suspended load, pylon, building, etc.) is detected, and to avoid contact between the object and the excavator 100 according to the type (hereinafter, "this" Contact avoidance control ") is performed.
- the controller 30 includes a notification unit 302 and an operation control unit 304 as functional units related to contact avoidance control, which are realized by executing one or more programs stored in a ROM, an auxiliary storage device, or the like on a CPU. including.
- avoidance control may not be executed depending on the type of the object. For example, in the crane mode, even if the wire rope exists near the back surface of the bucket 6, the avoidance control is not executed for the wire rope because the wire rope is a part of the work tool. In this way, the propriety of avoidance control is determined according to the position and location of the object.
- the controller 30 detects the temporarily placed earth and sand mountain, if the earth and sand mountain is the earth and sand mountain to be loaded, the avoidance control is not executed for the earth and sand mountain at the time of loading work, and the excavation operation is permitted. Will be done. However, during running work, when the excavator rides on the earth and sand mountain, it becomes unstable, so avoidance control is performed on the earth and sand mountain. In this way, the propriety of avoidance control (avoidance operation) may be determined according to the position, location, work content, and the like of the object. Further, not only the propriety of avoidance control but also the propriety of the operation content may be determined according to the position, location, work content, and the like of the object.
- the recording device 32 records an image (input image) acquired by the camera as the space recognition device 70 at a predetermined timing.
- the recording device 32 may be realized by any hardware or a combination of any hardware and software.
- the recording device 32 may be configured around a computer similar to the controller 30.
- the recording device 32 includes, for example, a recording control unit 322 as a functional unit realized by executing one or more programs stored in a ROM or an auxiliary storage device on a CPU. Further, the recording device 32 includes a storage unit 324 as a storage area defined in the internal memory.
- the determination device 34 makes a determination regarding an object around the excavator 100 (for example, an object detection determination, an object classification determination, etc.) based on the input image.
- the determination device 34 may be realized by any hardware or a combination of any hardware and software.
- the determination device 34 has the same configuration as the controller 30, that is, in addition to the CPU, RAM, ROM, auxiliary storage device, various input / output interfaces, etc., the determination device 34 performs high-speed calculation by parallel processing in conjunction with processing by the CPU. It may be configured around a computer that includes a computing device for image processing.
- the control device 210 of the management device 200 which will be described later, has the same configuration.
- the arithmetic device for image processing may include a GPU (Graphics Processing Unit), an FPGA (Field-Programmable Gate Array), and an ASIC (Application Specific Integrated Circuit).
- the determination device 34 includes a display control unit 342 and a determination unit 344 as functional units realized by executing one or more programs stored in a ROM, an auxiliary storage device, or the like on the CPU. Further, the determination device 34 includes a storage unit 346 as a storage area defined in the non-volatile internal memory.
- the controller 30, the recording device 32, and a part or all of the determination device 34 may be integrated into one.
- the display device D1 displays an image showing the surrounding state of the excavator 100 based on the input image under the control of the determination device 34 (display control unit 342). Specifically, the display device D1 displays the input image. Further, the display device D1 displays a converted image generated by the determination device 34 (display control unit 342) and subjected to a predetermined conversion process (for example, viewpoint conversion process) on the input image.
- the converted image may be, for example, a viewpoint-converted image in which a bird's-eye view image viewed from directly above the excavator 100 and a horizontal image viewed from a distance from the excavator 100 in the horizontal direction are combined.
- the viewpoint conversion image is synthesized after the images acquired by each of the front camera 70F, the rear camera 70B, the left camera 70L, and the right camera 70R are converted into a viewpoint conversion image by a bird's-eye view image and a horizontal image. It may be a composite image.
- the communication device 90 is an arbitrary device that connects to a communication network and communicates with the outside such as a management device 200.
- the communication device 90 may be, for example, a mobile communication module corresponding to a predetermined mobile communication standard such as LTE (LongTermEvolution), 4G (4thGeneration), 5G (5thGeneration).
- the notification unit 302 When the determination device 34 (determination unit 344) detects an object to be monitored in the monitoring area around the excavator 100, the notification unit 302 notifies the operator or the like to that effect. As a result, when an object to be monitored invades a relatively close range around the excavator 100, the operator or the like recognizes the intrusion even if the object is located in the blind spot when viewed from the cabin 10. It is possible to ensure safety by stopping the operation of the operating device 26.
- the notification unit 302 outputs a control signal to the sound output device D2 to notify the operator and the like that an object to be monitored has been detected in the monitoring area close to the excavator 100.
- the determination device 34 may notify that an object to be monitored has been detected in the monitoring area around the excavator 100.
- the operation control unit 304 limits the operation of the excavator 100 when the determination device 34 (determination unit 344) detects an object to be monitored in the monitoring area around the excavator 100.
- the motion control unit 304 limits the operation of the excavator 100 when the object to be monitored invades into the monitoring area close to the excavator 100, and makes contact between the excavator 100 and the object to be monitored, etc.
- the limitation of the operation of the excavator 100 may include slowing down the operation of various operation elements of the excavator 100 as an output with respect to the operation content (operation amount) of the operator or the like in the operation device 26.
- the restriction on the operation of the excavator 100 may include stopping the operation of the operating element of the excavator 100 regardless of the operation content of the operating device 26.
- the operating elements of the shovel 100 which are subject to the operation restriction of the excavator 100, may be all the operating elements that can be operated by the operating device 26, or the contact between the excavator 100 and the object to be monitored is avoided. It may be a part of the operating element required for.
- the operation control unit 304 may output a control signal to a pressure reducing valve provided on the pilot line on the secondary side of the operating device 26 to reduce the pilot pressure corresponding to the operation content of the operator or the like on the operating device 26. ..
- the operation control unit 304 transmits a signal limited to an operation amount smaller than the operation content (operation amount) corresponding to the signal input from the operation device 26 to the solenoid valve. This may control the solenoid valve and reduce the pilot pressure acting on the control valve from the solenoid valve.
- the pilot pressure corresponding to the operation content for the operating device 26 acting on the control valve that controls the hydraulic oil supplied to the hydraulic actuator can be reduced, and the operation of various operating elements can be restricted.
- the recording control unit 322 (an example of the recording unit) captures images acquired by the cameras (front camera 70F, rear camera 70B, left camera 70L, and right camera 70R) at a predetermined timing (hereinafter, “recording timing”). Record in the storage unit 324.
- recording timing a predetermined timing
- the required timing can be defined in advance and the input image can be recorded in the storage unit 324.
- the transmission capacity when the input image of the storage unit 324 is transmitted to the management device 200 can be suppressed, and the communication cost can be suppressed.
- the recording control unit 322 acquires an input image corresponding to the recording timing in the input image in the ring buffer defined in the RAM or the like, including the past portion. , Record in the storage unit 324.
- the recording timing may be, for example, a predetermined periodic timing. Further, the recording timing may be when a state of the excavator 100 in which an erroneous determination is likely to occur when the determination device 34 (determination unit 344) based on the input image determines the object around the excavator 100 occurs. Specifically, the recording timing may be when the excavator 100 is traveling or turning. Further, the recording timing may be when the determination unit 344 determines that an object is detected in the monitoring area around the excavator 100. Further, the recording start timing may be started by turning on the controller, by releasing the gate lock lever, or by turning on the operation lever.
- the determination result of the determination unit 344 is input to the recording device 32 (recording control unit 322), but when the recording timing is defined regardless of the determination result of the determination unit 344, the determination unit 344 The determination result does not have to be input to the recording device 32.
- the input image IM1 is recorded in the storage unit 324 under the control of the recording control unit 322 from the completion of the initial processing after the start of the excavator 100 to the stop of the excavator 100.
- One or more input images IM1 recorded in the storage unit 324 are transmitted to the management device 200 through the communication device 90 (an example of the environment information transmission unit) at a predetermined timing (hereinafter, “image transmission timing”). ..
- the image transmission timing may be, for example, when the excavator 100 is stopped. Further, the transmission timing may be when the free space of the storage unit 324 falls below a predetermined threshold value. This is because the total capacity of the input image IM1 recorded in the storage unit 324 may be relatively large between the start and stop of the excavator 100. Further, the image transmission timing may be, for example, after the completion of the initial processing after the start of the excavator 100.
- the storage unit 324 is a storage area defined in the non-volatile internal memory, and the input image IM1 recorded between the start and stop of the previous excavator 100 is transmitted to the management device 200. It may be there.
- the input image IM1 may be sequentially transmitted to the management device 200 through the communication device 90 each time it is recorded in the storage unit 324.
- the display control unit 342 causes the display device D1 to display an image (hereinafter, “excavator peripheral image”) showing the surroundings of the excavator 100.
- the display control unit 342 causes the display device D1 to display the input image as the excavator surrounding image.
- the display control unit 342 may display the input images of some cameras selected from the front camera 70F, the rear camera 70B, the left camera 70L, and the right camera 70R on the display device D1. ..
- the display control unit 342 may switch the camera corresponding to the input image to be displayed on the display device D1 according to a predetermined operation by the operator or the like. Further, the display control unit 342 may display all the input images of the front camera 70F, the rear camera 70B, the left camera 70L, and the right camera 70R on the display device D1.
- the display control unit 342 generates a converted image obtained by subjecting the input image to a predetermined conversion process as the excavator surrounding image, and displays the generated converted image on the display device D1.
- the converted image may be, for example, a viewpoint-converted image in which a bird's-eye view image viewed from directly above the excavator 100 and a horizontal image viewed from a distance from the excavator 100 in the horizontal direction are combined.
- the viewpoint conversion image the input images of the front camera 70F, the rear camera 70B, the left camera 70L, and the right camera 70R are converted into a viewpoint conversion image by combining a bird's-eye view image and a horizontal image.
- a composite image synthesized by a predetermined method hereinafter, “viewpoint conversion composite image” may be used.
- the display control unit 342 detects an object to be monitored in a predetermined monitoring area around the excavator 100 by the determination unit 344, the display control unit 342 corresponds to the detected object on the excavator surrounding image (hereinafter,). , "Detected object area”) is superimposed and displayed. As a result, the operator or the like can easily confirm the detected object on the excavator surrounding image.
- the determination unit 344 uses the machine-learned trained model LM stored in the storage unit 346 to determine the objects around the excavator 100 based on the input image. Specifically, the determination unit 344 loads the learned model LM from the storage unit 346 into a main storage device such as a RAM (path 344A), and causes the CPU to execute the model LM so as to surround the excavator 100 based on the input image. Make a judgment about the object. For example, as described above, the determination unit 344 detects the object to be monitored while determining the presence or absence of the object to be monitored in the monitoring area around the excavator 100.
- a main storage device such as a RAM (path 344A)
- the determination unit 344 determines (specifies) the type of the detected object to be monitored, that is, the detected object to be monitored is classified into a predetermined monitoring target classification list (hereinafter, "monitoring target list").
- the monitored list may include people, trucks, other construction machinery, utility poles, suspended loads, pylon and buildings, etc.
- the trained model LM is mainly composed of a neural network 401.
- the neural network 401 is a so-called deep neural network having one or more intermediate layers (hidden layers) between the input layer and the output layer.
- a weighting parameter representing the connection strength with the lower layer is defined for each of the plurality of neurons constituting each intermediate layer. Then, the neurons in each layer output the sum of the input values from the plurality of neurons in the upper layer multiplied by the weighting parameter defined for each neuron in the upper layer to the neurons in the lower layer through the threshold function.
- the neural network 401 is configured.
- the neural network 401 receives an input image as an input signal x and a predetermined monitoring target list (in this example, "scene 1 (for example, for example)" as an output signal y. , Scene (situation) of excavating the vicinity of the block and its risk level "," Scene 2 (for example, scene (situation) of a person entering a hole while excavating the vicinity of the block) and its risk level ", ...
- a predetermined monitoring target list in this example, "scene 1 (for example, for example)" as an output signal y.
- Scene (situation) of excavating the vicinity of the block and its risk level " Scene 2 (for example, scene (situation) of a person entering a hole while excavating the vicinity of the block) and its risk level ", ...
- the neural network 401 is, for example, a convolutional neural network (CNN).
- CNN is a neural network to which existing image processing techniques (convolution processing and pooling processing) are applied. Specifically, CNN extracts feature amount data (feature map) smaller in size than the input image by repeating a combination of convolution processing and pooling processing on the input image.
- the pixel value of each pixel of the extracted feature map is input to the neural network composed of the plurality of fully connected layers, and the output layer of the neural network includes, for example, an object (topography shape and the like) for each type of object. ) Can be output.
- the neural network 401 can output the predicted probability of the scene assumed based on the positional relationship for each type of the object and the change in the positional relationship. After that, the neural network 401 can output a scene having a high prediction probability and the degree of danger of the scene.
- the neural network 401 inputs the input image as the input signal x, and sets the position and size of the object in the input image (that is, the occupied area of the object on the input image) and the type of the object as the output signal y. It may be a configuration that can output. That is, the neural network 401 may be configured to detect an object on the input image (determine the occupied area portion of the object on the input image) and determine the classification of the object. Further, in this case, the output signal y may be configured in an image data format in which information regarding the occupied area of the object and its classification is superimposed on the input image as the input signal x.
- the determination unit 344 is based on the position and size of the occupied area of the object in the input image output from the trained model LM (neural network 401), and the relative position (distance and distance) of the object from the excavator 100. The direction) can be specified. Then, the determination unit 344 can identify the scene in which the object exists. The scene may be identified based on changes in the position and size of the object. This is because the cameras (front camera 70F, rear camera 70B, left camera 70L, and right camera 70R) are fixed to the upper swing body 3, and the imaging range (angle of view) is predetermined (fixed).
- the determination unit 344 detects the object to be monitored in the monitoring area. It can be determined that it has been done.
- the neural network 401 has a neural network corresponding to each of a process of extracting an occupied area (window) in which an object exists in an input image and a process of specifying the type of an object in the extracted area. It may be. That is, the neural network 401 may have a configuration in which object detection and object classification are performed step by step.
- the neural network 401 includes a process of classifying an object and defining an occupied area (bounding box) of an object for each grid cell in which the entire area of the input image is divided into a predetermined number of partial areas, and a grid. Based on the classification of the objects for each cell, the configuration may have a neural network corresponding to the process of combining the occupied areas of the objects for each type and determining the final occupied area of the object. That is, the neural network 401 may be configured to detect an object and classify the object in parallel.
- the determination result by the determination unit 344 is displayed on the display device D1 through, for example, the display control unit 342.
- the main screen 41V is displayed on the display device D1, and the input image is displayed in the camera image display area in the main screen 41V.
- the input image of the rear camera 70B is displayed in the camera image display area, and the input image shows the gutter block installed in front of the excavator 100 and the already excavated gutter.
- the determination unit 344 inputs the image data of the input image of the rear camera 70B into the trained model LM (neural network 401), so that the determination unit 344 outputs the object in the input image from the trained model LM.
- the occupied area is acquired, and the type and positional relationship of the object in the occupied area are determined.
- the type of the scene can be derived based on the discriminated type of the object and the positional relationship.
- the degree of risk is calculated based on the type of scene derived. Therefore, in this example, the box icon 501 that surrounds the occupied area of the object (block) classified as the "gutter block" output from the trained model LM, and the detected (classified) object are the gutter blocks.
- the character information icon 502 indicating that is superimposed on the input image and displayed.
- the box icon 503 that surrounds the occupied area of the object (groove) classified as the "excavation groove” output from the trained model LM, and the groove in which the detected (classified) object is one of the topographical shapes.
- the character information icon 504 indicating that is superimposed on the input image and displayed.
- the probabilities may also be displayed.
- the determination unit 344 classifies the scene in which the excavator 100 exists as a "scene for excavating the vicinity of the block" based on the type and positional relationship of those objects and the scene acquired from the learned model LM.
- the prediction probability classified as "the scene of excavating the vicinity of the block” may also be displayed.
- a level display (for example, 5 levels) indicating the degree of risk may be displayed.
- the operator of the excavator 100 can easily confirm the classification determined to be a dangerous scene and its cause, and can quickly perform the work for reducing the degree of danger.
- the determination unit 344 can also determine the work content in the type of scene.
- the determination unit 344 determines the dump truck, its position, and further. Based on the temporary earth and sand and its position, it can be determined that the work content in this scene is the loading work. In FIG. 9, the determination unit 344 can determine that the work content in this scene is a crane work based on the position of the recognized image of the wire rope and the position of the recognized image of the bucket 6. In this way, the determination unit 344 can determine the work content based on the recognized object and its position by the learning model. As the trained model LM, a support vector machine (SVM) or the like may be applied in addition to the neural network 401.
- SVM support vector machine
- the display device D1 may display a converted image based on the input image (for example, the above-mentioned composite viewpoint converted image).
- the box icon or the character information icon may be superimposed and displayed on the portion corresponding to the occupied area of the object on the converted image.
- the trained model LM is stored in the storage unit 346.
- the trained model LM of the storage unit 346 is an updated version of the trained model from the management device 200 by the communication device 90, that is, a trained model in which additional learning has been performed (hereinafter, “additionally trained model”) as described later. If received, it will be updated to the received additional trained model.
- the determination unit 344 can use the additionally trained model that has been additionally trained by the management device 200, so that the determination accuracy regarding the objects around the excavator 100 is improved according to the update of the trained model. be able to.
- the management device 200 includes a control device 210, a communication device 220, a display device 230, an input device 240, and a computer graphics image generation device 250 (hereinafter, “CG (Computer Graphics) image generation device”) 250.
- CG Computer Graphics
- the control device 210 controls various operations of the management device 200.
- the control device 210 includes, for example, a determination unit 2101, a teacher data generation unit 2102, and as functional units realized by executing one or more programs stored in a ROM or a non-volatile auxiliary storage device on the CPU. Includes learning unit 2103. Further, the control device 210 includes, for example, storage units 2104 and 2105 as a storage area defined in a non-volatile internal memory such as an auxiliary storage device.
- the communication device 220 is an arbitrary device that connects to a communication network and communicates with the outside such as a plurality of excavators 100.
- the display device 230 is, for example, a liquid crystal display or an organic EL display, and displays various information images under the control of the control device 210.
- the input device 240 receives an operation input from the user.
- the input device 240 includes, for example, a touch panel mounted on a liquid crystal display or an organic EL display. Further, the input device 240 may include a touch pad, a keyboard, a mouse, a trackball, and the like. Information regarding the operating state of the input device 240 is taken into the control device 210.
- the determination unit 2101 is read from the input image IM1 received from the plurality of excavators 100, that is, the storage unit 2104, using the learned model LM stored in the storage unit 2105 and machine-learned by the learning unit 2103. Based on the input image IM1 (path 2101A), a determination is made regarding an object around the excavator 100. Specifically, the determination unit 2101 loads the learned model LM from the storage unit 2105 into a main storage device such as a RAM (path 2101B), causes the CPU to execute the model LM, and causes the input image IM1 to be read from the storage unit 2104. Based on this, a determination is made regarding an object around the excavator 100.
- a main storage device such as a RAM
- the determination unit 2101 sequentially inputs a plurality of input images IM1 stored in the storage unit 2104 into the trained model LM, and makes a determination regarding an object around the excavator 100.
- the determination result 2101D of the determination unit 2101 is input to the teacher data generation unit 2102.
- the determination result 2101D may be sequentially input to the teacher data generation unit 2102 for each input image IM1, or is, for example, summarized by listing or the like and then input to the teacher data generation unit 2102. May be good.
- the teacher data generation unit 2102 (an example of a teacher information generation unit) is a teacher data (an example of teacher information) for the learning unit 2103 to machine learn a learning model based on a plurality of input images IM1 received from a plurality of excavators 100. ) Is generated.
- the teacher data represents a combination of an arbitrary input image IM1 and a correct answer to be output by the learning model when the input image IM1 is used as the input of the learning model.
- the learning model is an object of machine learning, and as a matter of course, it is configured around the same configuration as the trained model LM, for example, the above-mentioned neural network 401.
- the teacher data generation unit 2102 reads the input image IM1 received from the plurality of excavators 100 from the storage unit 2104 (path 2102A) and displays it on the display device D1, and the manager or worker of the management device 200 can display the input image IM1.
- Display the GUI (Graphical User Interface) for creating teacher data hereinafter, "GUI for creating teacher data"
- the administrator, the worker, etc. operate the GUI for creating teacher data by using the input device 240, and instruct the correct answer corresponding to each input image IM1, so that the format conforms to the algorithm of the learning model.
- Create teacher data In other words, the teacher data generation unit 2102 can generate a plurality of teacher data (teacher data sets) according to operations (work) by an administrator, a worker, or the like targeting a plurality of input image IM1s. ..
- the teacher data generation unit 2102 generates teacher data for the learning unit 2103 to additionally learn the trained model LM based on the plurality of input images IM1 received from the plurality of excavators 100.
- the teacher data generation unit 2102 reads out a plurality of input images IM1 from the storage unit 2104 (path 2102A), and each input image IM1 and the determination result (output result) 2101D of the determination unit 2101 corresponding to the input image IM1. And are displayed side by side on the display device 230.
- the manager, the operator, or the like of the management device 200 selects the combination corresponding to the erroneous determination from the combination of the input image IM1 displayed on the display device 230 and the corresponding determination result through the input device 240. be able to.
- the administrator, the operator, or the like operates the teacher data creation GUI by using the input device 240, and causes the input image IM1 of the combination corresponding to the erroneous determination, that is, the trained model LM to make an erroneous determination. It is possible to create teacher data for additional learning representing a combination of the input image IM1 and the correct answer to be output by the trained model LM when the input image IM1 is input.
- the teacher data generation unit 2102 is operated by an administrator, a worker, or the like targeting the input image IM1 corresponding to the erroneous determination in the trained model LM selected from the plurality of input image IM1s ( Depending on the work), a plurality of teacher data (teacher data sets) for additional learning can be generated.
- the teacher data generation unit 2102 generates teacher data for generating the first trained model LM from the plurality of input images IM1 received from the plurality of excavators 100. Then, the teacher data generation unit 2102 receives input images from the plurality of excavators 100 after the latest learned model LM is mounted on the plurality of excavators 100 at predetermined timings (hereinafter, “additional learning timings”). Teacher data for additional learning is generated from the input image IM1 selected from IM1 in which the trained model LM makes an erroneous determination.
- a part of the input image IM1 received from each of the plurality of excavators 100 may be used as a base for the verification data set of the trained model LM. That is, the input image IM1 received from each of the plurality of excavators 100 may be divided into an input image IM1 for generating teacher data and an input image IM1 for generating a verification data set.
- the additional learning timing may be a regularly defined timing, for example, one month has passed since the previous machine learning (additional learning).
- the timing of additional learning is, for example, when the number of input image IM1 exceeds a predetermined threshold value, that is, when the number of input image IM1 required for additional learning by the learning unit 2103 is collected. You may.
- the learning unit 2103 causes the learning model to perform machine learning based on the teacher data 2102B (teacher data set) generated by the teacher data generation unit 2102, and generates a trained model LM. Then, the generated trained model LM is stored in the storage unit 2105 after the accuracy verification is performed using the verification data set prepared in advance (path 2103B).
- the learning unit 2103 performs additional learning by causing the learned model LM (path 2103A) read from the storage unit 2105 to perform additional learning based on the teacher data (teacher data set) generated by the teacher data generation unit 2102. Generate a completed model. Then, the additional trained model is subjected to accuracy verification using the verification data set prepared in advance, and the trained model LM of the storage unit 2105 is an additional trained model for which accuracy verification is performed. It is updated (path 2103B).
- the learning unit 2103 can output the learning model by applying a known algorithm such as an error backpropagation method (backpropagation).
- the weighting parameters are optimized so that the error from the teacher data becomes small, and the trained model LM is generated. The same applies to the generation of the additionally trained model.
- the first trained model LM generated from the training model may be generated by an external device different from the management device 200.
- the teacher data generation unit 2102 may generate only the teacher data for additional learning
- the learning unit 2103 may only generate the additional learning model.
- the storage unit 2104 stores (stores) the input image IM1 received from each of the plurality of excavators 100 through the communication device 220.
- the input image IM1 used for generating the teacher data by the teacher data generation unit 2102 may be stored in a storage device different from the storage unit 2104.
- the learned model LM is stored (stored) in the storage unit 2105.
- the trained model LM updated by the additional trained model generated by the learning unit 2103 has a plurality of excavators at a predetermined timing (hereinafter, “model transmission timing”) through the communication device 220 (an example of the model transmission unit). Sent to each of the 100. This makes it possible to share the same updated trained model LM, that is, the additional trained model, among the plurality of excavators 100.
- the model transmission timing may be when the trained model LM of the storage unit 2105 is updated, that is, immediately after the trained model LM is updated, or when a predetermined time has elapsed after the update. Further, as for the model transmission timing, for example, after the trained model LM is updated, the communication device 220 receives a confirmation reply to the notification of the update of the trained model LM, which is transmitted to the plurality of excavators 100 through the communication device 220. It may be at the time.
- FIG. 15 is a sequence diagram showing an example of the operation of the excavator support system.
- step S10 the communication devices 90 of the plurality of excavators 100 transmit the input image IM1 to the management device 200 at the respective image transmission timings.
- the management device 200 receives the input images IM1 from each of the plurality of excavators 100 through the communication device 220 and stores them in the storage unit 2104 in a cumulative manner.
- step S12 the determination unit 2101 of the management device 200 inputs the plurality of input images IM1 received from the plurality of excavators 100 and stored in the storage unit 2104 into the trained model LM, and performs the determination process.
- step S14 the manager, the worker, and the like of the management device 200 verify the determination result by the learned model LM, and erroneously determine the learned model LM from among the plurality of input image IMs through the input device 240. Specify (select) the input image IM.
- step S16 the teacher data generation unit 2102 of the management device 200 generates a teacher data set for additional learning in response to an operation of the teacher data creation GUI through the input device 240 by an administrator, a worker, or the like.
- step S18 the learning unit 2103 of the management device 200 performs additional learning of the trained model LM using the teacher data set for additional learning, generates an additional trained model, and generates a trained model of the storage unit 2104. Update LM with additional trained model.
- step S20 the communication device 220 of the management device 200 transmits an updated version of the learned model LM to each of the plurality of excavators 100.
- the timing at which the updated version of the trained model LM is transmitted to the excavator 100 may be different for each of the plurality of excavators 100 as described above.
- each of the plurality of excavators 100 updates the trained model LM of the storage unit 346 to the updated trained model received from the management device 200.
- the CG image generation device 250 generates a computer graphics image (hereinafter, “CG image”) IM3 showing the surroundings of the excavator 100 at the work site in response to an operation by an operator or the like of the management device 200.
- the CG image generation device 250 is mainly composed of a computer including a CPU, a RAM, a ROM, an auxiliary storage device, various input / output interfaces, and the like, and an operator or the like can create a CG image IM3.
- Application software is pre-installed. Then, the worker or the like creates the CG image IM3 on the display screen of the CG image generation device 250 through a predetermined input device.
- the CG image generation device 250 can generate a CG image IM3 that represents the surroundings of the excavator 100 at the work site in response to the work (operation) by the operator or the like of the management device 200. Further, the CG image generator 250 is based on an image around the actual excavator 100 (for example, the input image IM1), and has weather conditions corresponding to the captured image, weather conditions different from the sunshine conditions, a working environment under the sunshine conditions, and the like. It is also possible to generate a CG image IM3 corresponding to. The CG image IM3 generated by the CG image generation device 250 is incorporated into the control device 210.
- the CG image IM3 may be generated (created) outside the management device 200.
- the control device 210 includes a determination unit 2101, a teacher data generation unit 2102, a learning unit 2103, and a storage unit 2104 and 2105, as in the above example.
- the determination unit 2101 uses a plurality of input images IM1 (path 2101A) and a CG image IM3 read from the storage unit 2104 by using the learned model LM stored in the storage unit 2105 and machine-learned by the learning unit 2103. Based on (path 2101C), a determination is made regarding an object around the excavator 100. Specifically, the determination unit 2101 loads the learned model LM from the storage unit 346 into a main storage device such as a RAM (path 2101B) and causes the CPU to execute the input image IM1 and the input image IM1 read from the storage unit 2104. Based on the CG image IM3, a determination is made regarding an object around the excavator 100.
- the determination unit 2101 sequentially inputs a plurality of input images IM1 and CG images IM3 stored in the storage unit 2104 into the trained model LM, and determines the objects around the shovel 100.
- the determination result 2101D of the determination unit 2101 is input to the teacher data generation unit 2102.
- the determination result 2101D may be sequentially input to the teacher data generation unit 2102 for each of the plurality of input image IM1 and CG image IM3, and for example, the teacher data is generated after being summarized by listing or the like. It may be input to the unit 2102.
- the teacher data generation unit 2102 is learned by the learning unit 2103 based on the plurality of input images IM1 received from the plurality of excavators 100 and the CG image IM3 generated (stored in the storage unit 2104) by the CG image generation device 250. Generate teacher data for machine learning the model.
- the teacher data generation unit 2102 reads the input image IM1 received from the plurality of excavators 100 and the CG image IM3 generated by the CG image generation device 250 from the storage unit 2104 (paths 2102A, 2102C), and displays the display device D1. Along with displaying it, the GUI for creating teacher data is displayed. Then, the administrator, the operator, etc. operate the GUI for creating teacher data using the input device 240, and instruct the correct answer corresponding to each input image IM1 or CG image IM3, thereby making the algorithm of the learning model. Create teacher data in a format that matches. In other words, the teacher data generation unit 2102 generates a plurality of teacher data (teacher data sets) in response to operations (work) by an administrator, a worker, or the like targeting the plurality of input image IM1 and CG image IM3. can do.
- the teacher data generation unit 2102 is based on the plurality of input images IM1 received from the plurality of excavators 100 and the CG image IM3 generated by the CG image generation device 250 (stored in the storage unit 2104), and is based on the learning unit 2103. Generates teacher data for additional training of the trained model LM.
- the teacher data generation unit 2102 reads out a plurality of input images IM1 and CG images IM3 from the storage unit 2104 (paths 2102A, 2102C), and corresponds to each input image IM1 or CG image IM3 and the input image IM1 or CG image IM3.
- the determination result (output result) of the determination unit 2101 (learned model LM) is displayed side by side on the display device 230.
- the manager, the operator, and the like of the management device 200 can select the combination of the input image IM1 or the CG image IM3 displayed on the display device 230 through the input device 240 and the determination result of the corresponding trained model LM.
- the combination corresponding to the erroneous judgment can be selected.
- the administrator, the worker, or the like operates the GUI for creating teacher data by using the input device 240, and the input image IM1 or CG image IM3 corresponding to the combination of erroneous determination and the input image IM1 or CG image IM3. It is possible to create teacher data for additional learning that represents a combination with the correct answer to be output by the trained model LM when is input.
- the teacher data generation unit 2102 targets at least one of the input image IM1 and the CG image IM3 selected from the plurality of input image IM1 and the CG image IM3 and corresponding to the erroneous determination in the trained model LM.
- teacher data can be generated using the CG image IM3 in addition to the input image IM1 collected from the plurality of excavators 100, so that the teacher data can be enhanced.
- the CG image IM3 it is possible to virtually freely create various work site situations, that is, various environmental conditions. Therefore, by generating the teacher data set using the CG image IM3, the trained model LM can realize relatively high determination accuracy corresponding to various work sites at an earlier timing.
- the CG image generator 250 can output the data related to the correct answer (hereinafter, "correct answer data") to be output by the trained model LM when the CG image IM3 is input to the control device 210 together with the CG image IM3. is there.
- control device 210 (teacher data generation unit 2102) makes an error in the determination process by the trained model LM (determination unit 2101) that inputs the CG image IM3 based on the correct answer data input from the CG image generation device 250.
- Multiple teacher data for additional learning representing a combination of a CG image IM3 that automatically extracts judgments and corresponds to the extracted erroneous judgments and a correct answer that should be output by the trained model LM when the CG image IM3 is input. (Teacher data set) can be generated automatically.
- the learning unit 2103 can perform additional learning of the trained model LM, for example, the above-mentioned backpropagation method (backpropagation) or the like, based on the teacher data automatically generated by the teacher data generation unit 2102. .
- the control device 210 can also automatically generate an additional trained model based on the CG image IM3 and the correct answer data generated by the CG image generation device 250.
- FIG. 16 is a conceptual diagram showing another example of the determination process by the determination unit 2101.
- the trained model LM is mainly composed of the first neural network 401A and the second neural network 401B.
- an input image is input as an input signal x, and as an output signal y, the probability (prediction probability) that an object exists for each type of object in a predetermined monitoring target list and the position of the object.
- Information can be output.
- the input image is an captured image captured by the front camera 70F, and the objects in the monitoring target list include clay pipes, holes, and the like.
- the first neural network 401A estimates the existence of the clay pipe with high probability. Then, the first neural network 401A derives the position (for example, latitude, longitude, and altitude) of the clay pipe based on the information regarding the position of the front camera 70F.
- the information regarding the position of the front camera 70F is, for example, the latitude, longitude, and altitude of the front camera 70F, and is derived based on the output of the positioning device 73.
- the first neural network 401A can derive the position of the clay pipe based on the position and size of the clay pipe image in the captured image. In the example shown in FIG. 16, the first neural network 401A outputs the estimation result of the existence of the clay pipe at the east longitude e1, the north latitude n1, and the altitude h1 as the output signal y.
- the first neural network 401A can output the probability (prediction probability) that an object exists for each type of object in the predetermined monitoring target list and the position information of the object based on the information about the construction plan. it can.
- the first neural network 401A is based on information about the area in which the hole for burying the gutter block is drilled, as shown in FIG. 11, the position of the hole (eg latitude, longitude, And altitude).
- the first neural network 401A can derive the position of the hole based on the information about the position included in the design data.
- the first neural network 401A outputs the recognition result of the hole to be formed at the east longitude e2, the north latitude n2, and the altitude h2 as the output signal y.
- the output signal y of the first neural network 401A is input to the second neural network 401B as the input signal y, and the positional relationship of each object whose existence is estimated or recognized by the first neural network 401A as the output signal z. It is possible to output the degree of danger at that time for each scene (situation) based on the above.
- FIG. 17 is a conceptual diagram showing another example of the determination process by the determination unit 2101.
- the trained model LM is mainly composed of the first neural network 401A and the third neural network 401C.
- the degree of danger when the dump truck stops in front of the excavator 100 is determined.
- the dump truck is stopped in front of the excavator 100 so that the excavator 100 can load earth and sand on the loading platform of the dump truck.
- the dump truck has started to move away from the excavator 100, contrary to the intention of the driver of the dump truck, because the side brake is not properly applied.
- the loading platform of the dump truck has not yet been loaded with earth and sand.
- the first neural network 401A recognizes the dump track when the captured image as shown in FIG. 17 is input as the input signal x at time t1. Then, the first neural network 401A derives the position (for example, latitude, longitude, and altitude) of the dump truck based on the information regarding the position of the front camera 70F. In the example shown in FIG. 17, the first neural network 401A outputs the recognition result of the dump truck at the east longitude e1, the north latitude n1, and the altitude h1 as the output signal y at the time t1.
- the first neural network 401A recognizes the dump truck located at a position farther from the excavator 100 than at time t1. Then, the first neural network 401A derives the position (for example, latitude, longitude, and altitude) of the dump truck based on the information regarding the position of the front camera 70F. In the example shown in FIG. 17, the first neural network 401A outputs the recognition result of the dump truck at the east longitude e2, the north latitude n2, and the altitude h2 as the output signal y at the time t2.
- the position for example, latitude, longitude, and altitude
- the first neural network 401A recognizes the dump truck at a position farther from the excavator 100 than at time t2. .. Then, the first neural network 401A derives the position (for example, latitude, longitude, and altitude) of the dump truck based on the information regarding the position of the front camera 70F. In the example shown in FIG. 17, the first neural network 401A outputs the recognition result of the dump truck at the east longitude e3, the north latitude n3, and the altitude h3 as the output signal y at the time t3.
- the position for example, latitude, longitude, and altitude
- the third neural network 401C is input with the output signal y of the first neural network 401A at a predetermined time in the past as the input signal y and the output signal y of the first neural network 401A at the present time, and is the third neural network 401C as the output signal z. It is possible to output the current degree of risk for each scene (situation) based on the positional relationship of objects at each time recognized by the neural network 401A.
- the third neural network 401C receives the output signal y of the first neural network 401A at time t1 and the output signal y of the first neural network 401A at time t2. Then, the third neural network 401C has a risk level at time t2 regarding each scene (situation) based on the position of the dump truck at time t1 and the position of the dump truck at time t2 recognized by the first neural network 401A. Can be output.
- scene 1 is, for example, a scene (situation) in which a dump truck to which the side brake is not properly applied moves forward
- scene 2 is, for example, a dump truck to which the side brake is not properly applied.
- This is a scene (situation) in which the truck retreats.
- the third neural network 401C determines that the dump truck is moving forward based on the position of the dump truck at time t1 and the position of the dump truck at time t2 recognized by the first neural network 401A. It is possible to output the scene 1 at a higher risk level.
- the third neural network 401C receives the output signal y of the first neural network 401A at time t2 and the output signal y of the first neural network 401A at time t3. Then, the third neural network 401C has a risk level at time t3 regarding each scene (situation) based on the position of the dump truck at time t2 and the position of the dump truck at time t3 recognized by the first neural network 401A. Can be output.
- the third neural network 401C determines that the dump truck is moving forward based on the position of the dump truck at time t2 and the position of the dump truck at time t3 recognized by the first neural network 401A.
- the risk level of scene 1 can be output even higher.
- FIG. 18 is a diagram showing another configuration example of the excavator support system, and corresponds to FIG. 13.
- FIG. 18 shows a configuration in which each of the three excavators 100 (excavator 100A, excavator 100B, and excavator 100C) is wirelessly connected to the communication device 220 of the management device 200 via the communication device 90. Further, FIG. 18 shows a configuration in which the support device 300 including the display unit 310, the input unit 320, and the communication unit 330 is wirelessly connected to the communication device 220 of the management device 200 via the communication unit 330.
- the management device 200 constituting the excavator support system shown in FIG. 18 is different from the management device 200 shown in FIG. 13 in that the control device 210 mainly has an operation control command generation unit 2106. Further, each of the excavators 100 shown in FIG. 18 is different from the excavator 100 shown in FIG. 13 mainly in that the determination device 34 is omitted.
- the operation control command generation unit 2106 which is a function of the control device 210 in the management device 200, is a determination unit 344, which is a function of the determination device 34 in the excavator 100 shown in FIG. Works the same as.
- the operation control command generation unit 2106 can generate an operation control command for the operation control unit 304, which is a function of the controller 30 mounted on the excavator 100, based on the determination result 2101E of the determination unit 2101. it can.
- the determination result 2101E is, for example, the same as the determination result 2101D.
- the operation control command generation unit 2106 in the management device 200 is a controller 30 mounted on each of the plurality of excavators 100 (excavator 100A, excavator 100B, and excavator 100C) via wireless communication.
- the motion control unit 304 in the above can be individually functioned.
- the operation control command generation unit 2106 in the management device 200 responds to an input by an operator using the support device 300 via the input unit 320 in the support device 300, for example, around the excavator on the display unit 310 in the support device 300. Images can be displayed. Further, the operation control command generation unit 2106 can display the determination result by the determination unit 2101 on the display unit 310.
- FIG. 19 is a diagram showing another configuration example of the image display unit 41 and the operation unit 42 of the display device D1.
- the image display unit 41 shows a state in which the input image of FIG. 5 is displayed.
- the image display unit 41 includes a date / time display area 41a, a traveling mode display area 41b, an attachment display area 41c, a fuel consumption display area 41d, an engine control state display area 41e, an engine operating time display area 41f, and cooling.
- the driving mode display area 41b, the attachment display area 41c, the engine control state display area 41e, the rotation speed mode display area 41i, and the air conditioner operation state display area 41m are areas for displaying the setting state information which is information on the setting state of the excavator 100.
- the fuel consumption display area 41d, the engine operating time display area 41f, the cooling water temperature display area 41g, the fuel remaining amount display area 41h, the urea water remaining amount display area 41j, and the hydraulic oil temperature display area 41k are information related to the operating state of the excavator 100. This is an area for displaying certain operating status information.
- the date and time display area 41a is an area for displaying the current date and time.
- the travel mode display area 41b is an area for displaying the current travel mode.
- the attachment display area 41c is an area for displaying an image representing the currently attached attachment.
- the fuel consumption display area 41d is an area for displaying fuel consumption information calculated by the controller 30.
- the fuel consumption display area 41d includes an average fuel consumption display area 41d1 for displaying the lifetime average fuel consumption or the section average fuel consumption, and an instantaneous fuel consumption display area 41d2 for displaying the instantaneous fuel consumption.
- the engine control status display area 41e is an area for displaying the control status of the engine 11.
- the engine operating time display area 41f is an area for displaying the cumulative operating time of the engine 11.
- the cooling water temperature display area 41g is an area for displaying the current temperature state of the engine cooling water.
- the fuel remaining amount display area 41h is an area for displaying the remaining amount state of the fuel stored in the fuel tank.
- the rotation speed mode display area 41i is an area for displaying an image of the current rotation speed mode set by the engine rotation speed adjustment dial 75.
- the urea water remaining amount display area 41j is an area for displaying the remaining amount state of the urea water stored in the urea water tank as an image.
- the hydraulic oil temperature display area 41k is an area for displaying the temperature state of the hydraulic oil in the hydraulic oil tank.
- the air conditioner operation status display area 41m includes an outlet display area 41m1 for displaying the current outlet position, an operation mode display area 41m2 for displaying the current operation mode, a temperature display area 41m3 for displaying the current set temperature, and the present. Includes an air volume display area 41m4 that displays the set air volume of.
- the image display area 41n is an area for displaying an image output by the space recognition device 70 or the like. In the example shown in FIG. 19, the image display area 41n displays an image captured by the front camera. A bird's-eye view image or a rear image may be displayed in the image display area 41n.
- the bird's-eye view image is, for example, a virtual viewpoint image generated by the control unit 40, and is generated based on the images acquired by each of the rear camera 70B, the left camera 70L, and the right camera 70R. Further, an excavator figure corresponding to the excavator 100 may be arranged in the central portion of the bird's-eye view image.
- the rear image is an image that reflects the space behind the excavator 100, and includes an image of a counterweight.
- the rear image is, for example, a real viewpoint image generated by the control unit 40, and is generated based on the image acquired by the rear camera 70B.
- the image display area 41n is a vertically long area, but the image display area 41n may be a horizontally long area.
- the menu display area 41p has tabs 41p1 to 41p7.
- tabs 41p1 to 41p7 are arranged on the left and right sides at the bottom of the image display unit 41 at intervals. Icons for displaying various information are displayed on the tabs 41p1 to 41p7.
- a menu detail item icon for displaying a menu detail item is displayed.
- the icons displayed on tabs 41p2 to 41p7 are switched to the icons associated with the menu detail items.
- An icon for displaying information about the digital level is displayed on tab 41p4.
- tab 41p4 When tab 41p4 is selected by the operator, the currently displayed image switches to a screen showing information about the digital level.
- a screen showing information about the digital level may be displayed by superimposing it on the currently displayed image or reducing the currently displayed image.
- tab 41p6 An icon for displaying information related to computerized construction is displayed on tab 41p6.
- tab 41p6 When tab 41p6 is selected by the operator, the currently displayed image is switched to a screen showing information on computerized construction.
- a screen showing information on computerized construction may be displayed by superimposing it on the currently displayed image or reducing the currently displayed image.
- An icon for displaying information on the crane mode is displayed on tab 41p7.
- tab 41p7 When tab 41p7 is selected by the operator, the currently displayed image switches to a screen showing information about the crane mode. However, a screen showing information on the crane mode may be displayed by superimposing it on the currently displayed image or reducing the currently displayed image.
- the icon is not displayed on tabs 41p2, 41p3, 41p5. Therefore, even if the tabs 41p2, 41p3, and 41p5 are operated by the operator, the image displayed on the image display unit 41 does not change.
- icons displayed on the tabs 41p1 to 41p7 are not limited to the above examples, and icons for displaying other information may be displayed.
- the operation unit 42 is composed of one or a plurality of button-type switches on which the operator selects tabs 41p1 to 41p7, inputs settings, and the like.
- the operation unit 42 includes seven switches 42a1 to 42a7 arranged in the upper stage and seven switches 42b1 to 42b7 arranged in the lower stage.
- the switches 42b1 to 42b7 are arranged below the switches 42a1 to 42a7, respectively.
- the number, form, and arrangement of the switches of the operation unit 42 are not limited to the above examples, and the functions of a plurality of button-type switches are integrated into one by, for example, a jog wheel or a jog switch. It may be in the form, or the operation unit 42 may be separate from the display device D1. Further, the tabs 41p1 to 41p7 may be directly operated by the touch panel in which the image display unit 41 and the operation unit 42 are integrated.
- the switches 42a1 to 42a7 are arranged below the tabs 41p1 to 41p7 corresponding to the tabs 41p1 to 41p7, respectively, and function as switches for selecting the tabs 41p1 to 41p7, respectively. Since the switches 42a1 to 42a7 are arranged below the tabs 41p1 to 41p7, respectively, corresponding to the tabs 41p1 to 41p7, the operator can intuitively select the tabs 41p1 to 41p7. In the example shown in FIG. 19, for example, when the switch 42a1 is operated, tab 41p1 is selected, the menu display area 41p is changed from the one-stage display to the two-stage display, and the icons corresponding to the first menu are tabs 41p2 to It is displayed on 41p7.
- the size of the currently displayed image is reduced in response to the change of the menu display area 41p from the one-stage display to the two-stage display. At this time, since the size of the bird's-eye view image is maintained without being changed, the visibility when the operator confirms the surroundings of the excavator 100 does not deteriorate.
- the switch 42b1 is a switch for switching the captured image displayed in the image display area 41n. Each time the switch 42b1 is operated, the captured image displayed in the image display area 41n is configured to switch between, for example, a rear image, a left image, a right image, and a bird's-eye view image.
- the switches 42b2 and 42b3 are switches that adjust the air volume of the air conditioner.
- the air volume of the air conditioner is reduced when the switch 42b2 is operated, and the air volume of the air conditioner is increased when the switch 42b3 is operated.
- the switch 42b4 is a switch for switching ON / OFF of the cooling / heating function. In the example of FIG. 4, it is configured so that the cooling / heating function is switched ON / OFF each time the switch 42b4 is operated.
- the switches 42b5 and 42b6 are switches that adjust the set temperature of the air conditioner. In the example of FIG. 4, the set temperature is lowered when the switch 42b5 is operated, and the set temperature is raised when the switch 42b6 is operated.
- the switch 42b7 is a switch that can switch the display of the engine operating time display area 41f.
- switches 42a2 to 42a6 and 42b2 to 42b6 are configured so that the numbers displayed on the respective switches or in the vicinity of the switches can be input. Further, the switches 42a3, 42a4, 42a5, and 42b4 are configured to be able to move the cursor to the left, up, right, and down, respectively, when the cursor is displayed on the menu screen.
- switches 42a1 to 42a7 and 42b1 to 42b7 are examples, and may be configured so that other functions can be executed.
- the first menu detailed items are displayed on the tabs 41p2 to 41p7 while the predetermined image is displayed. Is displayed. Therefore, the operator can confirm the first menu detailed item while confirming the predetermined image.
- a bird's-eye view image is displayed without changing the size before and after the tab 41p1 is selected.
- the visibility when the operator checks the surroundings of the excavator 100 does not deteriorate.
- FIG. 20 is a schematic view showing an example of the construction system SYS.
- the construction system SYS includes a shovel 100, a management device 200, and a support device 300.
- the construction system SYS is configured to support construction by one or a plurality of excavators 100.
- the information acquired by the excavator 100 may be shared with the manager, other excavator operators, and the like through the construction system SYS.
- Each of the excavator 100, the management device 200, and the support device 300 constituting the construction system SYS may be one unit or a plurality of units.
- the construction system SYS includes one excavator 100, one management device 200, and one support device 300.
- the management device 200 is typically a fixed terminal device, for example, a server computer (so-called cloud server) installed in a management center or the like outside the construction site. Further, the management device 200 may be, for example, an edge server set at the construction site. Further, the management device 200 may be a portable terminal device (for example, a laptop computer terminal, a tablet terminal, or a mobile terminal such as a smartphone).
- a server computer so-called cloud server
- the management device 200 may be, for example, an edge server set at the construction site.
- the management device 200 may be a portable terminal device (for example, a laptop computer terminal, a tablet terminal, or a mobile terminal such as a smartphone).
- the support device 300 is typically a mobile terminal device, for example, a laptop-type computer terminal, a tablet terminal, a smartphone, or the like carried by a worker or the like at a construction site.
- the support device 300 may be a mobile terminal carried by the operator of the excavator 100.
- the support device 300 may be a fixed terminal device.
- At least one of the management device 200 and the support device 300 may include a monitor and an operation device for remote control.
- the operator using the management device 200 or the support device 300 may operate the excavator 100 while using the remote control operation device.
- the operation device for remote control is communicably connected to the controller 30 mounted on the excavator 100 through a wireless communication network such as a short-range wireless communication network, a mobile phone communication network, or a satellite communication network.
- various information images displayed on the display device D1 installed in the cabin 10 are at least the management device 200 and the support device 300. It may be displayed on a display device connected to one side.
- the image information representing the surrounding state of the excavator 100 may be generated based on the image captured by the imaging device (for example, the imaging device as the space recognition device 70).
- the administrator who uses the management device 200, the worker who uses the support device 300, etc. can remotely control the excavator 100 while checking the surroundings of the excavator 100, and various types of the excavator 100. You can make settings.
- the controller 30 of the excavator 100 determines the time and place when a predetermined switch for starting the autonomous operation is pressed, and the target trajectory used when the excavator 100 is autonomously operated.
- information regarding at least one such as a trajectory actually followed by a predetermined portion during autonomous operation may be transmitted to at least one of the management device 200 and the support device 300.
- the controller 30 may transmit the image captured by the space recognition device 70 to at least one of the management device 200 and the support device 300.
- the image may be a plurality of images captured during autonomous operation.
- the controller 30 provides at least one of the management device 200 and the support device 300 with information on at least one such as data on the operation content of the excavator 100 during autonomous operation, data on the posture of the excavator 100, and data on the posture of the excavation attachment. May be sent to.
- the manager who uses the management device 200 or the worker who uses the support device 300 can obtain information about the excavator 100 during autonomous operation.
- the type and position of the monitoring target outside the monitoring area of the excavator 100 are stored in the storage unit in chronological order.
- the construction system SYS makes it possible to share information about the excavator 100 with the manager, other excavator operators, and the like.
- the communication device mounted on the excavator 100 is configured to transmit and receive information to and from the communication device T2 installed in the remote control room RC via wireless communication. May be good.
- the communication device and the communication device T2 mounted on the excavator 100 transmit and receive information via a fifth generation mobile communication line (5G line), an LTE line, a satellite line, or the like. It is configured.
- 5G line fifth generation mobile communication line
- LTE line Long Term Evolution
- satellite line or the like. It is configured.
- a remote controller 30R In the remote control room RC, a remote controller 30R, a sound output device A2, an indoor image pickup device C2, a display device RP, a communication device T2, and the like are installed. Further, in the remote control room RC, a driver's seat DS on which the operator OP who remotely controls the excavator 100 sits is installed.
- the remote controller 30R is an arithmetic unit that executes various arithmetic operations.
- the remote controller 30R like the controller 30, is composed of a microcomputer including a CPU and a memory. Then, various functions of the remote controller 30R are realized by the CPU executing a program stored in the memory.
- the sound output device A2 is configured to output sound.
- the sound output device A2 is a speaker, and is configured to reproduce the sound collected by the sound collecting device (not shown) attached to the excavator 100.
- the indoor imaging device C2 is configured to image the inside of the remote control room RC.
- the indoor image pickup device C2 is a camera installed inside the remote control room RC, and is configured to take an image of the operator OP seated in the driver's seat DS.
- the communication device T2 is configured to control wireless communication with the communication device attached to the excavator 100.
- the driver's seat DS has the same structure as the driver's seat installed in the cabin of a normal excavator. Specifically, the left console box is arranged on the left side of the driver's seat DS, and the right console box is arranged on the right side of the driver's seat DS. A left operation lever is arranged at the front end of the upper surface of the left console box, and a right operation lever is arranged at the front end of the upper surface of the right console box. Further, a traveling lever and a traveling pedal are arranged in front of the driver's seat DS. Further, an engine speed adjusting dial 75 is arranged at the center of the upper surface of the right console box. Each of the left operating lever, the right operating lever, the traveling lever, the traveling pedal, and the engine speed adjusting dial 75 constitutes the operating device 26A.
- the operation device 26A is provided with an operation sensor 29A for detecting the operation content of the operation device 26A.
- the operation sensor 29A is, for example, an inclination sensor that detects the inclination angle of the operation lever, an angle sensor that detects the swing angle around the swing axis of the operation lever, and the like.
- the operation sensor 29A may be composed of other sensors such as a pressure sensor, a current sensor, a voltage sensor, or a distance sensor.
- the operation sensor 29A outputs information regarding the detected operation content of the operation device 26A to the remote controller 30R.
- the remote controller 30R generates an operation signal based on the received information, and transmits the generated operation signal to the excavator 100.
- the operation sensor 29A may be configured to generate an operation signal. In this case, the operation sensor 29A may output the operation signal to the communication device T2 without going through the remote controller 30R.
- the display device RP is configured to display information on the surrounding conditions of the excavator 100.
- the display device RP is a multi-display composed of nine monitors having three vertical rows and three horizontal rows, and can display the state of the space in front, left, and right of the excavator 100. It is configured as follows.
- Each monitor is a liquid crystal monitor, an organic EL monitor, or the like.
- the display device RP may be composed of one or a plurality of curved surface monitors, or may be composed of a projector.
- the display device RP may be a display device that can be worn by the operator OP.
- the display device RP is a head-mounted display, and may be configured so that information can be transmitted and received to and from the remote controller 30R by wireless communication.
- the head-mounted display may be wiredly connected to the remote controller.
- the head-mounted display may be a transmissive head-mounted display or a non-transparent head-mounted display.
- the head-mounted display may be a monocular head-mounted display or a binocular head-mounted display.
- the display device RP is configured to display an image that allows the operator OP in the remote control room RC to visually recognize the surroundings of the excavator 100. That is, the display device RP can confirm the situation around the excavator 100 as if the operator is in the cabin 10 of the excavator 100 even though the operator is in the remote control room RC. Is displayed.
- the construction system SYS is configured to support construction by the excavator 100.
- the construction system SYS has a communication device CD and a control device CTR that communicate with the excavator 100.
- the control device CTR is configured to determine a dangerous situation based on the information acquired by the information acquisition device E1.
- control device CTR is configured to estimate the construction status after a lapse of a predetermined time based on the information acquired by the information acquisition device E1 and determine a dangerous situation based on the information regarding the estimated construction status. May be good.
- control device CTR may be configured to determine the degree of danger based on the estimated construction situation and determine that a dangerous situation will occur when the degree of danger exceeds a predetermined value.
- control device CTR may be configured to determine the scene at the construction site based on the information acquired by the information acquisition device E1.
- the control device CTR may be configured to estimate the scene of the construction site based on the schedule information after a predetermined time.
- the excavator 100 includes a lower traveling body 1, an upper rotating body 3 rotatably mounted on the lower traveling body 1, and a non-volatile storage device provided on the upper rotating body 3. It includes an NM, an information acquisition device E1 that acquires information about construction, and a controller 30 as a control device that controls a notification device that is at least one of a display device D1 and a sound output device D2. Then, the controller 30 notifies when it is determined that a dangerous situation occurs based on the information acquired by the information acquisition device E1 and the information stored in the danger information database DB which is the database in the non-volatile storage device NM. It is configured to operate the device.
- the controller 30 estimates the construction status after a lapse of a predetermined time based on the information acquired by the information acquisition device E1, and stores the information on the estimated construction status and the danger information database DB in the non-volatile storage device NM. It may be configured to activate the notification device when it is determined that a dangerous situation occurs based on the information. With this configuration, the excavator 100 can prevent a dangerous situation from actually occurring.
- the controller 30 determines the degree of danger based on the estimated construction status and the danger information database DB stored in the non-volatile storage device NM, and when the degree of danger exceeds a predetermined value, a dangerous situation occurs. It may be configured to determine that.
- the excavator 100 may display information on the dangerous situation determined to occur on the display device D1. This is to more accurately convey to the operator the details of the dangerous situation that may occur.
- the information regarding the construction may include an image of the surroundings of the excavator 100, may include information regarding the construction plan, and may include information regarding the material arrangement.
- the construction system is a construction system that supports the creation of a construction plan.
- the information input device MD3 is provided as an information input device MD3, and the controller MD4 as a control device for controlling at least one of the display device MD1 and the sound output device MD2.
- the controller MD4 sets the notification device when it is determined that a dangerous situation occurs based on the information acquired by the information input device MD3 and the danger information database DB as a database stored in the non-volatile storage device NM. It is configured to work. With this configuration, the construction system can determine whether or not a dangerous situation occurs at the stage when the construction plan is created, so that it is possible to prevent the dangerous situation from actually occurring.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Structural Engineering (AREA)
- Civil Engineering (AREA)
- Mining & Mineral Resources (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Mechanical Engineering (AREA)
- Evolutionary Biology (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Component Parts Of Construction Machinery (AREA)
- Operation Control Of Excavators (AREA)
Abstract
Description
Claims (17)
- 下部走行体と、
前記下部走行体に旋回可能に搭載された上部旋回体と、
前記上部旋回体に設けられた記憶装置と、
施工に関する情報を取得する情報取得装置と、
制御装置と、を備え、
前記制御装置は、前記情報取得装置が取得した情報に基づき、危険な状況を判定する、
ショベル。 - 下部走行体と、
前記下部走行体に旋回可能に搭載された上部旋回体と、
前記上部旋回体に設けられた記憶装置と、
施工に関する情報を取得する情報取得装置と、
制御装置と、を備え、
前記制御装置は、前記情報取得装置が取得した情報に基づいて所定時間経過後の施工状況を推定し、推定した施工状況に関する情報に基づき、危険な状況を判定する、
ショベル。 - 前記制御装置は、推定した施工状況に基づいて危険度合いを決定し、該危険度合いが所定値を超えた場合に、危険な状況が発生すると判定する、
請求項2に記載のショベル。 - 発生すると判定された危険な状況に関する情報を表示装置に表示させる、
請求項1に記載のショベル。 - 前記施工に関する情報は、ショベルの周囲の画像を含む、
請求項1に記載のショベル。 - 前記施工に関する情報は、施工計画に関する情報を含む、
請求項1に記載のショベル。 - 前記施工に関する情報は、資材配置に関する情報を含む、
請求項1に記載のショベル。 - 記憶装置と、
施工に関する情報を取得する情報取得装置と、
制御装置と、を備え、
前記制御装置は、前記情報取得装置が取得した情報に基づき、危険な状況を判定する、
施工システム。 - 発生すると判定した危険な状況に関する情報を表示装置に表示させる、
請求項8に記載の施工システム。 - 前記施工に関する情報は、ショベルの周囲の画像を含む、
請求項8に記載の施工システム。 - 前記施工に関する情報は、施工計画に関する情報を含む、
請求項8に記載の施工システム。 - 前記施工に関する情報は、資材配置に関する情報を含む、
請求項8に記載の施工システム。 - 下部走行体と、
前記下部走行体に旋回可能に搭載された上部旋回体と、
前記上部旋回体に設けられた記憶装置と、
施工に関する情報を取得する情報取得装置と、
制御装置と、を備え、
前記制御装置は、前記情報取得装置が取得した情報に基づき、施工現場のシーンを判定する、
ショベル。 - 前記制御装置は、所定時間後の予定情報に基づき、施工現場のシーンを推定する、
請求項13に記載のショベル。 - 記憶装置と、
施工に関する情報を取得する情報取得装置と、
制御装置と、を備え、
前記制御装置は、前記情報取得装置が取得した情報に基づき、施工現場のシーンを判定する、
施工システム。 - 前記制御装置は、所定時間後の予定情報に基づき、施工現場のシーンを推定する、
請求項15に記載の施工システム。 - 下部走行体と、
前記下部走行体に旋回可能に搭載された上部旋回体と、
前記上部旋回体に設けられた記憶装置と、
制御装置と、を備え、
前記制御装置は、空間認識装置の出力に基づいて識別した物体の種類と位置に応じて動作内容の可否を判断する、
ショベル。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021512140A JPWO2020204007A1 (ja) | 2019-03-30 | 2020-03-30 | |
EP20783644.6A EP3951089A4 (en) | 2019-03-30 | 2020-03-30 | EXCAVATOR AND CONSTRUCTION SYSTEM |
KR1020217031860A KR20210140737A (ko) | 2019-03-30 | 2020-03-30 | 쇼벨 및 시공시스템 |
CN202080024901.8A CN113631779B (zh) | 2019-03-30 | 2020-03-30 | 挖土机及施工系统 |
US17/449,317 US20220018096A1 (en) | 2019-03-30 | 2021-09-29 | Shovel and construction system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-069472 | 2019-03-30 | ||
JP2019069472 | 2019-03-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/449,317 Continuation US20220018096A1 (en) | 2019-03-30 | 2021-09-29 | Shovel and construction system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020204007A1 true WO2020204007A1 (ja) | 2020-10-08 |
Family
ID=72667854
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/014696 WO2020204007A1 (ja) | 2019-03-30 | 2020-03-30 | ショベル及び施工システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220018096A1 (ja) |
EP (1) | EP3951089A4 (ja) |
JP (1) | JPWO2020204007A1 (ja) |
KR (1) | KR20210140737A (ja) |
CN (1) | CN113631779B (ja) |
WO (1) | WO2020204007A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220136215A1 (en) * | 2019-07-17 | 2022-05-05 | Sumitomo Construction Machinery Co., Ltd. | Work machine and assist device to assist in work with work machine |
WO2023054245A1 (ja) * | 2021-09-30 | 2023-04-06 | 株式会社小松製作所 | 作業機械のための表示システムおよび作業機械のための表示方法 |
JP7365738B1 (ja) * | 2023-04-12 | 2023-10-20 | サン・シールド株式会社 | クレーン操作シミュレーションシステム、及び、クレーン操作シミュレーション方法 |
WO2024075670A1 (ja) * | 2022-10-03 | 2024-04-11 | 日立建機株式会社 | 作業機械 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020203596A1 (ja) * | 2019-04-04 | 2020-10-08 | 株式会社小松製作所 | 作業機械を含むシステム、コンピュータによって実行される方法、学習済みの姿勢推定モデルの製造方法、および学習用データ |
US11278361B2 (en) | 2019-05-21 | 2022-03-22 | Verb Surgical Inc. | Sensors for touch-free control of surgical robotic systems |
US11504193B2 (en) * | 2019-05-21 | 2022-11-22 | Verb Surgical Inc. | Proximity sensors for surgical robotic arm manipulation |
JP7503370B2 (ja) * | 2019-07-01 | 2024-06-20 | 株式会社小松製作所 | 学習済みの作業分類推定モデルの製造方法、コンピュータによって実行される方法、および作業機械を含むシステム |
WO2023153722A1 (ko) * | 2022-02-08 | 2023-08-17 | 현대두산인프라코어(주) | 투명 디스플레이 기반의 건설기계 작업 보조 방법 및 장치 |
US20240018746A1 (en) * | 2022-07-12 | 2024-01-18 | Caterpillar Inc. | Industrial machine remote operation systems, and associated devices and methods |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008248613A (ja) * | 2007-03-30 | 2008-10-16 | Hitachi Constr Mach Co Ltd | 作業機械周辺監視装置 |
JP2014183500A (ja) | 2013-03-19 | 2014-09-29 | Sumitomo Heavy Ind Ltd | 作業機械用周辺監視装置 |
US20140343820A1 (en) * | 2011-12-13 | 2014-11-20 | Volvo Construction Equipment Ab | All-round hazard sensing device for construction apparatus |
JP5667638B2 (ja) * | 2010-10-22 | 2015-02-12 | 日立建機株式会社 | 作業機械の周辺監視装置 |
JP2019002242A (ja) * | 2017-06-19 | 2019-01-10 | 株式会社神戸製鋼所 | 転倒防止装置及び作業機械 |
US20190069472A1 (en) | 2015-06-16 | 2019-03-07 | Ju Hyuk Baek | Smart seeder |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6662880B2 (en) * | 2000-06-09 | 2003-12-16 | Gedalyahu Manor | Traveling rolling digger for sequential hole drilling and for producing sequential cultivated spots in soil |
WO2016101003A1 (en) * | 2014-12-24 | 2016-06-30 | Cqms Pty Ltd | A system and method of estimating fatigue in a lifting member |
US10344450B2 (en) * | 2015-12-01 | 2019-07-09 | The Charles Machine Works, Inc. | Object detection system and method |
JP6819462B2 (ja) * | 2017-05-30 | 2021-01-27 | コベルコ建機株式会社 | 作業機械 |
-
2020
- 2020-03-30 EP EP20783644.6A patent/EP3951089A4/en active Pending
- 2020-03-30 CN CN202080024901.8A patent/CN113631779B/zh active Active
- 2020-03-30 KR KR1020217031860A patent/KR20210140737A/ko unknown
- 2020-03-30 WO PCT/JP2020/014696 patent/WO2020204007A1/ja unknown
- 2020-03-30 JP JP2021512140A patent/JPWO2020204007A1/ja active Pending
-
2021
- 2021-09-29 US US17/449,317 patent/US20220018096A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008248613A (ja) * | 2007-03-30 | 2008-10-16 | Hitachi Constr Mach Co Ltd | 作業機械周辺監視装置 |
JP5667638B2 (ja) * | 2010-10-22 | 2015-02-12 | 日立建機株式会社 | 作業機械の周辺監視装置 |
US20140343820A1 (en) * | 2011-12-13 | 2014-11-20 | Volvo Construction Equipment Ab | All-round hazard sensing device for construction apparatus |
JP2014183500A (ja) | 2013-03-19 | 2014-09-29 | Sumitomo Heavy Ind Ltd | 作業機械用周辺監視装置 |
US20190069472A1 (en) | 2015-06-16 | 2019-03-07 | Ju Hyuk Baek | Smart seeder |
JP2019002242A (ja) * | 2017-06-19 | 2019-01-10 | 株式会社神戸製鋼所 | 転倒防止装置及び作業機械 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3951089A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220136215A1 (en) * | 2019-07-17 | 2022-05-05 | Sumitomo Construction Machinery Co., Ltd. | Work machine and assist device to assist in work with work machine |
WO2023054245A1 (ja) * | 2021-09-30 | 2023-04-06 | 株式会社小松製作所 | 作業機械のための表示システムおよび作業機械のための表示方法 |
WO2024075670A1 (ja) * | 2022-10-03 | 2024-04-11 | 日立建機株式会社 | 作業機械 |
JP7365738B1 (ja) * | 2023-04-12 | 2023-10-20 | サン・シールド株式会社 | クレーン操作シミュレーションシステム、及び、クレーン操作シミュレーション方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020204007A1 (ja) | 2020-10-08 |
US20220018096A1 (en) | 2022-01-20 |
EP3951089A4 (en) | 2022-09-14 |
EP3951089A1 (en) | 2022-02-09 |
CN113631779A (zh) | 2021-11-09 |
CN113631779B (zh) | 2024-06-18 |
KR20210140737A (ko) | 2021-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020204007A1 (ja) | ショベル及び施工システム | |
JP7472034B2 (ja) | ショベル、ショベル支援システム | |
WO2020196874A1 (ja) | 建設機械、支援システム | |
WO2019189203A1 (ja) | ショベル | |
WO2021025123A1 (ja) | ショベル、情報処理装置 | |
US20240026654A1 (en) | Construction machine and support system of construction machine | |
EP3885495B1 (en) | Excavator and excavator control device | |
US20230009234A1 (en) | Information communications system for construction machine and machine learning apparatus | |
US20230008338A1 (en) | Construction machine, construction machine management system, and machine learning apparatus | |
WO2024111596A1 (ja) | 作業機械、情報処理装置、プログラム | |
JP2023093109A (ja) | 建設機械、及び情報処理装置 | |
JP2023063989A (ja) | ショベル | |
JP2023063990A (ja) | ショベル | |
JP2023063991A (ja) | ショベル | |
JP2023063993A (ja) | ショベル | |
JP2023063992A (ja) | ショベル | |
JP2023063988A (ja) | ショベル |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20783644 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021512140 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20217031860 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020783644 Country of ref document: EP Effective date: 20211102 |