WO2022209434A1 - 施工管理システム、データ処理装置、及び施工管理方法 - Google Patents
施工管理システム、データ処理装置、及び施工管理方法 Download PDFInfo
- Publication number
- WO2022209434A1 WO2022209434A1 PCT/JP2022/007262 JP2022007262W WO2022209434A1 WO 2022209434 A1 WO2022209434 A1 WO 2022209434A1 JP 2022007262 W JP2022007262 W JP 2022007262W WO 2022209434 A1 WO2022209434 A1 WO 2022209434A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- machine
- work machine
- data
- unit
- Prior art date
Links
- 238000009430 construction management Methods 0.000 title claims abstract description 38
- 238000000034 method Methods 0.000 title claims description 12
- 238000012545 processing Methods 0.000 title claims description 9
- 238000001514 detection method Methods 0.000 claims description 173
- 238000010276 construction Methods 0.000 claims description 125
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 238000012876 topography Methods 0.000 description 38
- 238000007726 management method Methods 0.000 description 21
- 230000033001 locomotion Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 18
- 238000009412 basement excavation Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 9
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 4
- 238000011084 recovery Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 239000004035 construction material Substances 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/2054—Fleet management
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/205—Remotely operated machines, e.g. unmanned vehicles
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- the present disclosure relates to a construction management system, a data processing device, and a construction management method.
- a remote control system as disclosed in Patent Document 1 is known in the technical field related to construction management systems.
- the purpose of this disclosure is to suppress a decline in construction efficiency at construction sites.
- an output unit for displaying a remotely operable work machine on a display device, a selection data acquisition unit for acquiring machine selection data indicating that the work machine has been specified, and a work machine based on the machine selection data. and a remote operation permitting unit for permitting start of remote operation of the machine.
- FIG. 1 is a schematic diagram showing a construction management system according to an embodiment.
- FIG. 2 is a perspective view showing the hydraulic excavator according to the embodiment.
- FIG. 3 is a perspective view showing the crawler dump according to the embodiment.
- FIG. 4 is a functional block diagram showing the construction management system according to the embodiment.
- FIG. 5 is a flow chart showing a construction management method according to the embodiment.
- FIG. 6 is a diagram showing current terrain data according to the embodiment.
- FIG. 7 is a diagram showing a display example of the display device according to the embodiment.
- FIG. 8 is a diagram showing a display example of the display device according to the embodiment.
- FIG. 9 is a diagram showing a display example of the display device according to the embodiment.
- FIG. 10 is a diagram showing a display example of the display device according to the embodiment.
- FIG. 11 is a diagram showing a display example of the display device according to the embodiment.
- FIG. 12 is a block diagram of a computer system according to the
- FIG. 1 is a schematic diagram showing a construction management system 1 according to an embodiment.
- a construction management system 1 manages construction at a construction site 2 .
- a plurality of work machines 20 operate at the construction site 2 .
- work machine 20 includes excavator 21 , bulldozer 22 , and crawler dumper 23 .
- the work machine 20 may include a wheel loader.
- a person WM is present at the construction site 2 .
- a worker who works at the construction site 2 is exemplified as the person WM.
- the person WM may be a supervisor who manages construction.
- the person WM may be a spectator.
- the construction management system 1 includes a management device 3, a server 4, an information terminal 5, a detection device 9, and a detection device 12.
- the management device 3 includes a computer system located at the construction site 2.
- the management device 3 is supported by the travel device 6 .
- the management device 3 can travel on the construction site 2 by the travel device 6 .
- Examples of the traveling device 6 include an aerial work vehicle, a truck, and a traveling robot.
- the server 4 is a data processing device including a computer system.
- the server 4 may be located at the construction site 2 or may be located at a remote location from the construction site 2 .
- the information terminal 5 is a computer system arranged at a remote location 13 of the construction site 2 .
- a personal computer or a smart phone is exemplified as the information terminal 5 .
- the management device 3 , the server 4 and the information terminal 5 communicate with each other via the communication system 10 .
- Examples of the communication system 10 include the Internet, a local area network (LAN), a mobile phone communication network, and a satellite communication network.
- the detection device 9 detects the construction site 2.
- the detection device 9 acquires three-dimensional data of the construction site 2 .
- Examples of objects to be detected by the detection device 9 include the topography of the construction site 2 and objects present on the construction site 2 .
- An object includes one or both of a movable body and a stationary body.
- the work machine 20 and the person WM are exemplified as movable bodies.
- Wood or material is exemplified as the stationary body.
- the three-dimensional data acquired by the detection device 9 includes image data of the construction site 2.
- the image data acquired by the detection device 9 may be moving image data or still image data.
- a stereo camera is exemplified as the detection device 9 .
- the detection device 9 may include a monocular camera and a three-dimensional measurement device.
- a laser sensor LIDAR: Light Detection and Ranging
- the three-dimensional measurement device may be an infrared sensor that detects an object by emitting infrared light or a radar sensor (RADAR: Radio Detection and Ranging) that detects an object by emitting radio waves.
- the detection device 9 is mounted on the flying object 8.
- an unmanned aerial vehicle UAV: Unmanned Aerial Vehicle
- UAV Unmanned Aerial Vehicle
- the detection device 9 detects the construction site 2 from above the construction site 2 .
- the flying object 8 and the management device 3 are connected by a cable 7.
- Detection data of the detection device 9 is transmitted to the management device 3 via the cable 7 .
- the detection data of the detection device 9 transmitted to the management device 3 is transmitted to the server 4 via the communication system 10 .
- the management device 3 includes a power supply or generator.
- the management device 3 can supply power to the aircraft 8 via the cable 7 .
- the detection device 12 detects the construction site 2. Similar to the detection device 9 , the detection device 12 acquires three-dimensional data of the construction site 2 .
- the three-dimensional data acquired by the detection device 12 includes image data of the construction site 2 .
- the detection device 12 is mounted on the flying object 11.
- the detection device 12 detects the construction site 2 from above the construction site 2 . No cable is connected to the flying object 11 . Detection data of the detection device 12 is transmitted to the server 4 via the communication system 10 .
- the flying object 11 can fly higher than the flying object 8.
- the flying object 11 can fly over a wider range than the flying object 8.
- the detection device 12 can detect a wider area of the construction site 2 than the detection device 9 can.
- the detection device 12 detects the entire construction site 2 .
- a detection device 9 detects a portion of the construction site 2 .
- FIG. 2 is a perspective view showing the hydraulic excavator 21 according to the embodiment.
- the hydraulic excavator 21 includes a traveling body 24, a revolving body 25 supported by the traveling body 24, a work machine 26 supported by the revolving body 25, and a hydraulic cylinder 27 for driving the work machine 26. and
- the running body 24 has a pair of crawler belts.
- the hydraulic excavator 21 can travel on the construction site 2 by the traveling body 24 .
- the revolving body 25 revolves while being supported by the traveling body 24 .
- Work implement 26 includes a boom 26A connected to revolving body 25, an arm 26B connected to boom 26A, and a bucket 26C connected to arm 26B.
- the hydraulic cylinders 27 include a boom cylinder 27A that operates the boom 26A, an arm cylinder 27B that operates the arm 26B, and a bucket cylinder 27C that operates the bucket 26C.
- the hydraulic excavator 21 operates. Examples of the motions of the hydraulic excavator 21 include the traveling motion of the traveling body 24, the turning motion of the revolving body 25, the raising and lowering motions of the boom 26A, the digging motion and dumping motion of the arm 26B, and the digging motion and dumping motion of the bucket 26C. be done.
- FIG. 3 is a perspective view showing the crawler dump 23 according to the embodiment.
- the crawler dump 23 has a traveling body 28 , a vehicle body 29 and a dump body 30 .
- the running body 28 has a pair of crawler belts.
- the crawler dump 23 can travel the construction site 2 by the traveling body 28 .
- the dump body 30 is a member on which cargo is loaded.
- the hydraulic excavator 21 can load a cargo onto the dump body 30 using the working machine 26 .
- the dump body 30 can be lifted by a hoist cylinder (not shown) to discharge the cargo.
- the crawler dump 23 operates. Examples of the operation of the crawler dump 23 include a traveling operation of the traveling body 28 and a lowering operation and a dumping operation of the dump body 30 .
- FIG. 4 is a functional block diagram showing the construction management system 1 according to the embodiment.
- the construction management system 1 includes an aircraft 8, an aircraft 11, a management device 3 placed at the construction site 2, a server 4, and a remote location 13 located at the construction site 2. and an information terminal 5 .
- the flying object 8 has a position sensor 14, an attitude sensor 15, and a detection device 9.
- the position sensor 14 detects the position of the flying object 8.
- a position sensor 14 detects the position of the aircraft 8 using the Global Navigation Satellite System (GNSS).
- the position sensor 14 includes a GNSS receiver (GNSS sensor) and detects the position of the aircraft 8 in the global coordinate system.
- the attitude sensor 15 detects the attitude of the aircraft 8 .
- an inertial measurement unit IMU: Inertial Measurement Unit
- IMU Inertial Measurement Unit
- the flying object 11 has a position sensor 16, an attitude sensor 17, and a detection device 12.
- the position sensor 16 includes a GNSS receiver and detects the position of the aircraft 11 in the global coordinate system.
- the attitude sensor 17 detects the attitude of the aircraft 11 .
- an inertial measurement unit IMU: Inertial Measurement Unit
- IMU Inertial Measurement Unit
- the information terminal 5 has an input device 51 and a display device 52 .
- the input device 51 is operated by an administrator at the remote location 13.
- the input device 51 generates input data based on the administrator's operation. Examples of the input device 51 include a touch panel, a computer keyboard, a mouse, or operation buttons.
- the input device 51 may be a non-contact input device including an optical sensor, or may be a voice input device.
- the display device 52 displays display data.
- An administrator at the remote location 13 can confirm the display data displayed on the display device 52 .
- the display device 52 is exemplified by a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OELD).
- LCD liquid crystal display
- OELD organic electroluminescence display
- the server 4 includes a current terrain data creation unit 41, a detection data acquisition unit 42, a recognition unit 43, a reflection unit 44, an output unit 45, a selection data acquisition unit 46, a remote operation permission unit 47, a remote operation It has a command section 48 , a current terrain storage section 61 , a work type storage section 62 , a work target storage section 63 and a three-dimensional model storage section 64 .
- the current topography data creation unit 41 creates current topography data indicating the current topography of the construction site 2 where the work machine 20 operates.
- the current topography data is three-dimensional topography data representing the current topography of the construction site 2 .
- Existing terrain includes reference terrain before a given construction is started.
- the current topography data creation unit 41 creates current topography data based on the detection data of the detection device 12 . As described above, the detection device 12 detects the entire construction site 2 . The current topography data indicates the current topography of the construction site 2 as a whole.
- the detection device 12 detects the construction site 2 with a first frequency.
- the detection device 12 detects the construction site 2 only once, for example, before the start of work for the day.
- the detection device 12 detects the current landform indicating the reference landform before the start of the predetermined construction at the construction site 2 .
- the current topography data creation unit 41 creates current topography data with a first frequency.
- the current landform data created by the current landform data creation unit 41 is stored in the current landform storage unit 61 .
- the current terrain data stored in the current terrain storage unit 61 is updated at a first frequency.
- the timing at which the detection device 12 detects the construction site 2 is not limited to before the start of work of the day, and may be any timing.
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 that has detected the work machine 20 and the construction area around the work machine 20 .
- the detection device 9 detects a portion of the construction site 2.
- the portion of the construction site 2 detected by the detection device 9 includes a work machine 20 that performs work.
- the portion of the construction site 2 detected by the detection device 9 includes the construction area around the work machine 20 that performs the work.
- An example of a construction area detected by the detection device 9 is a construction area in which construction is being performed by the work machine 20 .
- the detection device 9 detects the construction site 2 with a second frequency higher than the first frequency.
- the detection device 9 continuously detects the construction site 2, for example, for a certain period of time.
- the detection device 9 continuously detects the construction site 2, for example, only while the working machine 20 is working. Note that the detection device 9 may always detect the construction site 2 .
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 at a second frequency.
- the detection data acquired by the detection data acquisition unit 42 is updated more frequently than the current terrain data.
- the recognition unit 43 recognizes objects on the construction site 2 based on the detection data acquired by the detection data acquisition unit 42 . As described above, the work machine 20 and the person WM are exemplified as objects.
- the recognition unit 43 recognizes objects using artificial intelligence (AI) that analyzes input data by an algorithm and outputs output data.
- AI artificial intelligence
- the input data is the image data of the construction site 2 acquired by the detection device 9, and the output data is the object.
- the recognition unit 43 has a learning model generated by learning the feature amount of the object.
- the learning model includes a learning model generated by learning the feature amount of work machine 20 and a learning model generated by learning the feature amount of person WM.
- machine learning is performed using a learning image including an object as teacher data, thereby generating a learning model that takes as input the feature amount of the object and outputs the object.
- the recognition unit 43 can recognize the object by inputting the image data of the construction site 2 acquired by the detection device 9 into the learning model.
- the reflection unit 44 generates reflection data in which the detection data acquired by the detection data acquisition unit 42 is reflected in the current terrain data.
- the current topographical data is three-dimensional topographical data of the entire construction site 2 detected by the detecting device 12 at the first frequency.
- the detection data includes three-dimensional terrain data of a construction area that is part of the construction site 2 detected by the detection device 9 at the second frequency.
- the detection data acquired by the detection data acquisition unit 42 are sequentially updated at the second frequency.
- the reflection unit 44 sequentially reflects the detection data acquired by the detection data acquisition unit 42 on a part of the current terrain data at a second frequency. At least part of the current terrain data is sequentially updated with the detection data acquired by the detection data acquisition unit 42 .
- the detection data acquired by the detection data acquisition unit 42 includes updated topography data indicating the updated topography of the construction area.
- Post-update topography includes the latest topography during or after construction.
- the updated terrain data is updated with a second frequency.
- the reflection unit 44 reflects the updated terrain data acquired by the detection data acquisition unit 42 on the current terrain data stored in the current terrain storage unit 61 .
- the detection data acquired by the detection data acquisition unit 42 includes non-terrain data indicating objects on the construction site 2 .
- the non-terrain data is three-dimensional data representing objects on the construction site 2 .
- Non-terrain data is updated at a second frequency.
- the reflection unit 44 reflects the non-terrain data acquired by the detection data acquisition unit 42 on the current terrain data stored in the current terrain storage unit 61 .
- the reflection unit 44 reflects the updated topography data of the construction area from which the non-topography data has been removed from the detection data acquired by the detection data acquisition unit 42, in the current topography data.
- Non-terrain data representing an object is recognized by the recognition unit 43 .
- the reflecting unit 44 removes the non-terrain data from the detection data to generate updated terrain data.
- the reflection unit 44 generates reflection data in which the object is reflected.
- the reflection data generated by the reflection unit 44 includes the updated terrain data of the construction area.
- the reflected data generated by the reflecting unit 44 includes the object recognized by the recognizing unit 43.
- the reflected data generated by the reflecting unit 44 includes the three-dimensional model of the working machine 20 recognized by the recognizing unit 43 .
- the three-dimensional model of work machine 20 includes computer graphics (CG) of work machine 20 .
- the three-dimensional model of the work machine 20 is a three-dimensional model that represents the work machine 20, and is constructed for each part that constitutes the work machine 20, such as the traveling body 24, the revolving body 25, and the work machine 26, for example. be.
- a three-dimensional model of work machine 20 is stored in advance in three-dimensional model storage unit 64 .
- the reflected data generated by the reflecting unit 44 includes a symbol image indicating the position of the object.
- a symbol image is image data that emphasizes the position of an object.
- a reflecting unit 44 generates a symbol image based on the recognition result of the recognizing unit 43 .
- the output unit 45 outputs the reflection data generated by the reflection unit 44 to the information terminal 5 .
- the output unit 45 transmits the reflection data to the information terminal 5 via the communication system 10 .
- the output unit 45 causes the display device 52 to display the reflection data generated by the reflection unit 44 .
- the output unit 45 causes the display device 52 to display the work machine 20 that can be remotely controlled.
- the output unit 45 causes the display device 52 to display the reflection data reflecting the updated landform data together with the work machine 20 .
- the output unit 45 causes the display device 52 to display the three-dimensional model of the work machine 20 .
- the selection data acquisition unit 46 acquires machine selection data indicating that the remote-controlled work machine 20 among the work machines 20 displayed on the display device 52 has been designated.
- the administrator at the remote location 13 can operate the input device 51 to specify the work machine 20 to be remotely controlled.
- Machine selection data is generated by operating the input device 51 .
- the selection data acquisition unit 46 acquires from the input device 51 machine selection data indicating that the remote-controlled work machine 20 has been specified.
- the remote operation permitting unit 47 permits starting remote operation of the work machine 20 based on the machine selection data acquired by the selection data acquisition unit 46 .
- the remote control command unit 48 transmits a remote control command to the work machine 20 permitted to start remote control.
- a work type corresponding to each of the plurality of work machines 20 is stored in the work type storage unit 62 .
- the work target storage unit 63 stores work targets corresponding to each of a plurality of work types.
- a three-dimensional model of the working machine 20 is stored in the three-dimensional model storage unit 64 .
- the flying object 8 is equipped with the position sensor 14 and the attitude sensor 15 .
- a position sensor 14 can detect the position of the detection device 9 .
- the orientation sensor 15 can detect the orientation of the detection device 9 .
- the orientation sensor 15 can detect the orientation of the detection device 9 .
- Attitude includes, for example, roll angle, pitch angle, and yaw angle. The yaw angle may be calculated based on detection data from two GNSS sensors provided on the aircraft 8 .
- a detection device 9 detects three-dimensional data of the construction site 2 .
- the three-dimensional data of the construction site 2 includes relative distances and relative positions between the detection device 9 and each of a plurality of detection points defined as detection targets.
- the recognition unit 43 and the reflection unit 44 recognize the position of the three-dimensional data to be detected in the global coordinate system, for example, based on the detection data of the position sensor 14, the detection data of the orientation sensor 15, and the detection data of the detection device 9. can be calculated. Further, the recognition unit 43 and the reflection unit 44 can perform a predetermined coordinate transformation to calculate the position of the three-dimensional data to be detected in the local coordinate system defined for the construction site 2, for example.
- the detection target of the detection device 9 includes updated landforms and objects.
- the current topography data creation unit 41 generates a map in a local coordinate system defined for the construction site 2, for example, based on the detection data of the position sensor 16, the detection data of the orientation sensor 17, and the detection data of the detection device 12.
- the position of the three-dimensional data to be detected can be calculated.
- the detection target of the detection device 12 includes the current terrain.
- the recognition unit 43 can recognize the presence or absence of an object and the position of the object based on the detection data acquired by the detection data acquisition unit 42 .
- the recognition unit 43 when recognizing the position of the person WM, the recognition unit 43 recognizes the person WM based on the two-dimensional image acquired by the monocular camera of the detection device 9 .
- the recognition unit 43 also acquires the three-dimensional topographical data of the construction site 2 .
- the recognition unit 43 can recognize the position of the person WM at the construction site 2 based on the person WM recognized based on the two-dimensional image.
- a person WM recognized based on two-dimensional images acquired by two monocular cameras that constitute a stereo camera is subjected to image processing based on the principle of triangulation to determine the three-dimensional position of the person WM.
- the position of the person WM at the construction site 2 can be recognized by calculating and correlating it with the three-dimensional topographical data of the construction site 2 .
- by calculating the three-dimensional position of the person WM recognized based on the two-dimensional image using the detection values of the laser sensor and the radar sensor and making it correspond to the three-dimensional topography data of the construction site 2, may recognize the position of the person WM in .
- the recognition unit 43 may recognize the position of the person WM from the three-dimensional data of the construction site 2 .
- the recognition unit 43 may recognize the position of the person WM based on the detection data of the position sensor possessed by the person WM.
- the recognition unit 43 can recognize the position of the person WM based on the detection data of the GNSS sensor of the smartphone.
- the position sensor possessed by the person WM may be a beacon.
- the position of the person WM at the construction site 2 can be determined from the coordinates of the person WM on the two-dimensional image recognized based on the two-dimensional image, the three-dimensional position and posture of the detection device 9, and the three-dimensional terrain data. It may be estimated by geometric calculation.
- the recognition unit 43 can recognize the operation of the working machine 20 based on the detection data acquired by the detection data acquisition unit 42 .
- the hydraulic excavator 21 can operate the traveling body 24 , the revolving body 25 , and the working machine 26 .
- the crawler dump 23 can operate the traveling body 28 and the dump body 30 .
- the reflecting unit 44 can move the three-dimensional model in synchronization with the work machine 20 .
- FIG. 5 is a flow chart showing a construction management method according to the embodiment.
- the current topography data creation unit 41 creates current topography data of the construction site 2 where the work machine 20 operates (step S1).
- FIG. 6 is a diagram showing current terrain data according to the embodiment.
- the current topography data is three-dimensional topography data representing the current topography of the entire construction site 2 .
- the current topography data creation unit 41 acquires the detection data of the detection device 12 .
- the current topography data creation unit 41 creates current topography data based on the detection data of the detection device 12 .
- the current topography data created by the current topography data creation unit 41 is stored in the current topography storage unit 61 .
- the output unit 45 can transmit the current terrain data to the information terminal 5 .
- the display device 52 can display the current terrain as shown in FIG.
- a detection device 9 mounted on the aircraft 8 detects the construction site 2 .
- the detection device 9 detects, for example, the work machine 20 that is performing work and the construction area around the work machine 20 .
- Detection data of the detection device 9 is transmitted to the management device 3 via the cable 7 .
- the management device 3 transmits detection data of the detection device 9 to the server 4 .
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 that has detected the work machine 20 and the construction area around the work machine 20 (step S2).
- the recognition unit 43 recognizes that an object exists at the construction site 2 based on the detection data acquired by the detection data acquisition unit 42 (step S3).
- the reflection unit 44 reflects the detection data acquired by the detection data acquisition unit 42 on the current landform data stored in the current landform storage unit 61 to generate reflection data.
- the detection data acquired by the detection data acquisition unit 42 includes updated terrain data of the construction area indicating a portion of the construction site 2 .
- the reflecting unit 44 reflects the updated topographical data on the current topographical data.
- the detection data including the excavation location is acquired by the detection data acquisition unit 42 as updated terrain data.
- the updated terrain data includes excavation locations excavated by the hydraulic excavator 21 .
- the reflecting unit 44 synthesizes a part of the current terrain data and the updated terrain data.
- the reflecting unit 44 applies the updated topographic data to a part of the current topographic data. As a result, reflection data is generated in which the updated landform data including the excavation location is reflected.
- the detection data acquired by the detection data acquisition unit 42 includes updated terrain data of the construction area.
- the reflecting unit 44 generates reflected data by reflecting the updated topographical data on the current topographical data.
- the reflection unit 44 generates reflection data in which the object is reflected.
- the output unit 45 outputs the reflection data generated by the reflection unit 44 to the information terminal 5 .
- the display device 52 displays the reflected data transmitted from the output unit 45 .
- FIG. 7 is a diagram showing a display example of the display device 52 according to the embodiment.
- the display device 52 can display the reflected data.
- the reflected data includes image data of the construction area. For example, as construction progresses in the construction area, at least a portion of the topography of the construction site 2 changes, as shown in FIG. In the example shown in FIG. 7, the excavation work of the hydraulic excavator 21 creates an excavation location in the construction area.
- the detection data acquired by the detection data acquisition unit 42 includes updated terrain data of the construction area.
- the reflecting unit 44 reflects changes in the topography of the construction site 2 on the current topography in real time.
- the reflecting unit 44 reflects the updated topographical data on the current topographical data in real time.
- the display device 52 can display the reflection data reflecting the excavation location. By checking the reflected data displayed on the display device 52, the administrator of the remote location 13 can recognize the progress of construction at the construction site 2 in real time.
- the reflected data includes three-dimensional terrain data in which the updated terrain data is reflected in part of the current terrain data.
- the reflected data includes the three-dimensional model of work machine 20 recognized by recognition unit 43 .
- the three-dimensional model of the work machine 20 includes a three-dimensional model 21D of the hydraulic excavator 21, a three-dimensional model 23D of the crawler dump 23, a three-dimensional model 3D of the management device 3, and a three-dimensional model of the flying object 8. Includes Model 8D.
- the recognition unit 43 calculates the position (three-dimensional position) of the work machine 20 based on the detection data of the detection device 12 acquired by the detection data acquisition unit 42. and posture.
- the attitude of the work machine 20 includes the inclination of the revolving body 25 with respect to the horizontal plane and the revolving angle of the revolving body 25 with respect to the traveling body 24 .
- the attitude of work machine 20 includes the angle of work machine 26 .
- the angle of work implement 26 includes the angle of boom 26A, the angle of arm 26B, and the angle of bucket 26C.
- the detection data of the detection device 12 includes images acquired by the stereo cameras.
- the recognition unit 43 can calculate the three-dimensional position and orientation of the work machine 20 based on the detection data of the detection device 12 .
- the reflection unit 44 updates the three-dimensional model stored in the three-dimensional model storage unit 64 so that the three-dimensional model is placed at the position calculated by the recognition unit 43 and has the posture calculated by the recognition unit 43. Adjust to generate reflected data.
- the three-dimensional model of the working machine 20 is constructed for each part that constitutes the working machine 20, such as the traveling body 24, the revolving body 25, and the working machine 26, for example.
- the reflecting unit 44 Based on the angle of the boom 26A, the angle of the arm 26B, the angle of the bucket 26C, and the turning angle of the turning body 25, the reflecting unit 44 changes the angles of the corresponding parts of the three-dimensional model to reproduce the three-dimensional model. adjust.
- the output unit 45 outputs the reflection data including the three-dimensional model generated by the reflection unit 44 to the display device 52 .
- the recognition unit 43 may calculate the position of the work machine 20 based on the detection data of one GNSS sensor. Further, the recognition unit 43 may calculate the inclination and the turning angle of the turning body 25 based on the detection data of each of the two GNSS sensors. Further, when a stroke sensor is provided for each of boom cylinder 27A, arm cylinder 27B, and bucket cylinder 27C, recognition unit 43 may calculate the angle of work implement 26 based on the detection data of the stroke sensors. good.
- the recognition unit 43 detects the flying object based on the detection data of at least one of the two GNSS sensors mounted on the flying object 8. 8 can be calculated.
- the recognition unit 43 can also calculate the inclination of the flying object 8 based on the detection data of the two GNSS sensors.
- the recognition unit 43 may calculate the inclination of the flying object 8 based on the detection data of the inertial measurement unit.
- a recognition unit 43 stores a three-dimensional model of the flying object 8 .
- the recognition unit 43 adjusts the stored three-dimensional model so that the three-dimensional model is arranged at the calculated position and has the calculated inclination, and displays it on the display device 52 .
- the reflected data includes a symbol image that indicates the position of the object.
- the reflected data includes a symbol image 31 indicating the position of the hydraulic excavator 21, a symbol image 32 indicating the position of the crawler dump truck 23, a symbol image 33 indicating the position of the management device 3, and a symbol image 33 indicating the position of the man WM. and a symbol image 34 indicating the position.
- the reflected data may include at least one of the symbol image 31, the symbol image 32, the symbol image 33, and the symbol image .
- the reflection unit 44 can generate the symbol image 31 based on the position of the hydraulic excavator 21 recognized by the recognition unit 43 .
- the symbol image 31 is frame-shaped (box-shaped) surrounding the three-dimensional model 21D.
- the reflection unit 44 generates reflection data so that the three-dimensional model 21D of the hydraulic excavator 21 and the symbol image 31 are superimposed and displayed.
- the symbol image 31 may be displayed adjacent to the three-dimensional model 21D.
- the symbol image 31 may have any shape as long as the hydraulic excavator 21 can be emphasized. Note that the symbol image 31 may be displayed and the three-dimensional model 21D may not be displayed.
- the reflection unit 44 can generate the symbol image 32 based on the position of the crawler dump 23 recognized by the recognition unit 43 .
- the reflecting unit 44 can generate the symbol image 33 based on the position of the management device 3 recognized by the recognizing unit 43 .
- the reflection unit 44 generates the symbol image 34 based on the position of the person WM recognized by the recognition unit 43 .
- each of the hydraulic excavator 21 and the crawler dump 23 is a work machine 20 that can be remotely controlled.
- the output unit 45 causes the display device 52 to display the work machine 20 that can be remotely controlled.
- the remotely controllable work machine 20 displayed on the display device 52 includes the hydraulic excavator 21 (first work machine) and the crawler dump 23 (second work machine) (step S4).
- the administrator of the remote location 13 operates the input device 51 to designate the work machine 20 to be remotely controlled from the work machines 20 displayed on the display device 52 .
- FIG. 8 is a diagram showing a display example of the display device 52 according to the embodiment.
- the input device 51 includes a mouse that operates a pointer that moves on the display screen of the display device 52 .
- the administrator designates a three-dimensional model 21D showing the hydraulic excavator 21 via the input device 51.
- the symbol image 31 is highlighted.
- the display form of the 3D model 21D may be changed.
- the color of the three-dimensional model 21D may be changed, or the line type of the frame showing the three-dimensional model 21D may be changed.
- the selection data acquisition unit 46 acquires machine selection data indicating that the hydraulic excavator 21 has been specified (step S5).
- the output unit 45 causes the display device 52 to display the work type corresponding to the specified hydraulic excavator 21 (step S6).
- the work type of the work machine 20 refers to the type of operation or work that the work machine 20 can perform.
- a work type corresponding to the work machine 20 is predetermined and stored in the work type storage unit 62 .
- the work type storage unit 62 stores work types corresponding to each of the plurality of work machines 20 .
- the work type storage unit 62 stores work types for each working machine, such as a work type corresponding to the hydraulic excavator 21, a work type corresponding to the bulldozer 22, and a work type corresponding to the crawler dump 23.
- the output unit 45 outputs a signal for displaying the work type on the display device 52 based on the machine selection data acquired by the selection data acquisition unit 46 and the data stored in the work type storage unit 62 .
- the machine selection data indicating that the hydraulic excavator 21 has been specified is acquired by the selection data acquisition unit 46 .
- the output unit 45 causes the display device 52 to display the work type corresponding to the excavator 21 among the work types of the plurality of work machines 20 stored in the work type storage unit 62 .
- “movement”, “turning”, “loading”, “excavation”, “recovery”, and “manual operation” are displayed as work types of the hydraulic excavator 21.
- work machine 20 is self-operating.
- “Movement” refers to an operation in which the traveling body 24 automatically travels.
- “Turn” refers to an operation in which the turning body 25 automatically turns.
- “Loading” refers to an operation in which the work machine 26 automatically loads a load onto a loading object.
- “Excavation” refers to an operation in which the work machine 26 automatically excavates an object to be excavated.
- “Collecting” refers to the operation of automatically collecting the cargo dropped on the ground by the working machine 26 in one place.
- “Manual operation” means remote operation of the hydraulic excavator 21 using an operation lever provided at the remote location 13 .
- the administrator of the remote location 13 operates the input device 51 to specify the type of work to be performed by the hydraulic excavator 21 from the plurality of work types displayed on the display device 52 .
- FIG. 9 is a diagram showing a display example of the display device 52 according to the embodiment.
- the operator operates the input device 51 to designate "loading".
- the selection data acquisition unit 46 acquires type selection data indicating that "loading" is specified as the type of work to be performed by the hydraulic excavator 21 (step S7).
- the output unit 45 causes the display device 52 to display the work target corresponding to the specified "loading" (step S8).
- the work target of the work machine 20 refers to the target on which the work machine 20 performs work.
- a work target corresponding to the work type is determined in advance and stored in the work target storage unit 63 .
- the work target storage unit 63 stores work targets corresponding to each of a plurality of work types.
- the work object storage unit 63 stores a work object corresponding to "moving", a work object corresponding to "swing", a work object corresponding to "loading”, a work object corresponding to "excavation”, and a work object corresponding to "excavation”.
- a work target corresponding to "recovery” is stored.
- a “destination” is exemplified as a work target corresponding to "move”.
- a “turn destination” is exemplified as a work target corresponding to "turn".
- work targets corresponding to "loading” include “crawler dump” and “loading destination”.
- An example of an operation target corresponding to "excavation” is “excavation destination”.
- a “collection destination” is exemplified as an operation target corresponding to "collection”.
- the output unit 45 outputs a signal for displaying the work target on the display device 52 based on the type selection data acquired by the selection data acquisition unit 46 and the data stored in the work target storage unit 63 .
- the selection data acquisition unit 46 acquires type selection data indicating that "loading" has been specified.
- the output unit 45 causes the display device 52 to display the work target corresponding to “loading” among the plurality of work targets stored in the work target storage unit 63 .
- FIG. 10 is a diagram showing a display example of the display device 52 according to the embodiment.
- the administrator of the remote site 13 operates the input device 51 to specify a three-dimensional model 23D representing the crawler dump 23 from among the plurality of work targets displayed on the display device 52. do.
- the symbol image 32 is highlighted.
- the selection data acquisition unit 46 acquires target selection data indicating that the crawler dump 23 has been specified as the work target (step S9).
- the administrator at the remote site 13 After specifying the hydraulic excavator 21, the work type, and the work target, the administrator at the remote site 13 starts remote control of the hydraulic excavator 21.
- FIG. 11 is a diagram showing a display example of the display device 52 according to the embodiment.
- the display device 52 displays the "execute” symbol.
- the operator operates the “execute” symbol via the input device 51 .
- the remote operation permitting unit 47 determines whether or not to permit the start of remote operation based on the designated work type and the surrounding conditions of the hydraulic excavator 21 (step S10).
- the remote operation permitting unit 47 performs remote operation. does not allow the start of
- the recognition unit 43 can recognize obstacles based on the detection data acquired by the detection data acquisition unit 42 .
- the recognition unit 43 can recognize obstacles by using artificial intelligence, for example. If the recognized obstacle exists around work machine 20, remote operation permission unit 47 does not determine to permit the start of remote operation. Further, when the recognized obstacle is between the work machine 20 and the work target, the remote operation permission unit 47 may not determine to permit the start of the remote operation.
- the recognition unit 43 can recognize the loading target, which is the work target, by using artificial intelligence, for example.
- the remote operation permitting unit 47 does not determine to permit the start of remote operation.
- the recognition unit 43 can recognize the current load amount of the dump body 30 and the load amount of the work machine 20 by using artificial intelligence, for example. When the recognizing unit 43 recognizes that there is a situation in which the dump body 30 cannot be loaded based on the current load amount of the dump body 30 and the load amount of the work machine 20, the remote operation permitting unit 47 does not determine to permit the start of remote operation.
- step S10 if the start of remote control is not permitted (step S10: No), remote control of the hydraulic excavator 21 is not performed. No remote operation command is output from the remote operation command unit 48 .
- step S10 when the start of remote operation is permitted (step S10: Yes), a remote operation command is transmitted from the remote operation command unit 48 to the hydraulic excavator 21 (step S11).
- the hydraulic excavator 21 starts loading the crawler dump 23 based on the remote control command.
- the reflected data displayed on the display device 52 includes the three-dimensional model 21D that moves in synchronization with the hydraulic excavator 21.
- the recognition unit 43 can recognize the motion of the work machine 20 .
- the reflection unit 44 can move the three-dimensional model 21D on the display device 52 based on the motion of the hydraulic excavator 21 recognized by the recognition unit 43 so as to synchronize with the motion of the hydraulic excavator 21 .
- the three-dimensional model 21D of the excavator 21 moves in synchronization with the excavator 21 on the display device 52 .
- FIG. 11 shows an example in which the boom of the three-dimensional model 21D is raised in synchronization with the boom 26A of the hydraulic excavator 21 being raised.
- the remote operation permission unit 47 determines whether or not the remote operation has ended (step S12).
- step S12 If it is determined in step S12 that the remote operation has ended (step S12: Yes), the construction management method according to the embodiment ends. If it is determined in step S12 that the remote operation has not ended (step S12: No), the processing from step S2 to step S11 is repeated.
- the administrator at the remote site 13 operates the input device 51 to switch the remotely controlled work machine 20 from the hydraulic excavator 21 to the crawler dump truck 23 .
- the administrator of the remote site 13 designates the three-dimensional model 23D of the crawler dump 23 displayed on the display device 52 via the input device 51 .
- the selection data acquiring unit 46 acquires machine selection data indicating that the crawler dump truck 23 has been specified (step S5).
- the work type corresponding to the crawler dump 23 is displayed on the display device 52 (step S6).
- the selection data acquisition unit 46 acquires target selection data indicating that the "first unloading position" has been specified (step S9).
- the "first unloading position” is specified, it is determined whether or not to permit the start of remote control (step S10). For example, if there is an obstacle between the loading position and the first unloading position, the start of remote control is not permitted.
- the crawler dump truck 23 leaves the loading position and starts traveling toward the first dumping position.
- the remote operation command may be transmitted to the other work machine 20 until the remote operation command is transmitted from the remote operation command unit 48 to the work machine 20 and the remote operation of the work machine 20 is completed. .
- the remote control of the working machine 20 is completed, another working machine 20 displayed on the display device 52 is designated, and the work type and work target of the other working machine 20 are designated. good too. Further, it may be determined whether or not to permit the start of remote control of another work machine 20 .
- FIG. 12 is a block diagram showing a computer system 1000 according to an embodiment.
- the server 4 described above includes a computer system 1000 .
- a computer system 1000 includes a processor 1001 such as a CPU (Central Processing Unit), a main memory 1002 including non-volatile memory such as ROM (Read Only Memory) and volatile memory such as RAM (Random Access Memory), It has a storage 1003 and an interface 1004 including an input/output circuit.
- the functions of the server 4 described above are stored in the storage 1003 as computer programs.
- the processor 1001 reads a computer program from the storage 1003, develops it in the main memory 1002, and executes the above-described processing according to the program. Note that the computer program may be distributed to the computer system 1000 via a network.
- a computer program or computer system 1000 causes display device 52 to display remotely operable work machine 20, obtains machine selection data indicating that work machine 20 has been designated, and permitting initiation of remote control of work machine 20 based on the machine selection data.
- the remotely controllable working machine 20 is displayed on the display device 52 .
- the administrator of the remote location 13 can designate the working machine 20 to be remotely controlled via the input device 51 .
- the manager of the remote site 13 can arbitrarily select the work machine 20 to be remotely controlled according to the situation of the construction site 2 .
- machine selection data is generated indicating that the work machine 20 to be remotely controlled has been designated. Once the machine selection data is generated, remote control of the designated work machine 20 is permitted to begin. If work machine 20 is not specified, remote control of work machine 20 is not permitted.
- the manager of the remote site 13 can arbitrarily select the work machine 20 that he wishes to remotely control according to the situation of the construction site 2 . Therefore, a decrease in construction efficiency at the construction site 2 is suppressed.
- remote control of the hydraulic excavator 21 is permitted to start. If crawler dump 23 is specified, start of remote control of crawler dump 23 is permitted.
- a manager at the remote location 13 can remotely operate the plurality of work machines 20 in sequence.
- the work type corresponding to the work machine 20 designated as a remote control target is displayed on the display device 52 . This allows the administrator of the remote site 13 to select the type of work.
- a work type corresponding to each of the plurality of work machines 20 is stored in the work type storage unit 62 in advance. Therefore, the output unit 45 can display the work type on the display device 52 based on the machine selection data and the data stored in the work type storage unit 62 .
- a work target corresponding to the type of work to be performed by the work machine 20 designated as a remote control target is displayed on the display device 52 . This allows the administrator at the remote location 13 to select a work target.
- Work targets corresponding to each of a plurality of work types are stored in the work target storage unit 63 in advance. Therefore, the output unit 45 can display the work target on the display device 52 based on the type selection data and the data stored in the work target storage unit 63 .
- the remote operation permitting unit 47 determines whether or not to permit the start of remote operation based on the specified work type and the surrounding conditions of the work machine 20 . As a result, remote control is not started in situations where remote control is inappropriate.
- the other work machine 20 displayed on the display device 52 is specified, and the other work machine 20 is designated.
- the work type and work target of the work machine 20 can be specified. As a result, remote control of the plurality of work machines 20 can be efficiently performed.
- only one remotely controllable work machine 20 may be displayed on the display device 52 .
- the start of remote control of the work machine 20 is permitted. If the work machine 20 displayed on the display device 52 is not specified via the input device 51, the start of remote control of the work machine 20 is not permitted.
- the administrator at the remote site 13 can remotely operate the work machine 20 after being permitted to start remote operation of the work machine 20 .
- the administrator at the remote site 13 operates the mouse as the input device 51 to designate the work machine 20 to be remotely controlled, select the work type, and designate the work target. It was decided to.
- the manager at the remote location 13 can specify the work machine 20 to be remotely controlled, specify the work type, or specify the work target by voice. You can
- the output unit 45 causes the display device 52 to display the work type corresponding to the designated work machine 20, or causes the display device 52 to display the work target corresponding to the designated work type.
- the output unit 45 may cause the voice output device to output the work type corresponding to the designated work machine 20, or may output the work type corresponding to the designated work type.
- the work target may be output to an audio output device.
- the work machine 20 may have a driver on board.
- the administrator at the remote site 13 can operate the input device 51 to specify the work type and work target.
- the type selection data and object selection data specified by the input device 51 are transmitted to the hydraulic excavator 21 via the communication system 10 .
- An output device for outputting the type selection data and the object selection data specified by the input device 51 is arranged in the driver's cab of the hydraulic excavator 21 .
- a monitor and a speaker are exemplified as an output device.
- a driver of the hydraulic excavator 21 may operate the hydraulic excavator 21 based on the type selection data and the target selection data designated by the input device 51 . For example, when "loading" is specified and "first dump truck" is specified as a loading target, the operator of the hydraulic excavator 21 operates the hydraulic excavator 21 to load the "first dump truck". Loading operations can be carried out.
- the recognition unit 43 recognizes the position of the work machine 20 based on the detection data of the detection device 9.
- work machine 20 may be provided with a position sensor that detects the position of work machine 20, and recognition unit 43 may recognize the position of work machine 20 based on the detection data of the position sensor.
- the recognition unit 43 recognizes the operation of the work machine 20 based on the detection data of the detection device 9.
- a motion sensor that detects the motion of work machine 20 may be provided in work machine 20, and recognition unit 43 may recognize the motion of work machine 20 based on detection data of the motion sensor.
- motion sensors include an angle sensor that detects the motion of the work machine 26 and a stroke sensor that detects the amount of expansion and contraction of the hydraulic cylinder 27 .
- the recognition unit 43 may recognize objects based on pattern matching, for example, without using artificial intelligence.
- the recognition unit 43 can recognize the object by matching the template representing the person WM with the image data of the construction site 2 .
- the detection device 9 does not have to be mounted on the aircraft 8.
- the detection device 9 may be mounted, for example, on the work machine 20 or may be attached to a structure present at the construction site 2 . The same applies to the detection device 12 as well.
- the work type storage unit 62 pre-stores work types corresponding to each of the plurality of work machines 20 . Further, the work target storage unit 63 stores in advance work targets corresponding to each of a plurality of work types. In another embodiment, the work machine 20 transmits information indicating work types and work targets corresponding to each of a plurality of work types to the server 4 via the communication system 10, and the output unit 45 receives Based on the information, the work type and the work target may be displayed on the display device 52 .
- Each of the terrain storage unit 61, the work type storage unit 62, and the work target storage unit 63 may be configured by separate hardware.
- the functions of the current terrain data creation unit 41, the functions of the detection data acquisition unit 42, the functions of the recognition unit 43, the functions of the reflection unit 44, the functions of the output unit 45, the functions of the selection data acquisition unit 46, and the remote operation permission unit 47 , the function of the remote operation command unit 48, the function of the current terrain storage unit 61, the function of the work type storage unit 62, and the function of the work target storage unit 63 may be provided in the management device 3. , may be provided in a server different from the server 4 .
- the management device 3 is supported by the traveling device 6 and can travel on the construction site 2.
- the management device 3 may be mounted on the work machine 20 or installed at a predetermined position on the construction site 2 .
- the detection device 12 detects the entire construction site 2, and the detection device 9 detects a part of the construction site 2.
- the detection device 9 may detect the entire construction site 2 .
- the detection targets of the detection device 9 are not limited to the topography of the construction site 2, the work machine 20 and the man WM. In other embodiments, the detection device 9 may detect construction materials.
- the information terminal 5 does not have to be located at the remote location 13 of the construction site 2.
- the information terminal 5 may be mounted on the work machine 20, for example. Also, the information terminal 5 may be omitted.
- the progress of construction may be output from the monitor to work machine 20 .
- a monitor may include an input device as well as a display device.
- the output unit 45 causes the display device 52 to display the work machine 20 that can be remotely controlled.
- the display device 52 may display the remotely operable work machine 20 and the non-remotely operable work machine 20 .
- the working machine 20 is not limited to including the hydraulic excavator 21, the bulldozer 22, and the crawler dump 23.
- work machine 20 may include excavator 21 , bulldozer 22 , and portions of crawler dump 23 . It may also include other types of work machines.
- the position of the flying object 8 is detected using the global navigation satellite system (GNSS) and the attitude of the flying object 8 is detected using the inertial measurement device.
- GNSS global navigation satellite system
- SLAM Simultaneous Localization and Mapping
- SLAM may be used to detect the position and attitude of the aircraft 8 .
- SLAM may be used to detect the position and orientation of the flying object 11 and work machine 20 .
- SYMBOLS 1 Construction management system, 2... Construction site, 3... Management apparatus, 3D... Three-dimensional model, 4... Server (data processing apparatus), 5... Information terminal, 6... Traveling device, 7... Cable, 8... Aircraft, 8D Three-dimensional model 9 Detecting device 10 Communication system 11 Flight object 12 Detecting device 13 Remote location 14 Position sensor 15 Attitude sensor 16 Position sensor 17 Attitude sensor , 20... Working machine, 21... Hydraulic excavator, 21D... Three-dimensional model, 22... Bulldozer, 23... Crawler dump, 23D... Three-dimensional model, 24... Traveling body, 25... Rotating body, 26... Working machine, 26A... Boom , 26B... arm, 26C... bucket, 27... hydraulic cylinder, 27A...
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Structural Engineering (AREA)
- Economics (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Strategic Management (AREA)
- Mining & Mineral Resources (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Entrepreneurship & Innovation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Quality & Reliability (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Component Parts Of Construction Machinery (AREA)
- Traffic Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Conveying And Assembling Of Building Elements In Situ (AREA)
Abstract
Description
図1は、実施形態に係る施工管理システム1を示す模式図である。施工管理システム1は、施工現場2の施工を管理する。施工現場2において複数の作業機械20が稼働する。実施形態において、作業機械20は、油圧ショベル21、ブルドーザ22、及びクローラダンプ23を含む。なお、作業機械20は、ホイールローダを含んでもよい。また、施工現場2に人WMが存在する。人WMとして、施工現場2で作業する作業者が例示される。なお、人WMは、施工を管理する監督者でもよい。人WMは、見学者でもよい。
図2は、実施形態に係る油圧ショベル21を示す斜視図である。図2に示すように、油圧ショベル21は、走行体24と、走行体24に支持される旋回体25と、旋回体25に支持される作業機26と、作業機26を駆動する油圧シリンダ27とを備える。
図4は、実施形態に係る施工管理システム1を示す機能ブロック図である。図4に示すように、施工管理システム1は、飛行体8と、飛行体11と、施工現場2に配置された管理装置3と、サーバ4と、施工現場2の遠隔地13に配置された情報端末5とを有する。
上述のように、飛行体8には、位置センサ14及び姿勢センサ15が搭載されている。位置センサ14は、検出装置9の位置を検出することができる。姿勢センサ15は、検出装置9の姿勢を検出することができる。姿勢センサ15は、検出装置9の姿勢を検出することができる。姿勢は、例えばロール角、ピッチ角、及びヨー角を含む。ヨー角は、飛行体8に設けられた2つのGNSSセンサの検出データに基づいて算出されてもよい。検出装置9は、施工現場2の3次元データを検出する。施工現場2の3次元データは、検出装置9と検出対象に規定される複数の検出点のそれぞれとの相対距離及び相対位置を含む。認識部43及び反映部44は、位置センサ14の検出データと、姿勢センサ15の検出データと、検出装置9の検出データとに基づいて、例えばグローバル座標系における検出対象の3次元データの位置を算出することができる。また、認識部43及び反映部44は、所定の座標変換を実施して、例えば施工現場2に規定されたローカル座標系における検出対象の3次元データの位置を算出することができる。検出装置9の検出対象は、更新後の地形及び物体を含む。
図5は、実施形態に係る施工管理方法を示すフローチャートである。現況地形データ作成部41は、作業機械20が稼働する施工現場2の現況地形データを作成する(ステップS1)。
図12は、実施形態に係るコンピュータシステム1000を示すブロック図である。上述のサーバ4は、コンピュータシステム1000を含む。コンピュータシステム1000は、CPU(Central Processing Unit)のようなプロセッサ1001と、ROM(Read Only Memory)のような不揮発性メモリ及びRAM(Random Access Memory)のような揮発性メモリを含むメインメモリ1002と、ストレージ1003と、入出力回路を含むインターフェース1004とを有する。上述のサーバ4の機能は、コンピュータプログラムとしてストレージ1003に記憶されている。プロセッサ1001は、コンピュータプログラムをストレージ1003から読み出してメインメモリ1002に展開し、プログラムに従って上述の処理を実行する。なお、コンピュータプログラムは、ネットワークを介してコンピュータシステム1000に配信されてもよい。
以上説明したように、実施形態によれば、遠隔操作可能な作業機械20が表示装置52に表示される。遠隔地13の管理者は、入力装置51を介して、遠隔操作したい作業機械20を指定することができる。施工現場2に複数の作業機械20が存在する場合、遠隔地13の管理者は、施工現場2の状況に合わせて遠隔操作する作業機械20を任意に選択することができる。遠隔操作する作業機械20が指定されると、遠隔操作する作業機械20が指定されたことを示す機械選択データが生成される。機械選択データが生成されると、指定された作業機械20の遠隔操作の開始が許可される。作業機械20が指定されない場合、作業機械20の遠隔操作は許可されない。遠隔地13の管理者は、施工現場2の状況に合わせて、遠隔操作したい作業機械20を任意に選択することができる。そのため、施工現場2の施工効率の低下が抑制される。
上述の実施形態において、表示装置52に表示される遠隔操作可能な作業機械20は、1台でもよい。表示装置52に表示された作業機械20が入力装置51を介して指定されることにより、作業機械20の遠隔操作の開始が許可される。表示装置52に表示された作業機械20が入力装置51を介して指定されない場合、作業機械20の遠隔操作の開始は許可されない。遠隔地13の管理者は、作業機械20の遠隔操作の開始が許可された後、作業機械20を遠隔操作することができる。
Claims (20)
- 遠隔操作可能な作業機械を表示装置に表示させる出力部と、
前記作業機械が指定されたことを示す機械選択データを取得する選択データ取得部と、
前記機械選択データに基づいて前記作業機械の遠隔操作の開始を許可する遠隔操作許可部と、を備える、
施工管理システム。 - 前記表示装置に表示される前記作業機械は、第1作業機械と第2作業機械とを含み、
前記遠隔操作許可部は、前記第1作業機械が指定されたことを示す第1機械選択データが取得された場合、前記第1作業機械の遠隔操作の開始を許可し、前記第2作業機械が指定されたことを示す第2機械選択データが取得された場合、前記第2作業機械の遠隔操作の開始を許可する、
請求項1に記載の施工管理システム。 - 前記出力部は、指定された前記作業機械に対応する作業種別を出力装置に出力させる、
請求項1又は請求項2に記載の施工管理システム。 - 前記作業機械に対応する作業種別を記憶する作業種別記憶部を備え、
前記出力部は、前記機械選択データと前記作業種別記憶部の記憶データとに基づいて前記作業種別を出力させる、
請求項3に記載の施工管理システム。 - 前記選択データ取得部は、前記作業機械に実施させる作業種別が指定されたことを示す種別選択データを取得し、
前記出力部は、指定された前記作業種別に対応する作業対象を出力装置に出力させる、
請求項3又は請求項4に記載の施工管理システム。 - 前記作業種別に対応する作業対象を記憶する作業対象記憶部を備え、
前記出力部は、前記種別選択データと前記作業対象記憶部の記憶データとに基づいて前記作業対象を出力させる、
請求項5に記載の施工管理システム。 - 前記遠隔操作許可部は、指定された前記作業種別と前記作業機械の周辺の状況とに基づいて、前記遠隔操作の開始を許可するか否かを判定する、
請求項4から請求項6のいずれか一項に記載の施工管理システム。 - 前記作業機械が稼働する施工現場の現況地形データを作成する現況地形データ作成部と、
前記作業機械及び前記作業機械の周辺の施工エリアを検出した検出装置の検出データを取得する検出データ取得部と、
前記検出データを前記現況地形データに反映した反映データを生成する反映部と、を備え、
前記検出データは、前記施工エリアの更新地形データを含み、
前記出力部は、前記更新地形データが反映された前記反映データを前記作業機械とともに前記表示装置に表示させる、
請求項1から請求項7のいずれか一項に記載の施工管理システム。 - 前記検出データに基づいて前記作業機械を認識する認識部を備え、
前記反映データは、前記認識部により認識された前記作業機械の3次元モデルを含み、
前記出力部は、前記3次元モデルを前記表示装置に表示させる、
請求項8に記載の施工管理システム。 - 前記認識部は、前記作業機械の動作を認識し、
前記反映部は、前記作業機械に同期して前記3次元モデルを動かす、
請求項9に記載の施工管理システム。 - 前記検出装置は、飛行体に搭載される、
請求項8から請求項10のいずれか一項に記載の施工管理システム。 - 前記第1作業機械の遠隔操作の開始が許可された後、前記第1作業機械の遠隔操作が完了するまでの間に前記第2機械選択データは取得され、前記第2作業機械の遠隔操作の開始を許可する、
請求項2に記載の施工管理システム。 - 表示装置に表示される遠隔操作可能な作業機械が指定されたことを示す機械選択データを取得する選択データ取得部と、
前記機械選択データに基づいて前記作業機械の作業種別を表示するための信号を出力する出力部と、を備える、
施工管理システム。 - 遠隔操作可能な作業機械を表示装置に表示させる出力部と、
前記作業機械が指定されたことを示す機械選択データを取得する選択データ取得部と、
前記機械選択データに基づいて前記作業機械の遠隔操作の開始を許可する遠隔操作許可部と、を備える、
データ処理装置。 - 遠隔操作可能な作業機械を表示装置に表示させることと、
前記作業機械が指定されたことを示す機械選択データを取得することと、
前記機械選択データに基づいて前記作業機械の遠隔操作の開始を許可することと、を含む、
施工管理方法。 - 前記表示装置に表示される前記作業機械は、第1作業機械と第2作業機械とを含み、
前記第1作業機械が指定されたことを示す第1機械選択データが取得された場合、前記第1作業機械の遠隔操作の開始を許可し、前記第2作業機械が指定されたことを示す第2機械選択データが取得された場合、前記第2作業機械の遠隔操作の開始を許可することと、を含む、
請求項15に記載の施工管理方法。 - 指定された前記作業機械に対応する作業種別を出力装置に出力させることと、を含む、
請求項15又は請求項16に記載の施工管理方法。 - 前記作業機械に対応する作業種別を記憶することと、
前記機械選択データと前記作業種別の記憶データとに基づいて前記作業種別を出力させることと、を含む、
請求項17に記載の施工管理方法。 - 前記作業機械に実施させる作業種別が指定されたことを示す種別選択データを取得することと、
指定された前記作業種別に対応する作業対象を出力装置に出力させることと、を含む、
請求項17又は請求項18に記載の施工管理方法。 - 前記作業種別に対応する作業対象を記憶することと、
前記種別選択データと前記作業対象の記憶データとに基づいて前記作業対象を出力させることと、を含む、
請求項19に記載の施工管理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/279,075 US20240127372A1 (en) | 2021-03-31 | 2022-02-22 | Construction management system, data processing device, and construction management method |
CN202280020883.5A CN117015790A (zh) | 2021-03-31 | 2022-02-22 | 施工管理系统、数据处理装置、以及施工管理方法 |
AU2022247649A AU2022247649A1 (en) | 2021-03-31 | 2022-02-22 | Construction management system, data processing device, and construction management method |
DE112022000845.8T DE112022000845T5 (de) | 2021-03-31 | 2022-02-22 | Bauverwaltungssystem, Datenverarbeitungsvorrichtung und Bauverwaltungsverfahren |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-061692 | 2021-03-31 | ||
JP2021061692A JP2022157459A (ja) | 2021-03-31 | 2021-03-31 | 施工管理システム、データ処理装置、及び施工管理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022209434A1 true WO2022209434A1 (ja) | 2022-10-06 |
Family
ID=83455950
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/007262 WO2022209434A1 (ja) | 2021-03-31 | 2022-02-22 | 施工管理システム、データ処理装置、及び施工管理方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240127372A1 (ja) |
JP (1) | JP2022157459A (ja) |
CN (1) | CN117015790A (ja) |
AU (1) | AU2022247649A1 (ja) |
DE (1) | DE112022000845T5 (ja) |
WO (1) | WO2022209434A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1088624A (ja) * | 1996-09-13 | 1998-04-07 | Taisei Corp | Gps無人施工管理システム |
JP2020180451A (ja) * | 2019-04-24 | 2020-11-05 | 株式会社小松製作所 | 作業機械を制御するためのシステムおよび方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6777375B2 (ja) | 2015-03-05 | 2020-10-28 | 株式会社小松製作所 | 作業機械の画像表示システム、作業機械の遠隔操作システム及び作業機械 |
-
2021
- 2021-03-31 JP JP2021061692A patent/JP2022157459A/ja active Pending
-
2022
- 2022-02-22 US US18/279,075 patent/US20240127372A1/en active Pending
- 2022-02-22 DE DE112022000845.8T patent/DE112022000845T5/de active Pending
- 2022-02-22 CN CN202280020883.5A patent/CN117015790A/zh active Pending
- 2022-02-22 AU AU2022247649A patent/AU2022247649A1/en active Pending
- 2022-02-22 WO PCT/JP2022/007262 patent/WO2022209434A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1088624A (ja) * | 1996-09-13 | 1998-04-07 | Taisei Corp | Gps無人施工管理システム |
JP2020180451A (ja) * | 2019-04-24 | 2020-11-05 | 株式会社小松製作所 | 作業機械を制御するためのシステムおよび方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2022157459A (ja) | 2022-10-14 |
CN117015790A (zh) | 2023-11-07 |
US20240127372A1 (en) | 2024-04-18 |
DE112022000845T5 (de) | 2023-11-16 |
AU2022247649A1 (en) | 2023-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9322148B2 (en) | System and method for terrain mapping | |
AU2018333191B2 (en) | Display system, display method, and display apparatus | |
KR101695914B1 (ko) | 토공공사 수행시 굴삭기의 형상정보를 실시간 제공하는 굴삭기 3d 토공 bim 시스템 | |
CN110741124B (zh) | 用于工程机械的信息系统 | |
JPH11211473A (ja) | 地形形状計測装置 | |
US20200149248A1 (en) | System and method for autonomous operation of heavy machinery | |
JP2020002718A (ja) | 表示制御装置、および表示制御方法 | |
JP7203616B2 (ja) | 作業機械 | |
JP7372029B2 (ja) | 表示制御装置、表示制御システムおよび表示制御方法 | |
US20210140147A1 (en) | A working machine provided with an image projection arrangement | |
US20240068202A1 (en) | Autonomous Control Of Operations Of Powered Earth-Moving Vehicles Using Data From On-Vehicle Perception Systems | |
WO2022209434A1 (ja) | 施工管理システム、データ処理装置、及び施工管理方法 | |
WO2022209437A1 (ja) | 施工管理システム、データ処理装置、及び施工管理方法 | |
JP7449314B2 (ja) | ショベル、遠隔操作支援装置 | |
AU2020320149B2 (en) | Display system, remote operation system, and display method | |
US20240135469A1 (en) | Construction management system, data processing device, and construction management method | |
JP7107792B2 (ja) | 建設機械 | |
WO2023106324A1 (ja) | 表示システム及び表示方法 | |
WO2023106323A1 (ja) | 施工管理システム及び施工管理方法 | |
WO2023136326A1 (ja) | 作業機械 | |
US20230407605A1 (en) | Design generation for earth-moving operations | |
JP2023086534A (ja) | 表示システム及び表示方法 | |
EA042868B1 (ru) | Мониторинг автономных транспортных средств |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22779659 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18279075 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022247649 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280020883.5 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022247649 Country of ref document: AU Date of ref document: 20220222 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022000845 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22779659 Country of ref document: EP Kind code of ref document: A1 |