WO2022209437A1 - 施工管理システム、データ処理装置、及び施工管理方法 - Google Patents
施工管理システム、データ処理装置、及び施工管理方法 Download PDFInfo
- Publication number
- WO2022209437A1 WO2022209437A1 PCT/JP2022/007307 JP2022007307W WO2022209437A1 WO 2022209437 A1 WO2022209437 A1 WO 2022209437A1 JP 2022007307 W JP2022007307 W JP 2022007307W WO 2022209437 A1 WO2022209437 A1 WO 2022209437A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- detection
- construction
- construction site
- unit
- Prior art date
Links
- 238000009430 construction management Methods 0.000 title claims abstract description 40
- 238000000034 method Methods 0.000 title claims description 13
- 238000012545 processing Methods 0.000 title claims description 9
- 238000001514 detection method Methods 0.000 claims abstract description 204
- 238000010276 construction Methods 0.000 claims abstract description 163
- 238000012876 topography Methods 0.000 description 50
- 238000010586 diagram Methods 0.000 description 20
- 230000033001 locomotion Effects 0.000 description 19
- 238000007726 management method Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 238000009412 basement excavation Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 239000004035 construction material Substances 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- the present disclosure relates to a construction management system, a data processing device, and a construction management method.
- construction management systems such as those disclosed in Patent Document 1 are known.
- the purpose of this disclosure is to confirm the progress of construction.
- a current topography data creation unit that creates current topography data of a construction site where a work machine operates, a detection data acquisition unit that acquires detection data of a detection device that detects the construction site, and a A construction management system is provided that includes a reflection unit that generates reflection data to be reflected in terrain data, and an output unit that outputs the reflection data to a display device.
- FIG. 1 is a schematic diagram showing a construction management system according to an embodiment.
- FIG. 2 is a perspective view showing the hydraulic excavator according to the embodiment.
- FIG. 3 is a perspective view showing the crawler dump according to the embodiment.
- FIG. 4 is a functional block diagram showing the construction management system according to the embodiment.
- FIG. 5 is a flow chart showing a construction management method according to the embodiment.
- FIG. 6 is a diagram showing current terrain data according to the embodiment.
- FIG. 7 is a diagram showing reflection data according to the embodiment.
- FIG. 8 is a diagram showing reflection data according to the embodiment.
- FIG. 9 is a diagram showing reflection data according to the embodiment.
- FIG. 10 is a diagram showing reflection data according to the embodiment.
- FIG. 11 is a diagram showing reflection data according to the embodiment.
- FIG. 12 is a diagram showing reflection data according to the embodiment.
- FIG. 13 is a block diagram of a computer system according to the embodiment.
- FIG. 1 is a schematic diagram showing a construction management system 1 according to an embodiment.
- a construction management system 1 manages construction at a construction site 2 .
- a plurality of work machines 20 operate at the construction site 2 .
- work machine 20 includes excavator 21 , bulldozer 22 , and crawler dumper 23 .
- the work machine 20 may include a wheel loader.
- a person WM is present at the construction site 2 .
- a worker who works at the construction site 2 is exemplified as the person WM.
- the person WM may be a supervisor who manages construction.
- the person WM may be a spectator.
- the construction management system 1 includes a management device 3, a server 4, an information terminal 5, a detection device 9, and a detection device 12.
- the management device 3 includes a computer system located at the construction site 2.
- the management device 3 is supported by the travel device 6 .
- the management device 3 can travel on the construction site 2 by the travel device 6 .
- Examples of the traveling device 6 include an aerial work vehicle, a truck, and a traveling robot.
- the server 4 is a data processing device including a computer system.
- the server 4 may be located at the construction site 2 or may be located at a remote location from the construction site 2 .
- the information terminal 5 is a computer system arranged at a remote location 13 of the construction site 2 .
- a personal computer or a smart phone is exemplified as the information terminal 5 .
- the management device 3 , the server 4 and the information terminal 5 communicate with each other via the communication system 10 .
- Examples of the communication system 10 include the Internet, a local area network (LAN), a mobile phone communication network, and a satellite communication network.
- the detection device 9 detects the construction site 2.
- the detection device 9 acquires three-dimensional data of the construction site 2 .
- Examples of objects to be detected by the detection device 9 include the topography of the construction site 2 and objects present on the construction site 2 .
- An object includes one or both of a movable body and a stationary body.
- the work machine 20 and the person WM are exemplified as movable bodies.
- Wood or material is exemplified as the stationary body.
- the three-dimensional data acquired by the detection device 9 includes image data of the construction site 2.
- the image data acquired by the detection device 9 may be moving image data or still image data.
- a stereo camera is exemplified as the detection device 9 .
- the detection device 9 may include a monocular camera and a three-dimensional measurement device.
- a laser sensor LIDAR: Light Detection and Ranging
- the three-dimensional measurement device may be an infrared sensor that detects an object by emitting infrared light or a radar sensor (RADAR: Radio Detection and Ranging) that detects an object by emitting radio waves.
- the detection device 9 is mounted on the flying object 8.
- an unmanned aerial vehicle UAV: Unmanned Aerial Vehicle
- UAV Unmanned Aerial Vehicle
- the detection device 9 detects the construction site 2 from above the construction site 2 .
- the flying object 8 and the management device 3 are connected by a cable 7.
- Detection data of the detection device 9 is transmitted to the management device 3 via the cable 7 .
- the detection data of the detection device 9 transmitted to the management device 3 is transmitted to the server 4 via the communication system 10 .
- the management device 3 includes a power supply or generator.
- the management device 3 can supply power to the aircraft 8 via the cable 7 .
- the detection device 12 detects the construction site 2. Similar to the detection device 9 , the detection device 12 acquires three-dimensional data of the construction site 2 .
- the three-dimensional data acquired by the detection device 12 includes image data of the construction site 2 .
- the detection device 12 is mounted on the flying object 11.
- the detection device 12 detects the construction site 2 from above the construction site 2 .
- Detection data of the detection device 12 is transmitted to the server 4 via the communication system 10 .
- the flying object 11 can fly higher than the flying object 8.
- the flying object 11 can fly over a wider range than the flying object 8.
- the detection device 12 can detect a wider area of the construction site 2 than the detection device 9 can.
- the detection device 12 detects the entire construction site 2 .
- a detection device 9 detects a portion of the construction site 2 .
- FIG. 2 is a perspective view showing the hydraulic excavator 21 according to the embodiment.
- the hydraulic excavator 21 includes a traveling body 24, a revolving body 25 supported by the traveling body 24, a work machine 26 supported by the revolving body 25, and a hydraulic cylinder 27 for driving the work machine 26. and
- the running body 24 has a pair of crawler belts.
- the hydraulic excavator 21 can travel on the construction site 2 by the traveling body 24 .
- the revolving body 25 revolves while being supported by the traveling body 24 .
- Work implement 26 includes a boom 26A connected to revolving body 25, an arm 26B connected to boom 26A, and a bucket 26C connected to arm 26B.
- the hydraulic cylinders 27 include a boom cylinder 27A that operates the boom 26A, an arm cylinder 27B that operates the arm 26B, and a bucket cylinder 27C that operates the bucket 26C.
- the hydraulic excavator 21 operates. Examples of the motions of the hydraulic excavator 21 include the traveling motion of the traveling body 24, the turning motion of the revolving body 25, the raising and lowering motions of the boom 26A, the digging motion and dumping motion of the arm 26B, and the digging motion and dumping motion of the bucket 26C. be done.
- FIG. 3 is a perspective view showing the crawler dump 23 according to the embodiment.
- the crawler dump 23 has a traveling body 28 , a vehicle body 29 and a dump body 30 .
- the running body 28 has a pair of crawler belts.
- the crawler dump 23 can travel the construction site 2 by the traveling body 28 .
- the dump body 30 is a member on which cargo is loaded.
- the hydraulic excavator 21 can load a cargo onto the dump body 30 using the working machine 26 .
- the dump body 30 can be lifted by a hoist cylinder (not shown) to discharge the cargo.
- the crawler dump 23 operates. Examples of the operation of the crawler dump 23 include a traveling operation of the traveling body 28 and a lowering operation and a dumping operation of the dump body 30 .
- FIG. 4 is a functional block diagram showing the construction management system 1 according to the embodiment.
- the construction management system 1 includes an aircraft 8, an aircraft 11, a management device 3 placed at the construction site 2, a server 4, and a remote location 13 located at the construction site 2. and an information terminal 5 .
- the flying object 8 has a position sensor 14, an attitude sensor 15, and a detection device 9.
- the position sensor 14 detects the position of the flying object 8.
- a position sensor 14 detects the position of the aircraft 8 using the Global Navigation Satellite System (GNSS).
- the position sensor 14 includes a GNSS receiver (GNSS sensor) and detects the position of the aircraft 8 in the global coordinate system.
- the attitude sensor 15 detects the attitude of the aircraft 8 .
- an inertial measurement unit IMU: Inertial Measurement Unit
- IMU Inertial Measurement Unit
- the flying object 11 has a position sensor 16, an attitude sensor 17, and a detection device 12.
- the position sensor 16 includes a GNSS receiver and detects the position of the aircraft 11 in the global coordinate system.
- the attitude sensor 17 detects the attitude of the aircraft 11 .
- an inertial measurement unit IMU: Inertial Measurement Unit
- IMU Inertial Measurement Unit
- the server 4 has a current terrain data creation unit 41 , a detection data acquisition unit 42 , a recognition unit 43 , a reflection unit 44 , an output unit 45 and a storage unit 46 .
- the current topography data creation unit 41 creates current topography data indicating the current topography of the construction site 2 where the work machine 20 operates.
- the current topography data is three-dimensional topography data representing the current topography of the construction site 2 .
- Existing terrain includes reference terrain before a given construction is started.
- the current topography data creation unit 41 creates current topography data based on the detection data of the detection device 12 . As described above, the detection device 12 detects the entire construction site 2 . The current topography data indicates the current topography of the construction site 2 as a whole.
- the detection device 12 detects the construction site 2 with a first frequency.
- the detection device 12 detects the construction site 2 only once, for example, before the start of work for the day.
- the detection device 12 detects the current landform indicating the reference landform before the start of the predetermined construction at the construction site 2 .
- the current topography data creation unit 41 creates current topography data with a first frequency.
- the current landform data created by the current landform data creation unit 41 is stored in the storage unit 46 .
- the current terrain data stored in the storage unit 46 is updated at a first frequency.
- the timing at which the detection device 12 detects the construction site 2 is not limited to before the start of work of the day, and may be any timing.
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 that has detected the work machine 20 and the construction area around the work machine 20 .
- the detection device 9 detects a portion of the construction site 2.
- the portion of the construction site 2 detected by the detection device 9 includes a work machine 20 that performs work.
- the portion of the construction site 2 detected by the detection device 9 includes the construction area around the work machine 20 that performs the work.
- An example of a construction area detected by the detection device 9 is a construction area in which construction is being performed by the work machine 20 .
- the detection device 9 detects the construction site 2 with a second frequency higher than the first frequency.
- the detection device 9 continuously detects the construction site 2, for example, for a certain period of time.
- the detection device 9 continuously detects the construction site 2, for example, only while the working machine 20 is working. Note that the detection device 9 may always detect the construction site 2 .
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 at a second frequency.
- the detection data acquired by the detection data acquisition unit 42 is updated more frequently than the current terrain data.
- the recognition unit 43 recognizes objects on the construction site 2 based on the detection data acquired by the detection data acquisition unit 42 . As described above, the work machine 20 and the person WM are exemplified as objects.
- the recognition unit 43 recognizes objects using artificial intelligence (AI) that analyzes input data by an algorithm and outputs output data.
- AI artificial intelligence
- the input data is the image data of the construction site 2 acquired by the detection device 9, and the output data is the object.
- the recognition unit 43 has a learning model generated by learning the feature amount of the object.
- the learning model includes a learning model generated by learning the feature amount of work machine 20 and a learning model generated by learning the feature amount of person WM.
- machine learning is performed using a learning image including an object as teacher data, thereby generating a learning model that takes as input the feature amount of the object and outputs the object.
- the recognition unit 43 can recognize the object by inputting the image data of the construction site 2 acquired by the detection device 9 into the learning model.
- the reflection unit 44 generates reflection data in which the detection data acquired by the detection data acquisition unit 42 is reflected in the current terrain data.
- the current topographical data is three-dimensional topographical data of the entire construction site 2 detected by the detecting device 12 at the first frequency.
- the detection data includes three-dimensional terrain data of a construction area that is part of the construction site 2 detected by the detection device 9 at the second frequency.
- the detection data acquired by the detection data acquisition unit 42 are sequentially updated at the second frequency.
- the reflection unit 44 sequentially reflects the detection data acquired by the detection data acquisition unit 42 on a part of the current terrain data at a second frequency. At least part of the current terrain data is sequentially updated with the detection data acquired by the detection data acquisition unit 42 .
- the detection data acquired by the detection data acquisition unit 42 includes updated topography data indicating the updated topography of the construction area.
- Post-update topography includes the latest topography during or after construction.
- the updated terrain data is updated with a second frequency.
- the reflection unit 44 reflects the updated terrain data acquired by the detection data acquisition unit 42 on the current terrain data stored in the storage unit 46 .
- the detection data acquired by the detection data acquisition unit 42 includes non-terrain data indicating objects on the construction site 2 .
- the non-terrain data is three-dimensional data representing objects on the construction site 2 .
- Non-terrain data is updated at a second frequency.
- the reflection unit 44 reflects the non-terrain data acquired by the detected data acquisition unit 42 on the current terrain data stored in the storage unit 46 .
- the reflection unit 44 reflects the updated topography data of the construction area from which the non-topography data has been removed from the detection data acquired by the detection data acquisition unit 42, in the current topography data.
- Non-terrain data representing an object is recognized by the recognition unit 43 .
- the reflecting unit 44 removes the non-terrain data from the detection data to generate updated terrain data.
- the reflection unit 44 generates reflection data in which the object is reflected.
- the reflection data generated by the reflection unit 44 includes the updated terrain data of the construction area.
- the reflected data generated by the reflecting unit 44 includes the object recognized by the recognizing unit 43.
- the reflected data generated by the reflecting unit 44 includes the three-dimensional model of the working machine 20 recognized by the recognizing unit 43 .
- the three-dimensional model of work machine 20 includes computer graphics (CG) of work machine 20 .
- the three-dimensional model of the work machine 20 is a three-dimensional model that represents the work machine 20, and is constructed for each part that constitutes the work machine 20, such as the traveling body 24, the revolving body 25, and the work machine 26, for example. be.
- a three-dimensional model of the work machine 20 is pre-stored in the storage unit 46 .
- the reflected data generated by the reflecting unit 44 includes a symbol image indicating the position of the object.
- a symbol image is image data that emphasizes the position of an object.
- a reflecting unit 44 generates a symbol image based on the recognition result of the recognizing unit 43 .
- the output unit 45 outputs the reflection data generated by the reflection unit 44 to the information terminal 5 .
- the output unit 45 transmits the reflection data to the information terminal 5 via the communication system 10 .
- the information terminal 5 has an input device 51 and a display device 52 .
- the input device 51 is operated by an administrator at the remote location 13.
- the input device 51 generates input data based on the administrator's operation. Examples of the input device 51 include a touch panel, a computer keyboard, a mouse, or operation buttons.
- the input device 51 may be a non-contact input device including an optical sensor, or may be a voice input device.
- the display device 52 displays display data.
- An administrator at the remote location 13 can confirm the display data displayed on the display device 52 .
- the output section 45 outputs the reflection data generated by the reflection section 44 to the display device 52 .
- the display device 52 displays the reflection data generated by the reflection unit 44 as display data.
- the display device 52 is exemplified by a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OELD).
- the flying object 8 is equipped with the position sensor 14 and the attitude sensor 15 .
- a position sensor 14 can detect the position of the detection device 9 .
- the orientation sensor 15 can detect the orientation of the detection device 9 .
- Attitude includes, for example, roll angle, pitch angle, and yaw angle. The yaw angle may be calculated based on detection data from two GNSS sensors provided on the aircraft 8 .
- a detection device 9 detects three-dimensional data of the construction site 2 .
- the three-dimensional data of the construction site 2 includes relative distances and relative positions between the detection device 9 and each of a plurality of detection points defined as detection targets.
- the recognition unit 43 and the reflection unit 44 recognize the position of the three-dimensional data to be detected in the global coordinate system, for example, based on the detection data of the position sensor 14, the detection data of the orientation sensor 15, and the detection data of the detection device 9. can be calculated. Further, the recognition unit 43 and the reflection unit 44 can perform a predetermined coordinate transformation to calculate the position of the three-dimensional data to be detected in the local coordinate system defined for the construction site 2, for example.
- the detection target of the detection device 9 includes updated landforms and objects.
- the current topography data creation unit 41 generates a map in a local coordinate system defined for the construction site 2, for example, based on the detection data of the position sensor 16, the detection data of the orientation sensor 17, and the detection data of the detection device 12.
- the position of the three-dimensional data to be detected can be calculated.
- the detection target of the detection device 12 includes the current terrain.
- the recognition unit 43 can recognize the presence or absence of an object and the position of the object based on the detection data acquired by the detection data acquisition unit 42 .
- the recognition unit 43 when recognizing the position of the person WM, the recognition unit 43 recognizes the person WM based on the two-dimensional image acquired by the monocular camera of the detection device 9 .
- the recognition unit 43 also acquires the three-dimensional topographical data of the construction site 2 .
- the recognition unit 43 can recognize the position of the person WM at the construction site 2 based on the person WM recognized based on the two-dimensional image.
- a person WM recognized based on two-dimensional images acquired by two monocular cameras that constitute a stereo camera is subjected to image processing based on the principle of triangulation to determine the three-dimensional position of the person WM.
- the position of the person WM at the construction site 2 can be recognized by calculating and correlating it with the three-dimensional topographical data of the construction site 2 .
- by calculating the three-dimensional position of the person WM recognized based on the two-dimensional image using the detection values of the laser sensor and the radar sensor and making it correspond to the three-dimensional topography data of the construction site 2, may recognize the position of the person WM in .
- the recognition unit 43 may recognize the position of the person WM from the three-dimensional data of the construction site 2 .
- the recognition unit 43 may recognize the position of the person WM based on the detection data of the position sensor possessed by the person WM.
- the recognition unit 43 can recognize the position of the person WM based on the detection data of the GNSS sensor of the smartphone.
- the position sensor possessed by the person WM may be a beacon.
- the position of the person WM at the construction site 2 can be determined from the coordinates of the person WM on the two-dimensional image recognized based on the two-dimensional image, the three-dimensional position and posture of the detection device 9, and the three-dimensional terrain data. It may be estimated by geometric calculation.
- the recognition unit 43 can recognize the operation of the working machine 20 based on the detection data acquired by the detection data acquisition unit 42 .
- the hydraulic excavator 21 can operate the traveling body 24 , the revolving body 25 , and the working machine 26 .
- the crawler dump 23 can operate the traveling body 28 and the dump body 30 .
- the reflecting unit 44 can move the three-dimensional model in synchronization with the work machine 20 .
- FIG. 5 is a flow chart showing a construction management method according to the embodiment.
- the current topography data creation unit 41 creates current topography data of the construction site 2 where the work machine 20 operates (step S1).
- FIG. 6 is a diagram showing current terrain data according to the embodiment.
- the current topography data is three-dimensional topography data representing the current topography of the entire construction site 2 .
- the current topography data creation unit 41 acquires the detection data of the detection device 12 .
- the current topography data creation unit 41 creates current topography data based on the detection data of the detection device 12 .
- the current topography data created by the current topography data creation unit 41 is stored in the storage unit 46 .
- the output unit 45 can transmit the current terrain data to the information terminal 5 .
- the display device 52 can display the current terrain as shown in FIG.
- a detection device 9 mounted on the aircraft 8 detects the construction site 2 .
- the detection device 9 detects, for example, the work machine 20 that is performing work and the construction area around the work machine 20 .
- Detection data of the detection device 9 is transmitted to the management device 3 via the cable 7 .
- the management device 3 transmits detection data of the detection device 9 to the server 4 .
- the detection data acquisition unit 42 acquires the detection data of the detection device 9 that has detected the work machine 20 and the construction area around the work machine 20 (step S2).
- the recognition unit 43 determines whether an object exists at the construction site 2 based on the detection data acquired by the detection data acquisition unit 42 . In the embodiment, the recognition unit 43 determines whether or not an object exists in the construction area (step S3).
- the reflection unit 44 stores the detection data acquired by the detection data acquisition unit 42 in the current terrain stored in the storage unit 46. Reflected in data to generate reflected data.
- the detection data acquired by the detection data acquisition unit 42 includes updated terrain data of the construction area indicating a portion of the construction site 2 .
- the reflecting unit 44 reflects the updated topographical data in the current topographical data (step S4).
- the detection data including the excavation location is acquired by the detection data acquisition unit 42 as updated terrain data.
- the updated terrain data includes excavation locations excavated by the hydraulic excavator 21 .
- the reflecting unit 44 synthesizes a part of the current terrain data and the updated terrain data.
- the reflecting unit 44 applies the updated topographic data to a part of the current topographic data. As a result, reflection data is generated in which the updated landform data including the excavation location is reflected.
- step S3 if it is determined that an object exists in the construction area (step S3: Yes), the reflection unit 44 removes the non-topography data indicating the object recognized by the recognition unit 43 from the detection data, and updates the topography data. is generated (step S5).
- the reflecting unit 44 After the updated topographical data is generated, the reflecting unit 44 reflects the updated topographical data on the current topographical data to generate reflected data. Moreover, when the object exists, the reflecting unit 44 generates reflection data in which the object is reflected (step S6).
- the reflection unit 44 may generate reflection data in which the object is not reflected even when the object exists.
- the output unit 45 outputs the reflection data generated in at least one of steps S4 and S6 to the information terminal 5.
- the display device 52 displays the reflected data transmitted from the output unit 45 (step S7).
- FIG. 7 is a diagram showing reflected data according to the embodiment.
- FIG. 7 shows reflected data when there is no object in the construction area.
- the reflected data includes image data of the construction area. For example, as construction progresses in the construction area, at least a portion of the topography of the construction site 2 changes, as shown in FIG. In the example shown in FIG. 7, the excavation work of the hydraulic excavator 21 creates an excavation location in the construction area.
- the detection data acquired by the detection data acquisition unit 42 includes updated terrain data of the construction area.
- the reflecting unit 44 reflects changes in the topography of the construction site 2 on the current topography in real time.
- the reflecting unit 44 reflects the updated topographical data on the current topographical data in real time.
- the display device 52 can display the reflection data reflecting the excavation location. By checking the reflected data displayed on the display device 52, the administrator of the remote location 13 can recognize the progress of construction at the construction site 2 in real time.
- FIG. 8 is a diagram showing reflected data according to the embodiment.
- FIG. 8 shows reflected data when the hydraulic excavator 21, the crawler dump truck 23, and the man WM are present as objects in the construction area.
- the reflected data includes three-dimensional terrain data in which the updated terrain data is reflected in part of the current terrain data.
- the reflected data includes the three-dimensional model of work machine 20 recognized by recognition unit 43 .
- the three-dimensional model of the working machine 20 includes a three-dimensional model 21D of the hydraulic excavator 21 and a three-dimensional model 23D of the crawler dump 23 .
- the recognition unit 43 calculates the position (three-dimensional position) of the work machine 20 based on the detection data of the detection device 12 acquired by the detection data acquisition unit 42. and posture.
- the attitude of the work machine 20 includes the inclination of the revolving body 25 with respect to the horizontal plane and the revolving angle of the revolving body 25 with respect to the traveling body 24 .
- the attitude of work machine 20 includes the angle of work machine 26 .
- the angle of work implement 26 includes the angle of boom 26A, the angle of arm 26B, and the angle of bucket 26C.
- the detection data of the detection device 12 includes images acquired by the stereo cameras.
- the recognition unit 43 can calculate the three-dimensional position and orientation of the work machine 20 based on the detection data of the detection device 12 .
- the reflection unit 44 adjusts the three-dimensional model stored in the storage unit 46 so that the three-dimensional model is placed at the position calculated by the recognition unit 43 and has the posture calculated by the recognition unit 43. , to generate reflection data.
- the three-dimensional model of the working machine 20 is constructed for each part that constitutes the working machine 20, such as the traveling body 24, the revolving body 25, and the working machine 26, for example.
- the reflecting unit 44 Based on the angle of the boom 26A, the angle of the arm 26B, the angle of the bucket 26C, and the turning angle of the turning body 25, the reflecting unit 44 changes the angles of the corresponding parts of the three-dimensional model to reproduce the three-dimensional model. adjust.
- the output unit 45 outputs the reflection data including the three-dimensional model generated by the reflection unit 44 to the display device 52 .
- the recognition unit 43 may calculate the position of the work machine 20 based on the detection data of one GNSS sensor. Further, the recognition unit 43 may calculate the inclination and the turning angle of the turning body 25 based on the detection data of each of the two GNSS sensors. Further, when a stroke sensor is provided for each of boom cylinder 27A, arm cylinder 27B, and bucket cylinder 27C, recognition unit 43 may calculate the angle of work implement 26 based on the detection data of the stroke sensors. good.
- FIG. 9 is a diagram showing reflected data according to the embodiment.
- the reflected data includes a symbolic image that indicates the position of the object.
- the reflected data includes a symbol image 31 indicating the position of the person WM.
- the reflection unit 44 generates the symbol image 31 based on the position of the person WM recognized by the recognition unit 43 .
- the reflection unit 44 generates reflection data so that the person WM and the symbol image 31 are superimposed and displayed.
- the symbol image 31 is frame-shaped (box-shaped) surrounding the person WM.
- the symbol image 31 may have any shape as long as it can emphasize the person WM.
- the symbol image 31 may be displayed adjacent to the person WM.
- FIG. 10 is a diagram showing reflected data according to the embodiment.
- the reflected data includes the symbol image 32 indicating the position of the hydraulic excavator 21 .
- the reflection unit 44 can generate the symbol image 32 based on the position of the excavator 21 recognized by the recognition unit 43 .
- the symbol image 32 is frame-shaped (box-shaped) surrounding the three-dimensional model 21D.
- the reflection unit 44 generates reflection data such that the three-dimensional model 21D of the hydraulic excavator 21 and the symbol image 32 are displayed in a superimposed manner.
- the symbol image 32 may have any shape as long as the hydraulic excavator 21 can be emphasized.
- FIG. 11 is a diagram showing reflected data according to the embodiment.
- the reflected data includes a three-dimensional model 21D that moves in synchronization with the excavator 21.
- the recognition unit 43 can recognize the motion of the work machine 20 .
- the reflection unit 44 can move the three-dimensional model 21D on the display device 52 based on the motion of the hydraulic excavator 21 recognized by the recognition unit 43 so as to synchronize with the motion of the hydraulic excavator 21 .
- FIG. 11 shows an example in which the boom of the three-dimensional model 21D is raised in synchronization with the boom 26A of the hydraulic excavator 21 being raised.
- the reflection unit 44 can move the three-dimensional model 23D on the display device 52 based on the motion of the crawler dump 23 recognized by the recognition unit 43 so as to synchronize with the motion of the crawler dump 23 .
- FIG. 12 is a diagram showing reflected data according to the embodiment.
- the reflected data includes a symbol image 31 indicating the position of the person WM, a symbol image 32 indicating the position of the excavator 21 and a symbol image 33 indicating the position of the crawler dump 23 .
- the reflection unit 44 generates reflection data such that the three-dimensional model 23D of the crawler dump 23 and the symbol image 33 are superimposed and displayed, for example.
- the reflection unit 44 determines whether or not construction of the construction area has been completed (step S8).
- step S8 If it is determined in step S8 that construction has not been completed (step S8: No), the processing from step S2 to step S7 is repeated. If it is determined in step S8 that the construction has ended (step S8: Yes), the construction management method according to the embodiment ends.
- FIG. 13 is a block diagram showing a computer system 1000 according to an embodiment.
- the server 4 described above includes a computer system 1000 .
- a computer system 1000 includes a processor 1001 such as a CPU (Central Processing Unit), a main memory 1002 including non-volatile memory such as ROM (Read Only Memory) and volatile memory such as RAM (Random Access Memory), It has a storage 1003 and an interface 1004 including an input/output circuit.
- the functions of the server 4 described above are stored in the storage 1003 as computer programs.
- the processor 1001 reads a computer program from the storage 1003, develops it in the main memory 1002, and executes the above-described processing according to the program. Note that the computer program may be distributed to the computer system 1000 via a network.
- the computer program or computer system 1000 stores the current terrain data of the construction site 2 where the work machine 20 operates, and acquires the detection data of the detection device 9 that detects the construction site 2, according to the above-described embodiment. , generating reflected data that reflects the detected data on the current terrain data, and displaying the reflected data on the display device 52 .
- current topography data representing the reference topography of the construction site 2 before construction is created.
- the work machine 20 and the construction area around the work machine 20 are detected by the detection device 9 .
- Reflected data in which the detected data is reflected in the current terrain data is displayed on the display device 52 .
- the administrator of the remote site 13 can check the progress of construction in the construction area in real time.
- a manager at the remote site 13 can remotely monitor the progress of construction.
- the detection data reflected in the current terrain data is updated more frequently than the current terrain data. Therefore, the administrator of the remote site 13 can check the latest status of the construction area.
- the detection data reflected in the current terrain data includes the updated terrain data of the construction area. This allows the administrator of the remote site 13 to check the latest topography of the construction area during or after construction.
- the detection data reflected in the current terrain data includes non-terrain data indicating objects on the construction site 2.
- the reflection unit 44 reflects the updated topography data obtained by removing the non-topography data from the detection data acquired by the detection data acquisition unit 42 in the current topography data.
- the administrator of the remote site 13 can confirm the latest topography of the construction area in which the effects of objects are suppressed.
- the detection data reflected in the current terrain data includes non-terrain data indicating objects on the construction site 2.
- the administrator of the remote site 13 can confirm not only the latest topography of the construction site, but also the status of objects on the construction site 2 in real time.
- the reflected data displayed on the display device 52 includes objects recognized by the recognition unit 43 .
- the display device 52 displays the reflected data in which the latest terrain and objects are distinguished. As a result, the administrator of the remote site 13 can appropriately confirm both the latest topography of the construction site and the status of objects on the construction site 2 in real time.
- the reflected data displayed on the display device 52 includes the three-dimensional model of the work machine 20.
- the administrator at the remote location 13 can properly check the status of the work machine 20 in real time.
- the recognition unit 43 recognizes the operation of the working machine 20 based on the detection data acquired by the detection data acquisition unit 42 .
- the reflection unit 44 can move the three-dimensional model in synchronization with the working machine 20 based on the recognition result of the recognition unit 43 .
- a manager at the remote location 13 can properly confirm the status of the work machine 20 in real time by confirming the three-dimensional model that moves in synchronization with the actual work machine 20 .
- the recognition unit 43 recognizes the person WM based on the detection data acquired by the detection data acquisition unit 42 . By checking the person WM displayed on the display device 52, the administrator of the remote location 13 can appropriately check the situation of the person WM in real time.
- the reflected data displayed on the display device 52 includes a symbol image indicating the position of the object.
- the administrator at the remote location 13 can properly confirm the position of the object by confirming the symbol image displayed on the display device 52 .
- the detection device 9 is mounted on the flying object 8. Thereby, the detection device 9 can collectively detect the work machine 20 and the construction area from above.
- the recognition unit 43 recognizes the position of the working machine 20 based on the detection data of the detection device 9 .
- work machine 20 may be provided with a position sensor that detects the position of work machine 20, and recognition unit 43 may recognize the position of work machine 20 based on the detection data of the position sensor.
- the recognition unit 43 recognizes the operation of the work machine 20 based on the detection data of the detection device 9.
- a motion sensor that detects the motion of work machine 20 may be provided in work machine 20, and recognition unit 43 may recognize the motion of work machine 20 based on detection data of the motion sensor.
- motion sensors include an angle sensor that detects the motion of the work machine 26 and a stroke sensor that detects the amount of expansion and contraction of the hydraulic cylinder 27 .
- the recognition unit 43 may recognize objects based on pattern matching, for example, without using artificial intelligence.
- the recognition unit 43 can recognize the object by matching the template representing the person WM with the image data of the construction site 2 .
- the detection device 9 does not have to be mounted on the aircraft 8.
- the detection device 9 may be mounted, for example, on the work machine 20 or may be attached to a structure present at the construction site 2 . The same applies to the detection device 12 as well.
- each of the current terrain data creation unit 41, the detection data acquisition unit 42, the recognition unit 43, the reflection unit 44, the output unit 45, and the storage unit 46 may be configured by separate hardware.
- at least one of the functions of the current terrain data creation unit 41, the detection data acquisition unit 42, the recognition unit 43, the reflection unit 44, the output unit 45, and the storage unit 46 is 3, or may be provided in a server different from the server 4.
- the management device 3 is supported by the traveling device 6 and can travel on the construction site 2.
- the management device 3 may be mounted on the work machine 20 or installed at a predetermined position on the construction site 2 .
- the detection device 12 detects the entire construction site 2, and the detection device 9 detects a part of the construction site 2.
- the detection device 9 may detect the entire construction site 2 .
- the detection targets of the detection device 9 are not limited to the topography of the construction site 2, the work machine 20 and the man WM. In other embodiments, the detection device 9 may detect construction materials.
- the information terminal 5 does not have to be located at the remote location 13 of the construction site 2.
- the information terminal 5 may be mounted on the work machine 20, for example. Also, the information terminal 5 may be omitted.
- the progress of construction may be output from the monitor to work machine 20 .
- a monitor may include an input device as well as a display device.
- the working machine 20 is not limited to including the hydraulic excavator 21, the bulldozer 22, and the crawler dump 23.
- work machine 20 may include excavator 21 , bulldozer 22 , and portions of crawler dump 23 . It may also include other types of work machines.
- the position of the flying object 8 is detected using the global navigation satellite system (GNSS) and the attitude of the flying object 8 is detected using the inertial measurement device.
- GNSS global navigation satellite system
- SLAM Simultaneous Localization and Mapping
- SLAM may be used to detect the position and attitude of the aircraft 8 .
- SLAM may be used to detect the position and orientation of the flying object 11 and work machine 20 .
- SYMBOLS 1... Construction management system, 2... Construction site, 3... Management apparatus, 4... Server (data processing apparatus), 5... Information terminal, 6... Running apparatus, 7... Cable, 8... Aircraft, 9... Detecting apparatus, 10 Communication system 11 Aircraft 12 Detecting device 13 Remote location 14 Position sensor 15 Attitude sensor 16 Position sensor 17 Attitude sensor 20 Working machine 21 Hydraulic excavator 21D ... three-dimensional model, 22 ... bulldozer, 23 ... crawler dump, 23D ... three-dimensional model, 24 ... running body, 25 ... rotating body, 26 ... working machine, 26A ... boom, 26B ... arm, 26C ... bucket, 27 ...
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- General Health & Medical Sciences (AREA)
- Entrepreneurship & Innovation (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Development Economics (AREA)
- Component Parts Of Construction Machinery (AREA)
- Conveying And Assembling Of Building Elements In Situ (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
図1は、実施形態に係る施工管理システム1を示す模式図である。施工管理システム1は、施工現場2の施工を管理する。施工現場2において複数の作業機械20が稼働する。実施形態において、作業機械20は、油圧ショベル21、ブルドーザ22、及びクローラダンプ23を含む。なお、作業機械20は、ホイールローダを含んでもよい。また、施工現場2に人WMが存在する。人WMとして、施工現場2で作業する作業者が例示される。なお、人WMは、施工を管理する監督者でもよい。人WMは、見学者でもよい。
図2は、実施形態に係る油圧ショベル21を示す斜視図である。図2に示すように、油圧ショベル21は、走行体24と、走行体24に支持される旋回体25と、旋回体25に支持される作業機26と、作業機26を駆動する油圧シリンダ27とを備える。
図4は、実施形態に係る施工管理システム1を示す機能ブロック図である。図4に示すように、施工管理システム1は、飛行体8と、飛行体11と、施工現場2に配置された管理装置3と、サーバ4と、施工現場2の遠隔地13に配置された情報端末5とを有する。
上述のように、飛行体8には、位置センサ14及び姿勢センサ15が搭載されている。位置センサ14は、検出装置9の位置を検出することができる。姿勢センサ15は、検出装置9の姿勢を検出することができる。姿勢は、例えばロール角、ピッチ角、及びヨー角を含む。ヨー角は、飛行体8に設けられた2つのGNSSセンサの検出データに基づいて算出されてもよい。検出装置9は、施工現場2の3次元データを検出する。施工現場2の3次元データは、検出装置9と検出対象に規定される複数の検出点のそれぞれとの相対距離及び相対位置を含む。認識部43及び反映部44は、位置センサ14の検出データと、姿勢センサ15の検出データと、検出装置9の検出データとに基づいて、例えばグローバル座標系における検出対象の3次元データの位置を算出することができる。また、認識部43及び反映部44は、所定の座標変換を実施して、例えば施工現場2に規定されたローカル座標系における検出対象の3次元データの位置を算出することができる。検出装置9の検出対象は、更新後の地形及び物体を含む。
図5は、実施形態に係る施工管理方法を示すフローチャートである。現況地形データ作成部41は、作業機械20が稼働する施工現場2の現況地形データを作成する(ステップS1)。
図13は、実施形態に係るコンピュータシステム1000を示すブロック図である。上述のサーバ4は、コンピュータシステム1000を含む。コンピュータシステム1000は、CPU(Central Processing Unit)のようなプロセッサ1001と、ROM(Read Only Memory)のような不揮発性メモリ及びRAM(Random Access Memory)のような揮発性メモリを含むメインメモリ1002と、ストレージ1003と、入出力回路を含むインターフェース1004とを有する。上述のサーバ4の機能は、コンピュータプログラムとしてストレージ1003に記憶されている。プロセッサ1001は、コンピュータプログラムをストレージ1003から読み出してメインメモリ1002に展開し、プログラムに従って上述の処理を実行する。なお、コンピュータプログラムは、ネットワークを介してコンピュータシステム1000に配信されてもよい。
以上説明したように、実施形態によれば、施工前の施工現場2の基準地形を示す現況地形データが作成される。施工中において、作業機械20及び作業機械20の周辺の施工エリアが検出装置9により検出される。現況地形データに検出データが反映された反映データが表示装置52に表示される。遠隔地13の管理者は、表示装置52を確認することにより、施工エリアの施工の進捗状況をリアルタイムで確認することができる。遠隔地13の管理者は、施工の進捗状況を遠隔監視することができる。
上述の実施形態において、認識部43は、検出装置9の検出データに基づいて、作業機械20の位置を認識することとした。例えば作業機械20の位置を検出する位置センサが作業機械20に設けられ、認識部43は、位置センサの検出データに基づいて、作業機械20の位置を認識してもよい。
Claims (20)
- 作業機械が稼働する施工現場の現況地形データを作成する現況地形データ作成部と、
前記施工現場を検出する検出装置の検出データを取得する検出データ取得部と、
前記検出データを前記現況地形データに反映する反映データを生成する反映部と、
前記反映データを表示装置に出力する出力部と、を備える、
施工管理システム。 - 前記検出データは、前記現況地形データよりも高頻度で更新される、
請求項1に記載の施工管理システム。 - 前記検出データは、前記施工現場の更新地形データを含む、
請求項1又は請求項2に記載の施工管理システム。 - 前記検出データは、前記施工現場の物体を示す非地形データを含み、
前記反映部は、前記検出データから前記非地形データが除去された前記施工現場の更新地形データを前記現況地形データに反映する、
請求項1又は請求項2に記載の施工管理システム。 - 前記検出データは、前記施工現場の物体を示す非地形データを含む、
請求項1又は請求項2に記載の施工管理システム。 - 前記検出データに基づいて前記施工現場の物体を認識する認識部を備え、
前記反映データは、前記認識部により認識された前記物体を含む、
請求項1又は請求項2に記載の施工管理システム。 - 前記物体は、前記作業機械を含み、
前記反映データは、前記作業機械の3次元モデルを含む、
請求項6に記載の施工管理システム。 - 前記認識部は、前記作業機械の動作を認識し、
前記反映部は、前記作業機械に同期して前記3次元モデルを動かす、
請求項7に記載の施工管理システム。 - 前記物体は、人を含む、
請求項6から請求項8のいずれか一項に記載の施工管理システム。 - 前記認識部は、前記物体の位置を認識し、
前記反映データは、前記物体の位置を示すシンボル画像を含む、
請求項6から請求項9のいずれか一項に記載の施工管理システム。 - 前記検出装置は、飛行体に搭載される、
請求項1から請求項10のいずれか一項に記載の施工管理システム。 - 前記検出装置は、前記作業機械及び前記作業機械の周辺の施工エリアを検出する、
請求項1から請求項11のいずれか一項に記載の施工管理システム。 - 作業機械が稼働する施工現場の現況地形データを作成する現況地形データ作成部と、
前記施工現場を検出する検出装置の検出データを取得する検出データ取得部と、
前記検出データを前記現況地形データに反映する反映データを生成する反映部と、
前記反映データを表示装置に出力する出力部と、を備える、
データ処理装置。 - 作業機械が稼働する施工現場の現況地形データを記憶することと、
前記施工現場を検出する検出装置の検出データを取得することと、
前記検出データを前記現況地形データに反映する反映データを生成することと、
前記反映データを表示装置に表示することと、を含む、
施工管理方法。 - 前記検出データは、前記現況地形データよりも高頻度で更新される、
請求項14に記載の施工管理方法。 - 前記検出データは、前記施工現場の更新地形データを含む、
請求項14又は請求項15に記載の施工管理方法。 - 前記検出データは、前記施工現場の物体を示す非地形データを含み、
前記検出データから前記非地形データが除去された前記施工現場の更新地形データを前記現況地形データに反映する、
請求項14又は請求項15に記載の施工管理方法。 - 前記検出データは、前記施工現場の物体を示す非地形データを含む、
請求項14又は請求項15に記載の施工管理方法。 - 前記検出データに基づいて前記施工現場の物体を認識することと、を含み、
前記反映データは、認識された前記物体を含む、
請求項14又は請求項15に記載の施工管理方法。 - 前記物体は、前記作業機械を含み、
前記反映データは、前記作業機械の3次元モデルを含む、
請求項19に記載の施工管理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/278,465 US20240233048A9 (en) | 2021-03-31 | 2022-02-22 | Construction management system, data processing device, and construction management method |
AU2022248060A AU2022248060A1 (en) | 2021-03-31 | 2022-02-22 | Construction management system, data processing device, and construction management method |
DE112022000560.2T DE112022000560T5 (de) | 2021-03-31 | 2022-02-22 | Bauverwaltungssystem, datenverarbeitungsvorrichtung und bauverwaltungsverfahren |
CN202280013211.1A CN116868223A (zh) | 2021-03-31 | 2022-02-22 | 施工管理系统、数据处理装置、以及施工管理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021061691A JP2022157458A (ja) | 2021-03-31 | 2021-03-31 | 施工管理システム、データ処理装置、及び施工管理方法 |
JP2021-061691 | 2021-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022209437A1 true WO2022209437A1 (ja) | 2022-10-06 |
Family
ID=83455898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/007307 WO2022209437A1 (ja) | 2021-03-31 | 2022-02-22 | 施工管理システム、データ処理装置、及び施工管理方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240233048A9 (ja) |
JP (1) | JP2022157458A (ja) |
CN (1) | CN116868223A (ja) |
AU (1) | AU2022248060A1 (ja) |
DE (1) | DE112022000560T5 (ja) |
WO (1) | WO2022209437A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003253661A (ja) * | 2002-02-28 | 2003-09-10 | Hazama Gumi Ltd | 建設工事の精密施工支援システム及びプログラム、並びにこれらを用いた精密施工法 |
WO2017170651A1 (ja) * | 2016-03-31 | 2017-10-05 | 住友重機械工業株式会社 | 建設機械用作業管理システム及び建設機械 |
JP2018059268A (ja) * | 2016-09-30 | 2018-04-12 | 株式会社小松製作所 | 作業機械の検出処理装置及び作業機械の検出処理方法 |
JP2021038571A (ja) * | 2019-09-03 | 2021-03-11 | 日立建機株式会社 | 現場管理システムおよび作業機械 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2019012993A1 (ja) | 2017-07-14 | 2019-11-07 | 株式会社小松製作所 | 稼働情報送信装置、施工管理システム、稼働情報送信方法及びプログラム |
-
2021
- 2021-03-31 JP JP2021061691A patent/JP2022157458A/ja active Pending
-
2022
- 2022-02-22 WO PCT/JP2022/007307 patent/WO2022209437A1/ja active Application Filing
- 2022-02-22 DE DE112022000560.2T patent/DE112022000560T5/de active Pending
- 2022-02-22 CN CN202280013211.1A patent/CN116868223A/zh active Pending
- 2022-02-22 US US18/278,465 patent/US20240233048A9/en active Pending
- 2022-02-22 AU AU2022248060A patent/AU2022248060A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003253661A (ja) * | 2002-02-28 | 2003-09-10 | Hazama Gumi Ltd | 建設工事の精密施工支援システム及びプログラム、並びにこれらを用いた精密施工法 |
WO2017170651A1 (ja) * | 2016-03-31 | 2017-10-05 | 住友重機械工業株式会社 | 建設機械用作業管理システム及び建設機械 |
JP2018059268A (ja) * | 2016-09-30 | 2018-04-12 | 株式会社小松製作所 | 作業機械の検出処理装置及び作業機械の検出処理方法 |
JP2021038571A (ja) * | 2019-09-03 | 2021-03-11 | 日立建機株式会社 | 現場管理システムおよび作業機械 |
Also Published As
Publication number | Publication date |
---|---|
US20240233048A9 (en) | 2024-07-11 |
US20240135469A1 (en) | 2024-04-25 |
DE112022000560T5 (de) | 2023-11-23 |
AU2022248060A1 (en) | 2023-09-14 |
CN116868223A (zh) | 2023-10-10 |
JP2022157458A (ja) | 2022-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11230825B2 (en) | Display system, display method, and display apparatus | |
US9322148B2 (en) | System and method for terrain mapping | |
US9142063B2 (en) | Positioning system utilizing enhanced perception-based localization | |
US20210209799A1 (en) | Position measurement system, work machine, and position measurement method | |
JP6867132B2 (ja) | 作業機械の検出処理装置及び作業機械の検出処理方法 | |
WO2022209434A1 (ja) | 施工管理システム、データ処理装置、及び施工管理方法 | |
JPH11211473A (ja) | 地形形状計測装置 | |
US12043989B2 (en) | Working machine | |
US20220316188A1 (en) | Display system, remote operation system, and display method | |
JP6887229B2 (ja) | 施工管理システム | |
WO2022209437A1 (ja) | 施工管理システム、データ処理装置、及び施工管理方法 | |
WO2023106324A1 (ja) | 表示システム及び表示方法 | |
JP7107792B2 (ja) | 建設機械 | |
WO2023106323A1 (ja) | 施工管理システム及び施工管理方法 | |
WO2023106076A1 (ja) | 表示システム及び表示方法 | |
WO2024214297A1 (ja) | 地形計測装置、地形計測システム、作業機械、地形計測方法、基準点設定登録方法及びプログラム | |
JP2023167539A (ja) | 環境データ生成装置、環境データ生成方法、プログラム | |
JP2022186143A (ja) | 環境データ生成装置、建設機械、環境データ生成方法、およびプログラム | |
KR100910203B1 (ko) | 자율이동플랫폼 장치의 지형영상 출력장치, 이를 구비하는 자율이동플랫폼 장치 및 자율이동플랫폼 장치의 지형영상 출력방법 | |
JP2023006464A (ja) | 情報処理装置、情報処理装置の制御プログラム | |
CA3226958A1 (en) | Work management device, work management method, and work-in-progress estimation model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22779662 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280013211.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18278465 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022248060 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022000560 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022248060 Country of ref document: AU Date of ref document: 20220222 Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22779662 Country of ref document: EP Kind code of ref document: A1 |