EP4405567B1 - Verfahren und ein kontrollknoten zum steuern einer bergbau-anlage - Google Patents

Verfahren und ein kontrollknoten zum steuern einer bergbau-anlage

Info

Publication number
EP4405567B1
EP4405567B1 EP22787018.5A EP22787018A EP4405567B1 EP 4405567 B1 EP4405567 B1 EP 4405567B1 EP 22787018 A EP22787018 A EP 22787018A EP 4405567 B1 EP4405567 B1 EP 4405567B1
Authority
EP
European Patent Office
Prior art keywords
mining
rig
environment
virtual
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP22787018.5A
Other languages
English (en)
French (fr)
Other versions
EP4405567A1 (de
EP4405567C0 (de
Inventor
Björn SYSE
Tony EHRNSTRÖM
Peter ÖBERG
Magnus KARLBERG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Epiroc Rock Drills AB
Original Assignee
Epiroc Rock Drills AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Epiroc Rock Drills AB filed Critical Epiroc Rock Drills AB
Publication of EP4405567A1 publication Critical patent/EP4405567A1/de
Application granted granted Critical
Publication of EP4405567B1 publication Critical patent/EP4405567B1/de
Publication of EP4405567C0 publication Critical patent/EP4405567C0/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21CMINING OR QUARRYING
    • E21C35/00Details of, or accessories for, machines for slitting or completely freeing the mineral from the seam, not provided for in groups E21C25/00 - E21C33/00, E21C37/00 or E21C39/00
    • E21C35/282Autonomous machines; Autonomous operations
    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21CMINING OR QUARRYING
    • E21C35/00Details of, or accessories for, machines for slitting or completely freeing the mineral from the seam, not provided for in groups E21C25/00 - E21C33/00, E21C37/00 or E21C39/00
    • E21C35/08Guiding the machine
    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21CMINING OR QUARRYING
    • E21C35/00Details of, or accessories for, machines for slitting or completely freeing the mineral from the seam, not provided for in groups E21C25/00 - E21C33/00, E21C37/00 or E21C39/00
    • E21C35/24Remote control specially adapted for machines for slitting or completely freeing the mineral
    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21CMINING OR QUARRYING
    • E21C35/00Details of, or accessories for, machines for slitting or completely freeing the mineral from the seam, not provided for in groups E21C25/00 - E21C33/00, E21C37/00 or E21C39/00
    • E21C35/302Measuring, signaling or indicating specially adapted for machines for slitting or completely freeing the mineral

Definitions

  • Embodiments herein relate to a method and a control node for controlling a mining rig in a mining environment. Furthermore, a computer program and a carrier are also provided herein.
  • Mining rigs are used to perform operations in a mining environment.
  • the operations performed by the mining rigs may relate, for example, directly to drilling operations, or to supporting drilling operations in the mining environment.
  • the mining rigs operating in the mining environment may be of many various types used for different purposes, such as:
  • the mining rig In order to control the mining rig, the mining rig typically has a cabin for an operator to sit, which cabin comprises controls for the operator to use when operating the mining rig.
  • the mining rig may sometimes be controlled remotely.
  • the mining rig is equipped with cameras enabling the operator to sit at a remote operating station and see live video feeds from the cameras and remotely operate the mining rig using remote controls. This creates an operating environment similar to that of actually operating the mining rig from the cabin.
  • While controlling a mining rig remotely may be like controlling the mining rig as if sitting inside the cabin, differences and problems of how to perform mining operations remotely may still arise.
  • the operators may thus first be trained using a digital twin of the mining rig.
  • the digital twin is a simulation of how the mining rig would operate in the mining environment based on inputs from the operator. In this way, the operator can be trained to be able to control the mining rig remotely for most operations.
  • the operator may need to be present in the mining environment to be able to perform these certain complex mining operations.
  • a drill plan comprising a series of drilling operations at different locations, since the drill plan requires the operator to see if it is possible to drill a hole at an exact location according to the drill plan or if the location has to be changed slightly due to detailed characteristics of the rock surface. These slight details may be difficult to see when the operator is only provided with present technology. Furthermore, vision provided by cameras on mining rigs may in some cases be obstructed by various parts of the mining rig, and in these cases, the camera vision does not provide enough information for the operator to complete a certain operation remotely.
  • WO2020/200835 A1 discloses a method and device for providing a model-based real-time simulation of an operation of a mobile mining machine at a face of a material extraction site that enables operator support.
  • An object of embodiments herein is to improve remote control of mining rigs performing operations in a mining environment.
  • it is an object to achieve improvements to operational safety and reducing the need for physical inspections.
  • the object is achieved by a method for controlling a mining rig in a mining environment as defined in claim 1.
  • control node configured to control a mining rig in a mining environment as defined in claim 12.
  • triggering the one or more operations may comprise triggering one or more real-time operations.
  • operational and safety improvements of remotely controlled mining operations are further increased. This is since triggering the one or more real-time operations enables controlling the mining rig to quickly avoid potential accidents and/or potential performance degrading events occurring in real time.
  • Fig. 2 is a schematic overview depicting a mining environment 100, a mining rig 20, a virtual mining model 20', and an operator 60.
  • the mining environment 100 may typically be part of an underground mine and may comprise a series of tunnels and spaces covered by rock surface 27.
  • the mining environment 100 may further comprise one or more infrastructure objects for supporting mining operations such as lighting elements, supporting beams, etc.
  • various mining operations may be performed, such as drilling and blasting holes, reinforcing tunnels, and extracting materials.
  • the mining rig 20 may be any mining rig suitable to perform any of the above-mentioned mining operations in the mining environment 100.
  • the mining rig 20 may be a drill rig, such as any of a face drill rig, a tunnelling rig, a rock reinforcement rig, a production drill rig, a scaling rig, or a shotcrete rig.
  • the mining rig 20 may be mobile and may be capable of driving, operating, and moving within the mining environment 100.
  • the mining rig 20 may comprise a cabin from which an operator may be able to control the mining rig 20.
  • the mining rig 20 may alternatively be cabin-free, and only be capable of being remotely controlled.
  • the mining rig 20 may comprise one or more booms 22, 23, for example, telescopic booms, controllable by the mining rig 20.
  • the one or more equipment may comprise any suitable mining equipment needed to perform mining operations in the mining environment 100, such as any one or more of the above-mentioned mining operations.
  • the one or more equipment may be, e.g., a feeder for feeding a rock drill, a rock drill for drilling in the mining environment 100, a bolting equipment for reinforcing rock strata by installing rock bolts or cables in the mining environment 100, a scaling hammer to hammer loose rock, or a shotcrete rig consolidating the rock surface by spraying layers of concrete onto the rock surface.
  • the equipment may be attached to any part of the mining rig such as attached to the one or more booms 22, 23.
  • the virtual mining model 20' is a real-time representation of the mining rig 20 in the mining environment 100.
  • the virtual mining model 20' may be a digital twin of the mining rig 20 in the mining environment 100.
  • the virtual mining model 20' comprises a combined three-dimensional real-time representation 21.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 comprises a coordinate system for establishing a positioning relationship of the mining rig 20 relative to the mining environment 100 in real time.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 is used to keep track of a current location of the mining rig 20, relative to the mining environment 100 in real time.
  • the combined three-dimensional real-time representation 21 enables a way to represent the current state and location of the mining rig 20 and the mining environment 100 in three dimensions in real time. In this way it may be possible to observe the mining rig 20 and the mining environment 100 in angles and/or perspectives not possible by present technology.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 it is possible to safely, at a remote location, perform complex operations otherwise not possible to perform remotely due to poor or otherwise obstructed vision.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 is generated based on sensor data from a plurality of sensors.
  • the plurality of sensors comprises one or more first sensors 24a-c for sensing the mining environment, and one or more second sensors 25a-c for sensing the mining rig in the mining environment.
  • One or more sensors out of the plurality of sensors may be attached to the mining rig 20, or be part of an infrastructure in the mining environment 100.
  • a wide range of configurations for how to position the plurality of sensors are possible.
  • the sensors are placed such that the sensors, at least in collaborations, may be able to have a complete or near-complete view surrounding the mining rig 20, e.g. no, or few parts should obfuscate the combined view of the sensors.
  • any one or more sensors out of the plurality of sensors may respectively be attached to any one or more out of: a roof of the mining rig 20, the one or more booms 22, 23, a cabin of the mining rig 20 such as on top of the roof of the cabin, in a rear-position of the mining rig 20, in a front-position of the mining rig 20, in the highest most position of the mining rig 20 e.g., sensing at least partially downwards, in the lowest position of the and/or mining rig e.g., sensing at least partially upwards.
  • one or more sensors out of the plurality of sensors may be arranged at various locations in the mining environment 100 and/or attached to other vehicles operating in the mining environment 100.
  • These sensors may independently or in collaboration pre-scan an area of the mining environment 100 in advance of the rest of sensors out of the plurality of sensors.
  • the mining rig 20 may in these scenarios obtain other sensor data to position the mining rig 200 in the pre-scanned area of the mining environment 100.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 may be subject to one or more real-time conditions.
  • the combined three-dimensional real-time representation 21 may not represent a state of the mining rig 20 and/or the mining environment 100 older than a certain period of time, e.g., at least some of the sensor data used for generating the virtual mining model 20' may not be older than 100 milliseconds (ms).
  • the real-time condition may for example pertain to sensor data used for monitoring real-time movements of the mining rig 20.
  • the operator 60 may remotely operate the mining rig 20 in the mining environment 100 observing the combined three-dimensional real-time representation 21 by, for example, using at least one display 61, using Augmented Reality (AR) or Virtual Reality (VR) headset or glasses 62 or any other suitable tool for AR or VR.
  • the operator 60 may, using the combined three-dimensional real-time representation 21, observe, from any perspective, how and where the mining rig 20 is positioned in the mining environment 100.
  • the operator 60 may further be able to observe operations performed by the mining rig 20 in the mining environment 100.
  • the operator 60 may be able to observe the mining rig 20 and/or the mining environment 100 from angles and locations which would have otherwise been difficult or impossible to observe, e.g., due to being obscured by parts of the mining rig 20 or by parts of, and/or infrastructure in, the mining environment 100.
  • the operator 60 may freely move around a visual representation of the combined three-dimensional real-time representation 21 and observe the mining rig 20 and/or the mining environment 100 from any suitable perspective, even from normally impossible perspective such as a perspective from inside the rock foundation of the mining environment 100.
  • the operator 60 may then provide one or more control inputs for controlling the mining rig 20 e.g., in real time, and continue to observe the mining rig 20 and/or the mining environment from any suitable perspective while operating the mining rig 20 e.g., in real time.
  • Real time as used herein may indicate one or more real-time conditions to be fulfilled.
  • any real-time condition mentioned herein may relate to ensuring a quick enough update such that it is possible for the operator 60 to observe changes to the mining rig 20 and/or the mining environment 100 in the combined three-dimensional real-time representation 21, and by remote control, have time to control the mining rig 20 to avoid an unwanted situation, e.g. to avoid accidents.
  • Real-time conditions may depend on a speed of the mining rig 20, and/or computational speed relating to performing the embodiments herein.
  • Real-time conditions may relate to soft real-time conditions and/or hard real-time conditions.
  • Real-time conditions herein may be a soft or hard real-time conditions.
  • the hard real-time conditions may relate to ensuring that an operation is performed within a hard time limit, e.g. within 80ms or 100ms, wherein it may further be verified that the hard time limit cannot be missed.
  • a hard time limit e.g. within 80ms or 100ms
  • a missed hard time limit may trigger an emergency halt of operations in the mining environment 100.
  • the soft real-time conditions may relate to attempting to perform an operation within a soft time limit for ensuring a high performance. For soft real-time conditions, performance degradation may be associated with missing the soft time limit.
  • the one or more real-time conditions may relate to some specific sensor data, e.g., from one or more sensors out of the plurality of sensors or may relate to sensor data obtained from one or more sensors out of the plurality of sensors which senses specific parts of the mining rig 20 and/or the mining environment 100.
  • sensor data used for monitoring real-time movements of the mining rig 20 may not be older than a certain period of time, e.g., 100ms.
  • the one or more real-time conditions may relate to that real-time operations to be performed by the mining rig 20 may need to be performed within a certain period of time, e.g. not more than 100ms after being triggered.
  • the one or more real-time conditions in embodiments herein may be used to ensure safety of operating the mining rig 20 in the mining environment 100. For example, if any of the one or more real-time conditions is not fulfilled, a halt of the mining rig 20 may be triggered and/or an alert or an alarm may be triggered for indicating to the operator 60 that a real-time condition has not been fulfilled.
  • Methods herein may be performed by a control node 40 or any other suitable device, e.g., the mining rig 20 or a suitable drill rig.
  • the control node 40 is part of the mining rig 20.
  • the control node 40 may be a remote node in communication with the mining rig 20, for example, in a control room remotely located from the mining rig 20.
  • Embodiments herein relate to combining sensor data from the plurality of sensors and apply the sensor data to a virtual mining model 20' of the mining rig 20 and the mining environment 100.
  • a virtual mining model 20' Using the virtual mining model 20', a three-dimensional VR or AR representation may be generated which may augment to the operator 60, visual information needed to complete a complex mining operation in real-time such as an entire rock bolting or drilling operation from a remote location.
  • the presented visual information further makes new functionalities possible to be performed remotely with greater efficiency, such as improved collision avoidance. This is since it is, due to the embodiments herein, possible, e.g., for the operator 60, to observe in three dimensions with depth-perception, the mining environment 100 and the mining rig 20 operating in the mining environment 100.
  • one or more operations may be triggered based on the combined three-dimensional real-time representation 21.
  • the mining rig 20 may further be taught how to perform a series of mining operations, for example a drill plan, and thus increase the efficiency of mining operations in the mining environment 100. These operations may be performed without any real-time conditions and hence may be planned in advance. In this way, it is possible to advantageously utilize the combined three-dimensional real-time representation 21 to in advance plan the one or more operations to be performed by the mining rig 20.
  • the one or more operations may first be performed virtually using the combined three-dimensional real-time representation 21, recorded, and then performed by the mining rig 20.
  • this may be performed by planning the series of mining operations by first inspecting the locations in the mining environment 100 of each of the mining operations to be performed in various angles in the coordinate system of the combined three-dimensional real-time representation 21. In this way, it is possible to determine how and where to perform each of the respective mining operations. The series of mining operations may then be triggered and performed automatically with minimal or no supervision and/or further control by the operator 60.
  • Fig. 3 shows example embodiments of a method for controlling the mining rig 20 in the mining environment 100.
  • the method may be performed by the control node 40, or any suitable device such as the mining rig 20.
  • the method comprises the following actions, which actions may be taken in any suitable order. Optional actions are referred to as dashed boxes in Fig. 3 .
  • the method may comprise obtaining a three-dimensional representation of the mining rig 20.
  • the obtained three-dimensional representation of the mining rig 20 is a Computer-Aided Design (CAD) drawing of the mining rig 20.
  • CAD Computer-Aided Design
  • the method comprises obtaining sensor the data from the plurality of sensors.
  • the plurality of sensors comprises the one or more first sensors 24a-c for sensing the mining environment 100 and the one or more second sensors 25a-c for sensing the mining rig 20 in the mining environment 100. Any one or more of the sensors may be attached to, or part of the mining rig 20. Alternatively, one or more of the sensors may be attached to the rock surface 27 or infrastructure of the mining environment 100.
  • the sensor data may be obtained from each of the respective sensor out of the plurality of sensors based on a respective sensing frequency, i.e., refresh rates, of each respective sensor. For example, some sensor data may be obtained at a low frequency, and some sensor data may be obtained at a high frequency, i.e., more often.
  • a respective sensing frequency i.e., refresh rates
  • the sensor data of different sensors may be obtained in batches, i.e., where several types of sensor data are obtained at the same time.
  • the sensor data are pre-processed before being obtained.
  • the mining rig 20 may be associated with a computing unit, e.g. attached to, coupled with, or connected to the mining rig 20, which first pre-processes the sensor data.
  • Pre-processing the sensor data may involve filtering out anomalies of the sensor data, and/or to prepare the sensor data for transmission.
  • networking capacity e.g. bandwidth
  • Obtaining the sensor data may involve receiving the sensor data from the plurality of sensors.
  • the sensor data may be transmitted from the plurality of sensors, e.g. orchestrated by a communication unit which is connected to, coupled with, or attached to, the mining rig 20.
  • Obtaining the sensor data may thus further involve receiving the sensor data at the control node 40.
  • the plurality of sensors comprises at least one high resolution sensor and at least one low resolution sensor.
  • the at least one high resolution sensor captures high resolution sensor data of the mining environment 100.
  • the at least one low resolution sensor may capture low resolution data for establishing real-time position and/or status of the mining rig 20 and the mining environment 100.
  • the high resolution sensor data captured by the at least one high resolution sensor is of a higher resolution than the low resolution sensor data captured by the low resolution sensor.
  • High resolution sensor data may comprise more than a first quantity of sensor data points. Additionally or alternatively, high resolution sensor data may comprise sensor data with accuracy and/or precision higher than a second threshold.
  • Low resolution sensor data may comprise less than a third quantity of sensor data points.
  • low resolution sensor data may comprise sensor data with accuracy and/or precision lower than a fourth threshold.
  • the at least one low resolution sensor may capture sensor data at a higher rate than the at least one high resolution sensor.
  • the low resolution sensor data may be captured at e.g. 20Hz, and the high resolution sensor data which may be captured at a much lower frequency e.g. once per day or every 15 th minute depending on technology used and the demand for high resolution sensor data.
  • the high resolution sensor data may only be captured rarely and provide a basis for detailed data of the mining rig 20 and/or the mining environment 100.
  • the low resolution data may instead be captured often, and while not as detailed as the high resolution sensor data, the low resolution sensor data may provide a basis for establishing real-time positions and status of the mining rig 20 and/or the mining environment 100.
  • the at least one high resolution sensor may comprise any one or more out of:
  • the rotating laser scanner may be a Light Detection and Ranging (Lidar) sensor, for example, for scanning the mining environment 100.
  • the rotating laser scanner may scan the mining environment 100 and provide information of the features of the mining environment 100, for example, represented by a set of positions of scanned parts of the mining environment 100 also referred to as a point cloud.
  • Images captured by the plurality of two-dimensional cameras may be derived to form a high resolution three-dimensional representation of the mining rig 20 and mining environment 100 by using photogrammetry.
  • the time-of-flight camera may be a camera which captures one or more images or video and may further estimates, for every pixel in the one or more images or video, a distance from the mining rig 20 to different positions in the mining environment 100.
  • the time-of-flight camera may provide two-dimensional and/or three-dimensional information for each pixel of the one or more images or video captured.
  • the time-of-flight is a low-cost camera typically suitable for dark areas and may work over long distances. Furthermore, the time-of-flight camera may provide distance information at high accuracy and/or precision.
  • the solid state Lidar may be a Lidar embedded in a micro-chip.
  • the solid-state Lidar may further scan the mining environment 100 without any moving parts.
  • the structured light three-dimensional scanner is which can scan a recognizable pattern. Based on the recognizable pattern, e.g. a striped pattern, the scanner may estimate the distance and three-dimensional space between the scanner and the recognizable pattern. This is since the scanner may know and/or may be able to derive how the pattern looks at different distances and/or angles.
  • the recognizable pattern e.g. a striped pattern
  • the method further comprises generating the virtual mining model 20' based on the obtained sensor data.
  • the virtual mining model 20' comprises the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 comprises the coordinate system for establishing the positioning relationship of the mining rig 20 relative to the mining environment 100 in real time.
  • the virtual mining model 20' provides a digital twin which represents the mining rig 20 in relation to the mining environment 100. This enables observing the mining rig 20 and the mining environment 100 from any perspective in real- time. Further, when controlling the mining rig 20, e.g.
  • generating the virtual mining model 20' comprises generating the coordinate system comprised in the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • Generating the coordinate system may comprise obtaining one or more relative distances between one or more of the plurality of sensors and a reference point of the mining rig 20.
  • generating the coordinate system may further comprise estimating one or more distances from the mining rig 20 and the mining environment 100.
  • the one or more distances from the mining rig 20 and the mining environment 100 may be one or more distances from the mining rig 20 and different parts of the rock surface 27.
  • the estimation of the one or more distances may be based on the obtained sensor data and the one or more relative distances between the one or more of the plurality of sensors and the reference point of the mining rig 20.
  • Generating the coordinate system may further comprise determining one or more positions associated with the mining rig 20 and the mining environment 100. The one or more positions may be determined relative to the reference point of the mining rig 20. Determining the one or more positions associated with the mining rig 20 may be based on the one or more distances from the mining rig and the mining environment 100. To estimate the one or more distance from the mining rig 20 and the mining environment 100, the sensor data obtained by the plurality of sensors may be compared with the reference point of the mining rig 20.
  • the plurality of sensor may comprise many different sensors, and thus by using all of the sensors, it may be possible to estimate the one or more distances from the mining rig 20 and the mining environment 100 with high precision and/or accuracy. Determining one or more positions associated with the mining rig 20 and the mining environment 100 may further comprise identifying one or more positions of parts of any one or more out of the one or more booms 22, 23.
  • the one or more static locations may further be used as a reference point for determining the position of the mining rig 20 position in relation to the mining environment 100.
  • the number of sensors used, and/or the number of types of sensors used may in some scenarios relate to the increase in accuracy and/or precision of the estimated one or more distances from the mining rig 20 and the mining environment 100. This is since by using sensor data of many different sensors, it may be possible to verify first sensor data from a first sensor and/or to detect anomalies in the first sensor data by comparing the first sensor data from with other sensor data of other sensors which senses the same or related objects as the first sensor.
  • sensor data obtained from a Lidar sensor may indicate the positions of one or more parts of the one or more booms 22, 23. In this scenario, the sensor data obtained from the Lidar sensor may be verified based on sensor data obtained from one or more sensors sensing the state, e.g. angles, of one or more joints in the one or more booms 22, 23.
  • the relative distances between the plurality of sensors and the reference point of the mining rig 20 may in some scenarios at least partially be derived based on sensor data from one or more sensors attached to the mining rig 20 with a known position on the mining rig 20, e.g. relative the reference point of the mining rig 20.
  • one or more angle sensors and/or one or more extension sensors may measure the extension and/or angles the one or more beams. In this way, it may be possible to determine distances and positions of at least parts of the one or more respective beams in relation to the mining rig 20 based on the sensor data of the one or more angle sensors and/or the one or more extension sensors.
  • Generating the virtual mining model 20' may further comprise deriving information of any one or both of the mining rig 20 and the mining environment 100 based on a combination of sensor data obtained from at least two differing sensors in the plurality of sensors. This may also be referred to as sensor fusion.
  • the plurality of sensors may comprise a set of sensors which senses the current state of the mining rig 20, for example if a particular part of the mining rig 20 is extended or not and by which angle.
  • Another set of sensors of the plurality of sensors may be used for scanning the front-view of the mining rig 20, and yet another set of sensors of the plurality of sensors may be used for scanning the rear-view of the mining rig 20.
  • the combination of the sensor data of the above-mentioned sets of sensors may be used to derive information of the position and state of both the mining rig 20 and the mining environment 100. This information may further be used to generate the virtual mining model 20'.
  • the same part of the mining environment 100 may be scanned and/or sensed by several different sensors, e.g. of same or of differing sensor type.
  • generating the virtual mining model 20' may further comprise deriving the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100, based on a plurality of two-dimensional images captured by the plurality of two-dimensional cameras.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 may be generated based on photogrammetry of two-dimensional images of the mining rig 20 and the mining environment 100.
  • generating the virtual mining model 20' is further based on the obtained three-dimensional representation of the mining rig 20, e.g. as obtained in action 301. In this way, it is possible to more accurately represent the mining rig 20 in the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • generating the virtual mining model 20' is further based on a planned model of the mining environment 100.
  • the planned model may be obtained from a suitable server, database, or a memory.
  • the planned model may comprise information of how the mining environment 100 is planned to be after one or more mining operations, e.g. which resulting physical attributes and/or spatial arrangements the mining environment 100 will have after the one or more mining operations.
  • the planned model may comprise a bolt pattern to be installed in the rock surface 27 of the mining environment 100.
  • the planned model may comprise a tunnel to be drilled in the in the rock surface 27 of the mining environment 100.
  • the resulting virtual mining model 20' may comprise a combination of the sensed mining environment 100 based on the obtained sensor data, and a planned model of the mining environment 100 based on the obtained planned model.
  • the method may comprise producing a visual representation of the virtual mining model 20'.
  • the visual representation may comprise a viewing position and a viewing orientation in the coordinate system which will further be exemplified in Fig. 4 .
  • the viewing position and viewing orientation in the coordinate system corresponds to a location and orientation in the mining environment 100.
  • the visual representation may be a snapshot of the current state of the mining rig 20 and the mining environment as seen from a location in the mining environment 100 which corresponds to the viewing position and viewing orientation in the coordinate system.
  • Producing the visual representation of the virtual mining model 20' may fulfil the one or more real-time conditions.
  • the one or more real-time conditions may further relate to that the visual representation is not representing a state of the mining rig 20 older than a certain time period.
  • the visual representation of the virtual mining model 20' is updated at a frequency high enough, e.g. any interval between 10Hz, 60Hz, or 120Hz, to ensure safe remote operation of the mining rig 20 in relation to operational speed of parts of the mining rig 20, time for signaling and signal processing, time for generating the virtual mining model 20', and time to visualize the virtual mining model such that the operator 60 is enables and/or has time to trigger the mining rig 20 to perform time-critical operations.
  • the visual representation may further illustrate parts of the mining rig 20 and/or the mining environment as at least partially transparent. In this way, it may be possible for the operator to see through these parts of the mining rig 20 and/or the mining environment 100.
  • the operator 60 may observe the visual representation of the virtual mining model 20'.
  • the method may comprise obtaining one or more control inputs from the operator 60.
  • the one or more control inputs may be used to control the mining rig 20 and/or to control what to observe in the visual representation of the virtual mining model 20'.
  • the one or more control inputs may be obtained from the operator 60 by means of the operator 60 inputting the one or more control inputs by using a remote operator station comprising a joystick and a plurality of buttons.
  • the remote operator station may be any workstation suitable for the operator to use for inputting the one or more control inputs, e.g., a computer, a laptop, mobile phone, tablet, or similar.
  • the virtual representation of the virtual mining model 20' may be presented to the operator 60 on the at least one display 61.
  • the one or more control inputs may also be obtained from the operator 60 by means of the operator 60 inputting the one or more control inputs by using a VR and/or AR environment.
  • the virtual representation of the virtual mining model 20' may be presented to the operator 60 in the VR and/or AR environment, e.g. by using VR headset or glasses 62.
  • the operator may in some of these embodiments operate the mining rig by physical or virtual operator controllers or a combination of both.
  • a wide range of controllers are therefore applicable, for example: VR gloves, dedicated VR controllers, game station controllers, a tele remote operating panel.
  • the operator 60 may point at a position in the mining environment 100 wherein a certain mining operation shall be performed.
  • the one or more obtained control inputs from the operator 60 may be associated with the operator 60 performing one or more virtual operations in the virtual mining model 20'.
  • the one or more virtual operations may correspond to the one or more operations to be performed by the mining rig 20 in the mining environment 100.
  • the operator 60 may perform the virtual operations and control the mining rig 20 as observed from the visual representation of the virtual mining model 20'. These virtual operations may then translate to one or more operations triggered to be performed by the mining rig 20 in the mining environment 100. This may be a seamless experience such that the operator 60 is in this way experiencing controlling the mining rig 20 in the mining environment 100 e.g., in real time.
  • the operator 60 may also use the virtual mining model 20' to plan and to perform a series of operations.
  • the operator 60 may then input one or more control inputs to trigger the mining rig 20 in the mining environment 100 to perform a series of mining operations, for example a drill plan.
  • the series of mining operations may then be performed automatically without further input from the operator 60.
  • Action 306 The method comprises triggering one or more operations to be performed by the mining rig 20 in the mining environment 100 based on the generated virtual mining model 20'. The one or more operations may then be performed based on vision not otherwise provided for by present technology. Triggering the one or more operations to be performed by the mining rig 20 may further be based on the one or more control inputs obtained from the operator 60, e.g., as in Action 305.
  • the one or more operations triggered to be performed by the mining rig 20 in the mining environment 100 may comprise one or more real-time operations.
  • the real-time operations may be triggered based on a real-time condition.
  • the one or more operations need to be performed, or started to be performed, within a set amount of time, e.g. 100ms.
  • the one or more operations are real-time operations, it is possible to remotely control the mining rig 20 in a safer and/or more efficient manner as it is possible to react, within a short amount of time, to potential dangers and/or other performance degrading events that may occur in the mining environment 100.
  • triggering the one or more operations to be performed by the mining rig 20 may comprise operating each of the one or more booms 22, 23 in the mining environment 100, e.g., in real time, respectively.
  • the mining rig 20 may, based on the one or more operations, be triggered to operate a first boom of the one or more booms 22, 23. This may be to activate a drill attached to the first boom. In this way, using the drill of the first boom, the mining rig 20 may then commence a drilling operation in the mining environment 100.
  • the method may comprise triggering an update of the visual representation of the virtual mining model 20' such that the updated visual representation comprises any one or both of an updated viewing position and an updated viewing orientation in the coordinate system.
  • the operator may use control inputs to observe the mining rig 20 and the mining environment in the virtual mining model 20' from any suitable perspective.
  • the operator 60 may move around freely in the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • Fig. 4 illustrate an example scenario for observing the virtual mining model 20' in different perspectives.
  • Fig. 4 illustrates the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 and the corresponding produced visual representation.
  • the visual representation is observed by the operator 60 on the at least one display 61, in a VR environment, or in an AR environment in the AR/VR headset or glasses 62.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 comprises two different viewing positions, a first viewing position 51, and a second viewing position 52 in the coordinate system.
  • the first viewing position 51 correspond to a first location in the mining environment 100
  • the second viewing position 52 correspond to a second viewing position in the mining environment 100.
  • Both of the first and second viewing positions 51, 52 have respective viewing orientations in the coordinate system which correspond to respective viewing orientation in the mining environment 100.
  • the visual representation may be produced based on the first viewing position 51 and its respective orientation. This visual representation may then visualize the mining rig 20 and the mining environment 100 for the operator 60 in real-time, as if the operator 60 was observing the mining rig 20 and mining environment 100 from an initial location and orientation in the mining environment 100.
  • the initial location and orientation in the mining environment 100 may correspond to the first viewing position 51 and the viewing orientation of the first viewing position 51.
  • the visual information provided to the operator 60 does not comprise the necessary information for the operator 60 to be able to safely and/or efficiently perform the one or more operations. This may be since the visual presentation may now show the operator 60, the mining rig 20 from the perspective necessary to determine how to perform the one or more operations.
  • the operator 60 may then input one or more control inputs for moving the first viewing position 51 to a second viewing position 52 in the coordinate system, e.g. as in action 307 above.
  • the visual representation may then be updated to visualize the mining rig 20 and the mining environment 100 for the operator 60 in real-time, as if the operator 60 was observing the mining rig 20 and mining environment 100 from an updated location and orientation in the mining environment 100.
  • the updated location and orientation may correspond to the second viewing position 52 and the viewing orientation of the second viewing position 52.
  • the operator 60 may, using the perspective of the updated visual representation, have visual information of high enough quality to perform the one or more operations.
  • the visual representation may change and be updated to be produced based on any number of different viewing position in a seamless real-time manner, e.g. updating several times a second.
  • the produced visual representation may appear to the operator 60 as a video representation of the mining rig 20 and the mining environment 100, wherein the operator 60 may choose from which viewing position and which orientation the mining rig 20 and the mining environment should be observed from.
  • the at least one high resolution sensor in some embodiments herein may be a high resolution scanner used to capture detailed data of the mining environment 100 surrounding the mining rig 20.
  • the high resolution scanner may generally be too slow and require too much computer power to capture live motion in real time and may therefore mostly be used for scanning static parts of the mining environment 100.
  • the high resolution scanner may be a rotating scanner which scans the environment 360 degrees around the mining rig 20.
  • the sensor data obtained by the high-resolution scanner may comprise information about the mining environment 100 which may be used to derive a precise and accurate positioning of the mining rig 20 and the one or more equipment which may be attached to the mining rig 20.
  • the high resolution scanner may be a laser scanner, but may also be any other type of suitable scanner, such as a Radio Detection and Ranging (Radar) sensor.
  • Radar Radio Detection and Ranging
  • the at least one low resolution sensor in some embodiments herein may be a fast scanner used to capture real time movements, i.e. dynamics, of the mining rig 20 and equipment attached to the mining rig 20 such as the one or more booms 22, 23.
  • the fast scanner may scan the one or more booms 22, 23 when for example the operator positions and moves the respective one or more booms 22, 23.
  • the fast scanner will further be able to capture real time movements using less bandwidth than the high resolution scanner as it may capture the mining rig 20 and the mining environment 100 using low resolution.
  • the fast scanner may further detect if there are changes to the mining environment, e.g. if the surrounding rock from the high resolution scan has changed in any major way, e.g. more than a threshold.
  • a new high resolution scan may be needed and may in some embodiments be triggered to be performed at least at the area of change.
  • a camera may be used instead of, or in combination with the fast scanner to capture real time movements.
  • the fast scanner may be a laser scanner, but may also be any other type of suitable scanner, such as a Radar sensor.
  • the one or more second sensors 25a-c of the plurality of sensors may be used to capture the mining rig 20 real-time movements in the mining environment 100.
  • these sensors may capture movements of the one or more equipment and/or the one or more booms 22, 23, which may be attached to the mining rig 20 in various embodiments herein.
  • the sensor data obtained from these sensors may be used to reposition the mining rig 20, and/or the parts of the mining rig 20 in the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • the one or more second sensors 25a-c out of the plurality of sensors may additionally or alternatively comprise any one or more out of the following sensors for sensing the mining rig 20 in the mining environment 100:
  • a position and movement of joints in each respective boom of the one or more booms 22, 23 of the mining rig 20 may be calculated and/or controlled.
  • the control node 40 is configured to control the mining rig 20 in the mining environment 100.
  • the method actions above may also be performed by the mining rig 20 or any suitable drill rig capable of performing the action methods above.
  • the control node may be located at a remotely located control room or at the mining rig 20.
  • the control node 40 may comprise an arrangement depicted in Figs. 5a and 5b .
  • the control node 40 may comprise an input and output interface 500 e.g. for communicating with the mining rig 20 and/or the plurality of sensors.
  • the input and output interface 500 may also visualise the visual representation to the operator 60, and further provide the means for the operator 60 to enter the one or more control inputs.
  • the input and output interface 500 may comprise a receiver (not shown) and a transmitter (not shown), both which may be wired or wireless.
  • the control node 40 may further be configured to, e.g. by means of an obtaining unit 501 in the the control node 40, obtain a three-dimensional representation of the mining rig 20.
  • the obtained three-dimensional representation of the mining rig 20 is a CAD drawing of the mining rig 20.
  • the control node 40 may further be configured to, e.g. by means of the obtaining unit 501 in the the control node 40, obtain sensor data from a plurality of sensors.
  • the plurality of sensors comprises one or more first sensors 24a-c for sensing the mining environment 100 and one or more second sensors 25a-c for sensing the mining rig in the mining environment 100.
  • the control node 40 may further be configured to, e.g. by means of a generating unit 502 in the the control node 40, based on the obtained sensor data, generate a virtual mining model 20'.
  • the virtual mining model 20' comprises a combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100.
  • the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 comprises a coordinate system for establishing the positioning relationship of the mining rig 20 relative to the mining environment 100 in real time.
  • control node 40 may further be configured to, e.g. by means of the generating unit 502 in the the control node 40, generate the virtual mining model 20' further based on the three-dimensional representation of the mining rig 20.
  • the control node 40 may further be configured to, e.g. by means of the generating unit 502 in the the control node 40, generate the virtual mining model 20' by deriving information of any one or both of the mining rig 20 and the mining environment 100 based on a combination of sensor data obtained from at least two differing sensors in the plurality of sensors.
  • the control node 40 may further be configured to, e.g. by means of the generating unit 502 in the the control node 40, generate the virtual mining model 20'.
  • generating the virtual mining model 20' comprises generating the coordinate system comprised in the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100 by:
  • the plurality of sensors comprises at least one high resolution sensor and at least one low resolution sensor, wherein the at least one high resolution sensor captures high resolution sensor data of the mining environment 100, and wherein the at least one low resolution sensor captures low resolution data for monitoring real-time movements of the mining rig 20.
  • the high resolution sensor data captured by the at least one high resolution sensor is of a higher resolution than the low resolution sensor data captured by the low resolution sensor, and wherein the at least one low resolution sensor captures sensor data at a higher rate than the at least one high resolution sensor.
  • the high resolution sensor data may be sensed at a higher frequency than low resolution sensor data.
  • the high resolution sensor data may comprise sensor data sensed by one or more accurate and/or precise sensors than the one or more sensors sensing the low resolution sensor data.
  • the high resolution sensor data may comprise images and/or videos comprising more pixels than low resolution sensor data. Additionally or alternatively, the high resolution sensor data may comprise more accurate and/or precise distance measuring than the low resolution sensor data.
  • the at least one high resolution sensor comprises any one or more out of:
  • the at least one high resolution sensor comprises the plurality of two-dimensional cameras.
  • the control node 40 may further be configured to, e.g. by means of the generating unit 502 in the the control node 40, generate the virtual mining model 20' further comprises deriving the combined three-dimensional real-time representation 21 of the mining rig 20 and the mining environment 100, based on a plurality of two-dimensional images captured by the plurality of two-dimensional cameras.
  • the control node 40 may further be configured to, e.g. by means of a producing unit 503 in the the control node 40, produce a visual representation of the virtual mining model 20', wherein the visual representation comprises a viewing position 51, 52 and a viewing orientation in the coordinate system.
  • the viewing position 51, 52 and viewing orientation in the coordinate system may correspond to a location and orientation in the mining environment 100.
  • producing the visual representation of the virtual mining model 20' may fulfil a real-time condition.
  • the control node 40 may further be configured to, e.g. by means of a triggering unit 504 in the the control node 40, trigger one or more operations to be performed by the mining rig 20 in the mining environment 100 based on the generated virtual mining model 20'.
  • the control node 40 may further be configured to, e.g. by means of the obtaining unit 501 in the the control node 40, in response to an operator 60 observing the visual representation of the virtual mining model 20', obtain one or more control inputs from the operator 60.
  • the one or more control inputs is obtained from the operator 60 by the operator 60 inputting the one or more control inputs by using a remote operator station comprising a joystick and a plurality of buttons.
  • the virtual representation of the virtual mining model 20' is presented to the operator 60 on at least one display.
  • the one or more control inputs is obtained from the operator 60 by the operator 60 inputting the one or more control inputs by using a VR environment.
  • the virtual representation of the virtual mining model 20' is presented to the operator 60 in the VR environment.
  • the one or more control inputs is obtained from the operator 60 by the operator 60 inputting the one or more control inputs by using an AR environment.
  • the virtual representation of the virtual mining model 20' is presented to the operator 60 in the AR environment.
  • the one or more obtained control inputs from the operator 60 may further be associated with the operator 60 performing one or more virtual operations in the virtual mining model 20'.
  • the one or more virtual operations may correspond to the one or more operations to be performed by the mining rig 20 in the mining environment 100.
  • the control node 40 may further be configured to, e.g. by means of the triggering unit 504 in the the control node 40, trigger the one or more operations to be performed by the mining rig 20 further based on the one or more control inputs obtained from the operator 60.
  • the mining rig 20 comprises a boom.
  • the control node 40 may further be configured to, e.g. by means of the triggering unit 504 in the the control node 40, trigger the one or more operations to be performed by the mining rig 20 to operate the boom 21 in the mining environment 100 e.g., in real time.
  • the control node 40 may further be configured to, e.g. by means of the triggering unit 504 in the the control node 40, based on the one or more control inputs, trigger an update of the visual representation of the virtual mining model 20' such that the updated visual representation comprises any one or both of an updated viewing position and an updated viewing orientation in the coordinate system.
  • the embodiments herein may be implemented through a respective processor or one or more processors, such as the processor 560 of a processing circuitry in the control node 40 depicted in Fig. 5a , together with respective computer program code for performing the functions and actions of the embodiments herein.
  • the program code mentioned above may also be provided as a computer program product, for instance in the form of a data carrier carrying computer program code for performing the embodiments herein when being loaded into the control node 40.
  • One such carrier may be in the form of a CD ROM disc. It is however feasible with other data carriers such as a memory stick.
  • the computer program code may furthermore be provided as pure program code on a server and downloaded to the control node 40.
  • the control node 40 may further comprise a memory 570 comprising one or more memory units.
  • the memory 570 comprises instructions executable by the processor in the control node 40.
  • the memory 570 is arranged to be used to store e.g. information, indications, data, configurations, sensor data, three-dimensional representations, and applications to perform the methods herein when being executed in the control node 40.
  • a computer program 580 comprises instructions, which when executed by the respective at least one processor 560, cause the at least one processor of the control node 40 to perform the actions above.
  • a respective carrier 590 comprises the respective computer program 580, wherein the carrier 590 is one of an electronic signal, an optical signal, an electromagnetic signal, a magnetic signal, an electric signal, a radio signal, a microwave signal, or a computer-readable storage medium.
  • control node 40 may refer to a combination of analog and digital circuits, and/or one or more processors configured with software and/or firmware, e.g. stored in the control node 40, that when executed by the respective one or more processors such as the processors described above.
  • processors as well as the other digital hardware, may be included in a single Application-Specific Integrated Circuitry (ASIC), or several processors and various digital hardware may be distributed among several separate components, whether individually packaged or assembled into a system-on-a-chip (SoC).
  • ASIC Application-Specific Integrated Circuitry
  • SoC system-on-a-chip

Landscapes

  • Engineering & Computer Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geochemistry & Mineralogy (AREA)
  • Geology (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Earth Drilling (AREA)
  • Operation Control Of Excavators (AREA)

Claims (15)

  1. Verfahren zum Steuern einer Bergbau-Anlage (20) in einer Bergbauumgebung (100), wobei das Verfahren umfasst:
    - Erhalten (302) von Sensordaten von einer Vielzahl von Sensoren, wobei die Vielzahl von Sensoren einen oder mehrere erste Sensoren (24a-c) zum Erfassen der Bergbauumgebung (100) und einen oder mehrere zweite Sensoren (25a-c) zum Erfassen der Bergbau-Anlage in der Bergbauumgebung (100) umfasst,
    - basierend auf den erhaltenen Sensordaten, Erzeugen (303) eines virtuellen Bergbaumodells (20'), wobei das virtuelle Bergbaumodell (20') eine kombinierte dreidimensionale Echtzeitdarstellung (21) der Bergbau-Anlage (20) und der Bergbauumgebung (100) umfasst,
    wobei die kombinierte dreidimensionale Echtzeitdarstellung (21) der Bergbau-Anlage (20) und der Bergbauumgebung (100) ein Koordinatensystem zum Festlegen einer Positionsbeziehung der Bergbau-Anlage (20) relativ zu der Bergbauumgebung (100) in Echtzeit umfasst und wobei das Erzeugen (303) des virtuellen Bergbaumodells (20') Erstellen (304) einer visuellen Darstellung des virtuellen Bergbaumodells (20') umfasst, wobei die visuelle Darstellung eine Betrachtungsposition (51, 52) und eine Betrachtungsorientierung in dem Koordinatensystem umfasst, wobei die Betrachtungsposition (51, 52) und die Betrachtungsorientierung in dem Koordinatensystem einem Ort und einer Orientierung in der Bergbauumgebung (100) entsprechen und wobei das Erstellen der visuellen Darstellung des virtuellen Bergbaumodells (20') eine Echtzeitbedingung erfüllt, und
    - Auslösen (306) eines oder mehrerer Vorgänge, die durch die Bergbau-Anlage (20) in der Bergbauumgebung (100) durchzuführen sind, basierend auf dem erzeugten virtuellen Bergbaumodell (20'), wobei das Auslösen (306) eines oder mehrerer Vorgänge Auslösen eines oder mehrerer Echtzeitvorgänge umfasst.
  2. Verfahren nach Anspruch 1, wobei das Verfahren weiter umfasst:
    - als Reaktion darauf, dass ein Bediener (60) die visuelle Darstellung des virtuellen Bergbaumodells (20') beobachtet, Erhalten (305) von einer oder mehreren Steuereingaben von dem Bediener (60), und
    wobei das Auslösen (306) des einen oder der mehreren Vorgänge, die durch die Bergbau-Anlage (20) durchzuführen sind, weiter auf der einen oder den mehreren Steuereingaben basiert, die von dem Bediener (60) erhalten werden.
  3. Verfahren nach Anspruch 2, wobei die eine oder die mehreren Steuereingaben von dem Bediener (60) mittels einem beliebigen erhalten werden aus:
    - dass der Bediener (60) die eine oder die mehreren Steuereingaben unter Verwendung einer Fernbediener (60)-Station eingibt, die einen Joystick und eine Vielzahl von Tasten umfasst, wobei die virtuelle Darstellung des virtuellen Bergbaumodells (20') dem Bediener (60) auf mindestens einer Anzeige (61) präsentiert wird,
    - dass der Bediener (60) die eine oder die mehreren Steuereingaben durch Verwenden einer Virtuelle-Realität, VR, -Umgebung eingibt, wobei die virtuelle Darstellung des virtuellen Bergbaumodells (20') dem Bediener (60) in der VR-Umgebung präsentiert wird, und
    - dass der Bediener (60) die eine oder die mehreren Steuereingaben durch Verwenden einer Erweiterten-Realität, AR, -Umgebung eingibt, wobei die virtuelle Darstellung des virtuellen Bergbaumodells (20') dem Bediener (60) in der AR-Umgebung präsentiert wird.
  4. Verfahren nach einem der Ansprüche 2-3, wobei das Verfahren weiter umfasst:
    - basierend auf der einen oder den mehreren Steuereingaben, Auslösen (307) einer Aktualisierung der visuellen Darstellung des virtuellen Bergbaumodells (20') derart, dass die aktualisierte visuelle Darstellung ein beliebiges oder beides einer aktualisierten Betrachtungsposition und einer aktualisierten Betrachtungsorientierung in dem Koordinatensystem umfasst.
  5. Verfahren nach einem der Ansprüche 2-4, wobei die eine oder die mehreren von dem Bediener (60) erhaltenen Steuereingaben dem zugeordnet sind, dass der Bediener (60) eine oder mehrere virtuelle Vorgänge in dem virtuellen Bergbaumodell (20') durchführt, und wobei die eine oder die mehreren virtuellen Vorgänge dem einen oder den mehreren Vorgängen entsprechen, die durch die Bergbau-Anlage (20) in der Bergbauumgebung (100) durchzuführen sind.
  6. Verfahren nach einem der Ansprüche 1-5, wobei das Erzeugen (303) des virtuellen Bergbaumodells (20') Erzeugen des Koordinatensystems umfasst, das in der kombinierten dreidimensionalen Echtzeitdarstellung (21) der Bergbau-Anlage (20) und der Bergbauumgebung (100) umfasst ist, durch:
    - Erhalten von einem oder mehreren relativen Abständen zwischen einem oder mehreren der Vielzahl von Sensoren und einem Referenzpunkt der Bergbau-Anlage (20),
    - basierend auf den erhaltenen Sensordaten und basierend auf dem einen oder den mehreren relativen Abständen zwischen dem einen oder den mehreren der Vielzahl von Sensoren und dem Referenzpunkt der Bergbau-Anlage (20), Schätzen eines oder mehrerer Abstände von der Bergbau-Anlage (20) und der Bergbauumgebung (100), und
    - basierend auf dem einen oder den mehreren Abständen von der Bergbau-Anlage und der Bergbauumgebung (100), Bestimmen einer oder mehrerer Positionen, die der Bergbau-Anlage (20) und der Bergbauumgebung (100) zugeordnet sind, wobei die eine oder die mehreren Positionen relativ zu dem Referenzpunkt der Bergbau-Anlage (20) bestimmt werden.
  7. Verfahren nach einem der Ansprüche 1-6, wobei die Vielzahl von Sensoren mindestens einen hochauflösenden Sensor und mindestens einen niedrigauflösenden Sensor umfasst, wobei der mindestens eine hochauflösende Sensor hochauflösende Sensordaten der Bergbauumgebung (100) aufnimmt und wobei der mindestens eine niedrigauflösende Sensor niedrigauflösende Daten zum Überwachen von Echtzeitbewegungen der Bergbau-Anlage (20) aufnimmt.
  8. Verfahren nach Anspruch 7, wobei die durch den mindestens einen hochauflösenden Sensor aufgenommenen hochauflösenden Sensordaten eine höhere Auflösung aufweisen als die durch den niedrigauflösenden Sensor aufgenommenen niedrigauflösenden Sensordaten und wobei der mindestens eine niedrigauflösende Sensor Sensordaten mit einer höheren Rate aufnimmt als der mindestens eine hochauflösende Sensor.
  9. Verfahren nach einem der Ansprüche 1-8, wobei das Verfahren weiter umfasst:
    - Erhalten (301) von einer dreidimensionalen Darstellung der Bergbau-Anlage (20) und wobei das Erzeugen (303) des virtuellen Bergbaumodells (20') weiter auf der dreidimensionalen Darstellung der Bergbau-Anlage (20) basiert.
  10. Verfahren nach Anspruch 9, wobei die erhaltene dreidimensionale Darstellung der Bergbau-Anlage (20) eine Computer-Aided Design, CAD, -Zeichnung der Bergbau-Anlage (20) ist.
  11. Verfahren nach einem der Ansprüche 1-10, wobei das Erzeugen (303) des virtuellen Bergbaumodells (20') weiter ein Ableiten von Informationen von einem oder beiden der Bergbau-Anlage (20) und der Bergbauumgebung (100) basierend auf einer Kombination von Sensordaten umfasst, die von mindestens zwei unterschiedlichen Sensoren in der Vielzahl von Sensoren erhalten werden.
  12. Steuerknoten (40), der dazu konfiguriert ist, eine Bergbau-Anlage (20) in einer Bergbauumgebung (100) zu steuern, wobei der Steuerknoten (40) weiter konfiguriert ist zum:
    - Erhalten von Sensordaten von einer Vielzahl von Sensoren, wobei die Vielzahl von Sensoren einen oder mehrere erste Sensoren (24a-c) zum Erfassen der Bergbauumgebung (100) und einen oder mehrere zweite Sensoren (25a-c) zum Erfassen der Bergbau-Anlage in der Bergbauumgebung (100) umfasst,
    - basierend auf den erhaltenen Sensordaten, Erzeugen eines virtuellen Bergbaumodells (20'), wobei das virtuelle Bergbaumodell (20') eine kombinierte dreidimensionale Echtzeitdarstellung (21) der Bergbau-Anlage (20) und der Bergbauumgebung (100) umfasst,
    wobei die kombinierte dreidimensionale Echtzeitdarstellung (21) der Bergbau-Anlage (20) und der Bergbauumgebung (100) ein Koordinatensystem zum Festlegen einer Positionsbeziehung der Bergbau-Anlage (20) relativ zu der Bergbauumgebung (100) in Echtzeit umfasst,
    - Erzeugen (304) einer visuellen Darstellung des virtuellen Bergbaumodells (20'), wobei die visuelle Darstellung eine Betrachtungsposition (51, 52) und eine Betrachtungsorientierung in dem Koordinatensystem umfasst,
    wobei die Betrachtungsposition (51, 52) und die Betrachtungsorientierung in dem Koordinatensystem einem Ort und einer Orientierung in der Bergbauumgebung (100) entsprechen und wobei das Erstellen der visuellen Darstellung des virtuellen Bergbaumodells (20') eine Echtzeitbedingung erfüllt, und
    - Auslösen eines oder mehrerer Vorgänge, die durch die Bergbau-Anlage (20) in der Bergbauumgebung (100) durchzuführen sind, basierend auf dem erzeugten virtuellen Bergbaumodell (20'), wobei das Auslösen eines oder mehrerer Vorgänge Auslösen eines oder mehrerer Echtzeitvorgänge umfasst.
  13. Steuerknoten (40) nach Anspruch 12, wobei der Steuerknoten (40) weiter dazu konfiguriert ist, das Verfahren nach den Ansprüchen 2-11 durchzuführen.
  14. Computerprogramm (580), das Anweisungen umfasst, die bei Ausführung durch einen Prozessor (560) den Prozessor (560) dazu veranlassen, Handlungen nach einem der Ansprüche 1-11 durchzuführen.
  15. Träger (590), der das Computerprogramm (580) nach Anspruch 14 umfasst, wobei der Träger (590) eines von einem elektronischen Signal, einem optischen Signal, einem elektromagnetischen Signal, einem magnetischen Signal, einem elektrischen Signal, einem Funksignal, einem Mikrowellensignal oder einem computerlesbaren Speichermedium ist.
EP22787018.5A 2021-09-22 2022-09-22 Verfahren und ein kontrollknoten zum steuern einer bergbau-anlage Active EP4405567B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2151155 2021-09-22
PCT/SE2022/050835 WO2023048625A1 (en) 2021-09-22 2022-09-22 Method and a control node for controlling a mining rig

Publications (3)

Publication Number Publication Date
EP4405567A1 EP4405567A1 (de) 2024-07-31
EP4405567B1 true EP4405567B1 (de) 2025-12-10
EP4405567C0 EP4405567C0 (de) 2025-12-10

Family

ID=83689954

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22787018.5A Active EP4405567B1 (de) 2021-09-22 2022-09-22 Verfahren und ein kontrollknoten zum steuern einer bergbau-anlage

Country Status (6)

Country Link
US (1) US20240401479A1 (de)
EP (1) EP4405567B1 (de)
CN (1) CN117940650A (de)
AU (1) AU2022350996A1 (de)
CA (1) CA3227036A1 (de)
WO (1) WO2023048625A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117627528B (zh) * 2024-01-25 2024-05-03 中建五局第三建设有限公司 一种深基坑用锚杆钻机的施工装置及其施工方法
CN121257228A (zh) * 2025-12-05 2026-01-02 深圳大学 多场耦合作用下深部资源流态化开采扰动效应评估模型

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2016200784B1 (en) * 2015-05-28 2016-06-16 Commonwealth Scientific And Industrial Research Organisation System and method for controlling a mining machine
BE1027207B1 (de) * 2019-04-03 2020-11-23 Thyssenkrupp Ind Solutions Ag Verfahren und Einrichtung zum automatisierbaren Betrieb einer Materialgewinnungsanlage an der Abbaufront einer Materialgewinnungsstätte
CN109989751B (zh) * 2019-05-06 2020-10-16 西安科技大学 一种综采三机跨平台远程实时运动跟踪方法
CN112780275B (zh) * 2019-11-08 2025-05-16 三一重型装备有限公司 掘进机工作系统及方法
CN113128109B (zh) * 2021-04-08 2022-11-29 太原理工大学 一种面向智能化综采机器人生产系统的测试与评估方法

Also Published As

Publication number Publication date
CN117940650A (zh) 2024-04-26
WO2023048625A1 (en) 2023-03-30
CA3227036A1 (en) 2023-03-30
EP4405567A1 (de) 2024-07-31
US20240401479A1 (en) 2024-12-05
AU2022350996A1 (en) 2024-02-29
EP4405567C0 (de) 2025-12-10

Similar Documents

Publication Publication Date Title
KR101917937B1 (ko) 광산 차량 및 광산 작업 태스크를 개시하는 방법
US10544567B2 (en) Method and system for monitoring a rotatable implement of a machine
EP3094806B1 (de) Untertagefahrzeug und verfahren zum einleiten von untertagearbeiten
CA3053100C (en) A display system and method for remote operation using acquired three-dimensional data of an object and viewpoint position data of a worker
EP4405567B1 (de) Verfahren und ein kontrollknoten zum steuern einer bergbau-anlage
EP4050892B1 (de) Arbeitsunterstützungsserver, arbeitsunterstützungsverfahren und arbeitsunterstützungssystem
KR101944816B1 (ko) 굴삭 작업 검측 자동화 시스템
US20220245856A1 (en) Position identification system for construction machinery
US12054915B2 (en) Work assisting server and work assisting system
EP3094807B1 (de) Minenkontrollsystem
CN118226796A (zh) 一种基于5g网络的智能矿山作业方法
JP7437930B2 (ja) 移動体及び撮像システム
US11586225B2 (en) Mobile device, mobile body control system, mobile body control method, and program
JP2021025231A (ja) 掘削管理装置および方法
CN119526374B (zh) 一种悬臂式井口重置机器人的远程操控方法及装置
JP6368503B2 (ja) 障害物監視システム及びプログラム
JP7351478B2 (ja) 表示システム、遠隔操作システム、及び表示方法
CN115766817A (zh) 作业机械远程控制系统、方法、作业机械及电子设备
JP6454524B2 (ja) エリア監視センサ
CN218780297U (zh) 一种钻机设备
JP2020170293A (ja) 画像表示方法及び遠隔操縦システム
CN117590779B (zh) 工程机械设备的智能辅助作业方法及装置
CN121547566A (zh) 基于激光点云的掘进截割面视频拼接方法

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240109

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: E21C 35/24 20060101AFI20250715BHEP

INTG Intention to grant announced

Effective date: 20250808

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: F10

Free format text: ST27 STATUS EVENT CODE: U-0-0-F10-F00 (AS PROVIDED BY THE NATIONAL OFFICE)

Effective date: 20251210

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602022026674

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

U01 Request for unitary effect filed

Effective date: 20251210

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT RO SE SI

Effective date: 20251216