DK201800123A1 - Augmented Reality Maintenance System - Google Patents
Augmented Reality Maintenance System Download PDFInfo
- Publication number
- DK201800123A1 DK201800123A1 DKPA201800123A DKPA201800123A DK201800123A1 DK 201800123 A1 DK201800123 A1 DK 201800123A1 DK PA201800123 A DKPA201800123 A DK PA201800123A DK PA201800123 A DKPA201800123 A DK PA201800123A DK 201800123 A1 DK201800123 A1 DK 201800123A1
- Authority
- DK
- Denmark
- Prior art keywords
- user device
- maintenance
- user
- augmented
- data processing
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Disclosed herein are various aspects of an Augmented Reality maintenance system.
Description
The present invention relates to an Augmented Reality Maintenance System.
BACKGROUND
In many application domains such as offshore, manufacturing and healthcare, inspection is an important step in daily operation as well as other types of maintenance and service.
For example, many industrial facilities, such as factories, off-shore installations such as drilling rigs, oil production platforms, vessels such as ships, airplanes, etc., large machines, etc. require regular inspection and other types of maintenance. Also other types of work sites, such as healthcare facilities may require inspection or other types of maintenance tasks to be performed.
Generally the term maintenance task refers to inspection tasks as well as tasks that require repair or other types of manipulation of an item, e.g. a partial disassembly of an item so as to add lubricants, replace parts, etc.
Maintenance tasks often follow an organized examination of objects to be inspected/maintained in accordance with specified guidelines which can be performed regularly, or at specific intervals: for instance, a scheduled inspection of a drilling rig for maintenance purposes. The technician may also need to perform some checks, and is expected to accurately record the observations on site. These inspection guidelines, and related information are typically well specified by application engineers. However, the data is often not accessible where it is really needed i.e. on the site of the inspection itself. For example, in a maintenance situation, a number of inspection checks are performed before the operation is re-commissioned and the technician gets only limited time to complete the applicable task.
It is thus describable that maintenance personnel have immediate access to reference materials, documentation, checklists, and the like. Often, repairs and other maintenance activities must be performed in small areas, having limited space for storage of such materials.
In addition, many maintenance or repair tasks require the use of both hands, eliminating the capability of one person to simultaneously perform the task and carry all required materials. Generally, two personnel are needed to perform a maintenance task, so that one person may complete the task while a second person carries the required materials to do so.
DK 2018 00123 A1
Augmented-reality based maintenance systems have been proposed in order to limit the amount of material the user is required to carry, and to allow the user easy, preferably hands-free access to the object requiring maintenance.
In an Augmented-Reality (AR) system, a live view of a real-word environment is overlaid with generated content such as sound, text and graphics, etc. The live view may be viewed directly by a user or may be integrated with the generated content and presented to the user. This is in contrast with Virtual Reality (VR) systems in which all visual sensory input and some or all audible sensory input is generated.
The AR environment may be viewed through conventional fixed displays viewed at a distance, portable displays, or semi-immersive to fully immersive wearable displays such as head-mounted displays, eyeglasses, contact lenses, and the like. An AR user experience may be enhanced by tracking the movement and orientation of the display device, thus allowing a shifting view of the real-world environment to be accompanied by AR content kept in correct position and orientation with respect to the real-world view.
In addition, an AR system may allow the user to interact with the generated AR content such as by manipulating generated elements, showing or hiding individual generated elements, and the like. An AR system also may allow the user to add generated elements such as drawings or text annotations to the AR environment.
Performing maintenance tasks of work sites poses additional challenges:
Work sites are often large and complex structures including many individual items that require maintenance. Many of the items are similar or even identical with each other but still require individual maintenance. Moreover the maintenance for the individual item often needs to be verifiable.
Work sites often change over time as items may be added, moved, altered, etc. Moreover, many items, e.g. machines, may move between different special configurations during normal operation.
Work sites are often constructed from steel or other materials that create a difficult environment for wireless data communication.
Work sites often involve a rough environment, as items may be exposed to varying temperatures, dirt, dust, humidity, chemicals, vibrations, etc.
Visibility in many work sites may be impaired by e.g. bad lightning conditions, steam, smoke, a generally cluttered environment, etc.
Work sites may include hazardous areas, e.g. due to the risk of dropped objects, chemicals, working at heights, slippery floors, confined spaces, etc.
DK 2018 00123 A1
A specific example of work sites where some or all of the above apply includes offshore or onshore facilities for the exploration or exploitation of oil fields. For the purpose of the present descriptions these facilities will be referred to as oil rigs.
The various aspects of the present disclosure are directed to overcoming one or more limitations or other problems in the art or at least to provide a useful alternative.
SUMMARY
Generally, an Augmented Reality maintenance system comprises functionality for assisting a user, e.g. an inspector, by means of an Augmented Reality enabled user device. The augmented reality enabled user device is configured to provide an augmented reality user interface, e.g. using AR glasses or a camera enabled tablet computer. Depending on the use case, the user of the user device may be a maintenance technician, an engineer, a security guard, a doctor, or a nurse or the like.
Aspect 1 - use of additional sensor data
US 2013/0010068 discloses an augmented reality system which collects a first image of a three-dimensional environment with a camera; automatically identifies a situation; automatically determines an action to be performed from the identified situation; performs the determined action or guides a user to perform the action; collects a second image of the three-dimensional environment with the camera; and determines a response to the performed action. It may be difficult to reliably determine appropriate completion of the action which may lead to erroneous determinations. This in turn may result in system failure or potentially hazardous situations.
According to one aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a data processing system and a portable augmentedreality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
display generated content overlaid a live view of a part of a physical environment, the part ofthe physical environment including a target item on which a maintenance task is to be performed, the generated content being indicative of the maintenance task;
receive, via a wireless or wired data communication interface, a sensor signal from a sensor external to the user device, the sensor signal being indicative of an operational condition ofthe target item;
determine, at least in part based on the received sensor signal, that the maintenance task has been completed;
DK 2018 00123 A1 responsive to said determination, automatically record completion of the maintenance task.
Hence, as the determination that the maintenance task is completed is based at least in part on external sensor signals, i.e. not only on a images recorded by a camera of the maintenance system, a more reliable determination may be performed. Hence, the item to be maintained is not required to comprise any visual indicators that indicate the operation state. Moreover, the determination may also be made in poor lightning conditions or when visibility is otherwise impaired.
For example, when the maintenance task was initiated because a pressure sensor of the work site has detected an operation temperature inside a vessel, pipe or the like to be outside of a predetermined target range, the completion of the maintenance task may be determined based on the pressure sensor again detection a pressure within the target range.
In some embodiments the determination that the maintenance task has been completed is based on a combination of the sensor signal and additional information. For example, the additional information may include the user providing an input, e.g. by interacting with the displayed generated content, such as by checking a check box or otherwise indicating the completion of the task. Alternatively or additionally the additional information may include position information of the AR enabled user device, e.g. by detecting that the AR enabled user device has been in a predetermined proximity of the target item on which the maintenance task had to be performed. Yet further alternatively or additionally, the additional information may include one or more images captured by the AR enabled device while the user performs the maintenance task, e.g. as described below.
According to another aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a data processing system and a portable augmentedreality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
display generated content overlaid a view of a part of a physical environment, the part of the physical environment including a target item on which a maintenance task is to be performed, the generated content being indicative of the maintenance task;
capture one or more images of the target item while a user performs the maintenance task;
determine, at least in part based on the captured images, that the maintenance task is completed;
DK 2018 00123 A1 responsive to said determination, automatically record completion of the maintenance task.
Hence, as the system bases the determination on an actual recognition that the required maintenance task is being performed, a more reliable determination of the actual performance of the maintenance task is possible rather than a mere detection that secondary symptoms that had led to the initiation of the maintenance task have now disappeared.
For example, the maintenance system may be configured to detect a gesture, movement or other action that is associated with the performance of the maintenance task, e.g. the turning of a knob, a valve or the like. Alternatively or additionally, the maintenance system may recognize a predetermined tool or instrument used for the performance of the maintenance task. The gesture, tool, instrument, action, or the like may be recognized per se or in a predetermined spatial association with the target item, e.g. in a predetermined proximity or relative position. Yet further, the maintenance system may determine completion of the maintenance task only based on the recognition of a predetermined plurality, e.g. a predetermined sequence - of multiple gestures, tools, instruments, actions and/or the like.
The sensor signal may be received by the AR enabled user device directly from the sensor, e.g. via a short-range wireless connection or via a wired connection. Alternatively or additionally, the data processing system may receive the sensor signal e.g. from one or more sensors distributed across the work site and communicatively connectable via a wired or wireless communications network to the data processing system. The data processing system or the AR enabled user device may receive the sensor signal directly from the sensor or from another processing system such as from a Supervisory control and data acquisition (SCADA) system which in turn receives the signals from the sensors.
In one embodiment the maintenance system includes or is connected to a work site control system (e.g. an loT/Big data system), e.g. a data system collecting data from multiple sensors on within the work site, such as more 100 sensors, e.g. more than 1000 sensor, e.g. more than 10000 sensors. The data processing system of the maintenance system may thus verify the maintenance task based on data from or about the user device and based on the data from the sensor. In other words, the sensor is connected to a server such as a server different from the data processing system, rather than being connected to the user device.
Examples of sensors include a pressure sensor, a temperature sensor, a voltmeter, a gas sensor, a flow sensor, a particle sensor, etc.
DK 2018 00123 A1
According to one aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a data processing system and a portable augmentedreality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
detect a current position of the AR enabled user device receive a sensor signal from one or more user-worn sensors for sensing a physical property ofthe environment log sensor data indicative of the received sensor signal in association with the detected current position.
The user-worn sensors may be integrated into the AR enabled user device or communicatively coupled to the AR enabled user device, e.g. via a short range wireless connection, e.g. based on RF communication, or via a wired connection.
Examples of user-worn sensors include a pressure sensor, a voltmeter, a gas sensor, a chemical sensor, a thermal camera/thermal sensor e.g. IR sensors optionally with a aiming laser, a radioactivity sensor, a p article sensors e.g. based on light scattering allowing localization of the source of smoke or the like, a ranging device, a spatial microphone to localize the source or noises (optionally with a user interface for zooming in on particular frequencies or particular noises).
In some embodiments, sensor data from user-worn sensors or sensors that are installed as part of the work site may also be used by the maintenance system to initiate a maintenance task, e.g. by adding the maintenance task to a check list or work order. Examples include an abnormal sensor reading causing the maintenance system to a to be check maintenance task to be added to a check list. Another example includes an output from a data analytics module of a SCADA or similar system to indicate a condition that requires checking (e.g. based on multiple sensor data or based on the development of sensor data over time, etc.) The maintenance system may thus create predictive maintenance flags indicating items to be checked/replaced or otherwise maintained. The added items on a checklist may be indicated to the user via the AR enabled user device, e.g. by highlighting the target item and/or the maintenance task when the AR enabled user device is in a proximity of the target device and/or by providing AR content directing the user towards the item to be checked, e.g. as described in US 2012/0249588.
Aspect 2: interaction with separate display
EP 2 942 717 discloses an AR maintenance system where the user wears a headmounted AR enabled device.
DK 2018 00123 A1
However the display of complex information on a head-mounted display may be difficult to read, e.g. when the physical environment is complex, includes varying lighting conditions etc.
According to one aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a data processing system and a portable augmentedreality enabled user device, e.g. a head-mounted device, communicatively connectable to the data processing system; wherein the user device is configured to:
display generated content overlaid a live view of the physical environment of the user;
detect a display of an associated data processing device in a field of view of the head-mounted device;
responsive to said detection, enter a reading mode in which the amount of displayed generated content is at least reduced or completely suppressed.
Accordingly, the user may have access to viewing additional information on a separate data processing device different from the user device, without or only with limited visual impairment by the generated content displayed by the user device. For example, the user may use a tablet for easier study of information e.g. drawings or documentation and/or for easier text input.
The associated data processing device may be a portable data processing device, such as a tablet computer, a smart phone, a laptop computer, or the like.
The detection of the display may e.g. be based on an AR tag or QR code attached to the associated data processing device, on feature detection, and/or on any other computer vision technology. Alternatively, or additionally, the detection of the device may be done by a signal from the associated data processing device, e.g. an RF signal, an IR signal or the like. The signal may e.g. be sent responsive to a user input to the associated data processing device.
The associated processing device may have access to similar information as the user device. In some embodiments, the associated data processing device may have access to additional information.
In some embodiments, the associated processing device may be communicatively coupled to the user device, e.g. via a short-range RF connection or another wireless connection, or via a wired connection. Hence, the user device and the associated processing device may be configured to exchange information, e.g. information about the current position, the current maintenance task etc.
DK 2018 00123 A1
In some embodiments, during the reading mode, the user device may cause the associated data processing device to display some or all of the generated content which otherwise would be displayed by the user device overlaid the live view of the environment.
In some embodiments, the user may select certain content to be displayed on the associated device, e.g. by a corresponding hand gesture.
Aspect 3: safety aspects
EP 2 942 717 discloses an AR maintenance system where the user wears a headmounted AR enabled device.
However, the display of generated content overlaid a live view of the physical environment may distract from the physical environment. Such distraction may be dangerous in certain situations.
According to one aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a portable augmented-reality enabled user device, e.g. a head-mounted device, configured to display generated content overlaid a live view of the physical environment of the user; wherein the user device is configured to:
detect a potentially hazardous situation;
responsive to said detection, automatically enter a safe mode during which the displayed generated content is at least reduced or completely suppressed.
Accordingly, potential risks related to the operation of the maintenance system in potentially hazardous environments are reduced.
In some embodiments, entering the safe mode may include one or more of the following: displaying a warning, playing an audible warning sound, displaying information directing the attention of the user towards the source of danger, turning off the display, etc.
In some embodiments, the user device may be configured, when operated in safe mode, to display evacuation directions or otherwise directing the user towards safety or less hazardous surroundings.
The detection of a potentially hazardous situation may be based on a variety of mechanisms, e.g. one or more of the following:
a signal from a user worn detector as described herein; e.g. a signal from a gas detector, a radioactivity detector, a thermal detector, a proximity detector, etc;
DK 2018 00123 A1 a signal from a server computer (e.g. a control system of the work site, a SCADA system, etc.) indicating malfunction or another hazardous situation within the work site (e.g. based on sensor inputs as described herein);
camera vision / object recognition configured to detect potentially hazardous situations;
a current position, e.g. when the user device detects that it enters into a zone of the work site that is marked as a danger zone; these markings may be dynamically updated in the data processing system of the maintenance system.
In some embodiments, the user device may signal the detected hazardous situation to the data processing system of the maintenance system. Accordingly, other users may be informed and/or certain operations of the work site may be stopped, reduced, etc.
Aspect 4: update 3D model / detect differences / Ul for creation of points of interest
US9846972B2 discloses an AR maintenance system including an authoring tool for configuring workflows. The system further uses vision technology to match a captured point cloud with a stored CAD model.
However, many sites undergo changes due to the normal operation as equipment may be moved, added, altered, or removed. Hence, a stored CAD model may not accurately reflect the real-world environment.
According to one aspect, disclosed herein are embodiments of an augmented-reality maintenance system comprising a data processing system and a portable augmentedreality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
create, by the user device, a digital 3D model representation of the physical environment within a field of view of the user device;
maintain, by the data processing system, a digital 3D model of a work site based on received digital 3D models from one or more user devices.
Creating the 3D model may be based on one or more images of the physical environment by a camera of the user device. The camera may e.g. be a depth camera thus allowing to obtain depth data from a depth image captured by the depth camera. The creation of the 3D model may be based on known computer vision techniques, e.g. including structure from motion or the like such as techniques implemented in a suitable object reconstruction pipeline. The 3D model may e.g. be a surface model of items within the field of view of the user device, e.g. in the form of a mesh representation.
DK 2018 00123 A1
In at least some embodiments, the user device may create 3D models of parts of the work sites as the user moves about the work site. In some embodiments the creation of 3D models may be performed automatically, e.g. continuously, as the user moves about the work site. Alternatively, the creation of a 3D model may be performed responsive to a user input. For example the user may view a part of the work site from which a 3D model is to be created.
In particular, this may be useful when the user whishes to create a new point of interest. The point of interest may thus be represented by a macroposition and a microposition. For example, the macroposition may be the position of the user device when the point of interest is created, or of another suitable anchor point, e.g. a reference position defining the position of a part of the work site, e.g. of a machine or other object. The microlocation may be a location relative to the anchor point, e.g. a position relative to the position of the machine or object. For example the microposition may represent the position of a display of the machine relative to a reference position of the machine as a whole. In some embodiments a macroposition may be defined relative to a global coordinate system of the work site, while the microposition may be defined relative to a local coordinate system of a machine or other object (the position of an origin of the local coordinate system relative to the global coordinate system may thus be the macroposition).
The user device may transfer the created 3D models to the data processing system of the maintenance system. The data processing system may use the 3D models to create a consolidated 3D model of the entire work site or at least of parts thereof.
In particular, the data processing system may use the received 3D models to maintain an updated 3D model of the work site, as changes to the work site may be represented in the consolidated 3D model.
Alternatively or additionally, the data processing system may use the received 3D models to detect changes in the work site, e.g. new objects having been placed, objects having been removed objects having been moved, etc. Some of these detected changes may trigger the data processing system to automatically create a point of interest and/or a maintenance task to be performed, e.g. an inspection of an object that has appeared, as this object may have been left unintentionally or as this object may represent a security risk, etc.
Hence, in some embodiments, the maintenance system may be configured to detect changes to the physical environment based on the created 3D model, e.g. by comparing a created 3D model with a reference model, such as a previously created 3D model. The system may further be configured to identify changes that require attention, such as dropped objects, activities without work permit, etc. The system may automatically
DK 2018 00123 A1 create points of interest and/or maintenance tasks responsive to identifying changes that require attention.
In some embodiments, the user device is configured to create a new point of interest associated to a user-selected part of the created 3D representation.
It will be appreciated that at least some embodiments of an AR maintenance system may combine two or more, e.g. all of the above aspects.
The present disclosure relates to different aspects including the various aspects of an augmented reality maintenance system described above and in the following, corresponding apparatus, systems, methods, and/or products, each yielding one or more of the benefits and advantages described in connection with the first mentioned aspects, and each having one or more embodiments corresponding to the embodiments described in connection with the first mentioned aspect and/or disclosed in the appended claims.
In particular, the present disclosure further relates to a computer-implemented methods performed by a maintenance system as described herein.
The present disclosure further relates to a data processing system for a maintenance system as described herein and configured to perform the steps of an embodiment of one or more of the methods disclosed herein. The present disclosure further relates to an augmented reality enabled user device for an AR maintenance system as described herein.
To this end, the data processing system and the user device may each comprise or be connectable to a computer-readable medium from which a computer program can be loaded into a processor, such as a CPU, for execution. The computer-readable medium may thus have stored thereon program code means adapted to cause, when executed on the data processing system/user device, the data processing system/user device to perform the steps of a method described herein. The data processing system may comprise a suitably programmed computer such as a portable computer, a tablet computer, a smartphone, a PDA or another programmable computing device having a graphical user-interface. In some embodiments, the data processing system may include a client system, e.g. including a user interface, and a host or server system. The client and the host system may be connected via a suitable communications network.
Generally, here and in the following the term processor is intended to comprise any circuit and/or device and/or system suitably adapted to perform the functions described herein. In particular, the above term comprises general- or special-purpose programmable microprocessors, such as a central processing unit (CPU) of a computer or other data processing system, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Programmable Logic Arrays (PLA), Field Programmable Gate
DK 2018 00123 A1
Arrays (FPGA), special purpose electronic circuits, etc., or a combination thereof. The processor may be implemented as a plurality of processing units.
The present disclosure further relates to a computer program product comprising program code means adapted to cause, when executed on a data processing system, said data processing system to perform the steps of one or more of the methods described herein.
The computer program product may be provided as a computer-readable medium, such as a CD-ROM, DVD, optical disc, memory card, flash memory, magnetic storage device, floppy disk, hard disk, etc..
Additional features and advantages will be made apparent from the following detailed description of embodiments that proceeds with reference to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Preferred embodiments of the invention will be described in more detail in connection with the appended drawings, where
FIG. 1 schematically illustrates an embodiment of a maintenance system disclosed herein.
FIG. 2 schematically illustrates a more detailed view of an embodiment of a maintenance system disclosed herein.
FIG. 3 schematically illustrates another embodiment of a maintenance system disclosed herein.
FIG. 4 schematically illustrates another embodiment of a maintenance system disclosed herein.
FIG. 5 schematically illustrates yet another embodiment of a maintenance system disclosed herein.
FIG. 6 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein.
FIG. 7 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein.
FIG. 8 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein.
FIG. 9 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein.
DK 2018 00123 A1
DETAILED DESCRIPTION
FIG. 1 schematically illustrates an embodiment of a system disclosed herein. The system comprises an AR enabled user device 100 and a data processing system 110 and, optionally, a database 120. Alternatively, the data processing system 110 may comprise or have access to another suitable data storage for storing data pertaining to the work site and/or maintenance tasks. While only one user device 100 is shown in FIG. 1, it will be appreciated that a typical embodiment will comprise multiple user devices.
Generally, some embodiments of the maintenance system described herein may provide a variety of maintenance assistance features for use during performance of maintenance activities. In certain embodiments, the AR enabled user device is a wearable device, e.g. a head-mounted device, worn by a user that is tasked with maintenance tasks associated with the work site, thus allowing hands-free operation. The AR enabled user device may be configured to determine a current position of the user wearing the device. To this end, the AR enabled user device 100 may be configured to scan and analyze a visual field or to use another means of position detection.
The user device 100 may e.g. be a head-mounted AR enabled display device communicatively coupled to the data processing system 110. The user device is preferable a set of AR glasses or a head-mounted device with a gesture interface. Using the hands-free interface, the technician may send his inputs (images, text a.o.) to the data processing system 110 for archiving and reporting purposes. The AR enabled user device 100 may be a user device that provides the user with direct view of the environment, e.g. through glasses, a transparent display, or the like. Alternatively, the user device may be configured to integrate capture a live view, e.g. by a camera, to integrate the captured live view with the generated content and to present the integrated view to the user. Examples of AR enabled user devices include a headmounted display, such as a Microsoft HoloLens, a tablet computer such as an iOS tablet using AR.Kit for tracking 3D space or an Android tablet using ARCorc for tracking 3D space, ODG AR glasses or other suitable AR hardware.
The AR enabled user device 100 may further be configured to scan and analyze a visual field so as to detect a target item or other point of interest within the visual field. Generally, the detection ofthe current position and/or the detection and/or recognition of target items or other points of interest may be based on any suitable computer vision technology, e.g. based on the detection of AR markers based on object recognition, feature detection, etc.
Generally, the AR enabled user device 100 may be configured to capture one or more images of the physical environment, detect one or more events or operational conditions (e.g. using image processing or other computer vision techniques), and log the detected event or operational condition, optionally associated with the detected
DK 2018 00123 A1 position. Examples of detectable conditions may include a detection of a door being open or a knob being in the wrong position. The detected event or condition may be correlated with the operational mode of the work site, as some conditions/events may be acceptable during some operational modes but not in others.
The AR enabled user device 100 may detect the current position of the user device (i.e. of the user carrying/wearing the device). The detection of the current position of the AR enabled user device within the work site may be based on one or more images captured by the AR enabled user device, e.g. based on the recognition of one or more AR tags, on feature recognition, etc. Alternatively or additionally, the detection of a current position of the AR enabled device may be based on motion sensors, e.g. accelerometer, or based on another suitable positioning technique.
The AR enabled user device 100 may provide generated content overlaid to a live view of the environment, e.g. so as to provide position-sensitive information to the user. The information may include one or more of the following:
travel instructions to a target item to be maintained;
positioning or movement instructions for the user to locate the target apparatus in the visual field;
information pertaining to the target item, such as technical data, instructions about the maintenance task to be performed, such as check-lists, animated 3D models, videos, parts of a user manual, etc.
The AR enabled user device 100 may further allow the user to interact with the generated AR content, e.g. so as to select parts of the AR content, enter data, mark items on a check list as being complete, create additional maintenance task, etc.
The AR enabled user device 100 may allow the user to interact with the generated content in a variety of ways, e.g. by gesture detection, voice input, gaze detection, etc.
For example, the user may request repair instructions or request a clarification of certain details. The user may relay a task status for recording/logging, create a new point of interest and/or a new maintenance task, etc.
The AR enabled user device 100 may comprise a suitable data storage medium for storing a client software and a processor, e.g. a CPU, for executing the client software. The client software may provide the AR functionality including functionality for presenting generated content overlaid a live view of the environment. The client software may further provide functionality for providing a visual interface for interacting with points of interest, e.g. for checking checklist, filling out data input fields, reading data output fields, viewing a log associated with a point of interest, and/or the like.
DK 2018 00123 A1
The data processing system 110 may be a server computer. The data processing system may be implemented as a suitably programmed computer or as a plurality of suitably programmed computers, e.g. one or more server computers and one or more client computers, as a cloud-based architecture, as a virtual machine or in any other suitable hardware architecture.
Generally, some embodiments of the maintenance system maintain a digital model of at least a part of the work site, e.g. a 2D or 3D model. The digital model may be stored by the data processing system, e.g. in a database 120, a file or another suitable form on a suitable data storage medium. The digital model may be a CAD model or another suitable 3D model, e.g. comprising a surface representation, e.g. a mesh representation, or a volume representation, e.g. a voxel representation, of elements of the work site.
The digital model may be used by an authoring tool to associate maintenance tasks with respective target items of the work site.
The digital model of the work site may be represented or obtained in a variety of ways, e.g. as a 2D drawing of a work site, e.g. in PDF, PNG or JPEG format;
a CAD drawing or CAD model of a work site;
a polygon model of a work site, e.g. in FBX or OBJ format;
a 3D model obtained through photogrammetry;
a 3D point cloud scanning of site by e.g. a 3D camera scan or by 3D laser scanning;
It will be appreciated that a model may be converted by the system into a different format than the original e.g. into an FBX or OBJ format.
The data processing system 110 may have stored information relating the work site, e.g. in a suitable database 120 or other suitable data structure. Such information may include one or more of the following:
a digital model or other suitable digital representation of the work site, historical information of changes to the work site, points of interest associated with the work site, user information.
Some or all of the above information may be transferred to one or more AR enabled user devices 100 and/or to other data processing systems.
DK 2018 00123 A1
User information may include one or more of the following: a user name, a rank, a role, contact information such as an e-mail address, authentication information such as a password, and/or the like. For example, the maintenance system may be communicatively coupled or otherwise be integrated with a user management system, such as Active Directory.
The data processing system 110 of the maintenance system may be configured to create data items indicative of lists of maintenance tasks to be performed by a user and to send information indicative of the maintenance tasks to the AR enabled user device of the user. Accordingly, the user device may provide guidance to the user pertaining to the maintenance tasks to be performed. This guidance may include information about which maintenance tasks to perform, where to perform them and how to perform them.
The creation of lists of maintenance tasks may be based on predefined maintenance tasks; however, the lists of maintenance tasks may be dynamic, e.g. based on sensor inputs as described herein, based on previous maintenance tasks (e.g. based on a history of previous checks) performed and/or based on results of previous maintenance tasks.
For example, the system may provide functionality that allows a user, e.g. via the AR enabled user device, to define a subsequent maintenance task to be performed by another user.
In another use scenario where a large number of identical or similar items are to be inspected (e.g. a large number of bolts), the system may select randomized subsets of these items and include them in respective lists of maintenance tasks, so as to spread out evenly over time.
At least some embodiments of the maintenance system described herein may provide various benefits, e.g. one or more of the following:
The system may be configured to indicate target items or other object(s) of interest overlaid the view of the object.
The system may be configured to document that the inspector has spent time physically at the object.
The system may be configured to provide the inspector with checklists to fill out and/or with status information for the object.
The system may be configured to provide instructions and allow the inspector to access documentation or other information regarding the object such as drawings, manuals, medical history or the like.
DK 2018 00123 A1
The system may be configured to provide real-time in situ assistance to the technician/inspector in the form of instructions and live data readouts of relevant data e.g. from sensors.
The system may be configured to provide intelligent recommendations to the technician based on visual recognition and deep learning algorithms.
The system may be configured to allow the inspector to setup new data/checks relevant for the object.
The system may be configured to allow the inspector to define new objects of interest by setting a new record on top of the view of the real object.
At least some embodiments of the maintenance system provide functionality for defining points of interests to which maintenance tasks may be associated. The functionality may provide a user interface allowing a user to create, edit and/or delete points of interest. Generally a point of interest may be indicative of a target item positioned within the work site on which a maintenance task is to be performed. For example, a point of interest may represent: a display on an apparatus, such as a pump, that needs to be read, a piece of equipment that needs visual inspection, a door that needs to be checked if it is locked or unlocked, a patient that needs to be check up on and/or the like.
The defined points of interest may be stored as respective data items in a suitable data storage structure, e.g. as respective database records of a database.
A point of interest may include some or all of the following information:
an identifier, preferably a unique identifier allowing the point of interest to be uniquely identified within an work site;
information indicative of a relation of the point of interest to the work site; an name of the pint of interest, such as door, pump display, bolt, etc.; position information indicative of a position of the point of interest within the work site; for example, the position information may include position coordinates and, optionally, rotation coordinates, relative to a suitable coordinate system, e.g. relative to a 3D model; the position information may include an association to an anchor point defined within the work site (or relative to a digital model thereof), e.g. a relative position and/or rotation relative to an anchor point. For example, the anchor point may define a stationary or movable position of a target item within the work site.
additional information, e.g. as described below.
DK 2018 00123 A1
Generally, the functionality for creating a point of interest may be provided by the data processing system 110, e.g. using a suitable authoring tool. The authoring tool may allow the creation of points of interest relative to a suitable digital model or suitable representation of the work site. Alternatively or additionally, the AR enabled user device 100 may provide functionality that allows a user to create points of interest relative to the current physical environment of the user device. For example, this may be done using the AR enabled visual interface provided by the user device, e.g. as described below. For example, a user may create a new point of interest by simply looking at the item and identifying the corresponding coordinates as a new point of interest.
At least some embodiments of the maintenance system further provide functionality for manually entering and/or for automatically receiving additional information associated with a point of interest, and for associating the entered information with said point of interest. Examples of such additional information may include a description of a maintenance task, a checklist, media content, a log of interactions with or changes to the point of interest and/or the like. Such additional information may be entered to or received by the system in a variety of ways, such as via suitable data input fields e.g. for entering text, numerical values, etc. or for uploading images, video etc. Alternatively or additionally, the system may receive real-time information from a database, e.g. of a SCADA system, and/or historical information from a database.
The system may provide functionality to output additional information associated with a point of interest. For example, the system may output such additional information via the AR enabled user device, e.g. responsive to the user device being in a proximity of the position of a point of interest such as when the point of interest is within the current field of view of the user device. Examples of additional information presented by the system may include, text, check lists, videos, or other media content, a graph based on historical data, output of a prediction model based on current and historical data, etc.
To this end, the maintenance system may comprise a database 120 or other data structure for storing or identifying information to be transferred between the data processing system and the AR enabled user devices.
Examples of information to be transferred may include one or more of the following:
data pertaining to one or more points of interest, e.g. of all points of interest associated with an work site or with a part of the work site, or a selected subset of points of interest, e.g. the points of interest associated with a list of maintenance tasks to be performed by a user, etc.;
user information;
DK 2018 00123 A1 information pertaining to interactions with or to changes to points of interest, such as the registration of an inspection and/or other input/output data to be that is to be output by the user device or that has been input to the user device.
FIG. 2 schematically illustrates a more detailed view of a user device and a data processing system of an embodiment of a maintenance system disclosed herein, e.g. of any of the embodiments of FIGs. 1, 3-5 or of another embodiment. The system comprises an AR enabled user device 100 and a data processing system 110 and, optionally, a database 120, all as described in connection with FIG. 1. While only one user device 100 is shown in FIG. 2, it will be appreciated that a typical embodiment will comprise multiple user devices.
The user device 100 comprises a location identification module 101, an image capture module 102, a processor 103, a display module 104, a memory module 105 and a communications module 106.
The location identification module 101 comprises circuitry, devices and/or computer programs operable to determine a current location of the user device within a work site. The location identification may be based on signals from a communications network, e.g. Wifi, on a global position system, on the visual detection of AR markers, Q.R codes within the physical environment of the user device, on the detection of RFID/NFC tags, or on any other suitable mechanism for detecting the current position of the user device within the work site.
The image capture module 102 may comprise a digital camera or another suitable device for capturing images or videos of the physical environment of the user device, e.g. still images and/or videos. The image capture device may comprise a depth camera, a stereo camera or another device operable to capture depth information in addition or alternative to 2D image information.
The processor 103 may comprise a general- or special-purpose programmable microprocessors, such as a central processing unit (CPU). The location identification module, the image capture module, the display module, the memory module and the communications module may be communicatively coupled to the processor. In some embodiments one or more of the modules may be integrated into a single module.
The display module 104 comprises circuitry and/or a device for displaying generated content, such as graphics, text, media content overlaid a live view of the physical environment.
The memory module 105 may comprise a suitable computer-readable medium operable to store data and computer program code to be executed by the processor. The stored data may include information about the work site, e.g. a 3D model or other
DK 2018 00123 A1 representation of the work site, and/or information about maintenance tasks to be performed and/or data input by the user.
The communications module 106 comprises circuitry or a device for communicating with a data processing system and/or other devices, e.g. via RF communications, such as short range RF communications or via a cellular communications network. For example the communications module may include a network adapter or other interface for establishing communication via a wireless communications network, such as a WLAN or via a wired network.
The data processing system comprises a user interface module 111, an interface module 102, a processor 113, a database module 114 and a memory module 115.
The user interface module 111 may comprise one or more user terminals or other device allowing a user to enter information and to view information.
The interface module 112 comprises circuitry or a device for communicating with one or more user devices and/or other devices, such as other data processing system, e.g. a site control system, a user management system a SCADA system, etc. The communication may be wired or wireless, e.g. via RF communications, such as short range RF communications or via a cellular communications network. For example the interface module may include a network adapter or other interface for establishing communication via a wireless communications network, such as a WLAN or via a wired network.
The processor 113 may comprise a general- or special-purpose programmable microprocessors, such as a central processing unit (CPU). The user interface module, the interface module, the database module and the memory module may be communicatively coupled to the processor. In some embodiments one or more of the modules may be integrated into a single module.
The database module 114 is configured to provide an interface to a suitable database 120.
The memory module 115 may comprise suitable data storage media for storing information pertaining to the work site and/or maintenance tasks to be performed, such as information about points of interest, a 3D model or other digital representation of the work site, work orders, lists of maintenance tasks to be performed, log data about performed maintenance tasks, etc. Moreover, the memory module may comprise a suitable storage media for storing one or more computer programs to be executed by the processor 113. The computer programs may include an authoring tool and/or other software applications for providing the functionality described herein. Some or all of the functionality may be implemented to be executed in a variety of client and/or server
DK 2018 00123 A1 execution environments, e.g. on a Windows PC, a Mac, an iOS tablet, an Android tablet, etc.
FIG. 3 schematically illustrates another embodiment of a maintenance system disclosed herein. The system comprises an AR enabled user device 100 and a data processing system 110 and, optionally, a database 120, all as described in connection with FIGs. 1-2. While only one user device 100 is shown in FIG. 3, it will be appreciated that a typical embodiment will comprise multiple user devices.
The system of FIG. 3 further comprises a tablet computer 130 or other portable data processing device, such as a laptop, smartphone etc. The tablet computer may be communicatively connected to the AR enabled user device 100, e.g. via a short-range RF link, so as to allow data to be exchanged between the user device 100 and the tablet computer 130.
Preferably, the user device detects when the user looks at the display of the tablet computer 130 and, responsive to such detection, enters a reading mode. In reading mode, the user device may reduce the amount of generated AR content displayed overlaid the live view of the environment, or completely suppress display of generated AR content. This way, the user's view on the tablet's display is not (or at least less) impaired.
The user device 100 may detect that the user looks at the display of the tablet 130 in a variety of ways. For example, the detection may e.g. be based on an AR tag or Q.R code attached to the tablet, on feature detection, and/or on any other computer vision technology. Alternatively, or additionally, the detection of the tablet may be done by a signal from the associated data processing device, e.g. an RF signal, an IR signal or the like. The signal may e.g. be sent responsive to a user input to the associated data processing device.
FIG. 4 schematically illustrates another embodiment of a maintenance system disclosed herein. The system comprises an AR enabled user device 100 and a data processing system 110 and, optionally, a database 120, all as described in connection with FIGs. 1-2. While only one user device 100 is shown in FIG. 4, it will be appreciated that a typical embodiment will comprise multiple user devices.
The system of FIG. 4 further comprises a sensor 140 communicatively coupled to, or even integrated into, the user device. The sensor may be a sensor installed within the work site or a user-worn sensor. The sensor may be communicatively coupled to the user device via a wired or wireless connection, e.g. via a short-range RF connection or via a communications network. While only one sensor 130 is shown in FIG. 4, it will be appreciated that some embodiments may comprise multiple sensors. It will further be appreciated that some embodiments may additionally include a tablet computer or
DK 2018 00123 A1 similar portable processing device as described in connection with FIG. 3. Moreover, in addition, or alternatively to, a sensor communicating directly with the user device, some embodiments may include one or more sensors that communicate with the data processing system 110, either directly or via another system, e.g. as described with reference to FIG. 5 below.
FIG. 5 schematically illustrates yet another embodiment of a maintenance system disclosed herein. The system comprises an AR enabled user device 100 and a data processing system 110 and, optionally, a database 120, all as described in connection with FIGs. 1-2. While only one user device 100 is shown in FIG. 5, it will be appreciated that a typical embodiment will comprise multiple user devices.
The system of FIG. 5 further comprises a number of sensors 151 communicatively coupled to, or even integrated into, a work site control system 150. The sensors may be communicatively coupled to the control system 150 via wired or wireless connections, e.g. via a short-range RF connection or via a communications network. The work site control system may be a SCADA system or other suitable computerised control system. The control system 150 is communicatively coupled to, or even integrated into, the maintenance system, e.g. via a suitable computer network. The control system receives sensor data from the sensors, processes the received sensor data, e.g. to record trends, log data, detect abnormalities, etc. The control system may forward the sensor data and/or processed results pertaining to the sensor data to the maintenance system. The maintenance system may detect situations that require attention and create corresponding maintenance tasks as described herein. This may be done automatically or in a user assisted manner.
FIG. 6 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein, e.g. by the maintenance system described in connection with FIGs. 4 or 5.
In step S61, the user device displays AR content overlaid a live view of the physical environment. The displayed AR content may include instructions or other information pertaining to a maintenance task to be performed by a user wearing or carrying the user device, e.g. a maintenance task to be performed on a target item within a field of view of the user. For example, the maintenance task may include adjusting a valve or dial so as to ensure that a pressure is within the required range.
In step S62, the maintenance system receives sensor data from one or more sensors, e.g. a sensor carried or worn by the user or a sensor of the item on which the maintenance task is performed. For example, the sensor may be a pressure sensor.
In step S63, the maintenance system receives additional information, such as information about the current position of the user device within the work site, e.g. as
DK 2018 00123 A1 recorded by a position detection module of the user device. Other examples of additional information may include a video recorded by the user device of the performance of the maintenance task, or a user input received by the user device indicative of an acknowledgement by the user that the task has been performed.
In step S64, the maintenance system determines that the maintenance task has been completed. This determination may be based on a combination of the sensor data and the additional information. For example, the maintenance system may determine that the maintenance task has been completed only if the a pressure sensor indicates that a target pressure is within a prescribed range and if the additional information also confirms completion of the task: for example, if the position information of the user device indicates that the user has been within a proximity of the item at which maintenance was to be performed and/or if computer vision processing of a recorded video indicates that the user has interacted with the target item (e.g. turned a dial or valve) and/or if the user has manually acknowledged completion of the task.
Finally, in step S65, the maintenance system logs completion of the maintenance task.
FIG. 7 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein, e.g. by the maintenance system described in connection with FIG. 3.
In step S71, the user device displays AR content overlaid a live view of the physical environment. The displayed AR content may include instructions or other information pertaining to a maintenance task to be performed by a user wearing or carrying the user device, e.g. a maintenance task to be performed on a target item within a field of view of the user. For example, the maintenance task may include adjusting a valve or dial so as to ensure that a pressure is within the required range.
In step S72, the user device detects that the user looks at a display of a secondary device, different from the user device, e.g. of a tablet computer carried by the user.
If the user device detects that the user looks at the display of the secondary device, the user device automatically enters a reading mode (step S73) in which only reduced or no generated AR content is displayed by the user device.
Otherwise, if the user device detects that the user no longer looks at the display of the secondary device, the user device resumes normal operation (step S74) in which Generated AR content is displayed.
FIG. 8 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein, e.g. by the maintenance system described in connection with any of FIGs. 1 - 5.
DK 2018 00123 A1
In step S81, the user device displays AR content overlaid a live view of the physical environment. The displayed AR content may include instructions or other information pertaining to a maintenance task to be performed by a user wearing or carrying the user device, e.g. a maintenance task to be performed on a target item within a field of view of the user. For example, the maintenance task may include adjusting a valve or dial so as to ensure that a pressure is within the required range.
In step S82, the user device detects a potentially hazardous situation, e.g. based on computer vision technology, based on received sensor inputs, based on a message from the data processing system and/or based on a current position of the user device.
If the user device detects a potentially hazardous situation, the user device automatically enters a safe mode (step S83) in which only reduced or no generated AR content is displayed by the user device and/or in which information pertaining to the detected hazardous information is displayed, such as evacuation directions.
Otherwise, if the user device detects that the potentially hazardous situation has been resolved, the user device resumes normal operation (step S84) in which Generated AR content is displayed.
FIG. 9 shows a flow chart of an embodiment of a process performed by an embodiment of a maintenance system disclosed herein, e.g. by the maintenance system described in connection with any of FIGs. 1 - 5.
In step S91, the user device displays AR content overlaid a live view of the physical environment. The displayed AR content may include instructions or other information pertaining to a maintenance task to be performed by a user wearing or carrying the user device, e.g. a maintenance task to be performed on a target item within a field of view of the user. For example, the maintenance task may include adjusting a valve or dial so as to ensure that a pressure is within the required range.
In step S92, the user device scans the local physical environment of the user device, e.g. within a field of view of a digital camera or depth camera of the user device.
In step S93, the user device creates a local 3D model of the scanned physical environment. The user device communicates the created local 3D model to the data processing system of the maintenance system.
In step S94, the data processing system maintains an updated global 3D model of the work site.
In step S95, the data processing system detects change in the work site, e.g. a new object, a removed object or displaced object. Moreover, the data processing system detects, e.g. based on predetermined criteria, based on artificial intelligence, and/or the
DK 2018 00123 A1 like, whether the change requires attention. In some embodiments, the determination is user-assisted. For example, the data processing system may display a detected change and a user may determine whether the change requires attention.
If a change that requires attention is detected, the process proceeds at step S96 where the data processing system creates a data record indicative of a maintenance task which is forwarded to a user device.
Embodiments of the methods described herein can be implemented by means of hardware comprising several distinct elements, and/or at least in part by means of a suitably programmed microprocessor.
In the claims enumerating several means, several of these means can be embodied by one and the same element, component or item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
It should be emphasized that the term comprises/comprising when used in this specification is taken to specify the presence of stated features, elements, steps or components but does not preclude the presence or addition of one or more other features, elements, steps, components or groups thereof.
Claims (19)
1. An augmented-reality maintenance system comprising a data processing system and a portable augmented-reality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
display generated content overlaid a live view of a part of a physical environment, the part of the physical environment including a target item on which a maintenance task is to be performed, the generated content being indicative of the maintenance task;
receive, via a wireless or wired data communication interface, a sensor signal from a sensor external to the user device, the sensor signal being indicative of an operational condition of the target item;
determine, at least in part based on the received sensor signal, that the maintenance task has been completed;
responsive to said determination, automatically record completion of the maintenance task.
2. An augmented-reality maintenance system according to claim 1; wherein the determination that the maintenance task has been completed is based on a combination of the received sensor signal and on additional information, in particular on a user input indicative of the completion of the maintenance task and/or on position information indicative of a current position of the user device.
3. An augmented-reality maintenance system according to claim 2; wherein the maintenance system is configured to:
capture one or more images of the target item while a user performs the maintenance task;
determine, at least in part based on the captured images, that the maintenance task is completed;
responsive to said determination, automatically record completion of the maintenance task.
4. An augmented-reality maintenance system comprising a data processing system and a portable augmented-reality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
display generated content overlaid a view of a part of a physical environment, the part of the physical environment including a target item on which a
DK 2018 00123 A1 maintenance task is to be performed, the generated content being indicative of the maintenance task;
capture one or more images of the target item while a user performs the maintenance task;
determine, at least in part based on the captured images, that the maintenance task is completed;
responsive to said determination, automatically record completion of the maintenance task.
5. An augmented-reality maintenance system according to any one of the preceding claims; wherein the maintenance system is configured to:
detect a current position of the AR enabled user device receive a sensor signal from one or more user-worn sensors for sensing a physical property ofthe environment log sensor data indicative of the received sensor signal in association with the detected current position.
6. An augmented-reality maintenance system comprising a data processing system and a portable augmented-reality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
detect a current position of the AR enabled user device receive a sensor signal from one or more user-worn sensors for sensing a physical property ofthe environment log sensor data indicative of the received sensor signal in association with the detected current position.
7. An augmented-reality maintenance system according to any one of the preceding claims; wherein the maintenance system is configured to:
- receive sensor data from user-worn sensors or sensors that are installed as part of the work site;
- use the received sensor data to create a data record indicative of a maintenance task to be performed; and
- display generated content indicative of the created data record by the user device.
DK 2018 00123 A1
8. An augmented-reality maintenance system according to any one of the preceding claims; wherein the user device is configured to:
detect a display of an associated data processing device in a field of view of the head-mounted device;
responsive to said detection, enter a reading mode in which the amount of displayed generated content is at least reduced or completely suppressed.
9. An augmented-reality maintenance system comprising a data processing system and a portable augmented-reality enabled user device, e.g. a head-mounted device, communicatively connectable to the data processing system; wherein the user device is configured to:
display generated content overlaid a live view of the physical environment of the user;
detect a display of an associated data processing device in a field of view of the head-mounted device;
responsive to said detection, enter a reading mode in which the amount of displayed generated content is at least reduced or completely suppressed.
10. An augmented-reality maintenance system according to claim 8 or 9; wherein the associated processing device is communicatively coupled to the user device; and the user device and the associated processing device are configured to exchange information.
11. An augmented-reality maintenance system according to any one of claims 8 through 10; wherein, when operated in reading mode, the user device is configured to cause the associated data processing device to display some or all of the generated content which, when not operated in reading mode, is displayed by the user device overlaid the live view of the environment.
12. An augmented-reality maintenance system according to any one of the preceding claims; wherein the user device is configured to:
detect a potentially hazardous situation;
responsive to said detection, automatically enter a safe mode during which the displayed generated content is at least reduced or completely suppressed.
13. An augmented-reality maintenance system comprising a portable augmented-reality enabled user device, e.g. a head-mounted device, configured to display generated
DK 2018 00123 A1 content overlaid a live view of the physical environment of the user; wherein the user device is configured to:
detect a potentially hazardous situation;
responsive to said detection, automatically enter a safe mode during which the displayed generated content is at least reduced or completely suppressed.
14. An augmented-reality maintenance system according to any one of claims 12 through 13; wherein entering the safe mode includes one or more of the following: displaying a warning, playing an audible warning sound, displaying information directing the attention of the user towards the source of danger, turning off the display.
15. An augmented-reality maintenance system according to any one of claims 12 through 14; wherein the user device is configured, when operated in safe mode, to display evacuation directions or otherwise directing the user towards safety or less hazardous surroundings.
16. An augmented-reality maintenance system according to any one of the preceding claims; wherein the maintenance system is configured to:
create, by the user device, a digital 3D model representation of the physical environment within a field of view of the user device;
maintain, by the data processing system, a digital 3D model of a work site based on received digital 3D models from one or more user devices.
17. An augmented-reality maintenance system comprising a data processing system and a portable augmented-reality enabled user device communicatively connectable to the data processing system; wherein the maintenance system is configured to:
create, by the user device, a digital 3D model of the physical environment within a field of view of the user device;
maintain, by the data processing system, a digital 3D model of a work site based on received digital 3D models from one or more user devices.
18. An augmented-reality maintenance system according to any one of claims 16 through 17; wherein the data processing system is configured to detect changes in the work site, based at least in part on the created digital 3D model of the physical environment within a field of view of the user device.
19. An augmented-reality maintenance system according to claim 18; wherein the data processing system is configured to:
- identify changes in the work site that require attention;
DK 2018 00123 A1
- responsive to identifying a change that requires attention, create a data record indicative of a maintenance task to be performed; and
- forward the created data record to the user device display for display by the user device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201800123A DK180665B1 (en) | 2018-03-18 | 2018-03-18 | Augmented Reality Maintenance System |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201800123A DK180665B1 (en) | 2018-03-18 | 2018-03-18 | Augmented Reality Maintenance System |
Publications (2)
Publication Number | Publication Date |
---|---|
DK201800123A1 true DK201800123A1 (en) | 2019-10-01 |
DK180665B1 DK180665B1 (en) | 2021-11-11 |
Family
ID=69156124
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DKPA201800123A DK180665B1 (en) | 2018-03-18 | 2018-03-18 | Augmented Reality Maintenance System |
Country Status (1)
Country | Link |
---|---|
DK (1) | DK180665B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4064006A1 (en) * | 2021-03-22 | 2022-09-28 | Siemens Aktiengesellschaft | Identifying a place of interest on a physical object through its 3d model in augmented reality view |
EP4439206A1 (en) * | 2023-03-29 | 2024-10-02 | FUJIFILM Business Innovation Corp. | Information processing system, program, and information processing method |
-
2018
- 2018-03-18 DK DKPA201800123A patent/DK180665B1/en active IP Right Grant
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4064006A1 (en) * | 2021-03-22 | 2022-09-28 | Siemens Aktiengesellschaft | Identifying a place of interest on a physical object through its 3d model in augmented reality view |
WO2022199995A1 (en) * | 2021-03-22 | 2022-09-29 | Siemens Aktiengesellschaft | Identifying a place of interest on a physical object through its 3d model in augmented reality view |
EP4439206A1 (en) * | 2023-03-29 | 2024-10-02 | FUJIFILM Business Innovation Corp. | Information processing system, program, and information processing method |
Also Published As
Publication number | Publication date |
---|---|
DK180665B1 (en) | 2021-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Akinlolu et al. | A bibliometric review of the status and emerging research trends in construction safety management technologies | |
US10885758B2 (en) | Proximity-based personnel safety system and method | |
US8225226B2 (en) | Virtual control panel | |
JP7076184B2 (en) | Using a human motion sensor to detect movement when near a hydraulic robot | |
JP7337654B2 (en) | Maintenance activity support system and maintenance activity support method | |
JP2020106513A (en) | Drift correction for industrial augmented reality applications | |
CN101939765B (en) | A computer implemented method and system for remote inspection of an industrial process | |
US20190377330A1 (en) | Augmented Reality Systems, Methods And Devices | |
Yan et al. | Estimating worker-centric 3D spatial crowdedness for construction safety management using a single 2D camera | |
JP6725063B2 (en) | Equipment management system | |
US20240265632A1 (en) | System and method for monitoring three-dimensional location-based worker safety management | |
DK180665B1 (en) | Augmented Reality Maintenance System | |
Mascareñas et al. | Augmented reality for enabling smart nuclear infrastructure | |
Wen et al. | 3D Excavator Pose Estimation Using Projection-Based Pose Optimization for Contact-Driven Hazard Monitoring | |
US8786399B2 (en) | Computer implemented method to display technical data for monitoring an industrial installation | |
KR20150138958A (en) | N f c based plant management system | |
Salman et al. | Implementation of augmented reality and mixed reality applications for smart facilities management: a systematic review | |
Tang | Artificial Intelligence in Occupational Health and Safety Risk Management of Construction, Mining, and Oil and Gas Sectors: Advances and Prospects | |
WO2023152280A1 (en) | An augmented reality-based automation system | |
Katika et al. | Mixed Reality for health and safety monitoring | |
Wang | Improving human-machine interfaces for construction equipment operations with mixed and augmented reality | |
Dyrda et al. | Specifying Volumes of Interest for Industrial Use Cases | |
Embers et al. | Smart Maintenance Services for Buildings with Digital Twins and Augmented Reality | |
EP4273644A1 (en) | An augmented reality based automation system with task manager | |
US20230169684A1 (en) | System and Method Using a System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PAT | Application published |
Effective date: 20190919 |
|
PME | Patent granted |
Effective date: 20211111 |