WO2018183179A1 - Method and apparatus for in-situ querying support for industrial environments - Google Patents

Method and apparatus for in-situ querying support for industrial environments Download PDF

Info

Publication number
WO2018183179A1
WO2018183179A1 PCT/US2018/024311 US2018024311W WO2018183179A1 WO 2018183179 A1 WO2018183179 A1 WO 2018183179A1 US 2018024311 W US2018024311 W US 2018024311W WO 2018183179 A1 WO2018183179 A1 WO 2018183179A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
semantic
physical
query
physical system
Prior art date
Application number
PCT/US2018/024311
Other languages
French (fr)
Inventor
Simon Mayer
JR. John HODGES
Dan Yu
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2018183179A1 publication Critical patent/WO2018183179A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying

Definitions

  • This application relates to industrial systems. More particularly, this application relates to interaction of virtual and physical components within industrial systems.
  • information about a system is usually located within the experience of the operators. Based on their knowledge, information about devices or the system may be accessed through various software applications. These applications may be accessed by many methods, e.g., a desktop or laptop computer, a mobile device including a tablet computer, or through physical documentation. The retrieval of information from multiple remote sources typically takes a considerable amount of time and is prone to human error.
  • aspects of a system are often modeled to allow for computer run simulations that represent the actual operation of the system.
  • Complex systems may include multiple semantic models, each semantic model containing information representing knowledge of some aspect of the system.
  • persons required to interact with or service a system may not posses the requisite knowledge of the system or the supporting documentation to make proper decisions. It would be desirable to allow a user on site with a system to request information about the system and receive useful information about the system in a manner that is intuitive and helpful to the user.
  • a system for providing in situ querying of an industrial system includes a computer processor in communication with a human machine interface device.
  • a voice capture device is in communication with a head mounted display and allows a user to issue voice commands to the system.
  • a semantic database containing contextual information about a physical system is queried by the computer processor in the form of a structured query command.
  • the computer processor receives a voice command from the user and translates the voice command into a query that is presented to the semantic database.
  • the computer processor receives a result from the query and visualizes the result to the user via the human machine interface device.
  • the visualization may be provided to the user in a head mounted display device.
  • one or more semantic models each corresponding to an aspect of the physical system are imported to the semantic database.
  • the physical system is a manufacturing line.
  • the human machine interface may include a voice capture device including a microphone integrated into a head mounted device worn by the user.
  • the visualized query result may be displayed to the user as an overlay to an environment being viewed by the user or the visualized query result may be displayed to the user as visualized objects that are spatially arranged in the user's field of view.
  • the voice command may be to identify all components of a given type within the physical system. In other embodiments, the voice command may be to highlight all components of the physical system that are in physical contact with an identified component and/or to identify all components in the physical system that can perform a specified function.
  • the semantic database contains information about a plurality of semantic models, each of the plurality of semantic models being representative of an aspect of the physical system. Further, one of the plurality of semantic models may represent a sub-system of the physical system. In other embodiments, one of the plurality of semantic models represents a sub-system of the physical system that contains a second sub-system of the physical system.
  • semantic information from a plurality of semantic models representing a physical system is imported into a semantic database.
  • a voice command from a user relating to the physical system is received and translated into a query.
  • the query is presented to the semantic database and a result of the query is received.
  • the result of the query is visualized, and the visualization is displayed to the user.
  • the visualization may be displayed to a user in a head mounted display device worn by the user.
  • the visualization may be displayed as an overlay to the environment being viewed by the user.
  • the visualization is displayed as a visualized object and arranging the visualized object spatially within the field of view of the user.
  • the physical system may be a manufacturing line.
  • the voice command may be a request from a user to display all components of the physical system of a selected type or may be a request from the user to display all components of the physical system that are in physical contact with a second identified component.
  • FIG. 1 is an isometric view of a visualization of an aspect of a manufacturing line according to embodiments of the present disclosure.
  • FIG. 2 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
  • FIG. 3 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
  • FIG. 4 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
  • FIG. 5 is a diagram illustrating the creation of a semantic model according to embodiments of the present disclosure.
  • FIG. 6 is a diagram illustration the creation of a semantic database that may be queried by a user according to an embodiment of the present disclosure.
  • FIG. 7 is a process flow diagram for a method of in-situ querying of a semantic model according to an embodiment of the present disclosure.
  • FIG. 8 is a block diagram of a computer system that may be used to implement a system for in-situ querying of an industrial system according to an embodiment of the present disclosure.
  • FIG. 1 is a simplified view of a manufacturing line 100 that includes its components, component types (foundation 101 , skid rail 103, skid 109, wheels 105, sensors 107) and their functional properties, and connectedness relations between components as well as the spatial dimensions of the components, a system may produce a visual rendering of the manufacturing line 100.
  • a semantic model may be created based on the properties of the manufacturing line 100 and its components.
  • An operator uses a cursor 1 1 1 in combination with a speech interface (not shown) to interact with the manufacturing line 100 and pose queries. The queries are answered using the underlying semantic model of the manufacturing line 100 and can be as complex as the underlying model's details allow.
  • the commands issued by the user may be provided in natural language.
  • the command will be parsed and the words in the command analyzed for meaning and context to attach meaning to the command.
  • the system may be configured to understand specific words arranged in exact sequences. Accordingly, the system receives a verbal command and attempts to provide a meaningful output based on the meaning derived in view of the system being observed. Examples of different complexity levels are shown with regard to FIG. 2, FIG. 3 and FIG. 4:
  • a command of, "Show all proximity sensors” is issued by a user - in response, the system highlights all components of type "Proximity Sensor".
  • a visual representation 200 of the manufacturing line is provided in which the proximity sensors located on the manufacturing line are shown as highlighted blocks 207.
  • a user issues the command, "Show all components that are directly connected to the selected component". - in response, the system highlights all components that are directly connected to the skid rail. Highlighted components may include the skid 309, foundation 301 and proximity sensors 307.
  • a user may issue the command, "Show all sensors that can measure vibration of the selected component". - in response, the system highlights those acceleration sensors 41 1 that are suitable for measuring the vibration of the skid rail 103. While in the example of FIG. 4, all accelerometers may be shown, it may be the case that some of the accelerometers are not capable of measuring vibration at a particular location. However, the semantic model of the system may support evaluation of the ability of a sensor placed in a first location to measure vibration or other property at a second different location. Thus, according to some embodiments, the system may return a query result that includes all accelerometers as shown in FIG. 4.
  • Embodiments of the invention combine knowledge models to describe complex systems with the ability to query these models using an intuitive modal [speech] interface in combination with an Augmented Reality (AR) solution.
  • AR Augmented Reality
  • FIG. 5 shows a high-level diagram of a basic semantic model.
  • a real-world physical system 501 includes interrelated components that generate data representative of the physical system 501. The data is stored as instances 510.
  • a semantic model 520 is built based on the instance data 510. The semantic model 520 describes the meaning of each instance datum 510 as well as relationships between instances corresponding to relationships between components of the physical system 501.
  • FIG. 6 a block diagram of a system for in-situ querying of an industrial system 600 according to an embodiment of the present disclosure is shown.
  • a real world physical system 501 produces data based on components of the physical system 501.
  • the physical system 501 may be composed of multiple sub-systems, and each sub-system may contain other sub-systems or components that are semantically related in some way. Accordingly, the physical system 501 may be viewed a system of systems, where each sub-system may be modeled to represent the components and relationships of that sub-system. Similarly, the relationships between different subsystems may also be modeled to create different semantic models of differing scope.
  • instance data 510 is captured and stored.
  • a number of semantic models 620, 630 and 640 are created to describe and model relationships of some aspect of physical system 501 .
  • the semantic information captured in models 620, 630 and 640 represent knowledge in the underlying sub-system being modeled by the semantic model 620, 630, 640.
  • the semantic models. 620, 630, 640 are combined into a semantic database 615 that may be queried by a user 660.
  • the user 660 may issue queries 662 to the semantic database 615 and the user 660 may receive results 664 of the queries 662.
  • the user 660 may issue queries 662 through a human machine interface device, by way of example, a microphone associated with a head mounted display may be used.
  • the head mounted display also provides the user 660 with a visualization of the surrounding environment and allows the user 660 to interact with the environment through mixed reality (MR) or augmented reality (AR).
  • MR mixed reality
  • AR augmented reality
  • the head mounted device may include processing capabilities allowing the user to interact with the visualized environment including providing a cursor that can be manipulated by the user 660 to select or highlight objects in the visualization display.
  • software in communication with the head mounted display may receive a voice command from the user 660, parse the command, and generate computer executable instructions representative of the voice command issued by the user 660.
  • Computer executable instructions may include structured queries 662 that are issued to a database management system associated with semantic database 615.
  • a processor in the head mounted display may receive results 664 to queries 662 issued to semantic database 615 and generate a visualization to the user that is viewable through the head mounted display.
  • the visualization may be projected on top of physical objects being viewed by the user, to provide an augmented reality experience.
  • the visualization may include other physical objects required to illustrate the results 664 of the queries 662.
  • the rendered physical objects may be spatially arranged within the viewed environment of the user providing a mixed reality experience.
  • a user 660 is provided with a visualized view 665 of the physical system 501.
  • requests for information presented as queries 662 are shown within the visualized view 665 to provide the user 660 with additional information about the system being viewed. Any service operation could benefit from such kind of intuitive query to assist problem analysis of the underlying system.
  • the system provides a powerful tool for site and remote engineers in order to:
  • FIG. 7 is a process flow diagram for in-situ querying of an industrial system according to aspects of an embodiment of the present disclosure.
  • the process begins by importing data from one or more knowledge models 750 to a semantic database 755.
  • the knowledge models may represent one or more aspects of a physical system.
  • a user issues a query command 701.
  • the command may be issued as a voice command which is spoken by the user into a microphone.
  • the microphone may be associated with a head-mounted display.
  • An example of a head mounted display is HOLOLENS available from MICROSOFT CORPORATION of Redmond, Washington.
  • the voice command is received by the system 710 and voice recognition is performed on the received voice command 720.
  • the translated command is received in a processor 730 and a structured query is generated based on the translated command 740.
  • the query is issued 745 to the semantic database 755 and a result of the query is received in the processor 760.
  • the processor is in communication with the head mounted display being worn by the user.
  • the processor generates a visualization of the query result 770 and displays the generated visualization to the user 780.
  • the visualization may be overlaid on the environment being viewed by the user to provide an AR experience, or the visualization may include rendered physical objects arranged spatially within the user's field of view to provide an MR experience.
  • FIG. 8 illustrates an exemplary computing environment 800 within which embodiments of the invention may be implemented.
  • Computers and computing environments such as computer system 810 and computing environment 800, are known to those of skill in the art and thus are described briefly here.
  • the computer system 810 may include a communication mechanism such as a system bus 821 or other communication mechanism for communicating information within the computer system 810.
  • the computer system 810 further includes one or more processors 820 coupled with the system bus 821 for processing the information.
  • the processors 820 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art. More generally, a processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device.
  • CPUs central processing units
  • GPUs graphical processing units
  • a processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and be conditioned using executable instructions to perform special purpose functions not performed by a general-purpose computer.
  • a processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between.
  • a user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof.
  • a user interface comprises one or more display images enabling user interaction with a processor or other device.
  • the computer system 810 also includes a system memory 830 coupled to the system bus 821 for storing information and instructions to be executed by processors 820.
  • the system memory 830 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 831 and/or random-access memory (RAM) 832.
  • the RAM 832 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM).
  • the ROM 831 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM).
  • system memory 830 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 820.
  • a basic input/output system 833 (BIOS) containing the basic routines that help to transfer information between elements within computer system 810, such as during start-up, may be stored in the ROM 831.
  • RAM 832 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 820.
  • System memory 830 may additionally include, for example, operating system 834, application programs 835, other program modules 836 and program data 837.
  • the computer system 810 also includes a disk controller 840 coupled to the system bus 821 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 841 and a removable media drive 842 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid-state drive).
  • Storage devices may be added to the computer system 810 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
  • SCSI small computer system interface
  • IDE integrated device electronics
  • USB Universal Serial Bus
  • FireWire FireWire
  • the computer system 810 may also include a display controller 865 coupled to the system bus 821 to control a display or monitor 866, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user.
  • the computer system includes an input interface 860 and one or more input devices, such as a keyboard 862 and a pointing device 861 , for interacting with a computer user and providing information to the processors 820.
  • the pointing device 861 for example, may be a mouse, a light pen, a trackball, or a pointing stick for communicating direction information and command selections to the processors 820 and for controlling cursor movement on the display 866.
  • the display 866 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 861.
  • an augmented reality device 867 that is wearable by a user, may provide input/output functionality allowing a user to interact with both a physical and virtual world.
  • the augmented reality device 867 is in communication with the display controller 865 and the user input interface 860 allowing a user to interact with virtual items generated in the augmented reality device 867 by the display controller 865.
  • the user may also provide gestures that are detected by the augmented reality device 867 and transmitted to the user input interface 860 as input signals.
  • the computer system 810 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 820 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 830.
  • a memory such as the system memory 830.
  • Such instructions may be read into the system memory 830 from another computer readable medium, such as a magnetic hard disk 841 or a removable media drive 842.
  • the magnetic hard disk 841 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security.
  • the processors 820 may also be employed in a multi-processing arrangement to execute the one or more sequences of instructions contained in system memory 830.
  • hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • the computer system 810 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein.
  • the term "computer readable medium” as used herein refers to any medium that participates in providing instructions to the processors 820 for execution.
  • a computer readable medium may take many forms including, but not limited to, non-transitory, non-volatile media, volatile media, and transmission media.
  • Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as magnetic hard disk 841 or removable media drive 842.
  • Non-limiting examples of volatile media include dynamic memory, such as system memory 830.
  • Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the system bus 821.
  • Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • the computing environment 800 may further include the computer system 810 operating in a networked environment using logical connections to one or more remote computers, such as remote computing device 880.
  • Remote computing device 880 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 810.
  • computer system 810 may include modem 872 for establishing communications over a network 871 , such as the Internet. Modem 872 may be connected to system bus 821 via user network interface 870, or via another appropriate mechanism.
  • Network 871 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 810 and other computers (e.g., remote computing device 880).
  • the network 871 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-6, or any other wired connection generally known in the art.
  • Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 871.
  • An executable application comprises code or machine- readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input.
  • An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
  • a graphical user interface comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
  • the GUI also includes an executable procedure or executable application.
  • the executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user.
  • the processor under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
  • the functions and process steps herein may be performed automatically or wholly or partially in response to user command.
  • An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.

Abstract

A system for providing in situ querying of an industrial system includes a computer processor in communication with a head mounted display device. A voice capture device is in communication with the head mounted display and allows a user to issue voice commands to the system. A semantic database containing contextual information about a physical system is queried by the computer processor in the form of a structured query command. The computer processor receives a voice command from the user and translates the voice command into query that is presented to the semantic database. The computer processor receives a result from the query and visualizes the result to the user via the head mounted display. In an embodiment, one or more semantic models each corresponding to an aspect of the physical system are imported to the semantic database. According to an embodiment the physical system is a manufacturing line.

Description

METHOD AND APPARATUS FOR IN-SITU QUERYING SUPPORT FOR
INDUSTRIAL ENVIRONMENTS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority under 35 U.S.C. §1 19(e) to U.S. Provisional Patent Application Serial No. 62/477,066, filed on March 27, 2017, entitled "In-Situ Querying Support for Industrial Environments", which is incorporated by reference herein in its entirety.
TECHNICAL FIELD
[0002] This application relates to industrial systems. More particularly, this application relates to interaction of virtual and physical components within industrial systems. Presently, information about a system is usually located within the experience of the operators. Based on their knowledge, information about devices or the system may be accessed through various software applications. These applications may be accessed by many methods, e.g., a desktop or laptop computer, a mobile device including a tablet computer, or through physical documentation. The retrieval of information from multiple remote sources typically takes a considerable amount of time and is prone to human error.
[0003] Aspects of a system are often modeled to allow for computer run simulations that represent the actual operation of the system. Complex systems may include multiple semantic models, each semantic model containing information representing knowledge of some aspect of the system. However, persons required to interact with or service a system may not posses the requisite knowledge of the system or the supporting documentation to make proper decisions. It would be desirable to allow a user on site with a system to request information about the system and receive useful information about the system in a manner that is intuitive and helpful to the user.
BACKGROUND
[0004] Devices and systems often have physical and virtual components that interact with each other in unexpected and non-obvious ways.
SUMMARY
[0005] A system for providing in situ querying of an industrial system includes a computer processor in communication with a human machine interface device. In some embodiments, a voice capture device is in communication with a head mounted display and allows a user to issue voice commands to the system. A semantic database containing contextual information about a physical system is queried by the computer processor in the form of a structured query command. The computer processor receives a voice command from the user and translates the voice command into a query that is presented to the semantic database. The computer processor receives a result from the query and visualizes the result to the user via the human machine interface device. In an embodiment, the visualization may be provided to the user in a head mounted display device. In an embodiment, one or more semantic models each corresponding to an aspect of the physical system are imported to the semantic database. According to an embodiment the physical system is a manufacturing line. According to aspects of embodiments described herein the human machine interface may include a voice capture device including a microphone integrated into a head mounted device worn by the user. [0006] According to aspects of various embodiments, the visualized query result may be displayed to the user as an overlay to an environment being viewed by the user or the visualized query result may be displayed to the user as visualized objects that are spatially arranged in the user's field of view. According to embodiments, the voice command may be to identify all components of a given type within the physical system. In other embodiments, the voice command may be to highlight all components of the physical system that are in physical contact with an identified component and/or to identify all components in the physical system that can perform a specified function.
[0007] According to aspects of embodiments, the semantic database contains information about a plurality of semantic models, each of the plurality of semantic models being representative of an aspect of the physical system. Further, one of the plurality of semantic models may represent a sub-system of the physical system. In other embodiments, one of the plurality of semantic models represents a sub-system of the physical system that contains a second sub-system of the physical system.
[0008] According to an exemplary method of in situ querying of an industrial system semantic information from a plurality of semantic models representing a physical system is imported into a semantic database. A voice command from a user relating to the physical system is received and translated into a query. The query is presented to the semantic database and a result of the query is received. The result of the query is visualized, and the visualization is displayed to the user. The visualization may be displayed to a user in a head mounted display device worn by the user. The visualization may be displayed as an overlay to the environment being viewed by the user. In other embodiments, the visualization is displayed as a visualized object and arranging the visualized object spatially within the field of view of the user. The physical system may be a manufacturing line. The voice command may be a request from a user to display all components of the physical system of a selected type or may be a request from the user to display all components of the physical system that are in physical contact with a second identified component.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The foregoing and other aspects of the present invention are best understood from the following detailed description when read in connection with the accompanying drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments that are presently preferred, it being understood, however, that the invention is not limited to the specific instrumentalities disclosed. Included in the drawings are the following Figures:
[0010] FIG. 1 is an isometric view of a visualization of an aspect of a manufacturing line according to embodiments of the present disclosure.
[0011] FIG. 2 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
[0012] FIG. 3 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
[0013] FIG. 4 is an isometric view of a visualization of an aspect of a manufacturing line highlighting a query result according to embodiments of the present disclosure.
[0014] FIG. 5 is a diagram illustrating the creation of a semantic model according to embodiments of the present disclosure. [0015] FIG. 6 is a diagram illustration the creation of a semantic database that may be queried by a user according to an embodiment of the present disclosure.
[0016] FIG. 7 is a process flow diagram for a method of in-situ querying of a semantic model according to an embodiment of the present disclosure.
[0017] FIG. 8 is a block diagram of a computer system that may be used to implement a system for in-situ querying of an industrial system according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0018] Currently, operators routinely rely on their experience and/or access to device and system information through different software applications. Access is typically gained using a desktop or laptop computer, a tablet, or physical documentation. The information retrieval process from multiple remote sources can take a considerable amount of time and is prone to errors. According to aspects of embodiments described in this disclosure, a specific example of a manufacturing line is presented. The concepts involved may apply to any complex system of any granularity and are not limited only to the examples provided herein.
[0019] FIG. 1 is a simplified view of a manufacturing line 100 that includes its components, component types (foundation 101 , skid rail 103, skid 109, wheels 105, sensors 107) and their functional properties, and connectedness relations between components as well as the spatial dimensions of the components, a system may produce a visual rendering of the manufacturing line 100. A semantic model may be created based on the properties of the manufacturing line 100 and its components. [0020] An operator uses a cursor 1 1 1 in combination with a speech interface (not shown) to interact with the manufacturing line 100 and pose queries. The queries are answered using the underlying semantic model of the manufacturing line 100 and can be as complex as the underlying model's details allow. The commands issued by the user may be provided in natural language. The command will be parsed and the words in the command analyzed for meaning and context to attach meaning to the command. In some embodiments, the system may be configured to understand specific words arranged in exact sequences. Accordingly, the system receives a verbal command and attempts to provide a meaningful output based on the meaning derived in view of the system being observed. Examples of different complexity levels are shown with regard to FIG. 2, FIG. 3 and FIG. 4:
• Referring to FIG. 2, a command of, "Show all proximity sensors" is issued by a user - in response, the system highlights all components of type "Proximity Sensor". As shown in FIG. 4, a visual representation 200 of the manufacturing line is provided in which the proximity sensors located on the manufacturing line are shown as highlighted blocks 207.
• Referring to FIG. 3 a user issues the command, "Show all components that are directly connected to the selected component". - in response, the system highlights all components that are directly connected to the skid rail. Highlighted components may include the skid 309, foundation 301 and proximity sensors 307.
• Referring to FIG. 4, a user may issue the command, "Show all sensors that can measure vibration of the selected component". - in response, the system highlights those acceleration sensors 41 1 that are suitable for measuring the vibration of the skid rail 103. While in the example of FIG. 4, all accelerometers may be shown, it may be the case that some of the accelerometers are not capable of measuring vibration at a particular location. However, the semantic model of the system may support evaluation of the ability of a sensor placed in a first location to measure vibration or other property at a second different location. Thus, according to some embodiments, the system may return a query result that includes all accelerometers as shown in FIG. 4.
[0021] Embodiments of the invention combine knowledge models to describe complex systems with the ability to query these models using an intuitive modal [speech] interface in combination with an Augmented Reality (AR) solution. This enables the system to render responses to complex queries directly on top of a physical device or system and make it intuitively useable for users. This makes the wealth of information stored in complex, potentially cross-domain knowledge models accessible to "average" users in an intuitive and efficient way.
[0022] FIG. 5 shows a high-level diagram of a basic semantic model. A real-world physical system 501 includes interrelated components that generate data representative of the physical system 501. The data is stored as instances 510. A semantic model 520 is built based on the instance data 510. The semantic model 520 describes the meaning of each instance datum 510 as well as relationships between instances corresponding to relationships between components of the physical system 501.
[0023] Referring now to FIG. 6, a block diagram of a system for in-situ querying of an industrial system 600 according to an embodiment of the present disclosure is shown. A real world physical system 501 produces data based on components of the physical system 501. The physical system 501 may be composed of multiple sub-systems, and each sub-system may contain other sub-systems or components that are semantically related in some way. Accordingly, the physical system 501 may be viewed a system of systems, where each sub-system may be modeled to represent the components and relationships of that sub-system. Similarly, the relationships between different subsystems may also be modeled to create different semantic models of differing scope. Referring again to FIG. 6, instance data 510 is captured and stored. A number of semantic models 620, 630 and 640 are created to describe and model relationships of some aspect of physical system 501 . The semantic information captured in models 620, 630 and 640 represent knowledge in the underlying sub-system being modeled by the semantic model 620, 630, 640. According to an embodiment, the semantic models. 620, 630, 640 are combined into a semantic database 615 that may be queried by a user 660. The user 660 may issue queries 662 to the semantic database 615 and the user 660 may receive results 664 of the queries 662. The user 660 may issue queries 662 through a human machine interface device, by way of example, a microphone associated with a head mounted display may be used. While the examples described herein depict a head mounting device for providing queries to the system and displaying the results of the queries, the invention is not so limited. Other human machine interfaces allowing a user to communicate queries to the system and to receive results of the queries fall with in the scope and spirit of this disclosure. For example, speech- based input/output systems may be used as well as other systems that may be contemplated a one of skill in the art. The head mounted display also provides the user 660 with a visualization of the surrounding environment and allows the user 660 to interact with the environment through mixed reality (MR) or augmented reality (AR). The head mounted device may include processing capabilities allowing the user to interact with the visualized environment including providing a cursor that can be manipulated by the user 660 to select or highlight objects in the visualization display. Similarly, software in communication with the head mounted display may receive a voice command from the user 660, parse the command, and generate computer executable instructions representative of the voice command issued by the user 660. Computer executable instructions may include structured queries 662 that are issued to a database management system associated with semantic database 615.
[0024] A processor in the head mounted display may receive results 664 to queries 662 issued to semantic database 615 and generate a visualization to the user that is viewable through the head mounted display. The visualization may be projected on top of physical objects being viewed by the user, to provide an augmented reality experience. In other embodiments, the visualization may include other physical objects required to illustrate the results 664 of the queries 662. The rendered physical objects may be spatially arranged within the viewed environment of the user providing a mixed reality experience.
[0025] Though the system of FIG. 6, a user 660 is provided with a visualized view 665 of the physical system 501. In addition, requests for information presented as queries 662 are shown within the visualized view 665 to provide the user 660 with additional information about the system being viewed. Any service operation could benefit from such kind of intuitive query to assist problem analysis of the underlying system. The system provides a powerful tool for site and remote engineers in order to:
• shorten the time spent on asset survey, assessment and analytics
• improve quality of the service by allowing site engineers to gain more insight to the target system and/or
• record the service process for future tracing / improvement.
[0026] FIG. 7 is a process flow diagram for in-situ querying of an industrial system according to aspects of an embodiment of the present disclosure. The process begins by importing data from one or more knowledge models 750 to a semantic database 755. The knowledge models may represent one or more aspects of a physical system. Once the semantic database 755 is established, a user issues a query command 701. The command may be issued as a voice command which is spoken by the user into a microphone. The microphone may be associated with a head-mounted display. An example of a head mounted display is HOLOLENS available from MICROSOFT CORPORATION of Redmond, Washington. The voice command is received by the system 710 and voice recognition is performed on the received voice command 720. The translated command is received in a processor 730 and a structured query is generated based on the translated command 740. The query is issued 745 to the semantic database 755 and a result of the query is received in the processor 760. The processor is in communication with the head mounted display being worn by the user. The processor generates a visualization of the query result 770 and displays the generated visualization to the user 780. The visualization may be overlaid on the environment being viewed by the user to provide an AR experience, or the visualization may include rendered physical objects arranged spatially within the user's field of view to provide an MR experience.
[0027] FIG. 8 illustrates an exemplary computing environment 800 within which embodiments of the invention may be implemented. Computers and computing environments, such as computer system 810 and computing environment 800, are known to those of skill in the art and thus are described briefly here.
[0028] As shown in FIG. 8, the computer system 810 may include a communication mechanism such as a system bus 821 or other communication mechanism for communicating information within the computer system 810. The computer system 810 further includes one or more processors 820 coupled with the system bus 821 for processing the information.
[0029] The processors 820 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art. More generally, a processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and be conditioned using executable instructions to perform special purpose functions not performed by a general-purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof. A user interface comprises one or more display images enabling user interaction with a processor or other device.
[0030] Continuing with reference to FIG. 8, the computer system 810 also includes a system memory 830 coupled to the system bus 821 for storing information and instructions to be executed by processors 820. The system memory 830 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 831 and/or random-access memory (RAM) 832. The RAM 832 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM). The ROM 831 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM). In addition, the system memory 830 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 820. A basic input/output system 833 (BIOS) containing the basic routines that help to transfer information between elements within computer system 810, such as during start-up, may be stored in the ROM 831. RAM 832 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 820. System memory 830 may additionally include, for example, operating system 834, application programs 835, other program modules 836 and program data 837. [0031] The computer system 810 also includes a disk controller 840 coupled to the system bus 821 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 841 and a removable media drive 842 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid-state drive). Storage devices may be added to the computer system 810 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
[0032] The computer system 810 may also include a display controller 865 coupled to the system bus 821 to control a display or monitor 866, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user. The computer system includes an input interface 860 and one or more input devices, such as a keyboard 862 and a pointing device 861 , for interacting with a computer user and providing information to the processors 820. The pointing device 861 , for example, may be a mouse, a light pen, a trackball, or a pointing stick for communicating direction information and command selections to the processors 820 and for controlling cursor movement on the display 866. The display 866 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 861. In some embodiments, an augmented reality device 867 that is wearable by a user, may provide input/output functionality allowing a user to interact with both a physical and virtual world. The augmented reality device 867 is in communication with the display controller 865 and the user input interface 860 allowing a user to interact with virtual items generated in the augmented reality device 867 by the display controller 865. The user may also provide gestures that are detected by the augmented reality device 867 and transmitted to the user input interface 860 as input signals.
[0033] The computer system 810 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 820 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 830. Such instructions may be read into the system memory 830 from another computer readable medium, such as a magnetic hard disk 841 or a removable media drive 842. The magnetic hard disk 841 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security. The processors 820 may also be employed in a multi-processing arrangement to execute the one or more sequences of instructions contained in system memory 830. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
[0034] As stated above, the computer system 810 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein. The term "computer readable medium" as used herein refers to any medium that participates in providing instructions to the processors 820 for execution. A computer readable medium may take many forms including, but not limited to, non-transitory, non-volatile media, volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as magnetic hard disk 841 or removable media drive 842. Non-limiting examples of volatile media include dynamic memory, such as system memory 830. Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the system bus 821. Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
[0035] The computing environment 800 may further include the computer system 810 operating in a networked environment using logical connections to one or more remote computers, such as remote computing device 880. Remote computing device 880 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 810. When used in a networking environment, computer system 810 may include modem 872 for establishing communications over a network 871 , such as the Internet. Modem 872 may be connected to system bus 821 via user network interface 870, or via another appropriate mechanism.
[0036] Network 871 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 810 and other computers (e.g., remote computing device 880). The network 871 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-6, or any other wired connection generally known in the art. Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 871.
[0037] An executable application, as used herein, comprises code or machine- readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
[0038] A graphical user interface (GUI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions. The GUI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user. The processor, under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
[0039] The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.
[0040] The system and processes of the figures are not exclusive. Other systems, processes and menus may be derived in accordance with the principles of the invention to accomplish the same objectives. Although this invention has been described with reference to particular embodiments, it is to be understood that the embodiments and variations shown and described herein are for illustration purposes only. Modifications to the current design may be implemented by those skilled in the art, without departing from the scope of the invention. As described herein, the various systems, subsystems, agents, managers and processes can be implemented using hardware components, software components, and/or combinations thereof. No claim element herein is to be construed under the provisions of 35 U.S.C. 1 12, sixth paragraph, unless the element is expressly recited using the phrase "means for."

Claims

CLAIMS What is claimed is:
1. A system for in situ querying of an industrial system comprising: a computer processor; a human machine interface device in communication with the computer processor;
; and a semantic database containing contextual information about a physical system; wherein the computer processor receives a command from a user and translates the command into query that is presented to the semantic database and the computer processor receives a result from the query and visualizes the result to the user via the human machine interface device.
2. The system of claim 1 , further comprising: one or more semantic models each corresponding to an aspect of the physical system, wherein knowledge stored as information in the semantic database is imported to the semantic database.
3. The system of claim 1 , wherein the physical system is a manufacturing line.
4. The system of claim 1 , wherein the visualized query result is displayed to the user as an overlay to an environment being viewed by the user.
5. The system of claim 1 , wherein the visualized query result is displayed to the user as visualized objects that are spatially arranged in the user's field of view.
6. The system of claim 1 , wherein the voice command is to identify all components of a given type within the physical system.
7. The system of claim 1 , wherein the voice command is to highlight all components of the physical system that are in physical contact with an identified component.
8. The system of claim 1 , wherein the voice command is to identify all components in the physical system that can perform a specified function.
9. The system of claim 1 , wherein the semantic database contains information about a plurality of semantic models, each of the plurality of semantic models being representative of an aspect of the physical system.
10. The system of claim 9, wherein one of the plurality of semantic models represents a sub-system of the physical system.
1 1. The system of claim 9, wherein one of the plurality of semantic models represents a sub-system of the physical system that contains a second sub-system of the physical system.
12. The system of claim 1 , further comprising: in communication with the computer process, a voice recognition circuit for receiving a voice command from the user and translating the voice command to a translated computer command.
13. The system of claim 1 , wherein the human machine interface comprises: a head-mounted display device; and a microphone integrated in the head mounted display.
14. A method of in situ querying of an industrial system comprising: importing semantic information from a plurality of semantic models representing a physical system into a semantic database; receiving a command from a user relating to the physical system; translating the voice command into a query; presenting the query to the semantic database; receiving a result of the query; visualizing the result of the query; and displaying the visualization to the user.
15. The method of claim 14, further comprising: displaying the visualization to the user in a head mounted display worn by the user.
16. The method of claim 15, further comprising: displaying the visualization as an overlay on the environment being viewed by the user in the head mounted display.
17. The method of claim 15, further comprising: displaying the visualization as a visualized object and arranging the visualized object spatially within the field of view of the user.
18. The method of claim 14, wherein the physical system is a manufacturing line.
19. The method of claim 18, further comprising: receiving a request from a user to display all components of the physical system of a selected type.
20. The method of claim 18, further comprising: receiving a request from the user to display all components of the physical system that are in physical contact with a second identified component.
PCT/US2018/024311 2017-03-27 2018-03-26 Method and apparatus for in-situ querying support for industrial environments WO2018183179A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762477066P 2017-03-27 2017-03-27
US62/477,066 2017-03-27

Publications (1)

Publication Number Publication Date
WO2018183179A1 true WO2018183179A1 (en) 2018-10-04

Family

ID=62063163

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/024311 WO2018183179A1 (en) 2017-03-27 2018-03-26 Method and apparatus for in-situ querying support for industrial environments

Country Status (1)

Country Link
WO (1) WO2018183179A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347353A (en) * 2019-06-27 2019-10-18 西安理工大学 A kind of interactive mode 3D printing system and its Method of printing
WO2023230902A1 (en) * 2022-05-31 2023-12-07 西门子股份公司 Human-machine interaction method and apparatus, electronic device, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140247283A1 (en) * 2012-08-28 2014-09-04 Geun Sik Jo Unifying augmented reality and big data
WO2016179248A1 (en) * 2015-05-05 2016-11-10 Ptc Inc. Augmented reality system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140247283A1 (en) * 2012-08-28 2014-09-04 Geun Sik Jo Unifying augmented reality and big data
WO2016179248A1 (en) * 2015-05-05 2016-11-10 Ptc Inc. Augmented reality system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347353A (en) * 2019-06-27 2019-10-18 西安理工大学 A kind of interactive mode 3D printing system and its Method of printing
CN110347353B (en) * 2019-06-27 2022-09-30 西安理工大学 Interactive 3D printing system and printing method thereof
WO2023230902A1 (en) * 2022-05-31 2023-12-07 西门子股份公司 Human-machine interaction method and apparatus, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
US9442475B2 (en) Method and system to unify and display simulation and real-time plant data for problem-solving
US8799796B2 (en) System and method for generating graphical dashboards with drill down navigation
US8843883B2 (en) System and method for model-driven dashboard for business performance management
US10771350B2 (en) Method and apparatus for changeable configuration of objects using a mixed reality approach with augmented reality
JP2018014130A (en) Business activity monitoring runtime
US20130073969A1 (en) Systems and methods for web based application modeling and generation
US20210241893A1 (en) Dashboard Usage Tracking and Generation of Dashboard Recommendations
Guo et al. Research and development of monitoring system and data monitoring system and data acquisition of CNC machine tool in intelligent manufacturing
US11126938B2 (en) Targeted data element detection for crowd sourced projects with machine learning
CN112269799A (en) Data query method, device, equipment and medium
US11681961B2 (en) Flexible work breakdown structure
Schumann et al. Evaluation of augmented reality supported approaches for product design and production processes
US20190129832A1 (en) System and method for test data generation for use in model based testing using source code test annotations and constraint solving
WO2018183179A1 (en) Method and apparatus for in-situ querying support for industrial environments
US20110087614A1 (en) System for representing an organization
JP2022006178A (en) Processing method, device, and electronic device of deep model visualization data
Pavlov et al. Case study of using virtual and augmented reality in industrial system monitoring
US8095229B2 (en) Three-dimensional (3D) manufacturing process planning
US20150199105A1 (en) Automatic selection of center of rotation for graphical scenes
Fernando et al. Constraint-based immersive virtual environment for supporting assembly and maintenance tasks
Esengün et al. Development of an augmented reality-based process management system: The case of a natural gas power plant
Götze et al. Context awareness and augmented reality in facility management
CN113794604A (en) Network security situation display method, device, equipment and storage medium
CN110781226B (en) Data analysis method, device, storage medium, equipment and system
US20130138690A1 (en) Automatically identifying reused model artifacts in business process models

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18720425

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18720425

Country of ref document: EP

Kind code of ref document: A1