WO2019194697A1 - Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs - Google Patents

Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs Download PDF

Info

Publication number
WO2019194697A1
WO2019194697A1 PCT/RU2018/000215 RU2018000215W WO2019194697A1 WO 2019194697 A1 WO2019194697 A1 WO 2019194697A1 RU 2018000215 W RU2018000215 W RU 2018000215W WO 2019194697 A1 WO2019194697 A1 WO 2019194697A1
Authority
WO
WIPO (PCT)
Prior art keywords
entity
sensor
sensors
coordinates
machine
Prior art date
Application number
PCT/RU2018/000215
Other languages
English (en)
Inventor
Bogdan Viktorivich CHADKIN
Egor Sergeevich GOLOSHCHAPOV
Mikhail Alexandrovich KALINKIN
Vitaly Sergeevich MELNIKOV
Sergey Sergeevich ZOBNIN
Anton Borisovich KORELYAKOV
Alexander Vladimirovich LOGINOV
Maria Andreevna KURBATOVA
Martin KEGALJ
Jerry Klopf
Lukas LIESS
Jochen LÜTCHE
Mark Schmitt
Tristan TRAORE
Michael Zidorn
Andreas Kalmbach
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to PCT/RU2018/000215 priority Critical patent/WO2019194697A1/fr
Publication of WO2019194697A1 publication Critical patent/WO2019194697A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors

Definitions

  • the present invention relates to the field of industrial machinery, more particularly to the field of prototyping and monitoring of such machinery, and specifically, to an instrumentation overview device and method for a machine instrumented with a plurality of sensors.
  • an instrumentation overview device for a machine instrumented with a plurality of sensors.
  • the instrumentation overview device comprises: a first entity configured to acquire a 3D model of the machine, a second entity configured to acquire 3D coordinates of each of the plurality of sensors, a third entity configured to acquire a sensor value for each of the plurality of sensors and a fourth entity configured to generate an interactive screen comprising a plurality of interactive components.
  • the fourth entity is configured to generate the interactive screen based on the 3D model of the machine, the 3D coordinates of the plurality of sensors and the acquired sensor values .
  • the instrumentation overview device may advantageously visualize the acquired 3D model, the acquired sensor values, and the acquired 3D sensor coordinates (sensor positions) , in a correlated fashion on a single interactive screen.
  • a user may navigate through the interactive screen to access the acquired sensor values.
  • a necessity to refer to, and manually correlate, 2D drawings and measurement lists, is obviated.
  • any change made by an engineer to the 3D model may automatically and immediately be reflected on the interactive screen during prototyping. Prototyping and monitoring the machine can be made more efficient and may be achieved with a higher level of automation.
  • the instrumentation overview device is configured to perform automated processing for acquiring and visualizing the 3D model, the 3D coordinates and the sensor values on the interactive screen. That is, beneficially, manual intervention may not be necessary, any calculations and conversions may be automatically performed by the instrumentation overview device, and an interactive visualization may be achieved through automated processing directly from raw data, i.e., directly from the acquired 3D model, 3D coordinates and sensor values .
  • the machine may be any piece of industrial machinery, such as a large gas turbine, and may be instrumented with a plurality of sensors that are arranged inside and/or alongside the machine so as to provide readings relating to an operating condition of the machine, like temperature, pressure, flow etc ..
  • the 3D model may be, for example, acquired from a 3D CAD model, such as a 3D CAD file in a file format such as DWG,
  • the 3D coordinates may be coordinates in a Cartesian, polar, cylindrical or any other coordinate system, and may be acquired from a table, such as a data file in XLS, CSV or XML or any other file format.
  • the sensor value may refer to any type of acquired sensor data.
  • the sensor value may comprise at least a sensor reading and/or a sensor alarm state .
  • Interactive screen may refer to any combination of code and data suitable to cause a display device, which may be internal or external to the instrumentation overview device, to display an interactive graphical visualization of the machine, the sensors and the sensor values, and/or may refer to the actual graphical content of said interactive graphical visualization.
  • the interactive screen may be bitmap data, vector data, HTML, XML, Flash, Java or JavaScript code and/or data, data and code or any combination thereof.
  • An interactive component may refer to any portion of the interactive screen that may be manipulated by a user input such as a user interaction using a mouse, keyboard, touch panel, voice control or other periphery connected directly or indirectly to the fourth entity.
  • the fourth entity may be configured to generate, update, display, partially or fully redraw the interactive screen in response to such user input .
  • the first entity may acquire the 3D model
  • the second entity may acquire the 3D coordinates, by a user entering the 3D coordinates and/or the 3D model and/or by receiving the 3D coordinates and/or the 3D coordinates from an external computing or storage unit.
  • the third entity may acquire the sensor value for each of the plurality of sensors by reading the sensor value from the machine, such as from an interface of the machine, and/or by directly the sensor value from each sensor of the machine. Alternatively, the third entity may acquire the sensor value by receiving the sensor value from an external computing unit or an external storage unit. According to a variant, the sensor value may further comprise meta data. According to another variant, meta data for a respective sensor may be acquired along with the 3D coordinates for each sensor. According to yet a further variant, meta data for a plurality of sensors may be acquired separately by a meta data acquisition entity of the instrumentation overview device.
  • Meta data may refer to information about a respective sensor.
  • meta data include an acceptable sensor value range, a sensor location, a sensor name, information indicative of a person or entity that requested the sensor to be installed and/or queried (a so-called requestor) , information indicative of a design drawing in which the sensor is included or a design document in which further information about the sensor data may be found) , etc .
  • the fourth entity may be configured to generate the interactive screen based on the 3D model of the machine, the 3D coordinates of the plurality of sensors, the acquired sensor values and the meta data.
  • a user operating the machine may advantageously be provided with easy access to meta data such as design information and the like directly on the interactive screen.
  • the plurality of interactive components comprises a graphical map of the machine instrumented with the plurality of sensors and, for each of the plurality of sensors, a sensor tag associated with the respective sensor and positioned on the graphical map in accordance with the 3D coordinates of the respective sensor.
  • graphical map may refer to a portion of the interactive screen in which a graphical visualization of a portion of the machine or the full machine is displayed.
  • the graphical visualization may be 2D or 3D.
  • the fourth entity may be configured to allow a user to navigate the map, for example, to pan and zoom the map, and to access the sensor readings acquired for each sensor.
  • a sensor tag may refer to a graphical representation of a sensor installed inside or alongside the machine.
  • a sensor tag may be an icon such as a square, circle, triangle or diamond displayed on the graphical map.
  • a specific graphical appearance of the sensor tag such as a sensor tag shape may depend upon sensor metadata such as sensor type .
  • the instrumentation overview device may aid a user in quickly gaining an overview of the positions of the respective sensors inside the machine. A time required to locate a specific sensor is reduced, and an efficiency of prototyping of the machine and/or of error localization is increased.
  • the fourth entity is configured to change a graphical appearance of each sensor tag on the interactive screen dependent upon the acquired sensor value for the sensor corresponding to the respective sensor tag.
  • the fourth entity may color sensor tags of sensors with an alarm state in a different color than sensor tags without an alarm state, and/or may color sensor tags with high readings in a different color than sensor tags with low readings, wherein "high” and “low” may be defined in relation to a user-definable threshold and/or a respective acceptable sensor reading range .
  • the instrumentation overview device may advantageously provide an automated visualization of the instrumentation of the machine and the specific conditions prevailing inside the machine.
  • the plurality of interactive components comprises, for each of the plurality of sensors, a sensor list item associated with the respective sensor and indicative of at least the acquired sensor value for the respective sensor, and the fourth entity is configured to highlight, in response to a user input selecting one of the sensor list items, the corresponding sensor tag and/or to highlight, in response to a user input selecting one of the sensor tags, the corresponding sensor list item on the interactive screen.
  • to highlight may refer to any change of a graphical appearance of the interactive screen or portions thereof in a way to potentially draw a user's attention to a highlighted component, such as changing a color, a brightness, or a background color of the highlighted component and/or zooming in and centering on the highlighted component.
  • the plurality of sensor list items may be part of a sensor list component on the interactive screen.
  • the sensor list component may be scrollable using a scroll bar, a mouse wheel or similar.
  • Each sensor list item may be indicative of further information about the respective sensor in addition to the sensor value acquired for the respective sensor, such as the respective sensor's name, description, position, reading, alarm state and meta data of the sensor.
  • the instrumentation overview device highlights a sensor list item when a user selects a corresponding sensor tag in the graphical map, and/or highlights a sensor tag in the graphical map when a user selects a corresponding sensor list item.
  • This may provide the advantage of an automated and visual correlation between a sensor reading - as displayed in the sensor list item - and a sensor position, as visualized by the corresponding sensor tag being positioned in accordance with the 3D coordinates of the respective sensor on the graphical map.
  • An efficiency in identifying a position of a sensor corresponding to a certain reading and/or in identifying a sensor reading corresponding to a certain sensor tag on the map may be increased.
  • the fourth entity is configured, in response to a user input selecting one of the interactive components associated with a sensor of the plurality of sensors, to display the acquired sensor value for the respective sensor on the interactive screen.
  • the fourth entity may display the corresponding sensor value, such as a sensor reading or sensor alarm state, on the interactive screen, e.g. next to the sensor tag and/or as part of a dialog component.
  • the fourth entity may also display meta data associated with the corresponding sensor next to the sensor tag and/or as part of the dialog component .
  • the instrumentation overview device facilitates easy access to sensor values, such as readings and alarm states, and to meta data, such as design information, through selection of corresponding sensors tags or sensor list items on an interactive screen with a graphical map of the machine .
  • the fourth entity is configured, in response to a user input selecting one of the interactive components associated with a sensor of the plurality of sensors, to display a dialog component configured to receive a user input indicative of updated 3D coordinates of the respective sensor on the interactive screen, and the second entity is configured to update the acquired 3D coordinates of the respective sensor in accordance with the updated 3D coordinates indicated by the user input received by the dialog component .
  • the fourth entity may be configured to display a dialog component such as dialog box for receiving a user input .
  • a dialog component such as dialog box for receiving a user input .
  • the interactive screen may then reflect the updated coordinates, e.g. by moving the corresponding sensor tag to a new position corresponding to the updated 3D coordinates .
  • the instrumentation overview device may provide assistance during a prototyping stage, wherein an engineer may frequently decide to change the positions of certain sensors.
  • an engineer may frequently decide to change the positions of certain sensors.
  • a necessity to acquire new 3D coordinates for each of the plurality of sensors upon every change is obviated, and prototyping speed may be improved.
  • the plurality of interactive components comprises a filter selection component for receiving a user input indicative of a filter criterion
  • the fourth entity is configured to display an interactive component associated with a sensor of the plurality of sensors on the interactive screen only for a sensor of the plurality of sensors that matches the filter criterion.
  • the fourth entity may be configured not to display any interactive components that are associated with a sensor of the plurality of sensors that does not match the filter criterion.
  • the filter selection component may be configured to allow a user to indicate the filter criterion by selecting from one or more predefined filter criteria and/or by entering a user- defined filter criterion, such as a search request.
  • the instrumentation overview device may advantageously provide filtering and/or fast searching capabilities for restricting, limiting or filtering the amount of sensor tags and/or sensor list items (interactive components associated with a sensor) displayed on the interactive screen, and may therefore facilitate understanding and analysis of the instrumentation overview provided by the instrumentation overview device by filtering information not of interest to a user.
  • the filter criterion is a criterion based on at least one of a sensor value, a 3D coordinate and/or meta data.
  • the third entity is configured to repeatedly acquire the sensor value for each of the plurality of sensors
  • the fourth entity is configured to update the interactive screen when a change is detected in any of the 3D model of the machine, the 3D coordinates of the plurality of sensors or the acquired sensor values .
  • the instrumentation overview device may advantageously provide an online or life visualization of an online or live state of the machine instrumented with the plurality of sensors, automatically reflecting changes detected in the sensor values, 3D model or 3D coordinates on the map displayed on the interactive screen.
  • a live visualization may be of particular benefit in situations with quick temporal changes, such as during ignition of a gas turbine. That is, a user operating the machine may quickly grasp any changes in sensor readings or sensor alarm states.
  • the instrumentation overview device further comprises a fifth entity configured to store, for at least some of the acquisitions performed by the first entity, the second entity and/or the third entity, a history of the acquired 3D models, 3D coordinates and/or sensor values in association with an acquisition time of the respective acquisition.
  • the plurality of interactive components further comprises a history selection component configured to receive a user input that selects an acquisition time from the acquisitions times stored in the fifth entity.
  • the fourth entity is further configured to update the interactive screen based on the 3D model of the machine, the 3D coordinates of the plurality of sensors and the sensor values stored in the fifth entity in association with the acquisition time selected by the user input received by the history selection component.
  • the fifth entity may store an acquisition of at least the sensor values in predetermined intervals .
  • the fifth entity may store an acquisition of at least the sensor values in response to an external signal, or to a user input commanding that an acquisition of the sensor values be stored.
  • An acquisition herein, may refer to a plurality of sensor values acquired at a given time or in a given time period.
  • the acquisition may further include the 3D model and/or the 3D coordinates acquired at the given time or in the given time period.
  • An acquisition may thus be representative of the state of the machine as reported by the plurality of sensors and as described by the 3D model and the 3D coordinates at the given time or in the given time period.
  • the instrumentation overview device advantageously provides automated support for recording and analyzing historical states of the machine, for time-series analysis and the like.
  • the instrumentation overview device further comprises: a sixth entity configured to extract a 2D model from the 3D model acquired by the first entity; a seventh entity configured to create a plurality of raster bitmaps from the 2D model at a plurality of levels of detail, and an eighth entity configured to store a library of the created raster bitmaps in association with the respective level of detail.
  • the plurality of interactive components further comprises a navigation component configured to receive a user input indicative of at least a desired level of detail from the plurality of levels of detail.
  • the fourth entity is configured to generate the map displayed on the interactive screen using portions of the raster bitmap stored in the eighth entity in association with the desired level of detail indicated by the user input received by the navigation selection component.
  • the instrumentation overview device may be configured to generate or calculate and store a plurality of raster bitmaps of one or more cross- sections through the machine in response to acquiring the 3D model.
  • the instrumentation overview device may comprise a pre-stored library of raster bitmaps corresponding to various parts of the machine at various levels of detail, and thereby may provide support for navigating the graphical map displayed on the interactive screen with high speed.
  • navigation may comprise zooming in and out, panning, rotating, selecting a level of a cross-section and similar operations performed in response to a user input.
  • the pre stored library obviates a necessity for the fourth entity to perform complex three-dimensional calculations while the user navigates the map, providing smoother and more responsive navigation.
  • the fourth entity may thus be embodied, at least in part, by a unit with limited processing power such as a mobile device, smartphone, tablet, or a web browser executed on a remote PC or thin client.
  • the instrumentation overview device is configured as a distributed device distributed over at least a client-side unit and a server-side unit located remotely with respect to each other and communicatively coupled to each other via a communications network.
  • the client-side unit comprises at least a portion of the fourth entity
  • the server-side unit comprises at least one other of the entities of the instrumentation overview device .
  • entities of the instrumentation device configured to acquire sensor values may be located in proximity to the machine, e.g. at an installation site, entities of the instrumentation device configured to perform processing may be located at a central installation site, and the fourth entity configured to generate the interactive screen may be, at least in part, located at a site where a user operates the instrumentation overview device.
  • the instrumentation overview device of the present embodiment may advantageously enable remote monitoring of the machine .
  • the instrumentation overview device comprises a ninth entity having stored therein downloadable code that is configured, when downloaded and executed by a web browser executed on the client-side unit, to cause the web browser to embody the fourth entity.
  • the downloadable code may be Java, JavaScript or HTML 5 code.
  • the instrumentation overview device may advantageously provide simple web-based access to instrumentation overview.
  • the interactive screen may be displayed on any device having installed thereon a standard web browser. Thereby, the need to install special application software on workstations of expert users is avoided, and administration of a network of workstations of expert users charged with prototyping and/or monitoring the machine may be simplified.
  • a respective entity e.g. any of the first to ninth entity, may be implemented in hardware and/or in software. If said entity is implemented in hardware, it may be embodied as a device, e.g. as a computer or as a processor or as a part of a system, e.g. a computer system. If said entity is implemented in software it may be embodied as a computer program product, as a function, as a routine, as a program code or as an executable object.
  • Any embodiment of the first aspect may be combined with any embodiment of the first aspect to obtain another embodiment of the first aspect.
  • an instrumentation overview method for a machine instrumented with a plurality of sensors comprises: acquiring a 3D model of the machine, acquiring 3D coordinates of each of the plurality of sensors, acquiring a sensor value for each of the plurality of sensors, and generating an interactive screen comprising a plurality of interactive components based on the 3D model of the machine, the 3D coordinates of the plurality of sensors and the acquired sensor values .
  • the invention relates to a computer program product comprising a program code for executing the above-described instrumentation overview method when run on at least one computer.
  • a computer program product such as a computer program entity, may be embodied as a memory card, USB stick, CDROM, DVD or as a file which may be downloaded from a server in a network.
  • a file may be provided by transferring the file comprising the computer program product from a wireless communication network.
  • Fig. 1 shows a block diagram of an instrumentation overview device according to a first exemplary embodiment.
  • Fig. 2 shows an instrumentation overview method according to the first exemplary embodiment.
  • Fig. 3 shows the instrumentation overview according to the first exemplary embodiment, a gas turbine and further periphery.
  • Fig. 4 shows a block diagram of an instrumentation overview device according to the second exemplary embodiment .
  • Fig. 5 shows a schematic view of an interactive screen generated by the instrumentation overview device according to the second exemplary embodiment .
  • the instrumentation overview device 10 of the first exemplary embodiment comprises a first entity 1, a second entity 2, a third entity 3 and a fourth entity 4.
  • the first entity 1 acquires a 3D model of a machine 15 instrumented with a plurality of sensors.
  • the machine is a large gas turbine 15.
  • the 3D model of the machine 15 is stored as 3D CAD data in the form of a DWG file on a workstation 14, which is used by expert users to design and develop the gas turbine 15, and is acquired therefrom by the first entity 1, which is connected to the workstation 14 via a communications network such as the Internet or a Local Area Network.
  • the first entity 1 provides the fourth entity 4 access to the 3D model.
  • the first entity 1 may locally store the 3D model or may acquire the 3D model, or portions thereof, from the workstation 14 each time the fourth entity 4 requires access to a portion of the 3D model.
  • the second entity 2 acquires 3D coordinates of each of a plurality of sensors (not shown) that are arranged inside and alongside the gas turbine 15.
  • the 3D coordinates are stored, for example, as an XML table in the workstation 14 and are acquired therefrom by the second entity 2, which is connected to the workstation via the communications network.
  • the second entity 2 provides the fourth entity 4 with access to the 3D coordinates.
  • the second entity 2 may locally store the 3D coordinates or may acquire the 3D coordinates, or portions thereof, from the workstation 14 each time the fourth entity 4 requires access to a portion of the 3D coordinates .
  • the third entity 3 acquires a sensor value for each of the plurality of sensors (not shown) that are arrange inside and alongside the gas turbine 15.
  • the third entity 3 is connected to the gas turbine 15 via a bus connection and acquires the sensor values directly from the gas turbine 15 by reading each individual sensor.
  • the sensor values may also be offline sensor values and may be acquired from the workstation 14 or any other computing or storage device.
  • the third entity 3 provides the fourth entity 4 with access to the acquired sensor values.
  • the third entity 3 may locally store the acquired sensor values in the third entity 3 , or it may acquire the sensor values from the gas turbine 15 each time the fourth entity requires access to the sensor values.
  • the fourth entity 4 is connected to the first entity 1, the second entity 2, the third entity 3, a display 16, a keyboard 17, and a mouse 18.
  • a fourth step S4 the fourth entity 4 generates an interactive screen 20 based on the 3D model acquired by the first entity 1, the 3D coordinates acquired by the second entity 2 and the sensor values acquired by the third entity 3.
  • the interactive screen 20 is transmitted, for example via a cable such as an HDMI, VGA or DVI cable, to the display 16, causing the display 16 to display the interactive screen 20 to a user (not shown) .
  • the user may use the keyboard 17 and the mouse 18 to provide a user input to the fourth entity 4.
  • the fourth entity 4 changes the interactive screen 20 in response to the user input.
  • the fourth entity 4 provides functionality for a user to easily navigate through and access various aspects of the acquired 3D model of the gas turbine 15 , the sensors positions defined by the acquired 3D coordinates, and the sensor values acquired from the gas turbine.
  • the instrumentation overview device 10 and method according the first exemplary embodiment therefore advantageously achieve an interactive graphical instrumentation overview screen 20 visualizing the 3D model of the gas turbine 15 instrumented with the plurality of sensors and the sensor values.
  • the interactive screen 20 is generated in an automated way based on the 3D model, the 3D coordinates of the sensors and the sensor values acquired from the gas turbine 15 without manual user interaction. Changes made to the 3D model, the 3D coordinates as well as changing sensor values may thus be automatically reflected on the interactive screen 20, thereby enabling rapid prototyping, comprehensive monitoring and efficient decision-taking in case of any events.
  • Fig. 4 shows a block diagram of an instrumentation overview device 19 according to a second exemplary embodiment.
  • the instrumentation overview device 19 of the second exemplary embodiment is distributed over a server-side unit 11 and a client-side unit 12.
  • the server-side unit 11 comprises the first entity 1, the second entity 2, the third entity 3, a fifth entity 5, a sixth entity 6, a seventh entity 7, an eighth entity 8, a ninth entity 9 and a first portion 41 of the fourth entity 4.
  • the server-side unit 11 is connected to the client-side unit 12 via a communications network 13 such as the Internet.
  • the client-side unit 12 comprises a second portion 42 of the fourth entity and is configured to display the interactive screen 20 generated by the fourth entity 41, 42.
  • the functionality of the first entity 1, the second entity 2, and the third entity 3, and also the general operation principle of the fourth entity 41, 42 of the instrumentation overview device of the second exemplary embodiment correspond to the functionalities of the respective entities of the first exemplary embodiment, and the description thereof will not be repeated. Differences to the first exemplary embodiment are described below.
  • the first entity 1 provides the acquired 3D model of the gas turbine (15 in Fig. 3) to a sixth entity 6, which extracts a 2D model from the 3D model by performing 3D calculation so as to generate one or more cross-sections of the 3D model.
  • the 2D model is provided by the sixth entity 6 to a seventh entity 7, which creates a plurality of raster bitmaps from the 2D model at a plurality of resolutions.
  • the seventh entity 7 creates a raster bitmap that constitutes a graphical map of the entire machine based on the 2D model by rasterizing the one or more cross-sections at a predetermined display resolution so that a resulting raster bitmap, when displayed as part of the interactive screen 20, covers an entire area provided on the interactive screen 20 for the graphical map 21 (Fig. 5) .
  • the resulting raster bitmap corresponds to the lowest level of detail.
  • the seventh entity 7 divides each cross-section into, for example, four tiles and rasterizes each tile at the same predetermined display resolution, so that each of the raster bitmaps created for each tile covers an entire area provided on the interactive screen 20 for the graphical map 21 (Fig. 5) .
  • the resulting raster bitmaps correspond to a higher level of detail. This process may be repeated until a desired maximum level of detail is achieved. The achievable maximum level of detail may be limited by a maximum resolution of the 2D model. All resulting raster bitmaps are then stored in the eighth entity 8.
  • the client-side unit 12 is a personal computer running a web browser.
  • the personal computer 12 is connected to the server- side unit 11 via a communication network 13 such as the Internet .
  • a communication network 13 such as the Internet .
  • the web browser transmits a HTTP request to a web server embodied by the ninth entity 9 of the server- side unit 11.
  • the web server 9 responds by transmitting JavaScript code to the web browser running on the personal computer 12.
  • the web browser executes the transmitted JavaScript code, which causes the web browser to function as a second first portion 42 of the fourth entity 41, 42.
  • the second portion 42 of the fourth entity 41, 42 transmits a request for a raster bitmap corresponding to the lowest level of detail to a first portion 41 of the fourth entity 41, 42 comprised by the server- side unit 11.
  • the first portion 41 of the fourth entity 41, 42 fetches the requested raster bitmap from the eighth entity 8 and transmits the requested raster bitmap back to the second portion 42 of the fourth entity 41, 42.
  • the second portion 42 of the fourth entity 41, 42 also communicates with the first portion 41 of the fourth entity 41, 42 to request 3D coordinates and sensor values from the first portion 41 of the fourth entity 41, 42.
  • the second portion 42 of the fourth entity 41, 42 then generates the interactive screen 20, and displays the interactive screen 20 in a web browser window of the web browser running on the personal computer 12.
  • the first and second portions 41, 42 of the fourth entity 41, 42, that communicate with each other via the communications network 13, will henceforth collectively be referred to as the fourth entity 41, 42, and further description of internal communications between the first and second portions 41, 42 of the fourth entity 41, 42 will be omitted.
  • the interactive screen 20 generated by the fourth entity 41, 42 of the instrumentation overview device 19 according to the second exemplary embodiment is shown in more detail in Fig. 5.
  • Fig. 5 The interactive screen 20 generated by the fourth entity 41, 42 of the instrumentation overview device 19 according to the second exemplary embodiment is shown in more detail in Fig. 5.
  • the interactive screen 20 of Fig. 5 comprises a main area configured to display the graphical map 21.
  • the fourth entity 41, 42 displays the raster bitmap corresponding to the default level of detail, which has been received from the eight entity 8, as the graphical map 21 on the interactive screen 20.
  • the graphical map 21 is a rasterized bitmap generated from a 2D section through the 3D model of the machine 15, and may be appreciated by a skilled user as a design drawing to scale of the machine .
  • the fourth entity 41, 42 receives the 3D coordinates of the plurality of sensors from the second entity 2 and displays, for each of the plurality of sensors, a sensor tag 22, 23 on the graphical map 21.
  • the sensor tags 22, 23 are displayed in accordance with the 3D coordinates of the respective sensors such that a skilled user may appreciate the true positions of the respective sensors of the machine by looking at the graphical map 21 and the sensor tags 22, 23.
  • the fourth entity 41, 42 also receives a sensor value for each of the plurality of sensors from the third entity 3.
  • Each sensor value comprises a sensor reading and a sensor alarm state .
  • the fourth entity 41, 42 changes an appearance of sensor tag 22, which has an active sensor alarm state, by coloring sensor tag 22 in red, and changes an appearance of sensor tag 23, which has no active sensor alarm state (is in a normal state) , by coloring sensor tag 23 in white .
  • the fourth entity 41, 42 changes an appearance of sensor tag 22, which has a low sensor reading, by coloring sensor tag 22 in a dark shade of grey, and changes an appearance of sensor tag 23, which has a high sensor reading, by coloring sensor tag 23 in a light shade of grey.
  • information to support a determination whether a respective sensor reading is "high” or "low”, such as an acceptable sensor value range may be received e.g. as part of sensor meta data from the second entity 2 together with the 3D coordinates for each sensor.
  • a "high" sensor reading may be a reading that is nearer to an acceptable maximum value than to an acceptable minimum value of the acceptable sensor value range .
  • a user may easily get an overview of the instrumentation of the machine 15, and may, in particular, easily appreciate individual sensors having an active alarm state and their individual positions on the graphical map 21, and/or appreciate regions of the machine with high and/or low sensor readings.
  • the fourth entity 41, 42 generates a sensor list 24 and displays the sensor list 24 on the interactive screen 20.
  • the sensor list is a scrollable list and comprises one sensor list item 25, 26 for each of the plurality of sensors.
  • Each sensor list item 25, 26 displays (not shown) information about the sensor value (such as the sensor reading and/or the sensor alarm state) for the respective sensor.
  • Each sensor list item 25, 26 further displays (not shown) meta data such as a sensor name and a sensor type for the respective sensor.
  • the fourth entity 41, 42 updates the interactive screen 20 so as to highlight the selected sensor list item 25 and to highlight the sensor tag 22 that is associated with the same sensor as the selected sensor list item 25.
  • the fourth entity 41, 42 updates the interactive screen 20 so as to highlight the selected sensor tag 22 and to highlight the sensor list item 25 that is associated with the same sensor as the selected sensor tag 22.
  • highlighted components are shown in grey. Generally, highlighting is achieved by displaying the highlighted item (sensor list item 25 or sensor tag 22) in a different color.
  • a user may scrutinize the sensor list 24 and, upon identifying a sensor of interest, e.g. a sensor with an unusual reading, which is represented by (associated with) a sensor list item 25, the user may click on the sensor list item 25 and, then, localize the highlighted sensor tag 22 on the graphical map 21. Since the highlighted sensor tag 22 is associated with the sensor of interest and is positioned on the graphical map 21 in accordance with the 3D coordinates of the sensor of interest, it reflects the position of the sensor of interest in the gas turbine (15 in Fig. 3) . A user may thus easily identify a position of a sensor of interest in the gas turbine (15 in Fig. 3) .
  • a sensor of interest e.g. a sensor with an unusual reading
  • the user may click on the sensor list item 25 and, then, localize the highlighted sensor tag 22 on the graphical map 21. Since the highlighted sensor tag 22 is associated with the sensor of interest and is positioned on the graphical map 21 in accordance with the 3D coordinates of the sensor of interest, it
  • a user may visually identify a sensor of interest, such as a sensor with an active alarm state, or a group of sensors that show similar behavior or similar values, by looking at the graphical map 21, and my easily identify additional information about the respective sensor, such as its name and type, by clicking on the sensor tag 22 and, then, looking at the sensor list 24 and the highlighted sensor list item 25, which may contain additional meta data about the respective sensor.
  • a sensor of interest such as a sensor with an active alarm state, or a group of sensors that show similar behavior or similar values
  • the interactive screen 20 generated by the fourth entity 41, 42 may further be configured to accept a user input selecting a sensor tag 22 and/or a sensor list item 25 in a different fashion, e.g. with a right mouse button.
  • a dialog component 27 is displayed on the interactive screen 20.
  • the dialog component 27 comprises a sensor value display field 28 showing the sensor value received by the fourth entity 41, 42 from the third entity 3 for the sensor associated with the selected sensor tag 22 and/or the selected sensor list item 25.
  • the dialog component 27 further comprises a coordinate input field 29 showing the 3D coordinates received by the fourth entity 41, 42 from the second entity 2 for the sensor associated with the selected sensor tag 22 and/or the selected sensor list item 25.
  • a user may enter updated 3D coordinates into the coordinate input field 29.
  • the fourth entity 41, 42 transmits the updated 3D coordinates to the second entity 2, and the second entity updates the acquired 3D coordinates for the sensor associated with the sensor tag 22/sensor list item 25.
  • the second entity 2 may update 3D coordinates stored internally in the second entity 2, and/or may transmit the updated 3D coordinates to the workstation 14 (Fig.
  • the fourth entity 41, 42 updates (re generates, re-draws) the interactive screen 20 so as to reflect the updated 3D coordinates.
  • the sensor tag 22 is repositioned (moved) on the map 21 in accordance with the updated 3D coordinates.
  • the fourth entity 41, 42 may further generate a filter selection component 30 and display the filter selection component 30 as a further interactive component on the interactive screen 20.
  • the filter selection component 30 displays a plurality of filter criteria (schematically shown as horizontal lines inside filter selection component 30) .
  • a user may select one of the filter criteria by clicking on it e.g. with the left mouse button.
  • a user may then be offered the option to further configure the filter criterion.
  • one of the filter criteria may be a search field configured to receive a user input indicative of a sensor name or a part of a sensor name or a sensor location.
  • one of the filter criteria may be a value range field configured to receive a user input indicative of a minimum sensor value and a maximum sensor value.
  • one of the filter criteria may be configured to receive a user selection indicative of one or more sensor types, such as pressure sensor or temperature sensor.
  • the fourth entity 41, 42 is configured to update the interactive screen 20 so as to display only sensor tags 22, 23 and sensor list items 25, 26 associated with sensors that match the selected filter criteria, and not to display (to hide, to blank out) sensor tags 22, 23 that are associated with sensors that do not match the selected filter criteria.
  • a user may quickly search for a specific sensor, may easily view only sensors of a certain type, or may easily narrow down a specific issue by filtering for sensors with specific reasons.
  • prototyping and monitoring of the gas turbine 15 is improved.
  • the third entity 3 acquires sensor values for each of the plurality of sensors repeatedly, e.g. every second.
  • the second entity 2 acquires the 3D model repeatedly, e.g. every minute, or alternatively, every time a prototyping engineer updates the 3D coordinates stored on the workstation (14 in Fig. 3) .
  • the fourth entity 41, 42 receives updated sensor values from the third entity 3 and/or detects a change in the 3D coordinates acquired by the second entity 2, it updates the interactive screen 20 and the interactive components 21-32 thereon, in particular the sensor tags 22, 23 and/or the sensor list elements 25, 26.
  • a user may therefore advantageously get a live or online overview of the gas turbine 50 instrumented with the plurality of sensors and their current sensor readings, as represented by the updated sensor values, and current sensor positions as represented by the updated 3D coordinates .
  • the first entity 1 acquires the 3D model repeatedly, e.g. every time a prototyping engineer updates the 3D model stored on the workstation (14 in Fig. 3) .
  • the sixth entity 6 detects a change in the acquired 3D model, the sixth entity 6 extracts an updated 2D model from the changed 3D model and provides the updated 2D model to the seventh entity 7.
  • the seventh entity 7 then creates an updated plurality of raster bitmaps from the updated 2D model at a plurality of resolutions and stores them in the eighth entity 8.
  • the fourth entity 41, 42 updates the interactive screen 20 based on the updated raster bitmaps .
  • the instrumentation overview device 19 of the second exemplary embodiment further comprises a fifth entity 5 configured to store a history of the sensor values, the 3D coordinates and the 3D model acquired by the first entity 1, the second entity 2 and the third entity 3 in association with time.
  • the history may be described as an archive of previous sensor values, i.e., sensor readings, sensor alarm states, and sensor meta data, previous 3D coordinates, that is, previous sensor positions, and previous 3D model acquisitions that may be accessed at a later time for offline analysis, e.g. so as to review changes in sensor values, sensor positions and the 3D model itself over time.
  • the fourth entity 41, 42 further generates a history selection component 31 and displays the history selection component 31 as a further interactive component on the interactive screen 20.
  • the history selection component 31 is configured, for example, as a dropdown-menu 31.
  • the dropdown-menu 31 comprises one menu item representing a current acquisition, and a plurality of menu items representing previous acquisitions.
  • the fourth entity 41, 42 receives a user input selecting the current acquisition, the fourth entity 41, 42 generates the interactive screen 20 based on the sensors values, 3D coordinates and 3D model currently- acquired by the first entity 1, the second entity 2 and the third entity 3.
  • the fourth entity 41, 42 When the history selection component 31 receives a user input selecting one of the past acquisitions (past acquisition times) , the fourth entity 41, 42 generates the interactive screen 20 based on the sensor values, the 3D model and the 3D coordinates stored in the fifth entity 5 in association with the selected past acquisition time. I.e., the graphical map 21, the positions and the graphical appearances of the sensor tags 22, 23, the contents of the sensor value display field 28, and the sensor value information displayed by the sensor list elements 25, 26 are each updated in accordance with the sensor values, the 3D coordinates and the 3D model acquired by the first entity 1, the second entity 2 and the third entity 3 at the selected previous acquisition time .
  • the instrumentation overview device 19 may therefore beneficially be used for historical analysis of operation states of the gas turbine 15 and their change over time.
  • a plurality of raster bitmaps corresponding to a plurality of levels of detail is stored for each of one or more cross-sections of the 3D model of the gas turbine (15 in Fig. 3) .
  • the fourth entity 41, 42 further generates a navigation component 32 and displays the navigation component 32 as one of the interactive components on the interactive screen 20.
  • the navigation component 32 is configured to receive a user input indicative of a desired level of detail and/or a desired tile of a cross-section (a desired cross-section) of the acquired 3D model. Specifically, by clicking on "+” and in the navigation component 32, a user may "zoom", i.e. select a desired level of detail, and by clicking on the arrows in the navigation component 32, a user may "pan”, i.e. move left, right, forward, and backward, thereby selecting a desired tile from multiple tiles forming the cross-section of the 3D model.
  • a desired level of detail e.e. select a desired level of detail
  • a user may "pan", i.e. move left, right, forward, and backward, thereby selecting a desired tile from multiple tiles forming the cross-section of the 3D model.
  • the navigation component 32 may comprise further icons allowing a user to "pan vertically", i.e. to move up and down, thereby selecting one of multiple cross-sections through the 3D model.
  • the fourth entity 41, 42 acquires a raster bitmap corresponding to the selected level of detail, the selected tile and/or the selected cross- section from the seventh entity 7, displays the acquired raster bitmap as the graphical map 21 on the interactive screen 20, and displays, on the graphical map 21, sensor tags 22, 23 only for those of the plurality of sensors that are arranged in an area of the gas turbine 15 that is covered by the graphical map 21.
  • the instrumentation overview device 19 may advantageously achieve high responsivity when zooming and panning, may reduce a computational load on the client-side unit 12 by obviating a necessity for complex 3D computations on the client-side unit 12, and may therefore enable a user to use a standard web browser to get an overview of a large industrial machine such as the gas turbine (15 in Fig. 3) instrumented with a large quantity of sensors, as well as to zoom down and navigate through particular areas of the large industrial machine (15 in Fig. 3) so as to gain more detailed insights .
  • the interactive screen 20 may or may not comprise any of the history selection component 30, the filter selection component 30, the dialog component 27, the sensor list component 24, the navigation component 32 and/or may comprise any combination thereof, and the instrumentation overview device 10, 19 may or may not comprise the associated features, entities, and functionality for each of these components .
  • the navigation functionality described for the second exemplary embodiment in connection with the navigation component 32 may alternatively be implemented without any navigation component being displayed on the interactive screen 20.
  • the fourth entity 4, 41, 42 may also enable a user to use a mouse wheel to navigate the interactive screen 20.
  • the keyboard 17 and mouse 18 may not be provided, and touch gestures, voice input, interactive 3D glasses or the like may be used to navigate the interactive screen 20.
  • File formats such as XML, DWG, .prt, .catpart, and/or technical standards such as Ethernet, HTTP, JavaScript, HDMI , and the like, that are mentioned in the description and the various exemplary embodiments are merely examples, and any file format or technical standard may be used instead.
  • the present invention has been described in connection with an instrumentation overview device and method for a gas turbine.
  • the instrumentation overview device and method of the present invention are also applicable to other pieces of industrial machinery.
  • the teachings disclosed herein may further be used to provide an instrumentation overview of various other 3D structures instrumented with a large numbers of sensors, such as a vehicle, a vessel, an airplane, a building or an entire city.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

L'invention concerne un dispositif de vue d'ensemble d'instrumentation (10) pour une machine instrumentée avec une pluralité de capteurs. Le dispositif de vue d'ensemble d'instrumentation (10) est configuré pour acquérir un modèle 3D de la machine (15), acquérir des coordonnées 3D de chaque capteur de la pluralité de capteurs, acquérir une valeur de capteur pour chaque capteur de la pluralité de capteurs, et produire un écran interactif (20) comprenant une pluralité de composants interactifs (21-32) en fonction du modèle 3D de la machine (15), des coordonnées 3D de la pluralité de capteurs et des valeurs de capteur acquises. Le dispositif de vue d'ensemble d'instrumentation visualise avantageusement le modèle 3D acquis, les valeurs de capteur et les coordonnées de capteur sur un seul écran interactif, ce qui permet la navigation, la recherche rapide et l'analyse historique ou en ligne de données de capteur et de modèle dépendant du temps lors de la surveillance et/ou du prototypage d'une machine telle qu'une grande turbine à gaz.
PCT/RU2018/000215 2018-04-04 2018-04-04 Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs WO2019194697A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/RU2018/000215 WO2019194697A1 (fr) 2018-04-04 2018-04-04 Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2018/000215 WO2019194697A1 (fr) 2018-04-04 2018-04-04 Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs

Publications (1)

Publication Number Publication Date
WO2019194697A1 true WO2019194697A1 (fr) 2019-10-10

Family

ID=62186511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2018/000215 WO2019194697A1 (fr) 2018-04-04 2018-04-04 Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs

Country Status (1)

Country Link
WO (1) WO2019194697A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113987102A (zh) * 2021-12-23 2022-01-28 睿明(武汉)科技有限公司 一种交互式电力数据可视化方法和系统
CN116451447A (zh) * 2023-03-31 2023-07-18 北京瑞风协同科技股份有限公司 一种矢量模型的传感器布局设计方法和系统

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120251996A1 (en) * 2011-03-31 2012-10-04 Korea Electronics Technology Institute Method and system for plant management by augmentation reality
EP3076253A1 (fr) * 2015-03-27 2016-10-05 Rockwell Automation Technologies, Inc. Systèmes et procédés de présentation d'une réalité augmentée

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120251996A1 (en) * 2011-03-31 2012-10-04 Korea Electronics Technology Institute Method and system for plant management by augmentation reality
EP3076253A1 (fr) * 2015-03-27 2016-10-05 Rockwell Automation Technologies, Inc. Systèmes et procédés de présentation d'une réalité augmentée

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113987102A (zh) * 2021-12-23 2022-01-28 睿明(武汉)科技有限公司 一种交互式电力数据可视化方法和系统
CN116451447A (zh) * 2023-03-31 2023-07-18 北京瑞风协同科技股份有限公司 一种矢量模型的传感器布局设计方法和系统

Similar Documents

Publication Publication Date Title
US11349901B1 (en) Automated network discovery for industrial controller systems
EP2817687B1 (fr) Procédé pour fournir un outil de navigation d'une interface utilisateur pour un système de commande industriel
US9274519B2 (en) Methods and apparatus for monitoring operation of a system asset
US10289108B2 (en) Methods and apparatus for monitoring operation of a system asset
CN103052921A (zh) 用于使能够对技术系统进行监管和控制的方法和计算机程序产品
US20130246037A1 (en) Methods and apparatus for monitoring operation of a system asset
CN107077392A (zh) 用于可视化修改的预取高速缓存
WO2019194697A1 (fr) Dispositif et procédé de vue d'ensemble d'instrumentation pour machine instrumentée avec une pluralité de capteurs
JP2016157435A (ja) 表示管理方法および関連するコンピュータプログラム製品および電子デバイス
US9059931B2 (en) System and method for visualizing an address space
US11494444B2 (en) Systems and methods for visualizing and analyzing multi-dimensional data
EP2653943B1 (fr) Procédés et appareil permettant de surveiller le fonctionnement d'un système actif
WO2018183179A1 (fr) Procédé et appareil d'assistance à interrogation in situ dans des environnements industriels
US20130332882A1 (en) Context based desktop environment for controlling physical systems
US10860162B2 (en) Supervisory control system for controlling a technical system, a method and computer program products
JP2013149151A (ja) グラフ表示装置
US9004027B2 (en) Cam-data creation device and cam-data creation program
JP5824371B2 (ja) グラフ表示装置
WO2014008941A1 (fr) Procédé de gestion d'une alarme ou d'un événement dans un système de commande de processus
US20150286384A1 (en) Method Of Establishing Multi-Sensor Measuring Machine Routines
US10788819B2 (en) User interface, industrial automation system and method
US20170053426A1 (en) Dynamic graphic entity determination
EP3227767A1 (fr) Appareil et procédé de navigation hiérarchique
WO2020009601A1 (fr) Dispositif serveur de visualisation de données de capteur, système, et procédé
JP2015210640A (ja) 表示装置、監視システム、表示方法および表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18725328

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18725328

Country of ref document: EP

Kind code of ref document: A1