CN114332340A - Workshop visualization monitoring method and computer device - Google Patents

Workshop visualization monitoring method and computer device Download PDF

Info

Publication number
CN114332340A
CN114332340A CN202011019926.7A CN202011019926A CN114332340A CN 114332340 A CN114332340 A CN 114332340A CN 202011019926 A CN202011019926 A CN 202011019926A CN 114332340 A CN114332340 A CN 114332340A
Authority
CN
China
Prior art keywords
object model
module
scene
virtual scene
monitoring method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011019926.7A
Other languages
Chinese (zh)
Inventor
郑思浩
何光华
王敏
林家军
高子和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulian Fugui Precision Industry Co Ltd
Original Assignee
Shenzhen Fugui Precision Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fugui Precision Industrial Co Ltd filed Critical Shenzhen Fugui Precision Industrial Co Ltd
Priority to CN202011019926.7A priority Critical patent/CN114332340A/en
Publication of CN114332340A publication Critical patent/CN114332340A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a visual monitoring method for a workshop, which comprises the following steps: establishing a virtual scene for the workshop, wherein the virtual scene comprises an object model; associating the object model with physical equipment of the plant; acquiring real-time data of the entity equipment; presenting an operating state of the physical device with the object model based on real-time data of the physical device; and responding to the input signals received from the object model, and performing corresponding control on the entity equipment. The invention also provides a computer device for realizing the workshop visualization monitoring method. The invention can visually monitor the dynamic condition of the production workshop.

Description

Workshop visualization monitoring method and computer device
Technical Field
The invention relates to the technical field of monitoring, in particular to a workshop visualization monitoring method and a computer device.
Background
With the appearance and development of the technology of the internet of things, the industrial production develops rapidly. How to effectively monitor the production process of the equipment is a technical problem to be solved.
Disclosure of Invention
In view of the above, it is necessary to provide a visual monitoring method and a computer device for a workshop, which can visually monitor the dynamic situation of the production workshop.
The visual monitoring method for the workshop comprises the following steps: establishing a virtual scene for the workshop, wherein the virtual scene comprises an object model; associating the object model with physical equipment of the plant; acquiring real-time data of the entity equipment; presenting an operating state of the physical device with the object model based on real-time data of the physical device; and responding to the input signals received from the object model to correspondingly control the entity equipment.
Preferably, said controlling said physical device in response to user input received from said object model comprises: when the input signal is a first signal and the entity equipment associated with the object model is in a power-off state, controlling the entity equipment associated with the object model to be powered on; and when the input signal is a second signal and the physical equipment associated with the object model is in a power-on state, controlling the physical equipment associated with the object model to be powered off, wherein the first signal is the same as or different from the second signal.
Preferably, said controlling said physical device in response to user input received from said object model comprises: when a third signal is received from the object model, displaying a setting interface, wherein the setting interface is used for a user to set the operation parameters of the entity equipment associated with the object model; receiving operating parameters set by a user for entity equipment associated with the object model from the setting interface; and controlling the entity equipment associated with the object model to operate according to the set operation parameters.
Preferably, the display area is divided into a first display area and a second display area in response to the input of the user, wherein the first display area is used for displaying the virtual scene, the second display area is used for displaying a scene editing module, and the scene editing module is used for modifying the virtual scene according to the input of the user.
Preferably, the scene editing module comprises a scene sub-module, and the method further comprises: and displaying the index information of the virtual scene in the first display area by using the scene submodule.
Preferably, the scene editing module further comprises an object sub-module, and the method further comprises: displaying the object model and the attribute of the object model included in the virtual scene by using the object sub-module; and deleting the object models and/or adding the object models to the virtual scene by using the object sub-module.
Preferably, the scene editing module further includes a resource sub-module, and the method further includes: displaying various pre-stored object models by utilizing the resource submodule; and newly adding other object models by utilizing the resource sub-module.
Preferably, the object sub-module is used for editing the attributes of the newly added other object models.
Preferably, the scene editing module further comprises a setting submodule, and the method further comprises the step of carrying out related setting on the virtual scene by using the setting submodule, wherein the related setting comprises setting a display mode of the virtual scene and setting a style of a panel of an object model in the virtual scene.
The computer device comprises a memory and a processor, wherein the memory is used for storing at least one instruction, and the processor is used for realizing the workshop visualization monitoring method when executing the at least one instruction.
Compared with the prior art, the visual workshop monitoring method and the computer device provided by the invention can be used for visually monitoring the dynamic condition of the production workshop.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of a visual monitoring method for a workshop according to a preferred embodiment of the present invention
Fig. 2A-2E illustrate editing of a virtual scene, respectively.
FIG. 3 is a functional block diagram of a visual monitoring system for a workshop according to a preferred embodiment of the present invention.
FIG. 4 is a block diagram of a computer device according to a preferred embodiment of the present invention.
Description of the main elements
Computer device 3
Workshop visualization monitoring system 300
Execution module 301
Acquisition module 302
Memory device 30
Processor with a memory having a plurality of memory cells 32
Display screen 33
Virtual scene 100
Object model 101
Display area 31
A first display region 311
A second display region 312
Scene editing module 5
Scene submodule 51
Object submodule 52
Resource sub-module 53
Setting sub-module 54
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments of the present invention and features of the embodiments may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
FIG. 1 is a flow chart of a visual monitoring method for a workshop according to a preferred embodiment of the present invention.
In this embodiment, the plant visualization monitoring method may be applied to a computer device, and for a computer device that needs to perform plant visualization monitoring, the functions provided by the method of the present invention for plant visualization monitoring may be directly integrated on the computer device, or may be run on the computer device in the form of a Software Development Kit (SDK).
As shown in FIG. 1, the visual monitoring method for a workshop specifically comprises the following steps, the sequence of the steps in the flow chart can be changed according to different requirements, and certain steps can be omitted.
And step S1, establishing a virtual scene for the workshop, wherein the virtual scene comprises an object model.
In one embodiment, the object model may be a three-dimensional model or a two-dimensional model. The object model can be an equipment model, a personnel model, a workpiece model, a storage model, a workshop environment model and the like.
For example, referring to fig. 2A, a virtual scene 100 of a plant is shown, the virtual scene 100 including an object model 101.
And step S2, establishing association between the object model and entity equipment of the workshop.
For example, the object model 101 is associated with a certain production machine of the plant.
It should be noted that "associating" in this embodiment means establishing a corresponding relationship between the object model and the physical device of the plant. For example, the name of the object model may be associated with the name of the physical device.
In one embodiment, a display area of a display screen of a computer device may be divided into a first display area and a second display area in response to a user input, where the first display area is used to display the virtual scene, the second display area is used to display a scene editing module, and the scene editing module is used to edit/modify the virtual scene according to the user input.
For example, referring to fig. 2B, after receiving login information of a user, such as a login account, a display area may be divided 31 into a first display area 311 and a second display area 312 in response to a selection signal of the user for the virtual scene 100. The first display area 311 is configured to display the virtual scene 100, the second display area is configured to display a scene editing module 5, and the scene editing module 5 is configured to edit/modify the virtual scene 100 according to an input of a user.
In one embodiment, the scene editing module includes a scene sub-module. The index information of the virtual scene located in the first display region may be displayed using the scene sub-module in response to a user input. The index information includes, but is not limited to, a floor map used in the virtual scene, and all object models included in the virtual scene.
In one embodiment, basic information of each object model included in the virtual scene, such as a name of the model, a creation time, a modification time, a size of the model, and a name of a physical device corresponding to the object model, may also be displayed in response to an input signal, such as a click signal, from a user in the scene submodule.
For example, still referring to fig. 2B, the scene editing module 5 includes a scene sub-module 51. The scene sub-module 51 may be used to display index information of the virtual scene 100 located in the first display region 311. The index information includes a floor map "floor 1" used in the virtual scene 100, and the object model "gl" included in the virtual scene 100.
In addition, basic information of the object model "gl" included in the virtual scene 100, such as creation time, modification time, model size, and a name of a physical device corresponding to the object model, may be displayed in response to an input signal of the user in the scene sub-module 51, for example, a click signal of the name "gl" of the object model.
In one embodiment, the scene editing module further comprises an object sub-module.
In one embodiment, the object submodule may be used to display an object model included in the virtual scene and attributes of the object model in response to a user input; and responding to the input of a user, deleting the object models and/or adding the object models included in the virtual scene to the virtual scene by using the object submodule.
For example, referring to fig. 2C, the scene editing module 5 further includes an object sub-module 52. The object submodule 52 may be utilized to display object models, such as "AGV 1", "CART _ DEMO", included in the virtual scene currently displayed in the first display area, and attributes of the object models, such as name, size, name of the virtual scene where the object models are located, position coordinates of the object models in the virtual scene, and the like. In addition, still referring to fig. 2C, it is also possible to delete one or more object models included in the virtual scene by using a "delete" button included in the object sub-module 52, and add one or more object models to the virtual scene currently displayed in the first display area by using a "new" button.
In one embodiment, the scene editing module further comprises a resource sub-module.
In one embodiment, the resource sub-module may be responsive to user input to display various pre-stored object models; and responding to the input of the user and newly adding other object models by using the resource submodule.
For example, referring to FIG. 2D, various pre-stored object models may be displayed in response to user input using the resource sub-module 53; and adding other object models by using the resource submodule 53. The additional object model may be a three-dimensional model (3D model) or a two-dimensional model such as a planar map.
In one embodiment, the object sub-module may be further configured to edit the attributes of the additional object model in response to a user input.
In one embodiment, the scene editing module further comprises a setting sub-module.
In one embodiment, the setting submodule may be used to perform relevant settings on the virtual scene in response to user input, and the relevant settings include, but are not limited to, setting a display mode of the virtual scene, and setting a style of a panel of an object model in the virtual scene.
For example, referring to fig. 2D and 2E, the setting submodule 54 may be used to perform relevant settings on the virtual scene in response to the input of the user, where the relevant settings include setting the display mode of the virtual scene (e.g., "grid display"), setting the style of the panel of the object model in the virtual scene, such as the background color of the panel and the size of the font to be displayed, etc., and configuring the customized panel, such as configuring a panel for the state of the entire virtual scene, including humidity, temperature, illumination intensity, etc.
And step S3, acquiring real-time data of the entity equipment.
In one embodiment, real-time data of the physical device may be received from other computer apparatus. The real-time data may include, for example, the temperature of the physical device, whether the operation is in a normal state, and the like.
And step S4, presenting the operation state of the entity equipment by using the object model based on the real-time data of the entity equipment.
It should be noted that, since the object model is associated with the physical devices of the plant, the change of the operating state of the object model can be driven by the acquired real-time data of the physical devices, so as to implement mapping between the virtual object model and the physical devices of the plant.
And step S5, responding to the input signals received from the object model, and controlling the entity equipment correspondingly.
In one embodiment, the respective controlling of the physical device in response to the user input received from the object model comprises (a1) - (a 2):
(a1) and when the input signal is a first signal and the entity equipment associated with the object model is in a power-off state, controlling the entity equipment associated with the object model to be powered on.
(a2) And when the input signal is a second signal and the entity equipment associated with the object model is in a power-on state, controlling the entity equipment associated with the object model to be powered off.
In one embodiment, the first signal is the same as or different from the second signal.
In one embodiment, the first signal and the second signal may be a single click signal or a double click signal generated by clicking any position of the object model.
In other embodiments, the first signal is a signal generated by clicking a first designated position of the object model, and the second signal is a signal generated by clicking a second designated position of the object model. The first designated location is different from the second designated location.
In one embodiment, the respective controlling of the physical device in response to the user input received from the object model further comprises (b1) - (b 3):
(b1) when the third signal is received from the object model, a setting interface is displayed, and the setting interface is used for setting the operation parameters (such as running power and the like) of the entity equipment associated with the object model by a user.
In one embodiment, the third signal is different from the first signal and different from the second signal.
(b2) And receiving the operating parameters set by the user for the entity equipment associated with the object model from the setting interface.
(b3) And controlling the entity equipment associated with the object model to operate according to the set operation parameters.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
Fig. 3 is a block diagram of a visual monitoring system for a workshop according to a preferred embodiment of the present invention.
In some embodiments, the plant visualization monitoring system 300 is run in the computer device 3. The plant visualization monitoring system 300 may include a plurality of functional modules comprised of program code segments. Program code for various program segments of the plant visualization monitoring system 300 may be stored in the memory 30 of the computer device 3 and executed by the at least one processor 32 to perform the plant visualization monitoring function (described in detail with reference to fig. 1).
In this embodiment, the plant visualization monitoring system 300 may be divided into a plurality of functional modules according to the functions performed by the system. The functional module may include: an execution module 301 and an acquisition module 302. The module referred to herein is a series of computer program segments capable of being executed by at least one processor and capable of performing a fixed function and is stored in memory. In the present embodiment, the functions of the modules will be described in detail in the following embodiments.
Specifically, the execution module 301 may establish a virtual scene for the plant, where the virtual scene includes an object model; and establishing association between the object model and entity equipment of the workshop. The obtaining module 302 obtains real-time data of the entity device. The execution module 301 may present the operating state of the physical device with the object model based on the real-time data of the physical device; and responding to the input signals received from the object model to correspondingly control the entity equipment.
Fig. 4 is a schematic structural diagram of a computer device according to a preferred embodiment of the invention. In the preferred embodiment of the present invention, the computer device 3 comprises a memory 30, at least one processor 32, and a display screen 33. It will be appreciated by those skilled in the art that the configuration of the computer apparatus shown in fig. 4 does not constitute a limitation of the embodiments of the present invention, and may be a bus-type configuration or a star-type configuration, and that the computer apparatus 3 may include more or less hardware or software than those shown, or a different arrangement of components. The computer device 3 may be a computer, a mobile phone, a tablet computer, a server, or the like.
In some embodiments, the computer device 3 includes a terminal capable of automatically performing numerical calculation and/or information processing according to preset or stored instructions, and the hardware includes but is not limited to a microprocessor, an application specific integrated circuit, a programmable gate array, a digital processor, an embedded device, and the like.
It should be noted that the computer device 3 is only an example, and other electronic products that are currently available or may come into existence in the future, such as electronic products that can be adapted to the present invention, should also be included in the scope of the present invention, and are included herein by reference.
In some embodiments, the memory 30 is used for storing program codes and various data, such as a plant visualization monitoring system 300 installed in the computer device 3, and realizes high-speed and automatic access to programs or data during the operation of the computer device 3. The Memory 30 includes a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), a One-time Programmable Read-Only Memory (OTPROM), an Electrically Erasable rewritable Read-Only Memory (EEPROM), an EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc Memory, a magnetic disk Memory, a tape Memory, or any other computer-readable storage medium capable of carrying or storing data.
In some embodiments, the at least one processor 32 may be composed of an integrated circuit, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The at least one processor 32 is a Control Unit (Control Unit) of the computer device 3, connects various components of the entire computer device 3 by using various interfaces and lines, and executes various functions and processing data of the computer device 3, such as performing a function of plant visual monitoring, by running or executing programs or modules stored in the memory 30 and calling data stored in the memory 30.
Although not shown, the computer device 3 may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 32 through a power management device, so as to implement functions of managing charging, discharging, and power consumption through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The computer device 3 may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The integrated unit implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes instructions for causing a computer device (which may be a server, a personal computer, etc.) or a processor (processor) to perform parts of the methods according to the embodiments of the present invention.
In a further embodiment, in conjunction with fig. 3, the at least one processor 32 may execute the operating device of the computer device 3 and various installed applications (e.g., the plant visualization monitoring system 300), program code, and the like, such as the modules described above.
The memory 30 has program code stored therein, and the at least one processor 32 can call the program code stored in the memory 30 to perform related functions. For example, the various modules illustrated in FIG. 3 are program code stored in the memory 30 and executed by the at least one processor 32 to perform the functions of the various modules for plant visualization monitoring purposes.
In one embodiment of the present invention, the memory 30 stores one or more instructions (i.e., at least one instruction) that are executed by the at least one processor 32 for purposes of plant visualization monitoring as shown in FIGS. 1 and 2.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or that the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A visual monitoring method for a workshop, the method comprising:
establishing a virtual scene for the workshop, wherein the virtual scene comprises an object model;
associating the object model with physical equipment of the plant;
acquiring real-time data of the entity equipment;
presenting an operating state of the physical device with the object model based on real-time data of the physical device; and
and responding to the input signals received from the object model to control the entity equipment correspondingly.
2. The visual plant monitoring method of claim 1, wherein said responsive to user input received from said object model, controlling said physical device comprises:
when the input signal is a first signal and the entity equipment associated with the object model is in a power-off state, controlling the entity equipment associated with the object model to be powered on; and
and when the input signal is a second signal and the entity equipment associated with the object model is in a power-on state, controlling the entity equipment associated with the object model to be powered off, wherein the first signal is the same as or different from the second signal.
3. The visual plant monitoring method of claim 1, wherein said responsive to user input received from said object model, controlling said physical device comprises:
when a third signal is received from the object model, displaying a setting interface, wherein the setting interface is used for a user to set the operation parameters of the entity equipment associated with the object model;
receiving operating parameters set by a user for entity equipment associated with the object model from the setting interface; and
and controlling the entity equipment associated with the object model to operate according to the set operation parameters.
4. The visual shop monitoring method of claim 1, further comprising:
and responding to the input of a user to divide a display area into a first display area and a second display area, wherein the first display area is used for displaying the virtual scene, the second display area is used for displaying a scene editing module, and the scene editing module is used for modifying the virtual scene according to the input of the user.
5. The visual shop monitoring method of claim 4 wherein the scene editing module includes a scene sub-module, the method further comprising:
and displaying the index information of the virtual scene in the first display area by using the scene submodule.
6. The visual shop monitoring method of claim 5 wherein the scene editing module further includes an object sub-module, the method further comprising:
displaying the object model and the attribute of the object model included in the virtual scene by using the object sub-module; and
and deleting the object models and/or adding the object models to the virtual scene by using the object sub-module.
7. The visual shop monitoring method of claim 6 wherein the scene editing module further includes a resources sub-module, the method further comprising:
displaying various pre-stored object models by utilizing the resource submodule; and
and newly adding other object models by utilizing the resource submodule.
8. The visual shop monitoring method of claim 7, further comprising:
and editing the attributes of the newly added other object models by using the object sub-module.
9. The visual shop monitoring method of claim 7 wherein the scene editing module further includes a setup sub-module, the method further comprising:
and performing related setting on the virtual scene by using the setting submodule, wherein the related setting comprises setting a display mode of the virtual scene and setting a style of a panel of an object model in the virtual scene.
10. A computer device, characterized in that the computer device comprises a memory for storing at least one instruction and a processor for implementing the plant visualization monitoring method according to any one of claims 1 to 9 when executing the at least one instruction.
CN202011019926.7A 2020-09-24 2020-09-24 Workshop visualization monitoring method and computer device Pending CN114332340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011019926.7A CN114332340A (en) 2020-09-24 2020-09-24 Workshop visualization monitoring method and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011019926.7A CN114332340A (en) 2020-09-24 2020-09-24 Workshop visualization monitoring method and computer device

Publications (1)

Publication Number Publication Date
CN114332340A true CN114332340A (en) 2022-04-12

Family

ID=81010891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011019926.7A Pending CN114332340A (en) 2020-09-24 2020-09-24 Workshop visualization monitoring method and computer device

Country Status (1)

Country Link
CN (1) CN114332340A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740301A (en) * 2023-08-11 2023-09-12 深圳麦格米特电气股份有限公司 Three-dimensional virtual monitoring system and method and monitoring equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740301A (en) * 2023-08-11 2023-09-12 深圳麦格米特电气股份有限公司 Three-dimensional virtual monitoring system and method and monitoring equipment
CN116740301B (en) * 2023-08-11 2024-01-30 深圳麦格米特电气股份有限公司 Three-dimensional virtual monitoring system and method and monitoring equipment

Similar Documents

Publication Publication Date Title
CN105404525B (en) Manage the method and device of the basic input output system configuration in computer system
CN113282795B (en) Data structure diagram generation and updating method and device, electronic equipment and storage medium
CN112486491A (en) Page generation method and device, computer equipment and storage medium
CN110781067A (en) Method, device, equipment and storage medium for calculating starting time consumption
CN114237676A (en) FPGA (field programmable Gate array) logic updating method, device, equipment and readable storage medium
CN114332340A (en) Workshop visualization monitoring method and computer device
CN115392501A (en) Data acquisition method and device, electronic equipment and storage medium
CN115185496A (en) Service arrangement method based on Flowable workflow engine
CN115291856A (en) Flow establishing method and device and electronic equipment
CN117472704A (en) Machine room management method, equipment and storage medium based on three-dimensional model
CN111158827A (en) Method and device for graphic configuration tool correlation calculation value information
CN111128357A (en) Monitoring method and device for hospital logistics energy consumption target object and computer equipment
CN115271821A (en) Dot distribution processing method, dot distribution processing device, computer equipment and storage medium
CN115903678A (en) Intelligent production line modeling method and related equipment
CN107018160B (en) Manufacturing resource and clouding method based on layering
CN114265914A (en) Robot display method, device, equipment and storage medium
CN111176644A (en) Automatic layout method and device of operation interface and response method and device thereof
CN111930240A (en) Motion video acquisition method and device based on AR interaction, electronic equipment and medium
CN111459471B (en) Information processing method, device and storage medium
CN112579144A (en) Data processing method and device
CN114584570A (en) Digital mirroring method, server and storage medium
CN113867725A (en) Method, device and equipment for configuration development of intelligent building management system
CN114637564B (en) Data visualization method and device, electronic equipment and storage medium
CN113687813A (en) AI image recognition-based product prototype processing method and device and related equipment
CN112631237A (en) Detection method, device and system of wire controller and nonvolatile storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination