CN114237192B - Digital factory intelligent control method and system based on Internet of things - Google Patents

Digital factory intelligent control method and system based on Internet of things Download PDF

Info

Publication number
CN114237192B
CN114237192B CN202210186316.9A CN202210186316A CN114237192B CN 114237192 B CN114237192 B CN 114237192B CN 202210186316 A CN202210186316 A CN 202210186316A CN 114237192 B CN114237192 B CN 114237192B
Authority
CN
China
Prior art keywords
scene
scenes
visual
self
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210186316.9A
Other languages
Chinese (zh)
Other versions
CN114237192A (en
Inventor
陶雄杰
李建强
肖江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Likong Yuanhai Information Technology Co ltd
Original Assignee
Guangzhou Likong Yuanhai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Likong Yuanhai Information Technology Co ltd filed Critical Guangzhou Likong Yuanhai Information Technology Co ltd
Priority to CN202210186316.9A priority Critical patent/CN114237192B/en
Publication of CN114237192A publication Critical patent/CN114237192A/en
Application granted granted Critical
Publication of CN114237192B publication Critical patent/CN114237192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41865Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by job scheduling, process planning, material flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32252Scheduling production, machining, job shop
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a digital factory intelligent control method and system based on the Internet of things, wherein the method comprises the following steps: obtaining scene construction data; transmitting the scene construction data to a scene classifier in the first cloud processor, and performing similar scene classification on a plurality of garden scenes with the same attribute to obtain a first scene classification result; according to the first scene classification result, performing three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visual modeling technology to obtain a first digital factory; and acquiring the real-time workshop running state and workshop equipment parameters of the first manufacturing factory through a unified standardized interface, performing simulation optimization on the first digital factory, and outputting the first visual digital factory for warning visual control. The method solves the technical problems that in the prior art, most of three-dimensional visualization systems belong to closed architectures, data islands exist, the interoperability of visualization data is not high, and the control intellectualization is not perfect enough.

Description

Digital factory intelligent control method and system based on Internet of things
Technical Field
The invention relates to the field of digital factories, in particular to a digital factory intelligent control method and system based on the Internet of things.
Background
With the continuous promotion of factory informatization and automation construction and the rapid development of the times of big data and Internet of things, the security and stability of data arouse high importance of society, and the research at home and abroad mainly simulates the production process of a production line through tool coordinate calibration and workpiece coordinate system calibration and simulation analysis and optimization based on Witness at present. However, at present, many three-dimensional visual researches on industrial production lines at home and abroad are relatively isolated and lack of communication, and effective three-dimensional modeling and intelligent control can improve the digital management quality of manufacturing plants.
However, in the prior art, most of three-dimensional visualization systems belong to a closed architecture, three-dimensional visualization modeling of the three-dimensional visualization systems is not perfect, visualization data interoperability is not high, and control intelligence is low.
Disclosure of Invention
Aiming at the defects in the prior art, the method and the system for intelligently controlling the digital factory based on the Internet of things solve the technical problems that in the prior art, most three-dimensional visualization systems belong to closed architectures, three-dimensional visualization modeling is not perfect enough, visualization data interoperability is not high, and control intelligence is low, and achieve the technical effects of optimizing three-dimensional real-time visualization and improving control intelligence by combining a networking technology and a cloud computing technology, unifying space planning and data transmission.
In one aspect, the application provides a digital factory intelligent control method based on the internet of things, the method is applied to a digital factory intelligent control system based on the internet of things, the system is in communication connection with a first cloud processor, and the method comprises the following steps: obtaining a plurality of campus scenarios of a first manufacturing plant; obtaining scene construction data according to the plurality of garden scenes, wherein the scene construction data comprises scene structure data and scene size data; transmitting the scene structure data and the scene size data to the first cloud processor, wherein the first cloud processor comprises a scene classifier; according to the scene classifier, similar scene classification with the same attribute is carried out on the plurality of garden scenes, and a first scene classification result is obtained; according to the first scene classification result, performing three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visual modeling technology to obtain a first digital factory; obtaining the real-time workshop operation state and workshop equipment parameters of the first manufacturing factory through a unified standardized interface; performing simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters, and outputting a first visual digital factory; and performing alarm visual control according to the first visual digital factory.
On the other hand, this application still provides a digital factory intelligence control system based on thing networking, the system includes: a first obtaining unit for obtaining a plurality of campus scenarios of a first manufacturing plant; a second obtaining unit, configured to obtain scene construction data according to the plurality of campus scenes, where the scene construction data includes scene structure data and scene size data; a first transmission unit, configured to transmit the scene structure data and the scene size data to a first cloud processor, where the first cloud processor includes a scene classifier therein; the first classification unit is used for carrying out similar scene classification on the plurality of garden scenes according to the scene classifier to obtain a first scene classification result; the first modeling unit is used for carrying out three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visualization modeling technology according to the first scene classification result to obtain a first digital factory; a third obtaining unit, configured to obtain a real-time workshop operating state and workshop equipment parameters of the first manufacturing plant through a unified standardized interface; the first optimization unit is used for carrying out simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters and outputting a first visual digital factory; a first control unit for performing alarm visual control according to the first visual digital factory.
In a third aspect, the present application provides an internet of things-based digital plant intelligent control system, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method according to any one of the first aspect when executing the program.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
1. the method comprises the steps of obtaining a plurality of garden scenes of a first manufacturing plant, determining the construction characteristics of the scenes according to scene structure data and scene size data of the garden scenes, transmitting all data of the garden scenes to a first cloud processor for data cloud computing, classifying the scenes in the first manufacturing plant according to a scene classifier in the first cloud processor for scene classification with similar attributes, outputting a scene classification result, carrying out three-dimensional modeling according to the connection relation of the scenes, outputting a first digital plant, further obtaining the operation states and the equipment attribute parameters of workshop equipment of the garden scenes in the first manufacturing plant, carrying out data simulation optimization on the digital plant after three-dimensional modeling through a unified standardized interface, and outputting a first visual digital plant, and then the intelligent control of the monitoring function unit is completed, full three-dimensional virtual reality browsing monitoring and full mouse virtual reality operation are realized in a three-dimensional visual, interactive and easy-to-use and real-time data docking mode, and the technical effects of optimizing three-dimensional real-time visualization and improving control intelligence by combining a networking technology and a cloud computing technology, unifying space planning and data transmission are achieved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic flow chart of a digital factory intelligent control method based on the internet of things according to an embodiment of the present application;
fig. 2 is a schematic flow chart of operation and maintenance authority hierarchical management of an intelligent control method for a digital factory based on the internet of things according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a method for obtaining a first digital plant in an intelligent control system of a digital plant based on the internet of things according to an embodiment of the present disclosure;
fig. 4 is a schematic view illustrating a scene topology splicing process of a digital factory intelligent control method based on the internet of things according to an embodiment of the present application;
fig. 5 is a schematic flow chart of automatic inspection archiving of the intelligent control method of the digital factory based on the internet of things according to the embodiment of the present application;
fig. 6 is a schematic structural diagram of a digital factory intelligent control system based on the internet of things according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an exemplary electronic device according to an embodiment of the present application.
Detailed Description
The embodiment of the application solves the technical problems that in the prior art, most of three-dimensional visual systems belong to closed architectures, three-dimensional visual modeling is not perfect enough, visual data interoperability is not high, and control intelligence is low, and the technical effects of optimizing three-dimensional real-time visualization and improving control intelligence by combining a networking technology and a cloud computing technology, unifying space planning and data transmission are achieved.
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are merely some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited to the example embodiments described herein.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet relevant regulations of national laws and regulations.
The existing three-dimensional modeling is realized by three-dimensional visualization based on a foreign platform, so that the operation is complex due to high technical level requirements of users, and the enterprise needs to train users and then has high post cost. Meanwhile, the efficiency cannot be effectively improved due to the fact that users cannot timely adjust and apply the scenes, and the industrial cloud data mainly come from the fact that the range of the scenes and the applications are greatly reduced in an operation room due to the fact that the industrial cloud data are used in a local client mode on site. Therefore, the utilization degree of the digital modeling is not high, and the intelligent control is not perfect.
In view of the above technical problems, the technical solution provided by the present application has the following general idea:
the application provides a digital factory intelligent control method and system based on the Internet of things, and solves the technical problems that in the prior art, most of three-dimensional visual systems belong to closed architectures, three-dimensional visual modeling is not perfect enough, visual data interoperability is not high, and control intelligence is low. The method comprises the steps of obtaining a plurality of garden scenes of a first manufacturing factory, determining the construction characteristics of the scenes according to scene structure data and scene size data of the garden scenes, transmitting all data of the garden scenes to a first cloud processor for data cloud calculation, classifying the scenes in the first manufacturing factory by similar scenes with the same attribute according to a scene classifier in the first cloud processor, outputting a scene classification result, carrying out three-dimensional modeling according to the connection relation of the scenes, outputting a first digital factory, further obtaining the operation states and the equipment attribute parameters of workshop equipment of the garden scenes in the first manufacturing factory, carrying out data simulation optimization on the digital factory after three-dimensional modeling through a unified standardized interface, outputting the first visual digital factory, and further completing the intelligent control of a monitoring function unit, through the modes of three-dimensional intuition, easy interaction and real-time data docking, the full three-dimensional virtual reality browses and monitors and the full mouse virtual reality operation, and the control intelligence is improved.
For better understanding of the above technical solutions, the following detailed descriptions will be provided in conjunction with the drawings and the detailed description of the embodiments.
Example one
As shown in fig. 1, an embodiment of the present application provides an internet of things-based intelligent control method for a digital factory, where the method is applied to an internet of things-based intelligent control system for a digital factory, the system is communicatively connected to a first cloud processor, and the method includes:
step S100: obtaining a plurality of campus scenarios of a first manufacturing plant;
step S200: obtaining scene construction data according to the plurality of garden scenes, wherein the scene construction data comprises scene structure data and scene size data;
specifically, the plurality of garden scenes of the first manufacturing plant are garden scenes with different functions in the manufacturing plant, and garden scene units, such as power, water treatment, warehousing management, chemical engineering, production manufacturing and the like, are determined according to the industrial type of the manufacturing plant, and since the visualization of the garden scenes requires accurate collection of data of the first manufacturing plant, firstly, scene data collection is performed according to the plurality of garden scenes to obtain scene structure data and scene size data, wherein the scene structure data is the building structure, space design, arrangement design and the like of the scenes, and the scene size data is the building size, geometric size and the like of the scenes, so that the plurality of garden scenes can be subjected to building analysis, and basic scene data can be provided for the subsequent construction of the digital plant.
Step S300: transmitting the scene structure data and the scene size data to the first cloud processor, wherein the first cloud processor comprises a scene classifier;
specifically, the first cloud processor is in communication connection with the system, so that the data volume is large in the process of constructing the digital factory, the computer processing space is increased, the collected scene data are transmitted to the first cloud processor, and the first cloud processor is used for performing cloud computing processing. The first cloud processor comprises a scene classifier, and the scene classifier can divide scenes of building structures with the same attribute, so that the construction efficiency of a digital factory is improved.
Further, the scene structure data and the scene size data are transmitted to the first cloud processor, scene representative features and size representative data are extracted according to the scene structure data and the scene size data, and a classifier is trained according to the extracted features, so that the performance of the scene classifier is improved. The cloud processing-based mode can ensure the data processing effect and improve the flow operation efficiency.
Step S400: according to the scene classifier, similar scene classification with the same attribute is carried out on the plurality of garden scenes, and a first scene classification result is obtained;
specifically, according to the scene classifier, the similar scenes with the same attribute are classified in the multiple garden scenes, wherein the scene classifier can perform machine learning according to the extracted scene structure and size features, so that the scene classification performance of the scene classifier is improved, further, analysis is performed according to the similarity of the scene attributes in the classification process, the scene buildings with the same structure are classified into the same attribute category, the similarity of the scenes with the same attribute category is analyzed, further, the first scene classification result is obtained, and the similar scene modeling is performed based on the first scene classification result.
Step S500: according to the first scene classification result, performing three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visual modeling technology to obtain a first digital factory;
specifically, according to the first scene classification result, a three-dimensional visualization modeling technology is adopted for three-dimensional modeling, wherein the system provides core three-dimensional components for driving graphic display according to a three-dimensional engine, supports most three-dimensional model formats according to a graphic interface, and meets the requirements of different manufacturing modes, and on the basic functional unit, the three-dimensional visualization modeling technology is adopted for three-dimensional modeling of the first manufacturing factory, wherein the three-dimensional visualization modeling technology can also be combined with a rapid modeling technology, an integrated factory real-time monitoring equipment sensor technology, a camera monitoring technology and other means to provide visual display of the first manufacturing factory.
Furthermore, because the design and composition of the first manufacturing plant are unified, and the building structures for executing unified functional scenes are similar, in order to improve the modeling efficiency, different scenes of the similar building structures are digitally modeled according to the first scene classification result. The system can also provide a visual configuration function, can adjust an operation menu, buttons, a monitoring panel, layer options and three-dimensional scene angles, can set a three-dimensional effect according to user preference, and improves the modeling effect of the first digital factory.
Step S600: obtaining the real-time workshop operation state and workshop equipment parameters of the first manufacturing factory through a unified standardized interface;
step S700: performing simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters, and outputting a first visual digital factory;
specifically, in order to improve the visual control effect of the first digital factory, the production line of the first manufacturing factory is obtained through a unified standardized interface, and the equipment parameters, the equipment attributes and the operation state of the equipment in the manufacturing workshop are analyzed according to the production flow line, so that the first digital factory is subjected to simulation optimization according to the real-time operation state, and the first visual digital factory combined with the production equipment is output.
Furthermore, the first digital factory is subjected to simulation and optimization according to the real-time operation state of the workshop and the parameters of the workshop equipment, the process flows of all production processes and key actions of equipment production need to be simulated and displayed in the form of animation, the flows of all production processes can be rapidly displayed, meanwhile, deep development is carried out on different processes, production data and three-dimensional model equipment are fused, linkage and control of the three-dimensional model equipment and actual physical production equipment can be realized, real-time operation data of field equipment is displayed, and the digital controllable effect is improved.
Step S800: and performing alarm visual control according to the first visual digital factory.
Specifically, the three-dimensional visualization display of the dynamic data is realized by performing real-time data display on the three-dimensional model equipment, and support is provided for the service management function of the system. The real-time monitoring system has the advantages that detection data transmitted back by the sensors from time to time are read from the database, the detection data are drawn above the equipment stations and the sensors in the three-dimensional scene in real time, the running state of the equipment is monitored in time, and the like. And performing alarm management on the first visual digital factory according to the set real-time data display, so that the technical effects of optimizing three-dimensional real-time visualization and improving control intelligence by combining a networking technology and a cloud computing technology, unifying space planning and data transmission are achieved.
Further, as shown in fig. 2, in order to improve the management performance of multiple scenarios, step S100 in this embodiment of the present application further includes:
step S110: obtaining scene scale data and equipment distribution data of the plurality of campus scenes;
step S120: performing park grade identification on the plurality of park scenes according to the scene scale data and the equipment distribution data to obtain grade identification information of the plurality of park scenes;
step S130: acquiring configuration information of operation and maintenance personnel based on the grade identification information of the plurality of garden scenes;
step S140: and realizing the operation and maintenance authority hierarchical management of the plurality of park scenes according to the configuration information of the operation and maintenance personnel.
Specifically, since the first manufacturing plant includes a plurality of campus scenarios, in order to implement intelligent management of the multi-campus scenarios, it is necessary to perform operation and maintenance staff configuration and scenario hierarchical management for the plurality of campus scenarios.
Further, the process of configuring the operation and maintenance staff is as follows, firstly, scene management grade analysis is required to be carried out according to the scene scale data and the equipment distribution data, and a hierarchical management configuration staff is determined according to the analyzed grade, so that the operation and maintenance authority hierarchical management of the plurality of campus scenes is realized. Through the same platform management, a manager does not need to switch or log in a new system for checking the remote monitoring operation state, and then the system provides a scene to the authority hierarchical management from the equipment to the operation and maintenance function so as to realize the responsibility of each operator of the operation and maintenance personnel; the multiple scenes are integrated in one system, the overall control of actual scene projects of each place is realized, further, the management and switching of the multiple garden scenes can be realized by operating the three-dimensional scenes through a mouse, the three-dimensional scenes can be amplified or reduced, translated up and down, left and right and rotated at any angle, and meanwhile, the browsing of a park level, a building level, a workshop level, a production line level, an equipment level, a part level and the like can be realized in a hierarchical and progressive manner.
Further, the system includes a scene editor, and the steps of the embodiment of the present application further include S900:
step S910: acquiring a first newly added scene according to the scene editor, wherein the first newly added scene is scene information newly added by an editing user in real time;
step S920: determining a first scene category by classifying and identifying the first newly added scene;
step S930: judging whether the first scene type is in the first scene classification result, and if the first scene type is not in the first scene classification result, obtaining a first adding instruction;
step S940: and updating the first scene classification result according to the first scene category.
Specifically, by providing high-flexibility scene editing, a user can freely add a three-dimensional park scene, and the adding, switching and displaying of scenes of multiple physical places are supported, so that the requirement of multi-center management is met. Therefore, a first newly added scene is obtained based on the scene editor, the first newly added scene is classified and identified according to the scene classifier, whether the first newly added scene is in an existing classification unit is judged based on an identification result, if the first newly added scene is in the first scene classification result, the next processing flow is continued, and if the first newly added scene is not in the first scene classification result, the classification identification result of the first scene is added to the first scene classification result based on a first adding instruction, so that the intelligent identification and dynamic updating of the newly added scene are realized, the sampling of the classification result is enriched, and the scene modeling efficiency is improved.
Furthermore, the three-dimensional scene editor is a three-dimensional visual Internet of things basic supporting tool, a set of mature commercial solutions from three-dimensional scene creation to three-dimensional visual management and operation and maintenance can be provided, the system provides a convenient self-defined drawing tool for managers, modification of a production workshop structure is supported, monitoring equipment is added and deleted, monitoring connection objects and expression patterns of monitoring attributes are edited and modified, a flexible solution is provided for actual engineering management, and visual rapid deployment is achieved.
Further, as shown in fig. 3, according to the first scene classification result, performing three-dimensional modeling on the first manufacturing plant by using a three-dimensional visualization modeling technique to obtain a first digital plant, where step S500 in this embodiment of the present application further includes:
step S510: modeling each scene in the first scene classification result by adopting a three-dimensional visual modeling technology, and outputting a plurality of visual scenes;
step S520: performing scene splicing relation analysis on the plurality of visual scenes to generate a first splicing topological structure;
step S530: performing scene splicing self-check on the plurality of visual scenes based on the first splicing topological structure to obtain a first self-check result, wherein the first self-check result comprises a self-check passing state and a self-check failing state;
step S540: and when the first self-checking result is that the self-checking is passed, connecting the plurality of visual scenes according to the first splicing topological structure to obtain the first digital factory.
Specifically, each scene in the first scene classification result is modeled based on the three-dimensional visual modeling technology, so as to output a scene model of all garden scenes in the first manufacturing plant, the plurality of visual scenes can be output based on the modeling data, and the plurality of visual scenes are analyzed for a scene space structure, and since the plurality of garden scenes are independent when the first manufacturing plant data is collected, the output visual scenes are independently distributed after the plurality of garden scenes are respectively modeled for three-dimensional scenes, so that the plurality of garden scenes need to be connected and analyzed according to the arrangement relationship of the scene space structure, so as to generate the first mosaic topology structure, wherein the first mosaic structure is used for connecting the mosaic relationship among the plurality of garden scenes, and splicing all the garden scenes according to the first splicing topological structure to generate the first digital factory.
Further, before the plurality of visual scenes are connected according to the first splicing topological structure, scene data self-checking needs to be performed on the generated visual scenes, a first self-checking result is obtained, the first self-checking result comprises a first result and a second result, the first result is a self-checking pass result, namely the visual scenes are output until the data completion degree of splicing execution is reached, the second result is a self-checking fail result, namely the visual scenes are output until the data completion degree of splicing execution is not reached, scene correction to the self-checking pass need to be further realized, and then the visual scenes are connected according to the first splicing topological structure, so that the first digital factory is obtained.
Further, as shown in fig. 4, based on the first splicing topology, performing scene splicing self-inspection on the multiple visual scenes to obtain a first self-inspection result, where step S530 in this embodiment of the present application further includes:
step S531: if the first self-checking result is that the self-checking fails, outputting N topological nodes in the first splicing topological structure, wherein the N topological nodes are nodes with abnormal self-checking, and N is a positive integer greater than or equal to 1;
step S532: obtaining abnormal branch scenes of the N topological nodes;
step S533: and performing three-dimensional scene correction calibration on the abnormal branch scene connected with each topological node in the N topological nodes, and performing topological splicing according to the corrected and calibrated scene to obtain the first digital factory.
Specifically, since the first self-inspection result includes the first result and the second result, if the self-inspection result is the second result, that is, the scene splicing self-inspection is not passed, that is, the output multiple visual scenes do not reach the completion degree of data for performing splicing, it is necessary to further implement scene correction until the self-inspection is passed, and the process of scene correction is as follows.
The first splicing topological structure can represent the spatial relationship of all garden scenes, so that the scenes of splicing connection points are self-checked based on the first splicing topological relationship, N abnormal self-checking nodes in the first splicing topological structure are obtained when the self-checking fails, abnormal branch scenes are determined, for example, if three garden scene topologies are spliced to form one node, scene splicing abnormality occurs when the self-checking is performed in the three garden scenes of the node, scenes with unmatched scene splicing in the three scenes are obtained, the scenes with unmatched scene splicing in the three scenes are obtained, three-dimensional scene correction and calibration are performed on the abnormal branch scenes, and then topology splicing is performed to obtain the first digital factory, so that the purposes of improving the scene splicing quality and improving the quality of the first digital factory are achieved.
Further, as shown in fig. 5, the step S800 of performing alarm visualization control according to the first visualization digital factory further includes:
step S810: acquiring a first preset routing inspection route and first preset routing inspection time;
step S820: triggering a first inspection instruction according to the first preset inspection time;
step S830: according to the first routing inspection instruction, routing inspection is carried out on the first visual digital factory according to the first preset routing inspection route, and first routing inspection equipment parameters are output;
step S840: and archiving the first inspection equipment parameters to realize monitoring tracing management and obtain first early warning information.
Specifically, the first preset routing inspection route is an automatic routing inspection route which is set in advance, and automatic route planning can be performed according to equipment safety and fault points of the first visual digital factory; the first preset inspection time is the inspection time which is continuously updated or manually set according to the historical inspection record; the first inspection instruction is an inspection instruction for triggering inspection to start. Further, if the first preset inspection time triggers a first inspection instruction, the first visual digital factory is inspected according to the first preset inspection route, a first inspection equipment parameter is output, and the first inspection equipment parameter is filed. Furthermore, the automatic inspection function is to simulate operation and maintenance personnel to automatically inspect the animation, edit the inspection line in the scene in advance, and bind the monitoring information of the corresponding equipment to design the inspection animation to inspect by the first person. The inspection information can be filed in the inspection process, and report information can be generated so as to facilitate the inquiry, so that the automatic inspection real-time monitoring and tracing can be realized.
Furthermore, in order to improve the real-time performance of visual monitoring, a workshop data processing method based on complex event processing and the definition of atomic events, simple events and complex events are provided for intelligent alarm display, the real-time performance and alarm data of equipment can be displayed in a three-dimensional virtual factory, the position information can be correctly mastered when fault alarm occurs, the fault alarm can be quickly responded, different colors are adopted for identification and distinction for alarms in different levels, the alarm colors can be changed in a configuration mode, and the alarm content can be checked by clicking an alarm icon. The intelligent scene monitoring system is integrated and butted with an existing network monitoring system, real-time performance and alarm data are respectively displayed on a three-dimensional virtual factory, and an alarm icon can appear and flicker on equipment in an alarm state, so that scene control intelligence is improved.
Further, the system further includes an indexing unit, and the steps of the embodiment of the present application further include S1000:
step S1010: acquiring first index equipment information according to a user, wherein the first index equipment information comprises equipment name information and equipment position information;
step S1020: and inputting the first indexing equipment information into a first indexing unit, and obtaining the visual index positioning of the first visual digital factory according to the first indexing unit, wherein the visual index positioning is the position of visual equipment and the production line to which the visualization belongs.
Specifically, the first indexing unit can intelligently position the equipment according to the indexing equipment information input by the user, the user can directly check all equipment conditions in the whole three-dimensional virtual factory through the equipment index, the position and the production line of the equipment can be known at a glance, the view angle positioning to the corresponding level equipment can be realized by clicking the corresponding equipment name, so as to achieve the purpose of quickly finding the equipment, based on the real-time data display function in the first visual digital factory, the quick positioning equipment and the operation state of the query equipment, further, the physical factory sensor data is associated with the equipment of the virtual factory through the internet of things, therefore, the user can directly check all equipment conditions in the whole three-dimensional virtual factory through the equipment index, the position and the production line of the equipment can be known at a glance, the view angle positioning to the corresponding level equipment can be realized by clicking the corresponding equipment name, so as to achieve the purpose of quickly finding equipment.
Compared with the prior art, the invention has the following beneficial effects:
1. the method comprises the steps of obtaining a plurality of garden scenes of a first manufacturing plant, determining the construction characteristics of the scenes according to scene structure data and scene size data of the garden scenes, transmitting all data of the garden scenes to a first cloud processor for data cloud computing, classifying the scenes in the first manufacturing plant according to a scene classifier in the first cloud processor for scene classification with similar attributes, outputting a scene classification result, carrying out three-dimensional modeling according to the connection relation of the scenes, outputting a first digital plant, further obtaining the operation states and the equipment attribute parameters of workshop equipment of the garden scenes in the first manufacturing plant, carrying out data simulation optimization on the digital plant after three-dimensional modeling through a unified standardized interface, and outputting a first visual digital plant, and then the intelligent control of the monitoring function unit is completed, full three-dimensional virtual reality browsing monitoring and full mouse virtual reality operation are realized in a three-dimensional visual, interactive and easy-to-use and real-time data docking mode, and the technical effects of optimizing three-dimensional real-time visualization and improving control intelligence by combining a networking technology and a cloud computing technology, unifying space planning and data transmission are achieved.
2. The method comprises the steps of connecting and analyzing a plurality of garden scenes according to the arrangement relation of scene space structures to generate a first splicing topological structure, connecting the splicing relations among the plurality of garden scenes according to the first splicing structure, and correcting and calibrating the three-dimensional scenes of abnormal branch scenes, so that the scene splicing quality is improved, and the quality of the first digital factory is improved.
3. Due to the adoption of the authority hierarchical management from the scene to the equipment to the operation and maintenance function, the responsibility of each operation and maintenance personnel is realized; a plurality of scenes are integrated in one system, so that the overall control of actual scenes of all places is realized, and the data interoperability is improved.
Example two
Based on the same inventive concept as the digital plant intelligent control method based on the internet of things in the foregoing embodiment, the present invention further provides a digital plant intelligent control system based on the internet of things, as shown in fig. 6, the system includes:
a first obtaining unit 11, the first obtaining unit 11 being configured to obtain a plurality of campus scenarios of a first manufacturing plant;
a second obtaining unit 12, where the second obtaining unit 12 is configured to obtain scene construction data according to the plurality of campus scenes, where the scene construction data includes scene structure data and scene size data;
a first transmission unit 13, where the first transmission unit 13 is configured to transmit the scene structure data and the scene size data to a first cloud processor, where the first cloud processor includes a scene classifier therein;
a first classification unit 14, where the first classification unit 14 is configured to perform similar scene classification on the multiple garden scenes according to the scene classifier, and obtain a first scene classification result;
the first modeling unit 15 is configured to perform three-dimensional modeling on the first manufacturing plant by using a three-dimensional visualization modeling technology according to the first scene classification result, so as to obtain a first digital plant;
a third obtaining unit 16, where the third obtaining unit 16 is configured to obtain a real-time workshop operating state and workshop equipment parameters of the first manufacturing plant through a unified standardized interface;
the first optimization unit 17 is configured to perform simulation optimization on the first digital plant according to the real-time workshop operation state and the workshop equipment parameters, and output a first visual digital plant;
a first control unit 18, said first control unit 18 being adapted to perform an alarm visual control of said plant according to said first visual digital plant.
Further, the system further comprises:
a fourth obtaining unit configured to obtain scene scale data and device distribution data of the plurality of campus scenes;
a fifth obtaining unit, configured to perform campus level identification on the multiple campus scenes according to the scene scale data and the device distribution data, and obtain level identification information of the multiple campus scenes;
a sixth obtaining unit, configured to obtain configuration information of the operation and maintenance staff based on the level identification information of the plurality of campus scenes;
and the first configuration unit is used for realizing the operation and maintenance authority hierarchical management of the plurality of garden scenes according to the operation and maintenance personnel configuration information.
Further, the system further comprises:
a seventh obtaining unit, configured to obtain a first newly added scene according to the scene editor, where the first newly added scene is scene information newly added in real time by an editing user;
a first determining unit, configured to determine a first scene category by performing classification recognition on the first newly added scene;
a first determining unit, configured to determine whether the first scene type is in the first scene classification result, and if the first scene type is not in the first scene classification result, obtain a first adding instruction;
a first updating unit, configured to update the first scene classification result according to the first scene category.
Further, the system further comprises:
the second modeling unit is used for modeling each scene in the first scene classification result by adopting a three-dimensional visual modeling technology and outputting a plurality of visual scenes;
the first generation unit is used for carrying out scene splicing relation analysis on the plurality of visual scenes to generate a first splicing topological structure;
the first self-inspection unit is used for carrying out scene splicing self-inspection on the plurality of visual scenes based on the first splicing topological structure to obtain a first self-inspection result, wherein the first self-inspection result comprises a self-inspection passing result and a self-inspection failing result;
an eighth obtaining unit, configured to, when the first self-inspection result is that the self-inspection passes, connect the multiple visualization scenes according to the first splicing topology, to obtain the first digital factory.
Further, the system further comprises:
a first output unit, configured to output N topology nodes in the first splicing topology structure if the first self-inspection result is that the self-inspection fails, where the N topology nodes are nodes with abnormal self-inspection, and N is a positive integer greater than or equal to 1;
a ninth obtaining unit, configured to obtain an abnormal branch scenario of the N topology nodes;
a tenth obtaining unit, configured to perform three-dimensional scene correction calibration on an abnormal branch scene connected to each of the N topology nodes, and perform topology splicing according to the corrected and calibrated scenes to obtain the first digital factory.
Further, the system further comprises:
an eleventh obtaining unit, configured to obtain a first preset inspection route and a first preset inspection time;
the first trigger unit is used for triggering a first inspection instruction according to the first preset inspection time;
the first routing inspection unit is used for routing inspection of the first visual digital factory according to the first routing inspection instruction and the first preset routing inspection line and outputting a first routing inspection equipment parameter;
and the twelfth obtaining unit is used for archiving the first inspection equipment parameters to realize monitoring tracing management and obtain first early warning information.
Further, the system further comprises:
a thirteenth obtaining unit configured to obtain first index device information according to a user, wherein the first index device information includes device name information and device location information;
a fourteenth obtaining unit, configured to input the first indexing device information into a first indexing unit, and obtain, according to the first indexing unit, a visualization index location of the first visualization digital factory, where the visualization index location is a visualization device location and a production line to which a visualization belongs.
Various changes and specific examples of the digital plant intelligent control method based on the internet of things in the first embodiment of fig. 1 are also applicable to the digital plant intelligent control system based on the internet of things of the present embodiment, and through the foregoing detailed description of the digital plant intelligent control method based on the internet of things, those skilled in the art can clearly know the implementation method of the digital plant intelligent control system based on the internet of things in the present embodiment, so for the brevity of the description, detailed descriptions are not repeated here.
EXAMPLE III
The electronic device of the present application is described below with reference to fig. 7.
Fig. 7 illustrates a schematic structural diagram of an electronic device according to the present application.
Based on the inventive concept of the digital intelligent control system of the factory based on the internet of things in the embodiment, the invention also provides a digital intelligent control system of the factory based on the internet of things, wherein a computer program is stored on the digital intelligent control system of the factory based on the internet of things, and when the program is executed by a processor, the steps of any method of the digital intelligent control system of the factory based on the internet of things are realized.
Where in fig. 7 a bus architecture (represented by bus 300), bus 300 may include any number of interconnected buses and bridges, bus 300 linking together various circuits including one or more processors, represented by processor 302, and memory, represented by memory 304. The bus 300 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 305 provides an interface between the bus 300 and the receiver 301 and transmitter 303. The receiver 301 and the transmitter 303 may be the same element, i.e., a transceiver, providing a means for communicating with various other systems over a transmission medium. The processor 302 is responsible for managing the bus 300 and general processing, and the memory 304 may be used for storing data used by the processor 302 in performing operations.
The embodiment of the application provides a digital factory intelligent control method based on the Internet of things, which is applied to a digital factory intelligent control system based on the Internet of things, wherein the system is in communication connection with a first cloud processor, and the method comprises the following steps: obtaining a plurality of campus scenarios of a first manufacturing plant; obtaining scene construction data according to the plurality of garden scenes, wherein the scene construction data comprises scene structure data and scene size data; transmitting the scene structure data and the scene size data to the first cloud processor, wherein the first cloud processor comprises a scene classifier; according to the scene classifier, similar scene classification with the same attribute is carried out on the plurality of garden scenes, and a first scene classification result is obtained; according to the first scene classification result, performing three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visual modeling technology to obtain a first digital factory; obtaining the real-time workshop operation state and workshop equipment parameters of the first manufacturing factory through a unified standardized interface; performing simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters, and outputting a first visual digital factory; and performing alarm visual control according to the first visual digital factory. The technical problems that in the prior art, most of three-dimensional visualization systems belong to closed architectures, three-dimensional visualization modeling is not perfect, visualization data interoperability is not high, and control intelligence is low are solved, and the technical effects that three-dimensional real-time visualization is optimized through combination of a networking technology and a cloud computing technology, unified space planning and data transmission, and control intelligence is improved are achieved.
Those of ordinary skill in the art will understand that: the various numbers of the first, second, etc. mentioned in this application are only used for the convenience of description and are not used to limit the scope of the embodiments of this application, nor to indicate the order of precedence. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one" means one or more. At least two means two or more. "at least one," "any," or similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one (one ) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable system. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The various illustrative logical units and circuits described in this application may be implemented or operated upon by general purpose processors, digital signal processors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic systems, discrete gate or transistor logic, discrete hardware components, or any combination thereof. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing systems, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations may be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and its equivalent technology, it is intended that the present application include such modifications and variations.

Claims (7)

1. The method is applied to a digital factory intelligent control system based on the Internet of things, the system is in communication connection with a first cloud processor, and the method comprises the following steps:
obtaining a plurality of campus scenarios of a first manufacturing plant;
obtaining scene construction data according to the plurality of garden scenes, wherein the scene construction data comprises scene structure data and scene size data;
transmitting the scene structure data and the scene size data to the first cloud processor, wherein the first cloud processor comprises a scene classifier;
according to the scene classifier, similar scene classification with the same attribute is carried out on the plurality of garden scenes, and a first scene classification result is obtained;
according to the first scene classification result, performing three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visual modeling technology to obtain a first digital factory;
obtaining the real-time workshop operation state and workshop equipment parameters of the first manufacturing factory through a unified standardized interface;
performing simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters, and outputting a first visual digital factory;
performing alarm visual control according to the first visual digital factory;
according to the first scene classification result, performing three-dimensional modeling on the first manufacturing plant by adopting a three-dimensional visualization modeling technology to obtain a first digital plant, wherein the method further comprises the following steps:
modeling each scene in the first scene classification result by adopting a three-dimensional visual modeling technology, and outputting a plurality of visual scenes;
carrying out scene splicing relation analysis on the plurality of visual scenes to generate a first splicing topological structure;
performing scene splicing self-check on the plurality of visual scenes based on the first splicing topological structure to obtain a first self-check result, wherein the first self-check result comprises a self-check passing state and a self-check failing state;
when the first self-checking result is that the self-checking is passed, connecting the plurality of visual scenes according to the first splicing topological structure to obtain the first digital factory;
if the first self-checking result is that the self-checking fails, outputting N topological nodes in the first splicing topological structure, wherein the N topological nodes are nodes with abnormal self-checking, and N is a positive integer greater than or equal to 1;
obtaining abnormal branch scenes of the N topological nodes;
and performing three-dimensional scene correction calibration on the abnormal branch scene connected with each topological node in the N topological nodes, and performing topological splicing according to the corrected and calibrated scene to obtain the first digital factory.
2. The method of claim 1, wherein the method further comprises:
obtaining scene scale data and equipment distribution data of the plurality of campus scenes;
performing park grade identification on the plurality of park scenes according to the scene scale data and the equipment distribution data to obtain grade identification information of the plurality of park scenes;
acquiring configuration information of operation and maintenance personnel based on the grade identification information of the plurality of garden scenes;
and realizing the operation and maintenance authority hierarchical management of the plurality of park scenes according to the configuration information of the operation and maintenance personnel.
3. The method of claim 1, wherein the system includes a scene editor, the method further comprising:
acquiring a first newly added scene according to the scene editor, wherein the first newly added scene is scene information newly added by an editing user in real time;
determining a first scene category by classifying and identifying the first newly added scene;
judging whether the first scene category is in the first scene classification result, and if the first scene category is not in the first scene classification result, obtaining a first adding instruction;
and updating the first scene classification result according to the first scene category.
4. The method of claim 1, wherein said performing alarm visualization control according to said first visualized digital plant, further comprises:
acquiring a first preset routing inspection route and first preset routing inspection time;
triggering a first inspection instruction according to the first preset inspection time;
according to the first inspection instruction, inspecting the first visual digital factory according to the first preset inspection route, and outputting a first inspection equipment parameter;
and archiving the first inspection equipment parameters to realize monitoring tracing management and obtain first early warning information.
5. The method of claim 1, wherein the system further comprises an indexing unit, the method further comprising:
obtaining first index equipment information according to a user, wherein the first index equipment information comprises equipment name information and equipment position information;
and inputting the first indexing equipment information into a first indexing unit, and obtaining the visual index positioning of the first visual digital factory according to the first indexing unit, wherein the visual index positioning is the position of visual equipment and the production line to which the visualization belongs.
6. A digital factory intelligent control system based on the Internet of things is characterized by comprising:
a first obtaining unit for obtaining a plurality of campus scenarios of a first manufacturing plant;
a second obtaining unit, configured to obtain scene construction data according to the plurality of campus scenes, where the scene construction data includes scene structure data and scene size data;
a first transmission unit, configured to transmit the scene structure data and the scene size data to a first cloud processor, where the first cloud processor includes a scene classifier;
the first classification unit is used for carrying out similar scene classification on the plurality of garden scenes according to the scene classifier to obtain a first scene classification result;
the first modeling unit is used for carrying out three-dimensional modeling on the first manufacturing factory by adopting a three-dimensional visualization modeling technology according to the first scene classification result to obtain a first digital factory;
a third obtaining unit, configured to obtain a real-time workshop operating state and workshop equipment parameters of the first manufacturing plant through a unified standardized interface;
the first optimization unit is used for carrying out simulation optimization on the first digital factory according to the real-time workshop running state and the workshop equipment parameters and outputting a first visual digital factory;
the first control unit is used for performing alarm visual control according to the first visual digital factory;
the second modeling unit is used for modeling each scene in the first scene classification result by adopting a three-dimensional visual modeling technology and outputting a plurality of visual scenes;
the first generation unit is used for carrying out scene splicing relation analysis on the plurality of visual scenes to generate a first splicing topological structure;
the first self-inspection unit is used for carrying out scene splicing self-inspection on the plurality of visual scenes based on the first splicing topological structure to obtain a first self-inspection result, wherein the first self-inspection result comprises a self-inspection passing result and a self-inspection failing result;
an eighth obtaining unit, configured to, when the first self-inspection result is that the self-inspection is passed, connect the plurality of visualization scenes according to the first splicing topology to obtain the first digital factory;
a first output unit, configured to output N topology nodes in the first splicing topology structure if the first self-inspection result is that the self-inspection fails, where the N topology nodes are nodes with abnormal self-inspection, and N is a positive integer greater than or equal to 1;
a ninth obtaining unit, configured to obtain an abnormal branch scenario of the N topology nodes;
a tenth obtaining unit, configured to perform three-dimensional scene correction calibration on an abnormal branch scene connected to each of the N topology nodes, and perform topology splicing according to a corrected and calibrated scene to obtain the first digital factory.
7. An electronic device, comprising a processor and a memory: the memory is used for storing; the processor is configured to execute the method of any one of claims 1-5 by calling.
CN202210186316.9A 2022-02-28 2022-02-28 Digital factory intelligent control method and system based on Internet of things Active CN114237192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210186316.9A CN114237192B (en) 2022-02-28 2022-02-28 Digital factory intelligent control method and system based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210186316.9A CN114237192B (en) 2022-02-28 2022-02-28 Digital factory intelligent control method and system based on Internet of things

Publications (2)

Publication Number Publication Date
CN114237192A CN114237192A (en) 2022-03-25
CN114237192B true CN114237192B (en) 2022-05-06

Family

ID=80748278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210186316.9A Active CN114237192B (en) 2022-02-28 2022-02-28 Digital factory intelligent control method and system based on Internet of things

Country Status (1)

Country Link
CN (1) CN114237192B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114859744B (en) * 2022-05-07 2023-06-06 内蒙古云科数据服务股份有限公司 Intelligent application visual control method and system based on big data
CN114626835B (en) * 2022-05-16 2022-08-26 腾云互联(浙江)科技有限公司 Visual scheduling method and system for big data of manufacturing plant
CN116258314A (en) * 2022-11-23 2023-06-13 东土科技(宜昌)有限公司 Scene management method and system for production workshop, electronic equipment and storage medium
CN115994674B (en) * 2023-03-22 2023-05-30 广州力控元海信息科技有限公司 Scheduling management method based on digital twin comprehensive energy system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7995055B1 (en) * 2007-05-25 2011-08-09 Google Inc. Classifying objects in a scene
CN103106700B (en) * 2012-12-27 2015-09-30 德讯科技股份有限公司 Automatic data center inspection method based on 3D technology
CN104239431B (en) * 2014-08-27 2017-11-14 广东威创视讯科技股份有限公司 Three-dimension GIS model display methods and device
CN106846460B (en) * 2016-12-30 2020-07-14 译筑信息科技(上海)有限公司 Method and device for realizing three-dimensional BIM image
CN110659778B (en) * 2019-09-26 2022-07-01 山东鲁能软件技术有限公司 Inspection method and system based on three-dimensional model
CN111047689B (en) * 2019-12-20 2023-09-08 合肥卓瑞信息技术有限公司 Computer lab 3D management system
CN111429579A (en) * 2020-04-13 2020-07-17 北京中岩大地科技股份有限公司 Three-dimensional visual automatic monitoring system platform with BIM technology as carrier
CN112581617A (en) * 2020-10-23 2021-03-30 维坤智能科技(上海)有限公司 Three-dimensional scene equipment model of transformer substation and electrical data connection method
CN113343330A (en) * 2021-03-19 2021-09-03 刘昌宏 BIM-based 3D video monitoring and roaming and information equipment operation and maintenance system and method thereof
CN113140037B (en) * 2021-05-13 2022-11-18 天讯方舟(北京)信息科技有限公司 Building information model lightweight and three-dimensional scene visualization system

Also Published As

Publication number Publication date
CN114237192A (en) 2022-03-25

Similar Documents

Publication Publication Date Title
CN114237192B (en) Digital factory intelligent control method and system based on Internet of things
US8321806B2 (en) Visualization of process control data
TWI662440B (en) System and method for machine tool maintenance and repair
US8655830B2 (en) Systems and methods for reporting a cause of an event or equipment state using causal relationship models in a building management system
US9043003B2 (en) Graphical view sidebar for a process control system
WO2016090929A1 (en) Method, server and system for software system fault diagnosis
CN107958337A (en) A kind of information resources visualize mobile management system
JP2017167889A (en) Data management device, data management system, and data management method
CN112084385B (en) Part-process-equipment association relationship topological view generation method based on digital twinning
CN107169611A (en) A kind of patterned way planning AGV travel regions and the method for monitoring its operation
CN105573224A (en) Monitoring method, monitoring device, and monitoring system based on abstract model
CN115423278A (en) MIXBASE general digital twin visual monitoring platform
CN115809302A (en) Metadata processing method, device, equipment and storage medium
Vasyliuk et al. Construction Features of the Industrial Environment Control System.
CN108959391A (en) Show the equipment, system, method, storage medium of data-base cluster architecture diagram
CN113050501A (en) Workshop virtual monitoring system and service terminal
CN116643542A (en) Oil field station digital twin process configuration method, system and device based on low-code development platform
JP6792670B2 (en) Data management device, data management system and data management method
CN115345078B (en) Cable management method and device based on cable iteration model
CN115118578B (en) SCADA system based on WEB
CN115859689A (en) Panoramic visualization digital twin application method
CN115546435A (en) Communication resource monitoring and early warning system and method based on three-dimensional model, electronic device and storage medium
CN103595819A (en) Method for online testing service usability of web system
CN108664370B (en) Distributed industrial on-line configuration monitoring system and method
CN117272684B (en) Method and device for constructing production equipment operation management and control information model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant