CN103019702B - A kind of visualization of 3 d display and control editing system and method - Google Patents

A kind of visualization of 3 d display and control editing system and method Download PDF

Info

Publication number
CN103019702B
CN103019702B CN201210490835.0A CN201210490835A CN103019702B CN 103019702 B CN103019702 B CN 103019702B CN 201210490835 A CN201210490835 A CN 201210490835A CN 103019702 B CN103019702 B CN 103019702B
Authority
CN
China
Prior art keywords
instruction
information
logic
module
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210490835.0A
Other languages
Chinese (zh)
Other versions
CN103019702A (en
Inventor
刘涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201210490835.0A priority Critical patent/CN103019702B/en
Publication of CN103019702A publication Critical patent/CN103019702A/en
Application granted granted Critical
Publication of CN103019702B publication Critical patent/CN103019702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention provides a kind of visualization of 3 d display and control editing system and method.System wherein comprises: input-output unit, visual edit device, control editing device and equipment management device.Application the present invention can form by a unified visualization of 3 d display and control editing system the various user's schemes be applicable under different scene, reduces the development difficulty of user's scheme of 3-D display control technology.

Description

Visual three-dimensional display control editing system and method
Technical Field
The present application relates to the field of computer technologies, and in particular, to a system and a method for controlling and editing a visual three-dimensional display.
Background
With the rapid development of computer software and hardware, the application of computer graphics in various industries is rapidly popularized and deepened. At present, computer graphics has entered the three-dimensional era, and three-dimensional graphics are ubiquitous around people. Scientific computing visualization, computer animation and virtual reality have become three major topics of computer graphics in recent years, and the technical cores of the three major topics are three-dimensional graphics. Since three-dimensional graphics involve many algorithms and expertise, it is difficult to develop three-dimensional applications quickly.
With the development of hardware, the popularization of multi-core CPU platforms, and the development of parallel theory, it has become an important trend in system development to improve the performance of application systems by using parallel technology. However, if the benefit of the multi-core of the CPU is to be truly exerted, the application system must be designed carefully, so that the application system itself must have the capability of dividing the work task into a plurality of sub-tasks that can be executed in parallel, and under the support of the operating system (or a specific system operating platform), the sub-tasks are allocated to a plurality of CPU processing cores equipped in the computer to be executed in parallel, and after the parallel running of the sub-tasks is finished, the running results of the sub-tasks are combined to obtain the final processing result, so that the application system becomes the application system having the capability of "parallel computing".
In addition, a Computer Control System (CCS) is proposed in the prior art. The computer control system is a system which is formed by using a computer to participate in control and is connected with a controlled object by means of some auxiliary components so as to obtain a certain control purpose. The computer, typically an exponential computer, may be of various sizes, ranging from a miniature to a large general purpose or special purpose computer. The auxiliary components mainly refer to an input/output interface, a detection device, an execution device and the like. However, the computer control system generally provides only a simple user interface, and cannot intuitively reflect the specific situation and specific process flow of the field device, and the interactivity is not strong, and the user experience is poor.
However, in the prior art, due to the reasons that the technology in the related field is relatively wide, the difficulty is relatively high, the development cost is high, and the like, a technician often generates a corresponding user scheme for a certain specific scene, and the user scheme can only be used for the specific scene. If it is desired to generate a corresponding user profile for another scene, the technician needs to re-independently generate the user profile again from the other scene. That is, the control editing system used by the technician in the prior art when generating one user plan can generally be used only for generating the user plan, but cannot be used for generating other user plans applicable to other scenes. If the technician wishes to generate different user solutions in multiple scenarios, the technician must use different control editing systems, and cannot form a user solution suitable for different scenarios through one unified control editing system.
In summary, it is known that, because the control editing system in the prior art has the disadvantages as described above, how to provide a better control editing system and method, so as to form various user schemes suitable for different scenarios through a unified control editing system, is a problem that needs to be solved in the art.
Disclosure of Invention
In view of this, the present invention provides a system and a method for controlling and editing a visual three-dimensional display, so that various user schemes suitable for different scenes can be formed by a unified visual three-dimensional display control and editing system, and the difficulty in developing the user schemes of the three-dimensional display control technology is reduced.
The technical scheme of the invention is realized as follows:
a visual three-dimensional display control editing system, the system comprising: the device comprises an input/output device, a visual editing device, a control editing device and an equipment management device;
the input and output device is used for receiving a plurality of user instructions input by a user and sending the received user instructions to the visual editing device; and further for outputting the received user profile;
the visual editing device is used for acquiring a resource file from the outside according to a user instruction and sending the acquired resource file to the control editing device; the method also comprises the steps of sending a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device according to a user instruction; the rendering device is also used for converting and displaying the received resource file information, logic information, object information, scene information, rendered objects and rendered final rendering set of the scene model; compiling the received scene model into a user scheme, and sending the user scheme to the input and output device;
the control editing device is used for converting the format of the received resource file and then storing the resource file, and sending the resource file information to the visual editing device; reading a required resource file from the stored resource file after format conversion according to a resource reading instruction; sending the received equipment instruction to the equipment management device, and receiving equipment information of the required field equipment, which is returned by the equipment management device according to the equipment instruction; sending the logic information carried in the received adjusting instruction, the frame instruction and the control instruction to the visual editing device; sending a device control instruction to the device management apparatus according to the control instruction, and receiving updated device information returned by the device management apparatus according to the device control instruction; synthesizing the received equipment information, the updated equipment information and the read resource file subjected to format conversion into a required object according to the adjusting instruction, the frame instruction and the control instruction, rendering the synthesized object and then sending the rendered object to the visual editing device; combining the synthesized object and scene information into a required scene model according to a scene combination instruction, and sending the scene model to a visual editing device; performing parallel computation on the received scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, rendering the final rendering set, and sending the rendered final rendering set to a visual editing device;
the device management device is connected with various field devices and used for acquiring parameters of the required field devices according to the device indication instruction and sending the acquired parameters of the field devices to the control editing device as device information; and the control editing device is also used for sending a control instruction to each field device needing to be controlled according to the device control instruction, so as to correspondingly control each field device needing to be controlled, and sending the parameters of the field device after being controlled as updated device information to the control editing device.
Preferably, the input/output device includes: an input module and an output module; wherein,
the input module is used for receiving a plurality of user instructions input by a user and sending the received user instructions to the visual editing device;
and the output module is used for outputting the received user scheme.
Preferably, the visual editing apparatus includes: the system comprises an editing module, a display module and a compiling module; wherein,
the editing module is used for receiving a user instruction; acquiring a resource file from the outside according to a user instruction, and sending the acquired resource file to the control editing device; the method also comprises the steps of sending a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device according to a user instruction; sending a compiling instruction to the compiling module according to the user instruction; converting the received resource file information, logic information, object information, scene information, rendered objects and rendered final rendering set of the scene model and then sending the converted final rendering set to the display module;
the display module is used for displaying the received resource file information, the logic information, the object information, the scene information, the rendered object and the rendered final rendering set of the scene model after conversion;
and the compiling module is used for compiling the received scene model into a user scheme according to the compiling instruction and sending the user scheme to the input and output device.
Preferably, the editing module is further configured to send the received user instruction to the display module;
the compiling module is also used for sending the compiled user scheme to the display module.
Preferably, the control editing apparatus includes: the system comprises a storage module, a resource management module, a logic management module, an object management module, a scene management module and a rendering management module; wherein,
the resource management module is used for converting the format of the received resource file and then sending the resource file to the storage module; reading a required resource file from a storage module according to a resource reading instruction, and sending the read resource file subjected to format conversion to the object management module; sending the resource file information to the visual editing device;
the logic management module is used for sending the adjusting logic, the frame logic and the control logic carried in the received adjusting instruction, the frame instruction and the control instruction to the storage module and sending the logic information of the adjusting logic, the frame logic and the control logic to the visual editing device; the system is also used for reading required adjusting logic, frame logic and control logic from the storage module according to a logic calling instruction sent by the object management module and sending the read adjusting logic, frame logic and control logic to the object management module; the device management device can also be used for sending a device control instruction to the device management device according to the control logic carried in the control instruction and the field device list needing to be controlled;
the object management module is used for sending the equipment indication information carried in the received equipment indication instruction to the storage module, sending the equipment indication instruction to the equipment management device, and receiving the equipment information of the field equipment returned by the equipment management device according to the equipment indication instruction; sending a resource reading instruction to the resource management module according to the received resource reading instruction; sending a logic calling instruction to the logic management module according to the received object synthesis instruction, receiving an adjusting logic, a framework logic and a control logic returned by the logic management module, receiving updated equipment information returned by the equipment management device according to the equipment control instruction, and synthesizing the received equipment information, the updated equipment information and the resource file after format conversion into a required object according to the adjusting logic, the framework logic and the control logic; sending the synthesized object to the storage module and the rendering management module; reading an object from the storage module according to the received object calling instruction, and sending the read object to the scene management module;
the scene management module is used for receiving a scene combination instruction carrying scene information, sending an object calling instruction to the object management module according to the scene information in the scene combination instruction, and receiving an object returned by the object management module according to the object calling instruction; the received object and scene information are combined into a required scene model, and the scene model is sent to the rendering management module, the storage module and the visual editing device;
the rendering management module is used for rendering the received object and then sending the rendered object to the visual editing device; the system is also used for carrying out parallel computation on the received scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, rendering the final rendering set and then sending the rendered final rendering set to a visual editing device;
the storage module is used for storing the received resource file after format conversion, the adjusting logic, the frame logic, the control logic, the equipment indicating information, the object and the scene model.
Preferably, the object management module is further configured to send the object information of the synthesized object to an editing module in the visual editing apparatus; wherein the object information includes: various attribute information of the object;
and the editing module is also used for converting the received object information and then sending the converted object information to the display module for display.
Preferably, the object management module is further configured to store the object information of the synthesized object in the storage module; and the storage module is also used for reading corresponding object information from the storage module according to the object information reading instruction sent by the editing module and sending the read object information to the editing module.
Preferably, the scene management module is further configured to store the scene information carried in the scene combination instruction in a storage module, and read the corresponding scene information from the storage module according to the scene information reading instruction sent by the editing module, and send the read scene information to the editing module.
Preferably, the editing module is further configured to send a rendering configuration instruction carrying rendering parameters to the rendering management module;
the rendering management module is further configured to configure rendering parameters according to the received rendering configuration instruction.
Preferably, the device management apparatus includes: the device management system comprises a device management primary submodule and a plurality of device management secondary submodules; wherein,
the equipment management primary sub-module is respectively connected with each equipment management secondary sub-module;
and each equipment management secondary submodule is respectively connected with various field equipment.
The invention also provides a visual three-dimensional display control editing method, which comprises the following steps:
receiving a plurality of user instructions input by a user;
acquiring a resource file from the outside according to a user instruction, and performing format conversion and storage on the acquired resource file;
reading a required resource file from the stored resource file after format conversion according to a user instruction;
acquiring required equipment information of each field equipment from equipment management devices connected with various field equipment according to a user instruction;
synthesizing the acquired equipment information and the resource file after format conversion into a required object according to a user instruction;
combining the synthesized object and scene information into a required scene model according to a user instruction;
performing parallel computation on the scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, and rendering the final rendering set;
and compiling the scene model into a user scheme according to a user instruction and outputting the user scheme.
Preferably, the converting the format of the acquired resource file includes:
and converting the format of the received resource file through parallel computation.
Preferably, the reading the required resource file from the stored resource file after format conversion according to the user instruction includes:
when the user instruction is a resource reading instruction, reading a required resource file from the stored resource file after format conversion according to the resource reading instruction;
the resource reading instruction carries information of the required resource file.
Preferably, the acquiring, according to a user instruction, device information of each required field device from a device management apparatus connected to a plurality of types of field devices includes:
when the user instruction is an equipment instruction, acquiring equipment information of each required field equipment from the equipment management device according to the equipment instruction;
and when the user instruction is a control instruction, acquiring updated equipment information from the equipment management device according to the control instruction.
Preferably, the device indication instruction carries a device list of the required field devices;
the control command also carries control logic and a list of field devices to be controlled.
Preferably, the acquiring updated device information from the device management apparatus according to the control instruction includes:
sending a device control instruction to a device management device according to the control logic carried in the control instruction and a field device list to be controlled;
and the equipment management device performs control operation on each field equipment to be controlled according to the equipment control instruction and returns updated equipment information.
Preferably, before sending the device control instruction to the device management apparatus, the method further includes:
the adjustment logic, the framework logic, and the control logic in the received user instruction are stored.
Preferably, the synthesizing the acquired device information and the resource file after format conversion according to the user instruction includes:
when the user instruction is an object synthesis instruction, acquiring an adjusting logic, a framework logic and a control logic according to the object synthesis instruction;
and synthesizing the acquired equipment information and the resource file after format conversion into a required object according to the acquired adjusting logic, the frame logic and the control logic.
Preferably, the adjustment logic is logic for changing the attribute and state of the object;
the frame logic is used for providing frame logic for the object type required to be created; wherein the frame is a collection of basic attributes describing the object;
and the control logic is used for controlling and operating each field device to be controlled.
Preferably, the combining the combined object and scene information into the desired scene model according to the user instruction includes:
when the user instruction is a scene combination instruction, combining the synthesized object and scene information into a required scene model according to the scene combination instruction;
the scene combination instruction carries scene information.
Preferably, the compiling the scene model into the user scheme and outputting according to the user instruction includes:
when the user instruction is a compiling instruction, compiling the scene model into a user scheme according to the compiling instruction;
and outputting the compiled user scheme.
It can be seen from the above technical solutions that the device management module in the present invention can be connected to not only a certain field device, but also a plurality of required field devices, and can obtain various parameters of each connected field device according to user instructions, and can send corresponding control instructions to the various connected field devices for corresponding control, so that a user can establish various scene models and form various user schemes according to the needs of practical application, and thus a plurality of user schemes respectively applicable to various different practical scenes can be created and output by using the above visual three-dimensional display control editing system, thereby realizing integration of three-dimensional display and device control technologies, and reducing the difficulty in developing user schemes of three-dimensional display control technologies.
Drawings
Fig. 1 is a schematic structural diagram of a visualized three-dimensional display control editing system in an embodiment of the present invention.
Fig. 2 is a flowchart of a method for controlling editing of a visual three-dimensional display according to an embodiment of the present invention.
Detailed Description
In order to make the technical scheme and advantages of the invention more apparent, the invention is further described in detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a schematic structural diagram of a visualized three-dimensional display control editing system in an embodiment of the present invention. As shown in fig. 1, the system includes: an input/output device 11, a visualization editing device 12, a control editing device 13, and a device management device 14.
The input and output device 11 is configured to receive a plurality of user instructions input by a user, and send the received user instructions to the visual editing device 12; and further for outputting the received user profile;
the visual editing device 12 is configured to obtain a resource file from the outside according to a user instruction, and send the obtained resource file to the control editing device 13; the method also sends a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device 13 according to a user instruction; the rendering device is also used for converting and displaying the received resource file information, logic information, object information, scene information, rendered objects and rendered final rendering set of the scene model; compiling the received scene model into a user scheme, and sending the user scheme to the input and output device 11;
the control editing device 13 is configured to perform format conversion on the received resource file and store the resource file, and send resource file information to the visual editing device 12; reading a required resource file from the stored resource file after format conversion according to a resource reading instruction; sending the received device indication instruction to the device management apparatus 14, and receiving device information of the required field device returned by the device management apparatus 14 according to the device indication instruction; sending the logic information carried in the received adjusting instruction, frame instruction and control instruction to the visual editing device 12; sending a device control instruction to the device management apparatus 14 according to the control instruction, and receiving updated device information returned by the device management apparatus 14 according to the device control instruction; synthesizing the received device information, the updated device information and the read resource file after format conversion into a required object according to the adjustment instruction, the frame instruction and the control instruction, rendering the synthesized object, and sending the rendered object to the visual editing device 12; combining the synthesized object and scene information into a required scene model according to a scene combination instruction, and sending the scene model to the visual editing device 12; performing parallel computation on the scene model, filtering out objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, rendering the final rendering set, and sending the rendered final rendering set to the visual editing device 12;
the device management apparatus 14 is connected to the various field devices 100, and configured to acquire parameters of a required field device according to the device instruction, and send the acquired parameters of the field device to the control editing apparatus 13 as device information; and is further configured to send a control instruction to each field device that needs to be controlled according to the device control instruction, so as to correspondingly control each field device that needs to be controlled, and send the parameter of the field device that has been controlled to the control editing apparatus 13 as updated device information.
Preferably, in an embodiment of the present invention, the input/output device 11 may further include: an input module 111 and an output module 112;
the input module 111 is configured to receive a plurality of user instructions input by a user, and send the received user instructions to the visual editing apparatus 12;
the output module 112 is configured to output the received user scheme.
Preferably, in an embodiment of the present invention, the visual editing apparatus 12 further includes: an editing module 121, a display module 122, and a compiling module 123;
the editing module 121 is configured to receive a user instruction; acquiring a resource file from the outside according to a user instruction, and sending the acquired resource file to the control editing device 13; the method also sends a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device 13 according to a user instruction; sending a compiling instruction to the compiling module 123 according to the user instruction; converting the received resource file information, logic information, object information, scene information, rendered object, and rendered final rendering set of the scene model, and then sending the converted final rendering set to the display module 122;
the display module 122 is configured to display the received converted resource file information, logic information, object information, scene information, rendered object, and rendered final rendering set of the scene model;
the compiling module 123 is configured to compile the received scene model into a user scenario according to the compiling instruction, and send the user scenario to the input/output device 11.
Preferably, in an embodiment of the present invention, the resource file information includes: resource attribute information and resource image information.
Preferably, in an embodiment of the present invention, the control editing apparatus 13 further includes: a storage module 131, a resource management module 132, a logic management module 133, an object management module 134, a scene management module 135, and a rendering management module 136.
The resource management module 132 is configured to perform format conversion on the received resource file and send the resource file to the storage module 131; reading a required resource file from the storage module 131 according to the resource reading instruction, and sending the read resource file after format conversion to the object management module 134; sending the resource file information to the visual editing device 12;
the logic management module 133 is configured to send the adjustment logic, the frame logic, and the control logic carried in the received adjustment instruction, the frame instruction, and the control instruction to the storage module 131, and send the logic information of the adjustment logic, the frame logic, and the control logic to the visual editing apparatus 12; the object management module 134 is further configured to read required adjustment logic, framework logic and control logic from the storage module 131 according to a logic call instruction sent by the object management module 134, and send the read adjustment logic, framework logic and control logic to the object management module 134; the device management apparatus 14 may also be configured to send a device control instruction to the device management apparatus 14 according to the control logic carried in the control instruction and the field device list to be controlled;
the object management module 134 is configured to send the device indication information carried in the received device indication instruction to the storage module 131, send the device indication instruction to the device management apparatus 14, and receive device information of the field device returned by the device management apparatus 14 according to the device indication instruction; sending a resource reading instruction to the resource management module 132 according to the received resource reading instruction; sending a logic call instruction to the logic management module 133 according to the received object synthesis instruction, receiving the adjustment logic, the framework logic, and the control logic returned by the logic management module 133, receiving updated device information returned by the device management apparatus 14 according to the device control instruction, and synthesizing the received device information, the updated device information, and the resource file after format conversion into a required object according to the adjustment logic, the framework logic, and the control logic; sending the synthesized object to the storage module 131 and the rendering management module 136; reading an object from the storage module 131 according to the received object calling instruction, and sending the read object to the scene management module 135;
the scene management module 135 is configured to receive a scene combination instruction carrying scene information, send an object calling instruction to the object management module 134 according to the scene information in the scene combination instruction, and receive an object returned by the object management module 134 according to the object calling instruction; combining the received object and scene information into a required scene model, and sending the scene model to the rendering management module 136, the storage module 131 and the visual editing apparatus 12;
the rendering management module 136 is configured to render the received object and send the rendered object to the visual editing apparatus 12; the system is further configured to perform parallel computation on the received scene model, filter an object that does not need to be displayed in the scene model, generate a final rendering set of the scene model, render the final rendering set, and send the rendered final rendering set to the visual editing device 12; therefore, the user can check the rendered objects and the rendered final rendered set through the visual editing apparatus 12 to confirm whether the objects are the required objects and whether the combined scene model is the required scene model.
The storage module 131 is configured to store the received resource file, the adjustment logic, the framework logic, the control logic, the device indication information, the object, and the scene model after being subjected to the format conversion.
Preferably, in an embodiment of the present invention, the editing module 121 may further send the received user instruction to the display module 122, so that the user can view the user instruction received by the editing module 121 from the display module 122.
Preferably, in an embodiment of the present invention, the compiling module 123 may further send the compiled user schema to the display module 122, so that the user can view the compiled user schema from the display module 122.
Since the object management module 134 has sent the device indication information carried in the received device indication instruction to the storage module 131 for storage, in an embodiment of the present invention, the object management module 134 may further read the stored device indication information from the storage module 131 for subsequent operation of synthesizing the object.
Preferably, in an embodiment of the present invention, the object management module 134 is further configured to send the object information of the synthesized object to the editing module 121 in the visual editing apparatus 12. The editing module 121 may convert the received object information and then send the converted object information to the display module 122 for display. Wherein the object information includes: various attribute information of the object. Such as the size, dimension, color, shape, weight, speed, etc. of the object.
Preferably, in an embodiment of the present invention, the object management module 134 is further configured to store object information of the synthesized object in the storage module 131; and the storage module 131 is further configured to read corresponding object information according to an object information reading instruction sent by the editing module 121, and send the read object information to the editing module 121.
Preferably, in an embodiment of the present invention, the scene management module 135 is further configured to store the scene information carried in the scene combination instruction in the storage module 131, and read the corresponding scene information from the storage module 131 according to the scene information reading instruction sent by the editing module 121, and send the read scene information to the editing module 121.
Preferably, in an embodiment of the present invention, the editing module 121 is further configured to send a rendering configuration instruction carrying rendering parameters to the rendering management module 136;
the rendering management module 136 is configured to configure rendering parameters according to the received rendering configuration instruction.
Preferably, in an embodiment of the present invention, each rendering parameter in the rendering management module 136 is set with a default value. Thus, when no rendering configuration instructions are received, the rendering management module 136 will use default values for the various rendering parameters.
Preferably, in an embodiment of the present invention, the input/output device 11 may include: keyboard, mouse, handle, microphone, touch-sensitive screen, handwriting pad, data gloves and output port etc..
The input module 111 in the input/output device 11 may be: input devices such as keyboards, mice, handles, microphones, touch screens, writing pads, and/or data gloves; the input module 112 in the input-output device may be: output devices such as output ports (e.g., USB interface, network output interface, and/or wireless output interface, etc.).
Preferably, in an embodiment of the present invention, the display module 122 in the visual editing apparatus 12 may be: a display, or the like.
Preferably, in embodiments of the present invention, the field devices may include various electromechanical devices and various controllers.
For example, the electromechanical device may be: fans, valves and/or water pumps, etc.;
the controller may be: programmable Logic Controller (PLC), remote terminal control system (RTU), and the like.
Preferably, in an embodiment of the present invention, the adjusting logic is: logic for changing the attributes and state of the object; the framework logic is: logic for providing a framework for the type of object that needs to be created, wherein the framework is a collection of basic attributes that describe the object. The control logic is configured to send a device control instruction to the device management apparatus 14, so as to perform a control operation on each field device to be controlled.
Preferably, in an embodiment of the present invention, the resource file includes: model resources, material resources, chartlet resources, video resources, audio resources, Graphical User Interface (GUI) resources, animation resources, and special effects resources.
Preferably, in an embodiment of the present invention, the device management apparatus 14 may further include: a device management primary sub-module and a plurality of device management secondary sub-modules (not shown in fig. 1). The equipment management primary sub-module is respectively connected with each equipment management secondary sub-module; the device management sub-modules are connected to a plurality of field devices 100, respectively, so that a layered structure can be formed. The device management primary submodule is a primary device management layer, and a plurality of device management secondary submodules form a secondary device management layer.
Preferably, in an embodiment of the present invention, the layered structure in the device management apparatus 14 may be a layered structure composed of the above two device management layers, or a layered structure composed of multiple device management layers. The layered structure composed of multiple device management layers is similar to the layered structure composed of two device management layers described above, and thus will not be described in detail herein.
The device management apparatus 14 can perform hierarchical management on each field device connected to the device management apparatus 14 through the above-described layered structure, so as to realize remote control on each field device.
Through the visual three-dimensional display control editing system, a user can create various user schemes through the control editing device, then visual configuration, visual editing, visual debugging and other operations are carried out on the created user schemes, and finally the packaged user schemes are generated and output.
In the technical solution of the present invention, the output user scheme may be an executable file. After the user scheme is executed, corresponding functional modules can be automatically generated, and the user can perform corresponding operations according to the generated functional modules, so that the controllable equipment in the actual scene corresponding to the scene model included in the user scheme can be actually controlled.
Therefore, a user can establish various scene models and form various user schemes according to the requirements of practical application, so that a plurality of user schemes respectively applied to various different practical scenes can be established and output only by using the visual three-dimensional display control editing system, the integration of three-dimensional display and equipment control technology is realized, the development difficulty of the user scheme of the three-dimensional display control technology is reduced, the development efficiency and the operation efficiency of the user scheme of the three-dimensional display control technology are improved, and the development cost of the user is reduced.
In addition, according to the above visualized three-dimensional display control editing system provided by the present invention, the present invention further provides a corresponding visualized three-dimensional display control editing method, specifically please refer to fig. 2.
Fig. 2 is a flowchart of a method for controlling editing of a visual three-dimensional display according to an embodiment of the present invention.
As shown in fig. 2, the method for controlling and editing a visual three-dimensional display according to an embodiment of the present invention includes the following steps:
step 201, a plurality of user instructions input by a user are received.
In this step, the user can input various user instructions through the input/output device (for example, an input module in the input/output device) 11 in the visual three-dimensional display control editing system according to the present invention, thereby performing various operations required by the user. Therefore, the visualized three-dimensional display control editing system in the invention can receive various user instructions input by the user through the input and output device 11.
Preferably, in an embodiment of the present invention, the input module 111 in the input/output device 11 can receive a user instruction input by a user, and send the received user instruction to the editing module 121 in the visual editing apparatus 12.
Preferably, in an embodiment of the present invention, the user command may include, but is not limited to: resource reading instructions, device indication instructions, adjustment instructions, framework instructions, control instructions, object composition instructions, scene composition instructions, and compilation instructions.
Accordingly, the editing module 121 may transmit a resource reading instruction, a device instruction, an adjustment instruction, a frame instruction, a control instruction, an object composition instruction, a scene composition instruction, and a compiling instruction to the control editing apparatus 13 according to the received user instruction.
Preferably, in an embodiment of the present invention, the editing module 121 may further send the received user instruction to the display module 122, so that the user can view the user instruction received by the editing module 121 from the display module 122.
Step 202, acquiring a resource file from the outside according to a user instruction, and performing format conversion and storage on the acquired resource file.
Before creating a required user plan, various corresponding resource files required for creating the user plan need to be acquired from the outside first. Therefore, in this step, the resource file is obtained from the outside according to the user instruction, the format of the obtained resource file is converted, and then the resource file after format conversion is stored.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the above-mentioned visual three-dimensional display control editing system may obtain a resource file from the outside according to the received user instruction, and then send the obtained resource file to the resource management module 132 in the control editing apparatus 13; the resource management module 132 performs format conversion on the received resource file and stores the resource file in the storage module 131 in the control editing apparatus 13.
Preferably, in this step, in an embodiment of the present invention, the resource management module 132 may further send the resource file information to the editing module 121 in the visual editing apparatus 12, and the editing module 121 converts the received resource file information and sends the converted resource file information to the display module 122, so that the resource file information can be displayed to the user through the display module 122, which is convenient for the user to view and perform subsequent operations.
Wherein the resource file information includes: and the resource attribute information and the resource image information of the acquired resource file and the resource attribute information and the resource image information of the resource file after format conversion.
Preferably, in an embodiment of the present invention, the resource management module 132 performs format conversion on the received resource file through parallel computing.
Step 203, reading the required resource file from the stored resource file after format conversion according to the user instruction.
After the resource file is acquired from the outside, the required resource file can be read from the stored resource file after format conversion according to the user instruction input by the user, so as to be used for the subsequent creation and completion of the user scheme.
Preferably, in an embodiment of the present invention, the implementation manner of step 203 may be:
and when the user instruction is a resource reading instruction, reading the required resource file from the stored resource file after format conversion according to the resource reading instruction.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the visual three-dimensional display control editing system may send a resource reading instruction to the object management module 134 in the control editing apparatus 13 according to a received user instruction, where the object management module 134 sends the resource reading instruction to the resource management module 132; the resource management module 132 reads the required resource file after format conversion from the storage module 131 according to the resource reading instruction, and sends the read resource file after format conversion to the object management module 134, so that the object management module 134 synthesizes the required object in the subsequent steps.
Preferably, in an embodiment of the present invention, the resource reading instruction carries information of the required resource file, so that the required resource file can be read from the stored resource file after format conversion according to the information of the required resource file carried in the resource reading instruction.
In order to distinguish the resource files, in a preferred embodiment of the present invention, a unique identifier may be set for each stored resource file after format conversion, because the stored resource files after format conversion differ from each other in terms of type, nature, content, and the like. Therefore, in a preferred embodiment of the present invention, the information of the required resource file is: identification of the required resource file.
And step 204, acquiring the required equipment information of each field equipment from the equipment management device connected with the field equipment according to the user instruction.
Since various field devices are generally designed and controlled in the user plan, in this step, device information of each required field device can be acquired from the device management apparatus according to a user instruction. The device management device can be connected with various field devices, so that the device management device can obtain the device information of various field devices.
Preferably, in an embodiment of the present invention, the implementation manner of the step 204 may be:
when the user instruction is an equipment instruction, acquiring equipment information of each required field equipment from the equipment management device according to the equipment instruction;
and when the user instruction is a control instruction, acquiring updated equipment information from the equipment management device according to the control instruction.
Preferably, in an embodiment of the present invention, the device indication instruction carries device indication information, where the device indication information may be: a device list of required field devices. The device list records information of each required field device (for example, identification information for identifying each required field device), so that the device information of each required field device can be acquired from the device management apparatus according to the device indication information of the required field device carried in the device indication instruction.
Preferably, in an embodiment of the present invention, the control instruction further carries control logic and a list of field devices to be controlled. The field device list to be controlled records information of each field device to be controlled (for example, identification information for identifying each field device to be controlled). Therefore, the corresponding control operation can be performed on each field device to be controlled according to the field device list to be controlled and the control logic carried in the control instruction.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the visual three-dimensional display control editing system may send a device instruction to the object management module 134 in the control editing apparatus 13 according to a user instruction; the object management module 134 sends the device indication information carried in the received device indication instruction to the storage module 131, and also sends the received device indication instruction to the device management apparatus 14, and receives the device information of the field device returned by the device management apparatus 14 according to the device indication instruction.
In addition, the logic management module 133 in the control editing apparatus 13 in the above-mentioned visual three-dimensional display control editing system can send a device control instruction to the device management apparatus 14 according to the control logic carried in the control instruction and the field device list to be controlled; the device management apparatus 14 can perform control operation on each field device to be controlled according to the device control instruction, and return updated device information to the object management module 134.
Preferably, in an embodiment of the present invention, before the logic management module 133 sends the device control instruction to the device management apparatus 14 according to the control instruction, the method for controlling and editing a visual three-dimensional display may further include:
step 204a, storing the adjustment logic, the frame logic and the control logic in the received user instruction.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the visual three-dimensional display control editing system may send an adjusting instruction, a frame instruction, and a control instruction, which respectively carry an adjusting logic, a frame logic, and a control logic, to the logic management module 133 in the control editing apparatus 13 according to a user instruction; the logic management module 133 may send the adjusting logic, the frame logic, and the control logic carried in the received adjusting instruction, the frame instruction, and the control instruction to the storage module 131, and store the adjusting logic, the frame logic, and the control logic in the storage module 131.
In addition, in a preferred embodiment of the present invention, the logic management module 133 may further send the logic information of the adjustment logic, the framework logic and the control logic to the editing module 121 in the visual editing apparatus 12, and the editing module 121 converts the received logic information and sends the converted logic information to the display module 122, so that the logic information may be displayed to the user, and the user may conveniently view and perform subsequent operations.
In the technical solution of the present invention, the execution sequence between step 203 and step 204 is not limited. For example, in the preferred embodiment of the present invention, step 203 may be performed first and then step 204 may be performed, step 204 may be performed first and then step 203 may be performed, or step 203 and step 204 may be performed simultaneously. The specific execution sequence may be preset according to the actual application condition, and is not described herein again.
Step 205, synthesizing the acquired device information and the resource file after format conversion into a required object according to a user instruction.
Since the required device information and the corresponding resource file have been acquired in steps 203 and 204, respectively, in this step, the acquired device information and the format-converted resource file can be combined into a required object according to a user instruction.
Preferably, in an embodiment of the present invention, the implementation manner of the step 205 may be:
and when the user instruction is an object synthesis instruction, acquiring an adjusting logic, a framework logic and a control logic according to the object synthesis instruction, and synthesizing the acquired equipment information and the resource file after format conversion into a required object according to the acquired adjusting logic, framework logic and control logic.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the visual three-dimensional display control editing system may send an object composition instruction to the object management module 134 in the control editing apparatus 13 according to a user instruction; the object management module 134 will send a logic call instruction to the logic management module 133 in the control editing apparatus 13 according to the received object composition instruction; the logic management module 133 reads the required adjustment logic, frame logic and control logic from the storage module 131 according to the logic call instruction, and sends the read adjustment logic, frame logic and control logic to the object management module 134; then, the object management module 134 synthesizes the received device information, the updated device information, and the resource file after format conversion into a required object according to the adjustment logic, the framework logic, and the control logic.
Preferably, in an embodiment of the present invention, the adjusting logic is: logic for changing the attributes and state of the object; the framework logic is: logic for providing a framework for the type of object that needs to be created, wherein the framework is a collection of basic attributes that describe the object. The control logic is configured to send a device control instruction to the device management apparatus 14, so as to perform a control operation on each field device to be controlled.
In addition, in the preferred embodiment of the present invention, the object management module 134 may send the synthesized object to the scene management module 135 after synthesizing the object, so as to facilitate the subsequent combination of the scene models. Meanwhile, the object management module 134 may also send the synthesized object to the storage module 131 for storage, so as to facilitate subsequent invocation. In addition, the object management module 134 may further send the synthesized object to the rendering management module 136, the rendering management module 136 renders the received object and sends the rendered object to the editing module 121 in the visual editing apparatus 12, and the editing module 121 converts the received rendered object and sends the converted rendered object to the display module 122, so that the user may view the synthesized object from the display module 122.
Preferably, in an embodiment of the present invention, the object management module 134 may further send the object information of the synthesized object to the editing module 121 in the visual editing apparatus 12. The editing module 121 may convert the received object information and then send the converted object information to the display module 122 for display.
Wherein the object information includes: various attribute information of the object. Such as the size, dimension, color, shape, weight, speed, etc. of the object.
Preferably, in an embodiment of the present invention, the object management module 134 may further store the object information of the synthesized object in the storage module 131, and may read the corresponding object information from the storage module 131 according to the object information reading instruction sent by the editing module 121, and send the read object information to the editing module 121.
And step 206, combining the synthesized object and scene information into a required scene model according to a user instruction.
Since the desired object has been synthesized in step 205, in this step, the synthesized object and scene information can be combined into a desired scene model according to a user instruction.
Preferably, in an embodiment of the present invention, the implementation manner of the step 206 may be:
and when the user instruction is a scene combination instruction, combining the combined object and scene information into a required scene model according to the scene combination instruction.
Preferably, in an embodiment of the present invention, the scene combination instruction carries scene information. Since the scene information includes the scene attribute information (e.g., information such as position, size, connection relationship, etc.) of each object and each field device and the frame information of the scene itself (e.g., information such as type, size, structure, etc. of the scene), the synthesized object and scene information can be combined into a required scene model according to the scene information carried in the scene combination instruction.
For example, in a preferred embodiment of the present invention, the editing module 121 in the visual editing apparatus 12 in the visual three-dimensional display control editing system may send a scene combination instruction carrying scene information to the scene management module 135 in the control editing apparatus 13 according to a user instruction; the scene management module 135 sends an object calling instruction to the object management module 134 according to the scene information in the scene combination instruction, and the object management module 134 reads an object from the storage module 131 according to the object calling instruction and sends the read object to the scene management module 135; the scene management module 135 combines the received object and scene information into a desired scene model.
Preferably, after the desired scene model is combined, the scene management module 135 may further send the scene model to the rendering management module 136 to perform the subsequent step 207. In addition, the scene management module 135 may also send the scene model to the storage module 131 for storage.
Preferably, in an embodiment of the present invention, the scene management module 135 may further store the scene information carried in the scene combination instruction in the storage module 131, and may read the corresponding scene information from the storage module 131 according to the scene information reading instruction sent by the editing module 121, and send the read scene information to the editing module 121.
And step 207, performing parallel computation on the scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, and rendering the final rendering set.
In this step, after the scene model is generated, the scene model may be subjected to parallel computation, a final rendering set of the scene model is generated after objects that are not required to be displayed in the scene model are filtered, and the final rendering set is rendered.
For example, in a preferred embodiment of the present invention, the rendering management module 136 in the control editing apparatus 13 in the above-mentioned visual three-dimensional display control editing system may perform parallel computation on the scene model, filter out objects that are not required to be displayed in the scene model, generate a final rendering set of the scene model, render the final rendering set, and send the final rendering set to the editing module 121 in the visual editing apparatus 12, where the editing module 121 converts the received final rendering set and sends the converted final rendering set to the display module 122, so that a user may view whether the combined scene model is a required scene model from the display module 122.
Preferably, in an embodiment of the present invention, each rendering parameter in the rendering management module 136 is set with a default value. Therefore, the editing module 121 may also send a rendering configuration instruction carrying rendering parameters to the rendering management module 136 in advance; the rendering management module 136 configures rendering parameters according to the received rendering configuration instructions. In addition, when the rendering management module 136 does not receive the rendering configuration instruction, the rendering management module 136 will perform rendering using default values of the rendering parameters.
And step 208, compiling the scene model into a user scheme according to the user instruction and outputting the user scheme.
Since the scene model has been combined in step 206 and the user can check whether the combined scene model is the desired scene model through the visual editing apparatus after step 207, when the user confirms that the combined scene model is the desired scene model, the user can send a user instruction to output the corresponding user scheme. Therefore, in this step, the scene model may be compiled into a user scheme and output according to a user instruction.
Preferably, in an embodiment of the present invention, the implementation manner of the step 208 may be:
when the user instruction is a compiling instruction, compiling the scene model into a user scheme according to the compiling instruction; and outputting the compiled user scheme.
For example, in a preferred embodiment of the present invention, the editing module 121 in the control editing apparatus 12 in the above-mentioned visual three-dimensional display control editing system can receive a user instruction input by a user through the input module 111 in the input/output apparatus 11, and send a compiling instruction to the compiling module 123 according to the user instruction; the compiling module 123 compiles the received scene model into a user scenario according to the compiling instruction, and sends the user scenario to the output module 112 in the input/output device 11. The output module 112 may output the user schema.
Preferably, in an embodiment of the present invention, before the output module 112 outputs the user scheme, the compiling module 123 may further send the compiled user scheme to the display module 122, so that the user can view the compiled user scheme from the display module 122.
Through the system and the method for the visual three-dimensional display control editing, a user can create various user schemes through the control editing device, then perform operations such as visual configuration, visual editing and visual debugging on the created user schemes, and finally generate and output the packaged user schemes.
Preferably, in an embodiment of the present invention, the output user profile may be compiled into an executable file. Therefore, after the desired user plan is obtained by the above-mentioned visual three-dimensional display control editing system, the user plan can be copied to a platform (e.g., a personal computer, an automatic industrial control device, etc.) of a desired application in various manners (e.g., wired transmission, wireless transmission, copying through various storage devices, etc.). After the user scheme is executed, corresponding function modules are automatically generated, and a user executing the user scheme can perform corresponding operations according to the generated function modules, so that actual control can be performed on an actual scene corresponding to a scene model included in the user scheme.
In the embodiment of the invention, the device management module can be connected with a plurality of required field devices (including direct connection or various indirect connections) instead of only one field device, can acquire various parameters of each connected field device according to user instructions, and can also send corresponding control instructions to the various connected field devices for corresponding control, so that a user can establish various scene models and form various user schemes according to the requirements of practical application, a plurality of user schemes which are respectively applied to various different practical scenes can be created and output by using the visual three-dimensional display control editing system, the integration of three-dimensional display and device control technologies is realized, the development difficulty of the user schemes of the three-dimensional display control technologies is reduced, and the modification and maintenance of the generated user schemes are facilitated, the user scheme development efficiency and the operation efficiency of the three-dimensional display control technology are improved, and the development cost of a user is reduced. In addition, the visual three-dimensional display control editing system has a wide application range, and can be applied to a plurality of application fields such as application of the internet of things, industrial control and simulation application, military and aerospace application, application in urban planning, stage performance application, game application, emergency deduction application, education and teaching application, virtual reality application and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (19)

1. A visual three-dimensional display control editing system, the system comprising: the device comprises an input/output device, a visual editing device, a control editing device and an equipment management device;
the input and output device is used for receiving a plurality of user instructions input by a user and sending the received user instructions to the visual editing device; and further for outputting the received user profile;
the visual editing device is used for acquiring a resource file from the outside according to a user instruction and sending the acquired resource file to the control editing device; the method also comprises the steps of sending a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device according to a user instruction; the rendering device is also used for converting and displaying the received resource file information, logic information, object information, scene information, rendered objects and rendered final rendering set of the scene model; compiling the received scene model into a user scheme, and sending the user scheme to the input and output device;
the control editing device is used for converting the format of the received resource file and then storing the resource file, and sending the resource file information to the visual editing device; reading a required resource file from the stored resource file after format conversion according to a resource reading instruction; sending the received equipment instruction to the equipment management device, and receiving equipment information of the required field equipment, which is returned by the equipment management device according to the equipment instruction; sending the logic information carried in the received adjusting instruction, the frame instruction and the control instruction to the visual editing device; sending a device control instruction to the device management apparatus according to the control instruction, and receiving updated device information returned by the device management apparatus according to the device control instruction; synthesizing the received equipment information, the updated equipment information and the read resource file subjected to format conversion into a required object according to the adjusting instruction, the frame instruction and the control instruction, rendering the synthesized object and then sending the rendered object to the visual editing device; combining the synthesized object and scene information into a required scene model according to a scene combination instruction, and sending the scene model to a visual editing device; performing parallel computation on the received scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, rendering the final rendering set, and sending the rendered final rendering set to a visual editing device;
the device management device is connected with various field devices and used for acquiring parameters of the required field devices according to the device indication instruction and sending the acquired parameters of the field devices to the control editing device as device information; and the control editing device is also used for sending a control instruction to each field device needing to be controlled according to the device control instruction, so as to correspondingly control each field device needing to be controlled, and sending the parameters of the field device after being controlled as updated device information to the control editing device.
2. The system of claim 1, wherein the input-output device comprises: an input module and an output module; wherein,
the input module is used for receiving a plurality of user instructions input by a user and sending the received user instructions to the visual editing device;
and the output module is used for outputting the received user scheme.
3. The system according to claim 1 or 2, wherein the visual editing apparatus comprises: the system comprises an editing module, a display module and a compiling module; wherein,
the editing module is used for receiving a user instruction; acquiring a resource file from the outside according to a user instruction, and sending the acquired resource file to the control editing device; the method also comprises the steps of sending a resource reading instruction, an equipment indication instruction, an adjustment instruction, a frame instruction, a control instruction, an object synthesis instruction and a scene combination instruction to the control editing device according to a user instruction; sending a compiling instruction to the compiling module according to the user instruction; converting the received resource file information, logic information, object information, scene information, rendered objects and rendered final rendering set of the scene model and then sending the converted final rendering set to the display module;
the display module is used for displaying the received resource file information, the logic information, the object information, the scene information, the rendered object and the rendered final rendering set of the scene model after conversion;
and the compiling module is used for compiling the received scene model into a user scheme according to the compiling instruction and sending the user scheme to the input and output device.
4. The system of claim 3,
the editing module is also used for sending the received user instruction to the display module;
the compiling module is also used for sending the compiled user scheme to the display module.
5. The system according to claim 3, wherein the control editing means comprises: the system comprises a storage module, a resource management module, a logic management module, an object management module, a scene management module and a rendering management module; wherein,
the resource management module is used for converting the format of the received resource file and then sending the resource file to the storage module; reading a required resource file from a storage module according to a resource reading instruction, and sending the read resource file subjected to format conversion to the object management module; sending the resource file information to the visual editing device;
the logic management module is used for sending the adjusting logic, the frame logic and the control logic carried in the received adjusting instruction, the frame instruction and the control instruction to the storage module and sending the logic information of the adjusting logic, the frame logic and the control logic to the visual editing device; the system is also used for reading required adjusting logic, frame logic and control logic from the storage module according to a logic calling instruction sent by the object management module and sending the read adjusting logic, frame logic and control logic to the object management module; the device management device can also be used for sending a device control instruction to the device management device according to the control logic carried in the control instruction and the field device list needing to be controlled;
the object management module is used for sending the equipment indication information carried in the received equipment indication instruction to the storage module, sending the equipment indication instruction to the equipment management device, and receiving the equipment information of the field equipment returned by the equipment management device according to the equipment indication instruction; sending a resource reading instruction to the resource management module according to the received resource reading instruction; sending a logic calling instruction to the logic management module according to the received object synthesis instruction, receiving an adjusting logic, a framework logic and a control logic returned by the logic management module, receiving updated equipment information returned by the equipment management device according to the equipment control instruction, and synthesizing the received equipment information, the updated equipment information and the resource file after format conversion into a required object according to the adjusting logic, the framework logic and the control logic; sending the synthesized object to the storage module and the rendering management module; reading an object from the storage module according to the received object calling instruction, and sending the read object to the scene management module;
the scene management module is used for receiving a scene combination instruction carrying scene information, sending an object calling instruction to the object management module according to the scene information in the scene combination instruction, and receiving an object returned by the object management module according to the object calling instruction; the received object and scene information are combined into a required scene model, and the scene model is sent to the rendering management module, the storage module and the visual editing device;
the rendering management module is used for rendering the received object and then sending the rendered object to the visual editing device; the system is also used for carrying out parallel computation on the received scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, rendering the final rendering set and then sending the rendered final rendering set to a visual editing device;
the storage module is used for storing the received resource file after format conversion, the adjusting logic, the frame logic, the control logic, the equipment indicating information, the object and the scene model;
wherein the adjustment logic is logic to change the attribute and state of the object;
the frame logic is used for providing frame logic for the object type required to be created; wherein the frame is a collection of basic attributes describing the object;
and the control logic is used for controlling and operating each field device to be controlled.
6. The system of claim 5, wherein:
the object management module is further configured to send the object information of the synthesized object to an editing module in the visual editing apparatus; wherein the object information includes: various attribute information of the object;
and the editing module is also used for converting the received object information and then sending the converted object information to the display module for display.
7. The system of claim 6, wherein:
the object management module is further used for storing the object information of the synthesized object in the storage module; and the storage module is also used for reading corresponding object information from the storage module according to the object information reading instruction sent by the editing module and sending the read object information to the editing module.
8. The system of claim 5, wherein:
the scene management module is further configured to store scene information carried in the scene combination instruction in a storage module, read a corresponding scene information from the storage module according to the scene information reading instruction sent by the editing module, and send the read scene information to the editing module.
9. The system of claim 5, wherein:
the editing module is further configured to send a rendering configuration instruction carrying rendering parameters to the rendering management module;
the rendering management module is further configured to configure rendering parameters according to the received rendering configuration instruction.
10. The system according to claim 1, wherein the device management apparatus comprises: the device management system comprises a device management primary submodule and a plurality of device management secondary submodules; wherein,
the equipment management primary sub-module is respectively connected with each equipment management secondary sub-module;
and each equipment management secondary submodule is respectively connected with various field equipment.
11. A visual three-dimensional display control editing method is characterized by comprising the following steps:
receiving a plurality of user instructions input by a user;
acquiring a resource file from the outside according to a user instruction, and performing format conversion and storage on the acquired resource file;
reading a required resource file from the stored resource file after format conversion according to a user instruction;
acquiring required equipment information of each field equipment from equipment management devices connected with various field equipment according to a user instruction;
synthesizing the acquired equipment information and the resource file after format conversion into a required object according to a user instruction;
combining the synthesized object and scene information into a required scene model according to a user instruction;
performing parallel computation on the scene model, filtering objects which do not need to be displayed in the scene model, generating a final rendering set of the scene model, and rendering the final rendering set;
compiling the scene model into a user scheme according to a user instruction and outputting the user scheme;
the synthesizing of the acquired device information and the resource file with the converted format into a required object according to the user instruction includes:
when the user instruction is an object synthesis instruction, acquiring an adjusting logic, a framework logic and a control logic according to the object synthesis instruction;
synthesizing the acquired equipment information and the resource file after format conversion into a required object according to the acquired adjusting logic, the frame logic and the control logic;
the adjustment logic is logic for changing the attribute and state of the object;
the frame logic is used for providing frame logic for the object type required to be created; wherein the frame is a collection of basic attributes describing the object;
and the control logic is used for controlling and operating each field device to be controlled.
12. The method of claim 11, wherein converting the format of the obtained resource file comprises:
and converting the format of the received resource file through parallel computation.
13. The method of claim 11, wherein reading the required resource file from the stored format-converted resource files according to the user instruction comprises:
when the user instruction is a resource reading instruction, reading a required resource file from the stored resource file after format conversion according to the resource reading instruction;
the resource reading instruction carries information of the required resource file.
14. The method of claim 11, wherein obtaining the device information of each required field device from the device management apparatus connected to the plurality of field devices according to the user command comprises:
when the user instruction is an equipment instruction, acquiring equipment information of each required field equipment from the equipment management device according to the equipment instruction;
and when the user instruction is a control instruction, acquiring updated equipment information from the equipment management device according to the control instruction.
15. The method of claim 14, wherein:
the device indication instruction carries a device list of the required field devices;
the control command also carries control logic and a list of field devices to be controlled.
16. The method of claim 15, wherein the obtaining updated device information from the device management apparatus according to the control instruction comprises:
sending a device control instruction to a device management device according to the control logic carried in the control instruction and a field device list to be controlled;
and the equipment management device performs control operation on each field equipment to be controlled according to the equipment control instruction and returns updated equipment information.
17. The method of claim 16, wherein before sending the device control instruction to the device management apparatus, the method further comprises:
the adjustment logic, the framework logic, and the control logic in the received user instruction are stored.
18. The method of claim 11, wherein combining the synthesized object and scene information into the desired scene model according to the user instruction comprises:
when the user instruction is a scene combination instruction, combining the synthesized object and scene information into a required scene model according to the scene combination instruction;
the scene combination instruction carries scene information.
19. The method of claim 11, wherein the compiling the scene model into a user schema and outputting according to a user instruction comprises:
when the user instruction is a compiling instruction, compiling the scene model into a user scheme according to the compiling instruction;
and outputting the compiled user scheme.
CN201210490835.0A 2012-11-27 2012-11-27 A kind of visualization of 3 d display and control editing system and method Active CN103019702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210490835.0A CN103019702B (en) 2012-11-27 2012-11-27 A kind of visualization of 3 d display and control editing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210490835.0A CN103019702B (en) 2012-11-27 2012-11-27 A kind of visualization of 3 d display and control editing system and method

Publications (2)

Publication Number Publication Date
CN103019702A CN103019702A (en) 2013-04-03
CN103019702B true CN103019702B (en) 2016-02-03

Family

ID=47968341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210490835.0A Active CN103019702B (en) 2012-11-27 2012-11-27 A kind of visualization of 3 d display and control editing system and method

Country Status (1)

Country Link
CN (1) CN103019702B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104715500A (en) * 2015-03-26 2015-06-17 金陵科技学院 3D animation production development system based on three-dimensional animation design
CN106681730B (en) * 2016-12-30 2020-02-11 当家移动绿色互联网技术集团有限公司 System and method for creating VR content conveniently by common designers
CN108268434A (en) * 2016-12-30 2018-07-10 粉迷科技股份有限公司 Hyperlink edit methods and system in stereo scene
CN108334019A (en) * 2018-01-23 2018-07-27 安徽杰智智能科技有限公司 A kind of more edit mode intelligent controllers
CN108469779A (en) * 2018-01-23 2018-08-31 安徽杰智智能科技有限公司 A kind of ad hoc network intelligent controller
CN109389662B (en) * 2018-10-16 2019-11-19 成都四方伟业软件股份有限公司 A kind of three-dimensional scenic visual configuration method and device
CN112307534A (en) * 2020-11-02 2021-02-02 武汉光谷联合集团有限公司 Park weak current scheme online design method and device based on electronic map

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1710607A (en) * 2005-07-08 2005-12-21 北京航空航天大学 3-D scene organization method facing to virtual reality 3-D picture engine
CN101694615A (en) * 2009-09-30 2010-04-14 成都九门科技有限公司 Browser-based construction system of three-dimensional ultra-large scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3762139B2 (en) * 1999-05-17 2006-04-05 三菱電機株式会社 3D equipment management system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1710607A (en) * 2005-07-08 2005-12-21 北京航空航天大学 3-D scene organization method facing to virtual reality 3-D picture engine
CN101694615A (en) * 2009-09-30 2010-04-14 成都九门科技有限公司 Browser-based construction system of three-dimensional ultra-large scene

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于面向对象空间数据库的三维地景可视化研究;任敏等;《计算机工程与应用》;20021015;第209-211页 *
网络环境下三维可视化信息系统的方法研究;张立强等;《中国科学(D辑:地球科学)》;20050630;第35卷(第6期);第511-518页 *

Also Published As

Publication number Publication date
CN103019702A (en) 2013-04-03

Similar Documents

Publication Publication Date Title
CN103019702B (en) A kind of visualization of 3 d display and control editing system and method
CN106794581B (en) System and method for flexible human-machine collaboration
CN106200983A (en) A kind of combined with virtual reality and BIM realize the system of virtual reality scenario architectural design
CN110573992B (en) Editing augmented reality experiences using augmented reality and virtual reality
CN103984818A (en) AUV (autonomous underwater vehicle) design flow visualization modeling method based on Flex technology
US12026350B2 (en) Configuring remote devices through volumetric video telepresence interfaces
CN103093034A (en) Product collaborative design method based on cloud computing
CN115495069B (en) Model-driven coal industry software process implementation method, device and equipment
Yu et al. A virtual reality simulation for coordination and interaction based on dynamics calculation
JP6318500B2 (en) Simulation apparatus and simulation program
CN105556570A (en) Generating screen data
KR101460794B1 (en) Method and system for generating media art contents
Mohamed Deep learning for spatial computing: augmented reality and metaverse “the Digital Universe”
CN107273398B (en) Human interface system and method for operating the same
CN112233208B (en) Robot state processing method, apparatus, computing device and storage medium
CN113436320A (en) 3D model generation system and method based on IFC model file
CN112363856A (en) Method for realizing interoperation of deep learning framework and application program based on DDS
CN110312990A (en) Configuration method and system
KR20160051582A (en) Platform structure making support services of manufacturing enterprises based on 3 dimensional data
CN112685494A (en) Data visualization method, device, equipment and medium
Ulmer et al. Generic integration of VR and AR in product lifecycles based on CAD models
Ma et al. Innovative Applications of Digital Art and Augmented Reality in the Construction Industry through Building Information Modeling
KR102385381B1 (en) Method and system for generating script forcamera effect
Liu et al. An Intelligent Cockpit System HMI Engine Based on COMO
US20230343038A1 (en) Method and system for creating augmented reality filters on mobile devices

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant