CN114859744A - Intelligent application visualization control method and system based on big data - Google Patents

Intelligent application visualization control method and system based on big data Download PDF

Info

Publication number
CN114859744A
CN114859744A CN202210489768.4A CN202210489768A CN114859744A CN 114859744 A CN114859744 A CN 114859744A CN 202210489768 A CN202210489768 A CN 202210489768A CN 114859744 A CN114859744 A CN 114859744A
Authority
CN
China
Prior art keywords
visual
data
control
external environment
modeling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210489768.4A
Other languages
Chinese (zh)
Other versions
CN114859744B (en
Inventor
景伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia Yunke Data Service Co ltd
Original Assignee
Inner Mongolia Yunke Data Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia Yunke Data Service Co ltd filed Critical Inner Mongolia Yunke Data Service Co ltd
Priority to CN202210489768.4A priority Critical patent/CN114859744B/en
Publication of CN114859744A publication Critical patent/CN114859744A/en
Application granted granted Critical
Publication of CN114859744B publication Critical patent/CN114859744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses an intelligent application visualization control method and system based on big data, and relates to the related field of visualization control, wherein the method comprises the following steps: obtaining first modeling data of a first visual object according to a data fitting device; obtaining a first relevance by performing environmental impact analysis on information of a first external environment of the first visual object; if the first relevance is larger than the preset relevance, second modeling data are obtained; inputting the first modeling data and the second modeling data into a three-dimensional scene model and outputting a visual three-dimensional scene; acquiring a visual control according to the visual three-dimensional scene; and according to the visual control, a visual control layer is built for visual control. The technical problems that the existing visual modeling is not accurate enough and adaptive feedback according to the positioning environment is difficult to carry out are solved, and the technical effects that the control accuracy and the user comfort level are improved by carrying out environment positioning and correlation analysis on the smart home are achieved.

Description

Intelligent application visualization control method and system based on big data
Technical Field
The invention relates to the field related to visual control, in particular to an intelligent application visual control method and system based on big data.
Background
Along with the progress of scientific technology, the requirements of people on life quality and living environment are continuously improved, the smart home is promoted to enter the life and work of people, better life experience is brought to people due to the safe and comfortable mode of the smart home, therefore, the diversified requirements of the smart home are increased accordingly, the smart home tends to be mature day by day of big data technology, home visual control becomes the current research hotspot, the current visual control aiming at the smart home is still not perfect enough, the accuracy is not high, and the quality of the smart home is influenced.
At present, the visual modeling of the smart home is not accurate enough in the prior art, and adaptive feedback according to a positioning environment is difficult to perform, so that the visual accuracy and the user experience comfort level of the smart home are affected.
Disclosure of Invention
Aiming at the defects in the prior art, the method and the system for intelligently applying the visual control based on the big data solve the technical problems that visual modeling of the smart home is not accurate enough and adaptive feedback according to a positioning environment is difficult to perform in the prior art, so that the visual accuracy and the user experience comfort level of the smart home are influenced, and the technical effects of carrying out environment positioning and correlation analysis on the smart home, carrying out intelligent control on a visual control layer and improving the accuracy and the user comfort level are achieved.
In one aspect, the present application provides a big data-based intelligent application visualization control method, which is applied to a big data-based intelligent application visualization control system, the system is communicatively connected to a data fitting device, and the method includes: acquiring data of the first visual object according to the data fitting device to obtain first modeling data; obtaining information of a first external environment of the first visual object; obtaining a first relevance by performing environmental impact analysis on the information of the first external environment, wherein the first relevance is an impact degree of the first external environment on the first visual object; if the first relevance is larger than the preset relevance, second modeling data are obtained; inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, and outputting a visual three-dimensional scene; obtaining a visual control by analyzing the workflow of the visual three-dimensional scene; and constructing a visual control layer according to the visual control, and carrying out visual control according to the visual control layer.
On the other hand, the application also provides an intelligent application visualization control system based on big data, and the system comprises: the first obtaining unit is used for carrying out data acquisition on the first visual object according to the data fitting device to obtain first modeling data; a second obtaining unit configured to obtain information of a first external environment of the first visual object; a first analysis unit, configured to perform environmental impact analysis on the information of the first external environment to obtain a first correlation, where the first correlation is a correlation impact degree of the first external environment on the first visualization object; a third obtaining unit, configured to obtain second modeling data if the first correlation is greater than a preset correlation; a first input unit, configured to input the first modeling data and the second modeling data into a three-dimensional scene model, perform three-dimensional modeling on the first visualization object, and output a visualized three-dimensional scene; a fourth obtaining unit, configured to obtain a visual control by performing workflow analysis on the visual three-dimensional scene; and the first control unit is used for building a visual control layer according to the visual control and carrying out visual control according to the visual control layer.
In a third aspect, the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of any one of the above methods when executing the program.
In a fourth aspect, the present application also provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor, performs the steps of any of the methods described above.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
the method comprises the steps of acquiring data of a first visual object according to a data fitting device to obtain first modeling data, acquiring external environment information of the first visual object to generate first external environment information, performing visual object modeling association influence analysis on the acquired first external environment, outputting first association, further judging whether the first association is greater than preset association, if so, acquiring second modeling data according to the first external environment information, inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, outputting a visual three-dimensional scene, and performing workflow analysis on controllable equipment in the generated visual three-dimensional scene, the method for achieving the intelligent home environment positioning and correlation analysis has the advantages that the visual control is obtained, further, the visual control layer is built according to the visual control to achieve the visual control mode, the intelligent home environment positioning and correlation analysis are conducted on the visual control layer, and the technical effects that accuracy and user comfort are improved are achieved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic flowchart of an intelligent application visualization control method based on big data according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an environmental impact analysis of an intelligent application visualization control method based on big data according to an embodiment of the present application;
fig. 3 is a schematic flowchart of the external environment feedback control of the intelligent application visualization control method based on big data according to the embodiment of the present application;
FIG. 4 is a schematic structural diagram of an intelligent application visualization control system based on big data according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an exemplary electronic device according to an embodiment of the present application.
Description of reference numerals: a first obtaining unit 11, a second obtaining unit 12, a first analyzing unit 13, a third obtaining unit 14, a first input unit 15, a fourth obtaining unit 16, a first control unit 17, a bus 300, a receiver 301, a processor 302, a transmitter 303, a memory 304, and a bus interface 305.
Detailed Description
The embodiment of the application provides an intelligent application visual control method and system based on big data, solves the technical problems that in the prior art, visual modeling of an intelligent home is not accurate enough, adaptability feedback is difficult to perform according to a positioning environment, and therefore the visual accuracy and the user experience comfort level of the intelligent home are affected, achieves the technical effects that environment positioning and correlation analysis are performed on the intelligent home, intelligent control is performed on a visual control layer, and the accuracy and the user comfort level are improved.
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are merely some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited to the example embodiments described herein.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet relevant regulations of national laws and regulations.
The intelligent home becomes an important component of social informatization development as an important realization mode of family informatization, and the maturity of a big data technology provides a processing mode of data acquisition and management for further realizing the intelligent home, deeply analyzes the intelligent home data, can be more convenient, safer and more efficient in a visual control mode, and can enable the intelligent home to be more intuitive in visual management.
However, the existing space model based on the smart home is not perfect enough, environment positioning and modeling cannot be combined, visual modeling of the smart home is not accurate enough, and adaptability feedback according to the positioning environment is difficult to perform, so that the visual accuracy and the user experience comfort degree of the smart home are affected.
In view of the above technical problems, the technical solution provided by the present application has the following general idea:
the application provides an intelligent application visualization control method and system based on big data, data acquisition is carried out on a first visualization object according to a data fitting device, so as to obtain first modeling data, on the other hand, external environment information of the first visualization object is acquired, information of a first external environment is generated, visualization object modeling association influence analysis is carried out on the acquired first external environment, first association is output, further, whether the first association is larger than preset association is judged, if the first association is larger than the preset association, second modeling data is obtained according to the information of the first external environment, the first modeling data and the second modeling data are input into a three-dimensional scene model, three-dimensional modeling is carried out on the first visualization object, a visualization three-dimensional scene is output, and workflow analysis is carried out on controllable equipment in the generated visualization three-dimensional scene, the method for achieving the intelligent home environment positioning and correlation analysis has the advantages that the visual control is obtained, further, the visual control layer is built according to the visual control to achieve the visual control mode, the intelligent home environment positioning and correlation analysis are conducted on the visual control layer, and the technical effects that accuracy and user comfort are improved are achieved.
For better understanding of the above technical solutions, the following detailed descriptions will be provided in conjunction with the drawings and the detailed description of the embodiments.
Example one
As shown in fig. 1, an embodiment of the present application provides a big data based intelligent application visualization control method, which is applied to a big data based intelligent application visualization control system, which is communicatively connected to a data fitting device, and the method includes:
step S100: acquiring data of the first visual object according to the data fitting device to obtain first modeling data;
particularly, the smart home becomes an important component of social informatization development as an important realization mode of family informatization, and the maturity of a big data technology provides a processing mode of data acquisition and management for further realizing the smart home, deeply analyzes smart home data, can be more convenient, safer and more efficient in a visual control mode, and can enable the smart home to be more intuitive in visual management. However, the existing space model based on the smart home is not perfect enough, environment positioning and modeling cannot be combined, visual modeling of the smart home is not accurate enough, and adaptability feedback according to the positioning environment is difficult to perform, so that the visual accuracy and the user experience comfort degree of the smart home are affected.
Furthermore, by providing an intelligent application visualization control method based on big data, a first visualization object is obtained through visualization accurate control, wherein the first visualization object is any home space based on visualization control, data acquisition is performed on the first visualization object through the data fitting device, the first visualization object comprises but is not limited to multiple data acquisition modes such as geometric data acquisition, image acquisition and video acquisition, the data acquisition device is fitted according to requirements, the data fitting device is output, so that the data acquisition flexibility of the first visualization object is ensured, and the data acquired and output by the data fitting device is used as the first modeling data to perform home space scene modeling.
Step S200: obtaining information of a first external environment of the first visual object;
step S300: obtaining a first relevance by performing environmental impact analysis on the information of the first external environment, wherein the first relevance is an impact degree of the first external environment on the first visual object;
further, as shown in fig. 2, the obtaining of the first correlation by performing environmental impact analysis on the information of the first external environment further includes:
step S310: obtaining spatial photosensitivity, spatial temperature sensitivity and air exchange of the first visual object;
step S320: configuring a first weight, a second weight, and a third weight based on the spatial light sensitivity, the spatial temperature sensitivity, and the air exchange;
step S330: performing weight calculation on the space photosensitivity, the space temperature sensitivity and the air exchange performance according to the first weight, the second weight and the third weight, and outputting a first calculation result;
step S340: and obtaining the first relevance according to the first calculation result.
Specifically, information of a first external environment of the first visual object is obtained, where the information of the first external environment is an environment outside a location area of the first visual object, and when the location area of the first visual object is different, a geographic environment of the first visual object changes accordingly, in order to ensure accurate modeling of a home environment, information acquisition is performed on the external environment of the first visual object, information of the first external environment is obtained, for example, influences of the external environment of the first visual object are different in different areas such as a residential area, a suburban area, a traffic lane, and the like, and in order to further accurately analyze, a first relevance representing a relevance influence degree of the first external environment on the first visual object is obtained by performing environment influence analysis on the information of the first external environment.
Further, the process of analyzing the environmental impact of the information of the first external environment is as follows: firstly, the regional characteristics of the first visual object are obtained, so that the analysis is carried out according to the influence generated by the regional characteristics, the accurate positioning of the external environment is realized, the modeling accuracy of the home scene is improved, and the visual control accuracy is further improved.
Specifically, the spatial photosensitivity, the spatial temperature sensitivity, and the air exchange property of the first visual object are analyzed, and a weight is further assigned to perform a weight calculation, and the calculation result is regarded as the first correlation. The spatial photosensitivity is to analyze the first visual object based on the light transmittance between the spatial structure and the external environment, such as the size of glass, the home pattern and other data; the spatial temperature sensitivity is based on the temperature heat insulation between the spatial building material and the external environment, for example, different floor heights or the heat insulation performance of the spatial material are analyzed; the air exchange performance is based on the ventilation performance of the air wind direction between the spatial geographical position and the external environment, for example, whether the greening and ventilation wind direction of the external environment is analyzed by a chemical manufacturing factory or not, so that the first weight, the second weight and the third weight which are in one-to-one correspondence are configured according to the spatial photosensitivity, the spatial temperature sensitivity and the air exchange performance respectively, the first relevance is output, the comprehensive factor consideration of the first relevance is guaranteed, and the data analysis of the external environment positioning is met.
Step S400: if the first relevance is larger than the preset relevance, second modeling data are obtained;
step S500: inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, and outputting a visual three-dimensional scene;
specifically, since the first correlation is a degree of influence of the first external environment on the correlation between the spatial light sensitivity, the spatial temperature sensitivity, and the air exchange performance of the first visual object, when the external environment of the first visual object has no significant influence feature, it indicates that the modeling influence of the external environment on the first visual object is small, and when the external environment of the first visual object has a significant influence feature, it indicates that the modeling influence of the external environment on the first visual object is large, that is, if the first correlation is greater than a preset correlation, second modeling data is obtained, where the second modeling data is external environment scene modeling data generated based on sample data collection of information of the first external environment.
Further, the first modeling data and the second modeling data are input into a three-dimensional scene model, and the first visual object is modeled in a three-dimensional mode, that is, the first visual object is modeled by analyzing the home scene modeling data of the first visual object and the external environment scene modeling data of the first external environment, so that the first visual object can be further modeled accurately by analyzing the external environment influence association in combination with external environment positioning, and the visual three-dimensional scene is output.
Step S600: obtaining a visual control by analyzing the workflow of the visual three-dimensional scene;
step S700: and constructing a visual control layer according to the visual control, and carrying out visual control according to the visual control layer.
Further, as shown in fig. 3, the system includes a visual control feedback module, and the steps of the embodiment of the present application further include S800:
step S810: acquiring real-time data of the first external environment according to the data fitting device to obtain real-time data of the external environment;
step S820: inputting the real-time data of the external environment into the visual control feedback module, and outputting first feedback control data according to the visual control feedback module, wherein the first feedback control data is data for controlling the first visual object according to the first external environment;
step S830: and inputting the first feedback control data into the visual control layer for control.
Specifically, after the visual three-dimensional scene is successfully modeled, the corresponding visual control module is refined, namely, data entry is performed on controllable electronic equipment in the visual three-dimensional scene to generate home application control, namely, further, all the controllable equipment in the home scene is subjected to use analysis, different logic controls are generated according to the control types of the controllable equipment, and the generated visual controls are used as display interfaces of visual control, so that the equipment connected with each control can be intelligently controlled according to the visual controls.
Furthermore, a visual control layer is built according to the visual control, the visual control layer is located in a display interface, and hierarchical classification is carried out according to the complexity of the control mode of the visual control, so that the logical layering of the visual control is realized, and the accuracy of visual control is further improved.
When the external environment modeling data of the first visual object is comprehensively modeled, acquiring real-time data of the external environment by connecting a visual control feedback module according to the data fitting device, and then comprehensively analyzing the acquired real-time data of the external environment according to the visual control feedback module, so as to ensure the high quality and stability of the home environment according to the user requirements, for example, when the external environment of the first visual object comprises a chemical plant and the like, the collection and discharge time interval is used for carrying out automatic feedback control on air purification in the corresponding control home, thereby adopting the method, inputting the real-time data of the external environment into the visual control feedback module, outputting first feedback control data according to the visual control feedback module, and inputting the first feedback control data into a visual control layer built by a visual space for visual control, the technical effects that the environment positioning and the correlation analysis are carried out on the intelligent home, the intelligent feedback control is carried out on the visual control layer, and the accuracy and the user comfort level are improved are achieved.
Further, step S320 in the embodiment of the present application further includes:
step S321: building a weight configuration model, wherein the weight configuration model comprises a data input layer, a data processing layer, a data judging layer and a data output layer, and the weight configuration model is embedded in a cloud processor;
step S322: the data input layer receives the space photosensitivity, the space temperature sensitivity and the air exchange performance, and after data receiving is finished, data standardization processing is carried out according to the data processing layer to obtain a first sequencing result;
step S323: the data judgment layer judges according to the first sequencing result and a preset database, and when the logical relation of the preset database is met, a weight configuration result is obtained;
step S324: and outputting the weight configuration result according to the data output layer.
Specifically, the weight configuration model is a data model that performs comprehensive weight configuration based on the spatial light sensitivity, the spatial temperature sensitivity, and the air exchange performance of the first visualization object, and the weight configuration model performs weight configuration on each parameter through further data analysis, so that the output weight configuration result is more accurate, wherein the weight configuration result includes a first weight, a second weight, and a third weight, and the first weight corresponds to the spatial light sensitivity; the second weight corresponds to spatial temperature sensitivity; the third weight corresponds to air exchangeability.
Further, the weight arrangement model includes a data input layer for receiving the spatial sensitivity, the spatial temperature sensitivity, and the air exchange property, a data processing layer for performing data normalization processing according to the data processing layer after the data reception is completed, and the data normalization processing can further improve the effectiveness of the data processing by the data normalization processing because the reference standards of the input data are not at the same reference level. And sequencing according to the final space photosensitivity, the space temperature sensitivity and the air exchange performance to obtain a first sequencing result. Further, the data judgment layer judges according to the first sequencing result and a preset database, wherein the preset database comprises preset space photosensitivity, preset space temperature sensitivity and preset air exchange performance, the size of the data in the preset database is judged according to the sequence of the first sequencing result, and when the data in the preset database is larger than the data in the preset database, the weight is increased to obtain a weight configuration result; and outputting the weight configuration result according to the data output layer, and effectively ensuring the reliability and data logicality of the external environment relevance analysis through the weight adjustment of the weight configuration model.
Further, step S200 in the embodiment of the present application further includes:
step S210: obtaining first external environment characteristics according to the information of the first external environment, wherein the first external environment characteristics comprise noise environment characteristics and air environment characteristics;
step S220: obtaining a first external environment influence coefficient according to the noise environment characteristic and the air environment characteristic;
step S230: and gaining the first relevance according to the first external environment influence coefficient, and outputting a second relevance.
Specifically, after performing an environmental impact analysis on the information of the first external environment to obtain a first correlation, further performing an external environment feature analysis according to the information of the first external environment, including a noise environment feature and an air environment feature, where the noise environment feature is traffic noise generated by the external environment and the air environment feature is air quality impact generated by the external environment, and obtaining a first external environment impact coefficient according to the noise environment feature and the air environment feature, when the first external environment impact coefficient is large, a gain adjustment needs to be performed on the first correlation through a further adjustment, in detail, when the feature strength of the first external environment is large, a large impact is generated on control in the home environment, and therefore, based on that the first external environment impact coefficient is larger than a preset external environment impact coefficient, an influence of the association between the first visual object and the first external environment increases, thereby outputting the second association.
Further, the step S500 of inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, and outputting a visual three-dimensional scene further includes:
step S510: obtaining a plurality of visualization perspectives of the first visualization object;
step S520: generating a plurality of external environment modeling view angles according to the plurality of visual view angles;
step S530: inputting the three-dimensional scene model according to the plurality of visualization view angles and the plurality of external environment modeling view angles, wherein the three-dimensional scene model comprises a scene calibration layer;
step S540: carrying out three-dimensional space calibration according to the scene calibration layer in the three-dimensional scene model, and outputting a space calibration result;
step S550: and outputting the visual three-dimensional scene according to the space calibration result.
Specifically, home furnishing visual modeling is performed based on the first modeling data and the second modeling data, the first modeling data is obtained by data acquisition based on the first visual object, and the second modeling data is obtained by data acquisition based on the first external environment of the first visual object.
Further, by analyzing and checking different visual angles, the process of outputting an accurate visual three-dimensional scene is as follows: the method comprises the steps of obtaining a plurality of visual angles of a first visual object, generating a plurality of external environment modeling visual angles according to the visual angles, and further performing three-dimensional scene simulation through the visual angles, so that the accuracy of space simulation is improved.
Further, the step S600 of the embodiment of the present application further includes:
step S610: performing controllable equipment connection based on the visual three-dimensional scene to obtain a plurality of control terminals;
step S620: performing primary screening according to the plurality of control terminals, and outputting an identification control terminal;
step S630: controlling the terminal according to the identification to generate a plurality of visual controls;
step S640: obtaining the use frequency and the use order of the plurality of visual controls according to a first user;
step S650: and screening the plurality of visualizations in a step layer according to the use frequency and the use sequence, and outputting the visualization control.
Specifically, based on the connection of controllable devices in the visual three-dimensional scene, that is, according to the connection of the devices for realizing intelligent control in the first visual object, a plurality of corresponding control terminals are obtained, and according to the screening of the control terminals, a mark control terminal is output, wherein the mark control terminal is a terminal which can be used for visual control after one screening, and further according to the mark control terminal, a plurality of visual controls are generated, based on the generation of the visual controls, the use frequency and the use sequence of the first user are recorded, that is, by analyzing the workflow and the number of the visual controls, the visualization is further subjected to ladder-level screening according to the use frequency and the use sequence, and an intelligent control sequence is continuously generated according to the use of the user, and displaying in the visual control layer in a step layer screening mode, and performing intelligent control on the visual control layer.
Compared with the prior art, the invention has the following beneficial effects:
1. the method comprises the steps of acquiring data of a first visual object according to a data fitting device to obtain first modeling data, acquiring external environment information of the first visual object to generate information of a first external environment, performing visual object modeling association influence analysis on the acquired first external environment, outputting first association, further judging whether the first association is greater than preset association, if so, acquiring second modeling data according to the information of the first external environment, inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, outputting a visual three-dimensional scene, and performing workflow analysis on controllable equipment in the generated visual three-dimensional scene, the method for achieving the intelligent home environment positioning and correlation analysis has the advantages that the visual control is obtained, further, the visual control layer is built according to the visual control to achieve the visual control mode, the intelligent home environment positioning and correlation analysis are conducted on the visual control layer, and the technical effects that accuracy and user comfort are improved are achieved.
2. Due to the fact that the method obtains the weight configuration result according to the weight configuration model, reliability and data logicality of external environment relevance analysis are effectively guaranteed through weight adjustment of the weight configuration model.
3. The first visual object is modeled by analyzing the home scene modeling data of the first visual object and the external environment scene modeling data of the first external environment, so that accurate modeling of the first visual object can be further performed by combining external environment positioning and analyzing the external environment influence association.
Example two
Based on the same inventive concept as the intelligent application visualization control method based on big data in the foregoing embodiment, the present invention further provides an intelligent application visualization control system based on big data, as shown in fig. 4, where the system includes:
the first obtaining unit 11 is configured to perform data acquisition on a first visual object according to a data fitting device to obtain first modeling data;
a second obtaining unit 12, configured to obtain information of a first external environment of the first visualization object by the second obtaining unit 12;
a first analysis unit 13, where the first analysis unit 13 is configured to perform an environmental impact analysis on the information of the first external environment to obtain a first relevance, where the first relevance is a relevance impact degree of the first external environment on the first visualization object;
a third obtaining unit 14, where the third obtaining unit 14 is configured to obtain second modeling data if the first relevance is greater than a preset relevance;
a first input unit 15, where the first input unit 15 is configured to input the first modeling data and the second modeling data into a three-dimensional scene model, perform three-dimensional modeling on the first visual object, and output a visual three-dimensional scene;
a fourth obtaining unit 16, where the fourth obtaining unit 16 is configured to obtain a visual control by performing workflow analysis on the visual three-dimensional scene;
and the first control unit 17 is used for building a visual control layer according to the visual control, and performing visual control according to the visual control layer.
Further, the system further comprises:
a fifth obtaining unit for obtaining a spatial photosensitivity, a spatial temperature sensitivity, and an air exchange property of the first visual object;
a first arrangement unit for arranging a first weight, a second weight, and a third weight based on the space photosensitivity, the space temperature sensitivity, and the air exchange property;
a first output unit configured to perform weight calculation of the space photosensitivity, the space temperature sensitivity, and the air exchange performance based on the first weight, the second weight, and the third weight, and output a first calculation result;
a sixth obtaining unit, configured to obtain the first relevance according to the first calculation result.
Further, the system further comprises:
the system comprises a first building unit, a second building unit and a third building unit, wherein the first building unit is used for building a weight configuration model, the weight configuration model comprises a data input layer, a data processing layer, a data judging layer and a data output layer, and the weight configuration model is embedded in a cloud processor;
a seventh obtaining unit, configured to receive the space photosensitivity, the space temperature sensitivity, and the air exchange property by the data input layer, and perform data standardization processing according to the data processing layer after data reception is completed, so as to obtain a first ordering result;
an eighth obtaining unit, configured to, by the data judgment layer, perform judgment on the preset database according to the first ordering result, and obtain a weight configuration result when a logical relationship of the preset database is satisfied;
a second output unit, configured to output the weight configuration result according to the data output layer.
Further, the system further comprises:
a ninth obtaining unit, configured to obtain a first external environment characteristic according to the information of the first external environment, where the first external environment characteristic includes a noise environment characteristic and an air environment characteristic;
a tenth obtaining unit configured to obtain a first external environment influence coefficient according to the noise environment characteristic and the air environment characteristic;
a third output unit, configured to perform a gain on the first correlation according to the first external environment influence coefficient, and output a second correlation.
Further, the system further comprises:
an eleventh obtaining unit configured to obtain a plurality of visualization perspectives of the first visualization object;
a first generating unit configured to generate a plurality of external environment modeling viewing angles according to the plurality of visualization viewing angles;
a second input unit, configured to input the three-dimensional scene model according to the plurality of visualization viewing angles and the plurality of external environment modeling viewing angles, where the three-dimensional scene model includes a scene calibration layer;
a fourth output unit, configured to perform three-dimensional space calibration according to the scene calibration layer in the three-dimensional scene model, and output a space calibration result;
and the fifth output unit is used for outputting the visual three-dimensional scene according to the space calibration result.
Further, the system further comprises:
a twelfth obtaining unit, configured to perform controllable device connection based on the visual three-dimensional scene, and obtain a plurality of control terminals;
a sixth output unit, configured to perform primary screening according to the plurality of control terminals, and output an identifier control terminal;
the second generation unit is used for generating a plurality of visual controls according to the identification control terminal;
a thirteenth obtaining unit, configured to obtain, according to the first user, usage frequency and usage order of the plurality of visualization controls;
and the seventh output unit is used for screening the plurality of visualizations in a step layer according to the use frequency and the use sequence and outputting the visualization controls.
Further, the system further comprises:
a fourteenth obtaining unit, configured to perform real-time data acquisition on the first external environment according to the data fitting device, so as to obtain real-time data of the external environment;
a third input unit, configured to input the real-time external environment data into the visual control feedback module, and output first feedback control data according to the visual control feedback module, where the first feedback control data is data for controlling the first visual object according to the first external environment;
and the second control unit is used for inputting the first feedback control data into the visual control layer for control.
Various changes and specific examples of the intelligent application visualization control method based on big data in the first embodiment of fig. 1 are also applicable to the intelligent application visualization control system based on big data in this embodiment, and through the foregoing detailed description of the intelligent application visualization control method based on big data, a person skilled in the art can clearly know the implementation method of the intelligent application visualization control system based on big data in this embodiment, so details are not described herein for the sake of brevity of the description.
EXAMPLE III
The electronic device of the present application is described below with reference to fig. 5.
Fig. 5 illustrates a schematic structural diagram of an electronic device according to the present application.
Based on the inventive concept of the big data based intelligent application visualization control method in the foregoing embodiments, the present invention further provides a big data based intelligent application visualization control system, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any one of the foregoing big data based intelligent application visualization control systems.
Where in fig. 5 a bus architecture (represented by bus 300), bus 300 may include any number of interconnected buses and bridges, bus 300 linking together various circuits including one or more processors, represented by processor 302, and memory, represented by memory 304. The bus 300 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 305 provides an interface between the bus 300 and the receiver 301 and transmitter 303. The receiver 301 and the transmitter 303 may be the same element, i.e., a transceiver, providing a means for communicating with various other systems over a transmission medium. The processor 302 is responsible for managing the bus 300 and general processing, and the memory 304 may be used for storing data used by the processor 302 in performing operations.
The embodiment of the application provides an intelligent application visualization control method based on big data, which is applied to an intelligent application visualization control system based on big data, the system is in communication connection with a data fitting device, and the method comprises the following steps: acquiring data of the first visual object according to the data fitting device to obtain first modeling data; obtaining information of a first external environment of the first visual object; obtaining a first relevance by performing environmental impact analysis on the information of the first external environment, wherein the first relevance is an impact degree of the first external environment on the first visual object; if the first relevance is larger than the preset relevance, second modeling data are obtained; inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, and outputting a visual three-dimensional scene; obtaining a visual control by analyzing the workflow of the visual three-dimensional scene; and constructing a visual control layer according to the visual control, and carrying out visual control according to the visual control layer. The technical problems that in the prior art, visual modeling of the smart home is not accurate enough and adaptability feedback is difficult to carry out according to a positioning environment, so that the visual accuracy of the smart home and the user experience comfort level are affected are solved, environment positioning and correlation analysis are carried out on the smart home, intelligent control is carried out on a visual control layer, and the accuracy and the user comfort level are improved.
Those of ordinary skill in the art will understand that: the various numbers of the first, second, etc. mentioned in this application are only used for the convenience of description and are not used to limit the scope of the embodiments of this application, nor to indicate the order of precedence. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one" means one or more. At least two means two or more. "at least one," "any," or similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one (one ) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable system. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The various illustrative logical units and circuits described in this application may be implemented or operated upon by general purpose processors, digital signal processors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic systems, discrete gate or transistor logic, discrete hardware components, or any combination thereof. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing systems, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and its equivalent technology, it is intended that the present application include such modifications and variations.

Claims (10)

1. The intelligent application visualization control method based on big data is applied to an intelligent application visualization control system based on big data, the system is in communication connection with a data fitting device, and the method comprises the following steps:
acquiring data of the first visual object according to the data fitting device to obtain first modeling data;
obtaining information of a first external environment of the first visual object;
obtaining a first relevance by performing environmental impact analysis on the information of the first external environment, wherein the first relevance is an impact degree of the first external environment on the first visual object;
if the first relevance is larger than the preset relevance, second modeling data are obtained;
inputting the first modeling data and the second modeling data into a three-dimensional scene model, performing three-dimensional modeling on the first visual object, and outputting a visual three-dimensional scene;
obtaining a visual control by analyzing the workflow of the visual three-dimensional scene;
and constructing a visual control layer according to the visual control, and carrying out visual control according to the visual control layer.
2. The method of claim 1, wherein the first correlation is obtained by performing an environmental impact analysis on information of the first external environment, the method further comprising:
obtaining spatial photosensitivity, spatial temperature sensitivity and air exchange of the first visual object;
configuring a first weight, a second weight, and a third weight based on the spatial light sensitivity, the spatial temperature sensitivity, and the air exchange;
performing weight calculation on the space photosensitivity, the space temperature sensitivity and the air exchange performance according to the first weight, the second weight and the third weight, and outputting a first calculation result;
and obtaining the first relevance according to the first calculation result.
3. The method of claim 2, wherein the method further comprises:
building a weight configuration model, wherein the weight configuration model comprises a data input layer, a data processing layer, a data judging layer and a data output layer, and the weight configuration model is embedded in a cloud processor;
the data input layer receives the space photosensitivity, the space temperature sensitivity and the air exchange performance, and after data receiving is finished, data standardization processing is carried out according to the data processing layer to obtain a first sequencing result;
the data judgment layer judges according to the first sequencing result and a preset database, and when the logical relation of the preset database is met, a weight configuration result is obtained;
and outputting the weight configuration result according to the data output layer.
4. The method of claim 1, wherein the method further comprises:
obtaining first external environment characteristics according to the information of the first external environment, wherein the first external environment characteristics comprise noise environment characteristics and air environment characteristics;
obtaining a first external environment influence coefficient according to the noise environment characteristic and the air environment characteristic;
and gaining the first relevance according to the first external environment influence coefficient, and outputting a second relevance.
5. The method of claim 1, wherein inputting the first modeling data and the second modeling data into a three-dimensional scene model, three-dimensionally modeling the first visualization object, and outputting a visualized three-dimensional scene, the method further comprises:
obtaining a plurality of visualization perspectives of the first visualization object;
generating a plurality of external environment modeling view angles according to the plurality of visual view angles;
inputting the three-dimensional scene model according to the plurality of visualization view angles and the plurality of external environment modeling view angles, wherein the three-dimensional scene model comprises a scene calibration layer;
carrying out three-dimensional space calibration according to the scene calibration layer in the three-dimensional scene model, and outputting a space calibration result;
and outputting the visual three-dimensional scene according to the space calibration result.
6. The method of claim 1, wherein the visualization controls are obtained by workflow analysis of the visualized three-dimensional scene, the method further comprising:
performing controllable equipment connection based on the visual three-dimensional scene to obtain a plurality of control terminals;
performing primary screening according to the plurality of control terminals, and outputting an identification control terminal;
controlling the terminal according to the identification to generate a plurality of visual controls;
obtaining the use frequency and the use order of the plurality of visual controls according to a first user;
and screening the plurality of visualizations in a step layer according to the use frequency and the use sequence, and outputting the visualization control.
7. The method of claim 4, wherein the system comprises a visual control feedback module, the method further comprising:
acquiring real-time data of the first external environment according to the data fitting device to obtain real-time data of the external environment;
inputting the real-time data of the external environment into the visual control feedback module, and outputting first feedback control data according to the visual control feedback module, wherein the first feedback control data is data for controlling the first visual object according to the first external environment;
and inputting the first feedback control data into the visual control layer for control.
8. An intelligent application visualization control system based on big data, the system comprising:
the first obtaining unit is used for carrying out data acquisition on the first visual object according to the data fitting device to obtain first modeling data;
a second obtaining unit configured to obtain information of a first external environment of the first visual object;
a first analysis unit, configured to perform environmental impact analysis on the information of the first external environment to obtain a first correlation, where the first correlation is a correlation impact degree of the first external environment on the first visualization object;
a third obtaining unit, configured to obtain second modeling data if the first correlation is greater than a preset correlation;
a first input unit, configured to input the first modeling data and the second modeling data into a three-dimensional scene model, perform three-dimensional modeling on the first visualization object, and output a visualized three-dimensional scene;
a fourth obtaining unit, configured to obtain a visual control by performing workflow analysis on the visual three-dimensional scene;
and the first control unit is used for building a visual control layer according to the visual control and carrying out visual control according to the visual control layer.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any of claims 1 to 7 are implemented when the program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-7.
CN202210489768.4A 2022-05-07 2022-05-07 Intelligent application visual control method and system based on big data Active CN114859744B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210489768.4A CN114859744B (en) 2022-05-07 2022-05-07 Intelligent application visual control method and system based on big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210489768.4A CN114859744B (en) 2022-05-07 2022-05-07 Intelligent application visual control method and system based on big data

Publications (2)

Publication Number Publication Date
CN114859744A true CN114859744A (en) 2022-08-05
CN114859744B CN114859744B (en) 2023-06-06

Family

ID=82634763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210489768.4A Active CN114859744B (en) 2022-05-07 2022-05-07 Intelligent application visual control method and system based on big data

Country Status (1)

Country Link
CN (1) CN114859744B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080307365A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Object transitions
US20100082678A1 (en) * 2008-09-30 2010-04-01 Rockwell Automation Technologies, Inc. Aggregation server with industrial automation control and information visualization placeshifting
US20100277468A1 (en) * 2005-08-09 2010-11-04 Total Immersion Method and devices for visualising a digital model in a real environment
US20140253553A1 (en) * 2012-06-17 2014-09-11 Spaceview, Inc. Visualization of three-dimensional models of objects in two-dimensional environment
CN105224605A (en) * 2015-09-02 2016-01-06 东北大学秦皇岛分校 Data under a kind of large data environment store and lookup method
CN105376125A (en) * 2015-12-08 2016-03-02 深圳众乐智府科技有限公司 Control method and device for intelligent home system
US20160379415A1 (en) * 2015-06-23 2016-12-29 Paofit Holdings Pte Ltd Systems and Methods for Generating 360 Degree Mixed Reality Environments
CN108388142A (en) * 2018-04-10 2018-08-10 百度在线网络技术(北京)有限公司 Methods, devices and systems for controlling home equipment
CN109116812A (en) * 2017-06-22 2019-01-01 上海智建电子工程有限公司 Intelligent power distribution cabinet, energy conserving system and method based on SparkStreaming
CN109272155A (en) * 2018-09-11 2019-01-25 郑州向心力通信技术股份有限公司 A kind of corporate behavior analysis system based on big data
CN109976296A (en) * 2019-05-08 2019-07-05 西南交通大学 A kind of manufacture process visualization system and construction method based on virtual-sensor
CN112465189A (en) * 2020-11-04 2021-03-09 上海交通大学 Method for predicting number of court settlement plans based on time-space correlation analysis
CN113434483A (en) * 2021-06-29 2021-09-24 无锡四维时空信息科技有限公司 Visual modeling method and system based on space-time big data
CN114237192A (en) * 2022-02-28 2022-03-25 广州力控元海信息科技有限公司 Digital factory intelligent control method and system based on Internet of things
CN114266167A (en) * 2021-12-27 2022-04-01 卡斯柯信号有限公司 Visual modeling method, medium and electronic device for train operation basic environment
US11314493B1 (en) * 2021-02-19 2022-04-26 Rockwell Automation Technologies, Inc. Industrial automation smart object inheritance

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100277468A1 (en) * 2005-08-09 2010-11-04 Total Immersion Method and devices for visualising a digital model in a real environment
US20080307365A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Object transitions
US20100082678A1 (en) * 2008-09-30 2010-04-01 Rockwell Automation Technologies, Inc. Aggregation server with industrial automation control and information visualization placeshifting
US20140253553A1 (en) * 2012-06-17 2014-09-11 Spaceview, Inc. Visualization of three-dimensional models of objects in two-dimensional environment
US20160379415A1 (en) * 2015-06-23 2016-12-29 Paofit Holdings Pte Ltd Systems and Methods for Generating 360 Degree Mixed Reality Environments
CN105224605A (en) * 2015-09-02 2016-01-06 东北大学秦皇岛分校 Data under a kind of large data environment store and lookup method
CN105376125A (en) * 2015-12-08 2016-03-02 深圳众乐智府科技有限公司 Control method and device for intelligent home system
CN109116812A (en) * 2017-06-22 2019-01-01 上海智建电子工程有限公司 Intelligent power distribution cabinet, energy conserving system and method based on SparkStreaming
CN108388142A (en) * 2018-04-10 2018-08-10 百度在线网络技术(北京)有限公司 Methods, devices and systems for controlling home equipment
CN109272155A (en) * 2018-09-11 2019-01-25 郑州向心力通信技术股份有限公司 A kind of corporate behavior analysis system based on big data
CN109976296A (en) * 2019-05-08 2019-07-05 西南交通大学 A kind of manufacture process visualization system and construction method based on virtual-sensor
CN112465189A (en) * 2020-11-04 2021-03-09 上海交通大学 Method for predicting number of court settlement plans based on time-space correlation analysis
US11314493B1 (en) * 2021-02-19 2022-04-26 Rockwell Automation Technologies, Inc. Industrial automation smart object inheritance
CN113434483A (en) * 2021-06-29 2021-09-24 无锡四维时空信息科技有限公司 Visual modeling method and system based on space-time big data
CN114266167A (en) * 2021-12-27 2022-04-01 卡斯柯信号有限公司 Visual modeling method, medium and electronic device for train operation basic environment
CN114237192A (en) * 2022-02-28 2022-03-25 广州力控元海信息科技有限公司 Digital factory intelligent control method and system based on Internet of things

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MENGYA ZHENG等: "STARE: Augmented Reality Data Visualization for Explainable Decision Support in Smart Environments", 《IEEE ACCESS》 *
刘戈等: "一种环境质量数据可视化与可视分析系统", 计算机与现代化, no. 01 *
孙效华等: "基于增强现实技术的物联网数据呈现与交互", 包装工程, no. 20 *
殷红等: "列检场可视化监控系统LANDMARC算法的改进", 兰州交通大学学报, no. 04 *
黄靖丽;: "三维数字化技术在数字化工厂的应用", 中国管理信息化, no. 01 *

Also Published As

Publication number Publication date
CN114859744B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
DE112005001761T5 (en) A system, method and apparatus for determining and using a location of wireless devices or a wireless network enhancement infrastructure
AU2010337288A1 (en) Method and system for enterprise building automation
KR20140077361A (en) Apparatus and method for suporting floor plan of a building
CN110634014A (en) Method, device, equipment and medium for determining house source price
CN104881022A (en) Monitoring system based on substation digital model
CN116467788A (en) Barrier-free environment construction management method and system
CN114554503B (en) Networking information confirmation method and device and user equipment
CN114859744B (en) Intelligent application visual control method and system based on big data
CN115174416B (en) Network planning system, method and device and electronic equipment
US9568502B2 (en) Visual analytics of spatial time series data using a pixel calendar tree
Chen et al. Smart camera placement for building surveillance using OpenBIM and an efficient Bi-level optimization approach
CN212112557U (en) Manufacturing management integrated information system
CN115271475A (en) Intelligent park business standard model design framework based on Internet of things
CN114091133A (en) City information model modeling method and device, terminal equipment and storage medium
CN113192178B (en) House source information processing method, device and system
CN111062633A (en) Power transmission and transformation line and equipment state evaluation system based on multi-source heterogeneous data
CN113141570A (en) Underground scene positioning method and device, computing equipment and computer storage medium
CN111128357A (en) Monitoring method and device for hospital logistics energy consumption target object and computer equipment
CN114509043B (en) Spatial object coding method, device, equipment and medium
CN111859503A (en) Drawing review method, electronic equipment and graphic server
US20230180019A1 (en) Automated design, installation and validation of a wireless network
CN114937207A (en) Method and device for detecting residential base compliance
CN114418140A (en) 5G-fused power infrastructure co-construction sharing support method and system
CN116881861A (en) Determination method, medium and equipment for establishing constraint parameters of associated event
de Oliveira Virtual digital twin validation in terms of thermal comfort: a case study

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant