CN116091739A - Method, device, equipment and medium for intelligent enterprise management based on mixed reality - Google Patents

Method, device, equipment and medium for intelligent enterprise management based on mixed reality Download PDF

Info

Publication number
CN116091739A
CN116091739A CN202310368865.2A CN202310368865A CN116091739A CN 116091739 A CN116091739 A CN 116091739A CN 202310368865 A CN202310368865 A CN 202310368865A CN 116091739 A CN116091739 A CN 116091739A
Authority
CN
China
Prior art keywords
enterprise
information
mode
intelligent management
mixed reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310368865.2A
Other languages
Chinese (zh)
Inventor
蔡辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sanluo Technology Beijing Co ltd
Original Assignee
Sanluo Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanluo Technology Beijing Co ltd filed Critical Sanluo Technology Beijing Co ltd
Priority to CN202310368865.2A priority Critical patent/CN116091739A/en
Publication of CN116091739A publication Critical patent/CN116091739A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The invention discloses an enterprise intelligent management method, device, equipment and medium based on mixed reality, and belongs to the technical field of mixed reality. The method for intelligently managing the enterprise based on the mixed reality comprises the following steps: establishing an enterprise intelligent management database, and establishing an audit relation among elements in the enterprise intelligent management database; acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor; performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content; and transmitting the output content to the corresponding multimedia terminal. The method forms output content based on mixed reality based on multi-mode information, so that enterprise staff can conveniently and efficiently acquire information and interact with the multimedia terminal, the working efficiency and the use experience of enterprise staff, enterprise creators, stakeholders, managers or third party consultation companies and the like are improved, and further the intelligent management and the overall efficiency of the enterprise are improved.

Description

Method, device, equipment and medium for intelligent enterprise management based on mixed reality
Technical Field
The invention relates to the technical field of mixed reality, in particular to an enterprise intelligent management method, device, equipment and medium based on mixed reality.
Background
Mixed Reality (MR) is a visual world created in conjunction with the real world and the virtual world, where physical entities and digital objects coexist and can interact in real time to enhance the acquisition of information about the real object. That is, mixed reality is a synthesis of Virtual Reality (VR) and augmented reality (AR, augmented Reality). Mixed reality creates an interactive feedback information loop between the real world, the virtual world, and the user by presenting virtual scene information in the real scene. Properly executing the mixed reality can energize the working scheme of the enterprise scene and the office mode of the enterprise staff. How to realize the intelligent management of enterprises through mixed reality becomes a problem to be solved in the industry.
Disclosure of Invention
The invention provides a method and a device for intelligent enterprise management based on mixed reality, which are used for improving enterprise efficiency and reducing enterprise cost, and simultaneously can provide a technical scheme of lean and personalized enterprise management.
According to a first aspect of an embodiment of the present invention, there is provided a method for enterprise intelligent management based on mixed reality, including: establishing an enterprise intelligent management database, and establishing an audit relation among elements in the enterprise intelligent management database;
acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor;
performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content;
and sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content.
In one embodiment, the building an enterprise intelligent management database and establishing an audit relation between elements in the enterprise intelligent management database includes: establishing an enterprise intelligent management database in a static mode or a dynamic mode, wherein the enterprise intelligent management database comprises any one or more of a technical archive, an enterprise personalized knowledge management graph, an industry knowledge graph, an industry case management database and an industry talent archive; and establishing an audit relation and/or cross-element keywords among all elements in the enterprise intelligent management database.
In one embodiment, the acquiring, by the multi-modal sensor, multi-modal information of the enterprise personnel and/or the enterprise scenario includes: the multi-mode sensor comprises any one or more of a positioning device, a navigation device, an inertial measurement device, a touch sensing device, a visual sensing device, a millimeter wave radar device, an auditory sensing device, a flexible sensing device and a temperature sensing device; determining a personnel multi-mode acquisition mode of a multi-mode sensor according to position information of enterprise personnel, wherein the personnel multi-mode acquisition mode comprises any one or more of physical state information, psychological state information, emotional state information and fatigue state information; determining a scene multi-mode acquisition mode of the multi-mode sensor according to environmental information of an enterprise scene; and acquiring the multi-mode information of enterprise personnel and/or enterprise scenes by a personnel multi-mode acquisition mode and/or a scene multi-mode acquisition mode of the multi-mode sensor.
In one embodiment, the performing association analysis on the multimodal information and the enterprise intelligent management database to construct output content of mixed reality includes: performing association analysis on the multi-mode information and the enterprise intelligent management database to obtain analysis contents in a text information mode; converting the analysis content of the text information mode into the analysis content of the multimedia mode; the method comprises the steps of fusing analysis contents in a multimedia mode with real scenes of enterprise scenes through mixed reality to form output contents in mixed reality, wherein the output contents are provided with at least one display interface, options capable of realizing real-time interaction are arranged in the display interface, and the number of the options is at least one.
In one embodiment, further comprising:
obtaining enterprise information, the enterprise information including any one or more of an enterprise type, an enterprise industry, and an enterprise development stage; and adjusting the output content of the mixed reality according to the enterprise information.
In one embodiment, the performing association analysis on the multimodal information and the enterprise intelligent management database to form output content of mixed reality includes:
Figure SMS_1
wherein the symbol W i Is output content; symbol a is multi-modal information of enterprise personnel; symbol b is multi-modal information of the enterprise scenario; symbol Q is an enterprise intelligent management database; symbol x i Is the correction coefficient of the i-th output content.
In one embodiment, further comprising:
the real-time interactions include any one or more of text, touch operations, speech, limb movements, eye tracking, brain waves, and facial expressions.
According to a second aspect of the embodiment of the present invention, there is provided an apparatus for enterprise intelligent management based on mixed reality, including:
the building module is used for building an enterprise intelligent management database and building an audit relation among all elements in the enterprise intelligent management database;
The acquisition module is used for acquiring multi-mode information of enterprise personnel and/or enterprise scenes through the multi-mode sensor;
the construction module is used for carrying out association analysis on the multi-mode information and the enterprise intelligent management database and constructing output content of mixed reality;
and the sending module is used for sending the output content to a corresponding multimedia terminal, and the multimedia terminal can interact multimedia information through the interaction interface of the output content.
In one embodiment, further comprising: the building module, the obtaining module, the constructing module and the sending module are controlled to execute the method for intelligent enterprise management based on mixed reality in any of the above embodiments.
According to a third aspect of embodiments of the present invention there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method as provided in the first aspect when the program is executed.
According to a fourth aspect of embodiments of the present invention there is also provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method as provided by the first aspect.
In summary, the invention provides a method and a device for enterprise intelligent management based on mixed reality, wherein the method comprises the following steps: establishing an enterprise intelligent management database, and establishing an audit relation among elements in the enterprise intelligent management database; acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor; performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content; and sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content. The invention forms the output content of the mixed reality in the form of multimedia based on the multi-mode information, so that enterprise staff can conveniently and efficiently acquire information and interact through the interaction interface at the multimedia terminal, thereby improving the working efficiency and the use experience of the enterprise staff, enterprise initiator, stakeholders, manager or third party consultation company and the like.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of a method for intelligent enterprise management based on mixed reality according to an exemplary embodiment of the invention;
FIG. 2 is a flowchart illustrating a method for intelligent enterprise management based on mixed reality according to an exemplary embodiment of the present invention at step S11;
FIG. 3 is a flowchart illustrating a method for enterprise intelligent management based on mixed reality according to an exemplary embodiment of the present invention at step S12;
FIG. 4 is a flowchart illustrating a method for intelligent enterprise management based on mixed reality according to an exemplary embodiment of the present invention at step S13;
FIG. 5 is a flow chart of another method for intelligent enterprise management based on mixed reality, according to an exemplary embodiment of the invention;
FIG. 6 is a block diagram of an apparatus for enterprise intelligent management based on mixed reality, according to an exemplary embodiment of the present invention;
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present application are described in detail below to make the objects, technical solutions and advantages of the present application more apparent, and to further describe the present application in conjunction with the accompanying drawings and the detailed embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative of the application and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by showing examples of the present application.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
Mixed Reality (MR) is the creation of new environments and visualizations in conjunction with the real world and virtual world, where physical entities and digital objects coexist and can interact in real time to simulate real objects. Mixed reality is a composite of Virtual Reality (VR) and augmented reality (AR, augmented Reality). Mixed reality is a further development of virtual reality technology that enhances the realism of the user experience by presenting virtual scene information in a real scene, and by building up an interactive feedback information loop between the real world, the virtual world and the user. The mixed reality directly affects the working scheme of the enterprise scene and the office mode of the enterprise staff. The overall efficiency of the enterprise can be significantly improved with the support of mixed reality. In many enterprise scenarios, enterprise employees are hampered in the progress of the business due to the lack of a way to obtain business-related information in a timely manner. Moreover, the enterprise staff can obtain the information in the text form in a conventional mode, and the content associated with the current service can not be accurately screened out from the complicated information at a moment. The method and the device can be used for vividly and efficiently displaying the content associated with the current service by using a mixed reality technical means. That is, the information in the form of multimedia is fused with the real scene by means of mixed reality. For example, a menu bar including a plurality of virtual keys including a plurality of functions of a product is displayed on the right edge side of the product. For another example, the product may be displayed in a text form, a picture form or a video form in a front or side area, and the display may be suspended on the surface of the object in a semitransparent manner. Through the interactive operation function in the mixed reality, enterprise staff can acquire further detailed contents of the interactive operation after the interactive operation such as touch control.
Fig. 1 is a flowchart illustrating a method for mixed reality based enterprise intelligent management, as shown in fig. 1, according to an exemplary embodiment, comprising the following steps S11-S14:
in step S11, an enterprise intelligent management database is built, and an audit relation among all elements in the enterprise intelligent management database is built;
in step S12, acquiring multi-modal information of enterprise personnel and/or enterprise scenes by a multi-modal sensor;
in step S13, performing association analysis on the multimodal information and the enterprise intelligent management database to form output content of mixed reality;
in step S14, the output content is sent to a corresponding multimedia terminal, and the multimedia terminal may perform interaction of multimedia information through the interaction interface of the output content.
In one embodiment, the enterprise intelligent management database includes any one or more of a technology archive, an enterprise personalized knowledge management profile, an industry knowledge profile, an industry case management library, an industry talent archive. In the enterprise intelligent management database, all elements are stored and managed in a database form, and in order to better provide comprehensive and multiple reference information for users, an audit relation and cross-element keywords are required to be established among all elements.
Multimodal information of the enterprise personnel is acquired by the multimodal sensor, and the multimodal information of the enterprise personnel comprises any one or more of physical state information, emotional state information, body temperature data, heart rate information, psychological state information, fatigue state information and physical disease conditions. The multi-modal information of the enterprise personnel can comprise important information of the enterprise provided personnel except the working capacity and the working experience, so that the intelligent management of the enterprise is better realized. Before the output content is displayed, the output content is required to be desensitized, and particularly when sensitive information of enterprise personnel is involved, the sensitive information is desensitized through a preset desensitization algorithm and a desensitization key, and then the display is performed by using mixed reality. For example, the age, disease, birthday, marital status, confidential work information of the staff of the enterprise are all sensitive information, and need to be desensitized in advance. In addition, information related to the body in the multi-mode information of the enterprise staff is obtained, and the flexible sensor can be firmly attached to the human body. The flexible sensor can stably and continuously detect the body data due to the limb movement and multi-joint movement of the staff of the enterprise. For example, when an enterprise employee performs a significant limb movement, the inflexible sensor cannot perform effective body temperature monitoring, and may even fall.
The multimode sensor is used for acquiring the multimode information of the enterprise scene, and the multimode information of the enterprise scene is classified into specific types. For example, the multimodal information of the conference scenario of one software development team includes any one or more of progress status of the task to be processed, respective work tasks of participants of the task to be processed, resources required for the task to be processed, history of similar tasks of the task to be processed, future progress plans of the task to be processed. For another example, the multi-modal information of the production scenario of one plant line includes any one or more of a process link of a product, a commodity parameter of the product, a yield of the product, a process level of the product, and an associated process of the product. In addition, the multi-mode information can be applied to multiple scenes such as a research and development scene, a factory management scene, a financial scene, a supply chain scene, a law scene, a business management scene and the like of an enterprise. The multi-mode information of the scene can provide important reference information for the enterprise scene, so that the intellectualization and efficiency of enterprise intelligent management are improved.
The method comprises the steps that cross-element keywords are arranged in an enterprise intelligent management database, and link relations are arranged among elements of the enterprise intelligent management database. And carrying out association analysis on the multi-mode information and the enterprise intelligent management database, and combining with the mixed reality technology to form mixed reality output content.
The multimodal sensor can obtain any one or more of location information, intent information, task performance information for an employee of the enterprise. The method comprises the steps of mixing proper positions of the periphery of a target object in a real scene, generating a virtual interface, and displaying special effects to achieve a prompt effect or an emphasis effect. The content in the virtual interface can be displayed in a static mode and can also be displayed in a dynamic effect. The mixed reality output content can be used to guide enterprise employees in analysis and decision making. And predicting according to the historical data and the big data mode by a preset intelligent algorithm. When the predicted result reaches the preset warning condition, the system actively gives out an early warning prompt in a multimedia form in mixed reality and gives out a better response scheme so as to enable enterprise staff to perform corresponding operation.
In one embodiment, according to specific position information of enterprise staff, the method can feed back output contents of different mixed reality to enterprise staff with different positions under the same enterprise scene. For example, sales personnel more concentrate on sales volume information, customer information, etc., manager more concentrate on engineering progress, material and materials, etc., engineers more concentrate on construction technology, construction scheme, etc., and general manager more concentrate on contract money, overall arrangement, etc. In addition, the output content of the mixed reality can be modified according to the intention of the staff of the enterprise. For example, if a certain enterprise employee is about to acquire the time information of the project progress of the project, the output content of the mixed reality of the project is displayed in a remarkable manner in the multimedia information.
And sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content. The multimedia terminal comprises intelligent glasses, intelligent contact lenses, intelligent mobile phones, intelligent bracelets, personal computers, tablet computers, intelligent 3D projectors and the like, and can visually display output contents. The intelligent glasses, the intelligent contact lenses or the intelligent equipment with the human eye detection function can judge whether a user selects one interactive content in the mixed reality according to focusing conditions of the glasses. When the human vision focuses on the near, the definition of the distant object is reduced, so that the definition of the virtual content at different distances in the mixed reality can be automatically adjusted according to the focal length of the glasses, and the definition of the virtual content is kept consistent with that of the object in the reality. The method can also simulate the actual environment in a digital twinning mode and present the actual environment in a virtual mode, and the output content of the mixed reality can be displayed together with the virtual environment after digital twinning. The output content of the mixed reality contains at least one interactive interface, in which at least one interactive multimedia content is included, for example, a menu including a plurality of options, a clickable button, a draggable slider, a scalable image display frame, and the like. The specific interaction mode can be determined according to hardware of the multimedia terminal, and comprises a voice mode, a touch mode, an eye tracking mode, a gesture recognition mode, an expression analysis mode, a body gesture recognition mode, brain wave analysis and the like. Based on deep learning and big data analysis, the system can identify the specific meaning of the instruction sent by the user and perform corresponding interaction. The system can display the potential intention of the enterprise user and the information necessary for the enterprise user to be recognized by the system, which are judged by the system, besides the subjective intention of the enterprise user according to the history related data and the prediction data of the intelligent algorithm, and prompt in a shorthand way. If the user selects any one of the prompts, the detailed multimedia content of the prompt is further displayed.
In one embodiment, content of different dimensions may also be displayed on different screens by way of a multi-screen display. And displaying the content of the core requirements of the enterprise users on the main screen, and displaying the content in a targeted manner on other sub-screens aiming at each relevant content focused by the enterprise users. The content corresponding to the intent of the enterprise user is displayed in a significant manner, regardless of the screen.
And sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact with the multimedia information through an interactive interface. Multimedia includes any one or more of text, sound, planar image, three-dimensional image, moving picture, video.
According to the technical scheme, the multi-mode information is obtained through the multi-mode sensor, the multi-mode information is combined with the enterprise intelligent management database, output content is obtained through analysis, the output content is displayed on the multimedia terminal in a mixed reality mode, and corresponding multimedia information can be obtained through interaction of an interaction interface in the output content. Through the technical scheme, the content in the multimedia form interested by the enterprise user can be obtained efficiently and conveniently, the working efficiency of enterprise staff, enterprise creators, stakeholders, managers or third-party consultation companies and the like is improved, and the working experience of the enterprise staff is also remarkably improved. Meanwhile, the improvement of the working efficiency of a plurality of enterprise staff, enterprise creators, stakeholders, managers or third party consultation companies and the like is also beneficial to the whole operation of the enterprise.
In one embodiment, as shown in FIG. 2, step S11 includes the following steps S21-S22:
in step S21, an enterprise intelligent management database is established in a static manner or a dynamic manner, and the enterprise intelligent management database comprises any one or more of a technical archive, an enterprise personalized knowledge management graph, an industry knowledge graph, an industry case management library and an industry talent archive;
in step S22, an audit relation and/or cross-element keywords between elements in the enterprise intelligent management database are established.
In one embodiment, the enterprise realizes the multidimensional auditing relation of the enterprise intelligent management database by means of the historical data of the enterprise and the docking external information, wherein the docking external information can be realized by means of an application programming interface. The intelligent management database of the static mode assembly enterprise comprises a technical layer database, a client layer database, a supply chain aspect database, an industry layer database, a financial layer database, a legal layer database and the like. The dynamic mode component enterprise intelligent management database comprises an enterprise growth database, an enterprise parallel purchase database, an enterprise branch company/subsidiary company database, an enterprise milestone database, an enterprise business database, an enterprise production database, an enterprise technology database, an enterprise test database and the like. The industry talent archives are realized by dynamically updating any one or more of dynamic periodic summary of work, KPI completion, job position change, work position change, personal illness information, mental state information, personal heart information summary information and personal multidimensional dynamic graphs (personal work growth, personal physical and mental health multimodality, personal work achievement and the like) for a long time.
For the update mode, the static mode is an update enterprise intelligent management database with one-time large data volume at each interval with a fixed time length; the dynamic mode is to update the enterprise intelligent management database in real time and update the enterprise intelligent management database with small data volume in real time. The enterprise intelligent management database can comprise a plurality of elements, each element is a database and at least comprises any one or more of a technical archive, an enterprise personalized knowledge management graph, an industry knowledge graph, an industry case management database and an industry talent archive. And analyzing each element in the enterprise intelligent management database to obtain an audit relation among the elements, and also obtaining common keywords in some elements, namely element collapse keywords, wherein the element collapse keywords can be used for associating each element of the whole intelligent management database.
In one embodiment, as shown in FIG. 3, step S12 includes the following steps S31-S33:
in step S31, a person multi-mode acquisition mode of the multi-mode sensor is determined according to job position information of enterprise personnel, wherein the person multi-mode acquisition mode includes any one or more of physical state information, psychological state information, emotional state information and fatigue state information;
In step S32, determining a scene multi-mode acquisition mode of the multi-mode sensor according to environmental information of an enterprise scene;
in step S33, the multi-mode information of the enterprise personnel and/or the enterprise scene is obtained by the personnel multi-mode acquisition mode and/or the scene multi-mode acquisition mode of the multi-mode sensor.
In one embodiment, the plurality of sensors includes any one or more of a temperature sensor, a flexible sensor, a humidity sensor, a millimeter wave sensor, a brain wave kit, a muscle sensor, a cardiac sensor, a barometric sensor, a gravity sensor.
And determining a personnel multi-mode acquisition mode of the multi-mode sensor according to the position information of the enterprise personnel.
The multi-modal information of the enterprise personnel is acquired through a multi-modal sensor acquisition mode, wherein the multi-modal information of the enterprise personnel comprises any one or more of physical state information, emotional state information, body temperature data, heart rate information, psychological state information, fatigue state information and physical disease conditions. Acquiring occupational event information, work reporting period summary, skill characteristic information, comprehensive evaluation information and the like of enterprise staff in an enterprise intelligent management database, carrying out multidimensional analysis on work tasks to be executed by the enterprise staff in combination with the multi-mode information of the enterprise staff, and feeding back evaluation, wherein the evaluation comprises matching degree of the enterprise staff to the tasks, task execution suggestion and warning, task progress expectation, work task quality evaluation, resource input condition and the like. The multi-mode information of the enterprise personnel can provide important information of the enterprise except the working capacity and the working experience, so that the intelligent management of the enterprise is better realized. Before the output content is displayed, the output content is required to be desensitized, and particularly when sensitive information of enterprise personnel is involved, the sensitive information is desensitized through a preset desensitization algorithm and a desensitization key, and then the display is performed by using mixed reality. For example, the age, disease, birthday, marital status, confidential work information of the staff of the enterprise are all sensitive information, and need to be desensitized in advance. The sensitive information of enterprises also needs to be subjected to desensitization treatment, so that privacy identification at a system level is realized, and privacy boundary management is realized. In addition, the parts related to the body in the multi-mode information of the enterprise staff are acquired, and the flexible sensor is required to be firmly attached to the human body. Only flexible sensors can stably and continuously detect body data due to limb activities and multi-joint movements of staff of an enterprise. For example, a body temperature monitor that obtains the temperature of the armpit of the body, and when an enterprise employee performs a large-scale limb movement, the inflexible sensor cannot perform effective body temperature monitoring, and even may fall.
And determining a scene multi-mode acquisition mode of the multi-mode sensor according to the environmental information of the enterprise scene. The multi-mode information of the enterprise scene is acquired through a multi-mode sensor acquisition mode, and the multi-mode information of the enterprise scene is classified into specific types. For example, the multimodal information of the conference scenario of one software development team includes any one or more of progress status of the task to be processed, respective work tasks of participants of the task to be processed, resources required for the task to be processed, history of similar tasks of the task to be processed, future progress plans of the task to be processed. For another example, the multi-modal information of the production scenario of one plant line includes any one or more of a process link of a product, a commodity parameter of the product, a yield of the product, a process level of the product, and an associated process of the product. In addition, the multi-mode information can be applied to multiple scenes such as a research and development scene, a factory management scene, a financial scene, a supply amount scene, a law scene, a business management scene and the like of an enterprise. The multi-mode information of the scene can provide important reference information for the enterprise scene, so that the intellectualization and efficiency of enterprise intelligent management are improved.
In one embodiment, as shown in FIG. 4, step S13 includes the following steps S41-S43:
in step S41, performing association analysis on the multimodal information and the enterprise intelligent management database to obtain analysis content in a text information manner;
in step S42, the text information type analysis content is converted into the multimedia type analysis content;
in step S43, the analysis content of the multimedia mode and the real scene of the enterprise scene are fused through mixed reality to form an output content of mixed reality, the output content has at least one display interface, the display interface has options capable of real-time interaction, and the number of the options is at least one.
In one embodiment, before the multi-mode information and the enterprise intelligent management database are subjected to association analysis, data cleaning, data federation, database fluid integration, data planning, data presentation and data privacy analysis are required to be performed on the multi-mode information through a big data analysis tool and a deep learning model. The data in the database is stored in text form. For the storage of the picture, besides the storage of the pixel data of the picture, the image characteristics, keywords and semantics of the picture are stored, and the image characteristics, keywords and semantics are all in text form. The storage of other multimedia information is similar to the case of pictures and will not be repeated. In order to adapt to the display mode of mixed reality, the analysis content of the multimedia mode needs to be formed, so that the analysis content of the text information is converted through a preset intelligent algorithm. The conversion includes not only the text and data of the analyzed content of the text information, but also the multimedia content associated with the analyzed content of the text information. And analyzing the display scene of the enterprise scene to obtain a region suitable for displaying the analysis content of the multimedia mode, and displaying the analysis content of the multimedia mode in the region. The area not only comprises the real object itself in the display scene, but also comprises the analysis content which is suitable for the real semitransparent multimedia mode in the edge area of the real object.
And fusing the analysis content of the multimedia mode with the reality scene of the enterprise scene to form the output content of the mixed reality. Meanwhile, in order to distinguish reality and virtual, the output content comprises at least one display interface, wherein the display interface is provided with options capable of realizing real-time interaction, and the number of the options is at least one. The edges and fill portions of the presentation interface may be integrated with the environment without affecting the resolution of the output content. The real-time interactions include any one or more of text, touch operations, speech, limb movements, eye tracking, brain waves, automatic capturing of polynary nerves, and facial expressions. The real-time interactive monitoring hardware comprises an infrared monitoring device, a millimeter wave/centimeter wave radar, a temperature monitoring device, a humidity monitoring device, a positioning monitoring device, a nerve sensing device, a gravity monitoring device, a deep learning camera, a brain wave monitoring device, a muscle sensing device, a heart rate/electrocardio sensing device, a barometric pressure sensing device, a numerical control system device, an industrial software excavating device, a numerical value integrating device, a navigation monitoring device, an inertia monitoring device, a touch sensing device, a sound sensing device, a flexible sensing device and the like.
Performing association analysis on the multi-modal information and the enterprise intelligent management database to form output content of mixed reality, wherein the method comprises the following steps:
Figure SMS_2
wherein the symbol W i Is output content; symbol a is multi-modal information of enterprise personnel; symbol b is multi-modal information of the enterprise scenario; symbol Q is an enterprise intelligent management database; symbol x i Is the correction coefficient of the i-th output content. After the above formula shows that the multi-modal information of enterprise personnel is combined with the multi-modal information of enterprise scene, performing association analysis with the enterprise intelligent management database, and setting a correction coefficient x in consideration of the influence of the enterprise information i And obtaining the output content of the mixed reality.
In one embodiment, as shown in FIG. 5, the method further comprises the following steps S51-52:
in step S51, enterprise information is acquired, the enterprise information including any one or more of enterprise type, enterprise industry, and enterprise development stage;
in step S52, the output content of the mixed reality is adjusted according to the enterprise information.
In one embodiment, enterprise information is obtained, the enterprise information including any one or more of enterprise type, enterprise industry, and enterprise development stage. Types of businesses include monopoly businesses, red sea businesses, blue sea businesses, and so forth. Monopoly enterprises have the characteristics of irreplaceability and high profit, the gross profit of red sea enterprises is lower, the rising interval is limited, and the requirement of blue sea enterprises on high-speed development is higher than the requirement of the current profit of the enterprises, so that corresponding adaptation is also required for the output content of the mixed reality of different enterprise types. The enterprise industry includes internet industry enterprises, energy industry enterprises, biological industry enterprises, pharmaceutical industry enterprises, electric automobile industry enterprises, chip industry enterprises, and the like. The operation strategies of different types of enterprises have great variability, and output contents of mixed reality need to be adjusted. The development stage of the enterprise is divided into an entry stage, a growth stage, a saturation stage and a decay stage. The entry period refers to the new product being put on the market. At this time, the customer has no knowledge of the product, and only a few customers who pursue novelty may purchase the product, and sales are low. To expand sales, a large amount of promotional expense is required to promote the product. At this stage, the product cannot be mass-produced due to technical reasons, so that the cost is high, sales increase is slow, and enterprises cannot obtain profits, but may lose profits. The growth period means that the customers are familiar with the products, a large number of new customers start to buy, and the market is gradually expanding. The products are produced in batches, the production cost is relatively reduced, the sales of enterprises is rapidly increased, and the profits are also rapidly increased. The competitors see the profitability, enter the market for competition, increase the supply quantity of the same product, reduce the price, gradually slow down the profit growth speed of enterprises, and finally reach the highest profit point of the life cycle. Saturation period refers to the market demand tending to saturate, with few potential customers, slowly growing sales until turning down, indicating that the product has entered maturity. The competition is gradually increased, the selling price of the product is reduced, the promotion cost is increased, and the profit of enterprises is reduced. The decay period refers to that with the development of science and technology, new products or new substitutes appear, so that the consumption habit of customers is changed and the consumer is turned to other products, and the sales and profit of the original products are rapidly reduced. The product then enters the decay phase again. It can be seen that, for different stages of enterprise development, the business strategy of the enterprise also has different tendencies, and the output content of the mixed reality needs to be adaptively adjusted.
The mixed reality may virtualize the entire content of the flow. The process includes enterprise lifecycle, product type, product process, product upgrade requirements, enterprise management requirements, and the like. The enterprise life cycle comprises the following steps: the external factors and the internal factors of the enterprise are specifically: enterprise external factors: such as policy changes, market conditions, competing enterprises, etc., enterprise internal factors: internal driving force, performance, supply chain, customer, etc.; the product types comprise the type and model of the enterprise product, the internal independent attribute and the crossed attribute of the product process and the like; the product procedures comprise bill of materials, procedures, productivity, sales, inventory, cost and the like; the product upgrading requirements comprise product technology, research and development test, function upgrading, upper replacement and the like; enterprise management requirements include cost, efficiency, environmental management, talent configuration planning, employee training, legal management, digital and automated management, industrial upgrades, and the like.
The mixed reality output content may also be adjusted according to product lifecycle, technology lifecycle, and enterprise lifecycle. The life cycle of the product needs to be analyzed by considering factors such as customer cognition and education cost, self product development ecology, domestic and foreign competitive products, new materials and the like; the technology life cycle needs to consider the factors such as technology globalization development, enterprise technology realization mode and cost, bottom chain, technology upgrading, research and development, test and the like; the enterprise lifecycle needs to consider factors such as hardware lifecycle, maintenance lifecycle, site resource management, environment, lean n+ management mode, etc.
Further, in-depth analysis needs to consider enterprise lifecycle, product type, product process, product upgrade requirements, and enterprise management requirements. Specifically, the enterprise lifecycle is affected by external policies and environments, enterprise internal drive, enterprise performance, industry competition, upstream and downstream supply chains, customers, and other factors; the product type comprises all types of products and the technological condition of each type of product, and whether the technological condition is internal independent or crossed attribute or not; the product procedures comprise bill of materials, procedures, corresponding productivity, sales and inventory maximization configuration and cost; the product upgrading requirement refers to a research and development stage, a test stage, an upgrade version stage and a superior substitution stage; enterprise management requirements include cost, efficiency, environmental management, talent configuration planning and training, legal management, digital and automated management, industrial upgrades, and the like.
In one embodiment, FIG. 6 is a block diagram illustrating an apparatus for mixed reality enterprise intelligent management, according to an example embodiment. As shown in fig. 6, the apparatus includes a constructing module 61, an acquiring module 62, a constructing module 63, and a transmitting module 64.
The building module 61 is configured to build an enterprise intelligent management database, and build an audit relation between elements in the enterprise intelligent management database;
The acquiring module 62 is configured to acquire multi-modal information of enterprise personnel and/or enterprise scenes through multi-modal sensors;
the building module 63 is configured to perform association analysis on the multimodal information and the enterprise intelligent management database to form output content of mixed reality;
the sending module 64 is configured to send the output content to a corresponding multimedia terminal, where the multimedia terminal may perform interaction of multimedia information through an interaction interface of the output content.
The building block 61, the obtaining block 62, the building block 63 and the sending block 64 included in the apparatus for intelligent management of a mixed reality enterprise are controlled to execute the method for intelligent management of a mixed reality enterprise described in any of the above embodiments.
As shown in fig. 7, the present invention provides an electronic device 700, including: a processor 701 and a memory 702 storing computer program instructions;
when the processor 701 executes the computer program instructions, an enterprise intelligent management database is built, and an audit relation among all elements in the enterprise intelligent management database is built; acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor; performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content; and sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content.
The invention provides a computer readable storage medium, wherein computer program instructions are stored on the computer readable storage medium, and when the computer program instructions are executed by a processor, an enterprise intelligent management database is built, and a auditing relation among elements in the enterprise intelligent management database is built; acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor; performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content; and sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content.
It is to be understood that the specific features, operations and details described herein before with respect to the method of the invention may also be similarly applied to the apparatus and system of the invention, or vice versa. In addition, each step of the method of the present invention described above may be performed by a corresponding component or unit of the apparatus or system of the present invention.
It is to be understood that the various modules/units of the apparatus of the invention may be implemented in whole or in part by software, hardware, firmware, or a combination thereof. Each module/unit may be embedded in the processor of the computer device in hardware or firmware form or independent of the processor, or may be stored in the memory of the computer device in software form for the processor to call to perform the operations of each module/unit. Each module/unit may be implemented as a separate component or module, or two or more modules/units may be implemented as a single component or module.
In one embodiment, a computer device is provided that includes a memory and a processor, the memory having stored thereon computer instructions executable by the processor, the computer instructions, when executed by the processor, directing the processor to perform the steps of the method of the embodiments of the invention. The computer device may be broadly a server, a terminal, or any other electronic device having the necessary computing and/or processing capabilities. In one embodiment, the computer device may include a processor, memory, network interface, communication interface, etc. connected by a system bus. The processor of the computer device may be used to provide the necessary computing, processing and/or control capabilities. The memory of the computer device may include a non-volatile storage medium and an internal memory. The non-volatile storage medium may have an operating system, computer programs, etc. stored therein or thereon. The internal memory may provide an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface and communication interface of the computer device may be used to connect and communicate with external devices via a network. Which when executed by a processor performs the steps of the method of the invention.
The present invention may be implemented as a computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes steps of a method of an embodiment of the present invention to be performed. In one embodiment, a computer program is distributed over a plurality of computer devices or processors coupled by a network such that the computer program is stored, accessed, and executed by one or more computer devices or processors in a distributed fashion. A single method step/operation, or two or more method steps/operations, may be performed by a single computer device or processor, or by two or more computer devices or processors. One or more method steps/operations may be performed by one or more computer devices or processors, and one or more other method steps/operations may be performed by one or more other computer devices or processors. One or more computer devices or processors may perform a single method step/operation or two or more method steps/operations.
Those of ordinary skill in the art will appreciate that the method steps of the present invention may be implemented by a computer program, which may be stored on a non-transitory computer readable storage medium, to instruct related hardware such as a computer device or a processor, which when executed causes the steps of the present invention to be performed. Any reference herein to memory, storage, database, or other medium may include non-volatile and/or volatile memory, as the case may be. Examples of nonvolatile memory include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, magnetic tape, floppy disk, magneto-optical data storage, hard disk, solid state disk, and the like. Examples of volatile memory include Random Access Memory (RAM), external cache memory, and the like.
The technical features described above may be arbitrarily combined. Although not all possible combinations of features are described, any combination of features should be considered to be covered by the description provided that such combinations are not inconsistent.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (10)

1. The method for intelligently managing the enterprise based on the mixed reality is characterized by comprising the following steps of:
establishing an enterprise intelligent management database, and establishing an audit relation among elements in the enterprise intelligent management database;
acquiring multi-mode information of enterprise personnel and/or enterprise scenes through a multi-mode sensor;
performing association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content;
And sending the output content to a corresponding multimedia terminal, wherein the multimedia terminal can interact multimedia information through an interactive interface of the output content.
2. The method of claim 1, wherein the building an enterprise intelligent management database and establishing the audit relationship between elements in the enterprise intelligent management database comprises:
establishing an enterprise intelligent management database in a static mode or a dynamic mode, wherein the enterprise intelligent management database comprises any one or more of a technical archive, an enterprise personalized knowledge management graph, an industry knowledge graph, an industry case management database and an industry talent archive;
and establishing an audit relation and/or cross-element keywords among all elements in the enterprise intelligent management database.
3. The method of claim 1, wherein the obtaining, by the multimodal sensor, multimodal information of the enterprise personnel and/or the enterprise scenario comprises:
the multi-mode sensor comprises any one or more of a positioning device, a navigation device, an inertial measurement device, a touch sensing device, a visual sensing device, a millimeter wave radar device, an auditory sensing device, a flexible sensing device and a temperature sensing device;
Determining a personnel multi-mode acquisition mode of a multi-mode sensor according to position information of enterprise personnel, wherein the personnel multi-mode acquisition mode comprises any one or more of physical state information, psychological state information, emotional state information and fatigue state information;
determining a scene multi-mode acquisition mode of the multi-mode sensor according to environmental information of an enterprise scene;
and acquiring the multi-mode information of enterprise personnel and/or enterprise scenes by a personnel multi-mode acquisition mode and/or a scene multi-mode acquisition mode of the multi-mode sensor.
4. The method of claim 1, wherein the performing association analysis on the multimodal information and the enterprise intelligent management database to form mixed reality output content comprises:
performing association analysis on the multi-mode information and the enterprise intelligent management database to obtain analysis contents in a text information mode;
converting the analysis content of the text information mode into the analysis content of the multimedia mode;
fusing analysis contents in a multimedia mode with real scenes of enterprise scenes through mixed reality to form output contents in mixed reality, wherein the output contents are provided with at least one display interface, options capable of realizing real-time interaction are arranged in the display interface, and the number of the options is at least one;
The real-time interactions include any one or more of text, touch operations, speech, limb movements, eye tracking, brain waves, and facial expressions.
5. The method as recited in claim 1, further comprising:
obtaining enterprise information, the enterprise information including any one or more of an enterprise type, an enterprise industry, and an enterprise development stage;
and adjusting the output content of the mixed reality according to the enterprise information.
6. The method of claim 1, wherein the performing association analysis on the multimodal information and the enterprise intelligent management database to form mixed reality output content comprises:
Figure QLYQS_1
wherein the symbol W i Is output content; symbol a is multi-modal information of enterprise personnel; symbol b is multi-modal information of the enterprise scenario; symbol Q is an enterprise intelligent management database; symbol x i Is the correction coefficient of the i-th output content.
7. The method as recited in claim 1, further comprising:
under the support of big data, predicting the result of the association analysis of the multi-mode information and the enterprise intelligent management database through a preset intelligent algorithm to obtain a predicted result;
And when the predicted result reaches a preset warning condition, displaying a multimedia early warning prompt in the output content of the mixed reality, and displaying a coping scheme aiming at the early warning prompt.
8. An enterprise intelligent management device based on mixed reality, which is characterized by comprising:
the building module is used for building an enterprise intelligent management database and building an audit relation among all elements in the enterprise intelligent management database;
the acquisition module is used for acquiring multi-mode information of enterprise personnel and/or enterprise scenes through the multi-mode sensor;
the construction module is used for carrying out association analysis on the multi-mode information and the enterprise intelligent management database to form mixed reality output content;
and the sending module is used for sending the output content to a corresponding multimedia terminal, and the multimedia terminal can interact multimedia information through the interaction interface of the output content.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of mixed reality based enterprise intelligent management of any of claims 1 to 7 when the program is executed by the processor.
10. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor implements the method of mixed reality based enterprise intelligent management of any of claims 1 to 7.
CN202310368865.2A 2023-04-07 2023-04-07 Method, device, equipment and medium for intelligent enterprise management based on mixed reality Pending CN116091739A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310368865.2A CN116091739A (en) 2023-04-07 2023-04-07 Method, device, equipment and medium for intelligent enterprise management based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310368865.2A CN116091739A (en) 2023-04-07 2023-04-07 Method, device, equipment and medium for intelligent enterprise management based on mixed reality

Publications (1)

Publication Number Publication Date
CN116091739A true CN116091739A (en) 2023-05-09

Family

ID=86204879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310368865.2A Pending CN116091739A (en) 2023-04-07 2023-04-07 Method, device, equipment and medium for intelligent enterprise management based on mixed reality

Country Status (1)

Country Link
CN (1) CN116091739A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206863575U (en) * 2017-07-12 2018-01-09 北京大合创新科技发展有限公司 A kind of industrial equipment holographic intelligent management system based on mixed reality technology
CN113435850A (en) * 2021-07-02 2021-09-24 神华上航疏浚有限责任公司 Intelligent enterprise management system and method
US20220036302A1 (en) * 2019-11-05 2022-02-03 Strong Force Vcn Portfolio 2019, Llc Network and data facilities of control tower and enterprise management platform with adaptive intelligence
CN115129879A (en) * 2021-03-24 2022-09-30 北京智数天下科技有限公司 Method for constructing enterprise relational knowledge base based on knowledge graph

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206863575U (en) * 2017-07-12 2018-01-09 北京大合创新科技发展有限公司 A kind of industrial equipment holographic intelligent management system based on mixed reality technology
US20220036302A1 (en) * 2019-11-05 2022-02-03 Strong Force Vcn Portfolio 2019, Llc Network and data facilities of control tower and enterprise management platform with adaptive intelligence
CN115129879A (en) * 2021-03-24 2022-09-30 北京智数天下科技有限公司 Method for constructing enterprise relational knowledge base based on knowledge graph
CN113435850A (en) * 2021-07-02 2021-09-24 神华上航疏浚有限责任公司 Intelligent enterprise management system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
曹琪: "MR环境下多模态人机交互的工业设计评价系统研究", 《中国优秀硕士学位论文全文数据库 基础科学辑(月刊)》, pages 7 - 24 *
郭绍义 主编: "《机械工程概论(第二版)》", 华中科技大学出版社, pages: 178 - 187 *

Similar Documents

Publication Publication Date Title
US11836338B2 (en) System and method for building and managing user experience for computer software interfaces
US11429351B2 (en) Methods and systems for building custom automation workflows to integrate multiple applications
Healy et al. An exploration of product advantage and its antecedents in SMEs
US11521143B2 (en) Supply chain disruption advisor
US11587342B2 (en) Using attributes for identifying imagery for selection
US11507908B2 (en) System and method for dynamic performance optimization
CN110134806A (en) The method and system of context user profile photo selection
US11888801B2 (en) Systems and methods for message filtering
US20220058583A1 (en) Project management systems and methods
US20230325909A1 (en) Systems and methods for recommending 2d image
US11693923B1 (en) Robotic process automation system with hybrid workflows
US20240095680A1 (en) Administration services for compensation platforms
CN116091739A (en) Method, device, equipment and medium for intelligent enterprise management based on mixed reality
US10474996B2 (en) Workflow management system platform
US11270253B2 (en) Cognitive procurement
Ris et al. Digital transformation handbook
US20220067109A1 (en) Cognitive automation platform
Jemala Systemic outlook in technology-management trends of best technology/ICT companies
Juric Oracle CX Cloud Suite: Deliver a seamless and personalized customer experience with the Oracle CX Suite
Al-Harbi et al. TQM Metrics in Mobile Applications Industry
Shamray DIGITALIZATION OPPORTUNITIES FOR MURMANSK TRAWL FLEET COMPANY
Knapp et al. Governance and Organisation Project Maturity
Zhygalova Managerial Aspects of Business Intelligence Implementation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination