WO2020162943A1 - Holistic intelligence and autonomous information system and method thereof - Google Patents

Holistic intelligence and autonomous information system and method thereof Download PDF

Info

Publication number
WO2020162943A1
WO2020162943A1 PCT/US2019/017126 US2019017126W WO2020162943A1 WO 2020162943 A1 WO2020162943 A1 WO 2020162943A1 US 2019017126 W US2019017126 W US 2019017126W WO 2020162943 A1 WO2020162943 A1 WO 2020162943A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
module
layer
intelligence
entity
Prior art date
Application number
PCT/US2019/017126
Other languages
French (fr)
Inventor
Krishnakumar Arumugham THOGAMALAI
Original Assignee
Thogamalai Krishnakumar Arumugham
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thogamalai Krishnakumar Arumugham filed Critical Thogamalai Krishnakumar Arumugham
Priority to PCT/US2019/017126 priority Critical patent/WO2020162943A1/en
Publication of WO2020162943A1 publication Critical patent/WO2020162943A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/067Enterprise or organisation modelling

Definitions

  • the second set of modules comprising a rendering module configured for receiving at least one of one or more types of output from all the modules and layers of the computing system; transforming the intelligent recommendations into one or more elements. These elements may include, but not limited to, visual elements, signals etc. Further, the rendering module may deliver the output to the at least one entity in collaboration with the interface and presentation module.
  • the advantageous feature of the system is to provide wisdom in multitude of fields that are structured and delivered to provide unified intelligence that is integrated with the business to empower the organization.
  • the system performs every activity using intelligence. That is, the system understands every aspect required to perform the particular activity intelligently.
  • the system considers and combines multiple approaches such as model driven, domain driven, data driven approaches etc. for performing the said activity.
  • the above-mentioned building blocks can also be transferred through any medium into the system.
  • the system also provides reusable design patterns and automatically maps the patterns to create forms for the matters for creating projects of different kinds such as practical, conceptual, logical, functional etc.
  • the system leverages its proprietary comprehensive storyboarding, articulations functionalities and methods to capture the creative users' thoughts, imaginations, requirements, actions, narrations, gestures and transforms those into designs, models and implementation for a final product or project required by the user.
  • the system may also provide the ability to work with quantum computers by converting data in to integration friendly formats.
  • the Omni Presence Layer is enabled with a hot which uses Natural Language Processing , Machine learning techniques and Metadata that is gathered automatically from the system operations and also the metadata that is been inputted by various SMEs. This helps the users to work collaboratively on all phases of the system.
  • the inputs can be provided by the users, other applications, external devices, etc.
  • the system is capable of receiving input of any kind in different formats and can store them in a unified structure.
  • the system can collaborate with application data or system data and the data obtained from the role-based communication between the users in order to generate highly intelligent and accurate analytics, which helps in improving the performance of the system.
  • the feature uses State-of-the-art recommendation algorithms that combines the metadata of the dataset and a user preference and interest with similar users in the same domain working on the similar datasets and visualizations.
  • Auto Visualizations visualizes the data with better animations and insights which helps the user to produce actionable decisions for the business organization.
  • the system creates insights from the visualizations by analyzing the patterns and detecting outliers/ anomalies based on the distribution of the data points.
  • the Cognition Module of the Intelligence Module understands the context, user preferences and provides intelligence to the various components of the System.
  • the cognitive module acts like consciousness with the subconscious programs, neural networks, processors and memory to provide spontaneous and optimal intelligence for complex scenarios. This could be applicable for decision making at the higher level of the organizations by aggregating lower level/ grains of information across the organization in varied disciplines or subject matters and subject areas.
  • human resources productivity is based on many factors such as psychological, physiological, environmental, motivational factors, however; the energy required to complete a task can be measured and to figure out the possible causes for the lack of energy or high energy relies on taking other factors also into consideration.
  • mathematics, logic, analytics pertaining the corresponding scientific areas need to be applied to complete the analysis to provide a comprehensive and appropriate solutions.
  • the Orchestration Module enables the connection of different modules in the Omni Sense Layer and the co-ordination of the execution of processes, functions, methods, etc.
  • the Omniscience Layer provides the knowledge required for generating intelligence and the know-how for the System to function and to evolve over time.
  • the Omniscience Layer comprises of the Time Machine Module, Library Module and Evolution Module.
  • the Time Machine Module provides the ability to look back at the past and can also make predictions based on the past.
  • the Time Machine Module considers various factors such as time zone, fiscal calendar, calendar holidays, information, communications and collaboration, which are integrated with cross-functional aspects of the organizations to make timely decisions or to provide suggestions. It is also capable of providing timely triggers for the different layers in the system.
  • the Library Module provides various resources such as Language Libraries, Business Case Libraries, Code Libraries, etc. to the different Layers of the System.
  • the system intelligently caches the information based on the most requested or used queries and responses to make the Response to Request cycle faster.
  • the Search functions of the System looks up layers of intelligence such as; Subject Matter Expertise, Internal Intelligence, External Intelligence, residing in the Library Module and intelligently understands the context, intent, connotation and intonation of the requests and processes the response dynamically from its cache.
  • the System also produces results from the cache/ temporary Memory (In-Memory), if the requested results are already processed.
  • the Session Module handles global sessions based on user logs and also it validates the user session for every internal or external request or response.
  • the Metadata Management involves the use of a unique model and data structure to store the informations of data that includes objects, relationships, properties, IDs, etc. or all in one format.
  • the Log Management Module captures all transactions and their changes into logs or system audits for operations such as; rollbacks, undo, redo, purge, etc.
  • the Data File Management provides the ability to save the data in various formats including documents, media (audio, video, etc.), etc. as it can convert the entire document into a file format quickly.
  • the Data File management stores the images and videos in the form a binary string of 0s and Is and converts back the string to images and videos at the time of retrieval.
  • FIG. 2 illustrates a block diagram for a computer-implemented system to empower and enhance performance of an entity, in accordance with an embodiment of the present disclosure
  • FIG. 3 illustrates an exemplary information corresponding to a Library Module, in accordance with an embodiment of the present invention
  • Search Driven Analytics in the system allows the users to ask questions in a natural language which gets converted to data queries upon which analytics is performed.
  • the Enrichment Module 214 receives it from the Integration module 208 and segregates it using separators with the help of the Syntactic library 304 from the Library module 248. Further, the Enrichment Module 214 cleanses and synergizes the input to maintain the accuracy and the integrity of the data based on pre-defined set of rules and intelligence derived from the pattern learning and cognition provided by the Cognition Module 238.
  • the Enrichment Module 214 will iterate the process of receiving and validating the input by understanding the context, connotation, intonation, the intent, etc. using the Cognition Module 238 till it receives sufficient input from the user. For example, if the user provides an incomplete sketch, the Enrichment Module 214 recognizes what the user desires to sketch and completes the sketch using visual, key-stroke geometry, language libraries, etc. from the Library module 248. In another example, the Enrichment Module 214 associates the input with the context, the role of the user, the direction of discussion, etc. in a group discussion.
  • the Micro- Service Integration Module 230 manages user subscriptions and restricts the user access to the different layers of the system 106.
  • the Micro-Service Integration Module 230 also helps the system 106 to work with independent layers, components, modules, etc. If the command is for data storage, the Deciding layer 112 will redirect to the Data Layer 118 through the Communication module 228. If the request or command is for query, analytics, etc. the Deciding layer 112 will redirect to the Omni Sense Layer 114 through the Communication module 228.
  • the above said process is performed with the help of Multithreading and Parallelism techniques. Queries will be routed to specific data stores and structures depending on the nature of the input type, format, query etc. If it is a select or aggregate or analytical read only type of queries, then those will be routed to efficient data structures and the operations like update, delete, and insert may be routed to other proprietary data structures.
  • the validated queries then reaches the caching layer in which a graph structure is maintained, where the queries are mapped with the information that is used to retrieve the data from the database based on the metadata provided by the Metadata management module.
  • the cache layer will be empty.
  • the system directly goes to the database, retrieves the information and maps the queries in the caching layer graph using dynamic binary tree structures and graph coloring algorithms.
  • a ranking algorithm is applied which ranks the queries based on the time and frequency of access.
  • the System prunes the graph by deleting the inactive query nodes when the graph size crosses the cache buffer size.
  • the Caching layer prioritizes the most frequently asked queries and keeps the corresponding query nodes in the caching layer for faster access.
  • the Monitoring Module 242 continuously monitors the system 106 by using its own techniques, functions, methods and algorithms and pre-aggregate and cache data based on usage patterns and also identify probabilities based on the implicit patterns of the data and policies, procedures, perceptions etc. This enhances the analytical performance of the system.
  • the underlying data changes are automatically applied to the aggregates. This is further extended to managing the already saved or viewed reports or cached datasets by updating or refreshing those when the underlying data is changed at the lowest levels.
  • the Monitoring Module 242 automatically monitors and detects anomalies and sends implicit commands to the Intelligence Module 234 through the Orchestration module 232.
  • the Monitoring Module 242 does real-time monitoring and provides Proactive support with the help of the Time Machine Module 246 by automatically rolling time periods.
  • the Intelligence Module 234 needs to extract data from a past time or may require forecasts for a future time.
  • the Time Machine Module 246 of the Omniscience layer 116 may provide the framework to roll back to a past time or look into the future as and when required by the Intelligence Module 234. In a particular example, a user request may require recommendations using forecasted data.
  • the Time Machine Module 246 is called into action and may create forecasts for the future in coordination with the Data Layer 118, Evolution Module 250 and Library module 248.
  • the method may generate mathematical equations for the query. Further, the method may determine a best method to process the query based on optimal performance. Furthermore, the data may be distributed to another layer of processing (such as a deciding layer of the controlling layer) using concurrency control techniques to handle multiple requests from one or more entities (users) parallelly to support multiple views, concurrency, failover recovery, scalability, and replications.
  • the method may make predictions based on the Key Performance Indicators (KPI), the Slowly Changing Dimensions (SCD), position and time coordinates, and data from an omniscience layer and a data layer. Further, the method may create a programming code on the fly using dynamic programming abilities by utilizing top-down approach, bottom-up approach, system metadata, user metadata, data from SME's (in case the query relates to a particular business), predictions, and recommendations (such as from a cognition module).
  • KPI Key Performance Indicators
  • SCD Slowly Changing Dimensions
  • position and time coordinates and data from an omniscience layer and a data layer.
  • the method may create a programming code on the fly using dynamic programming abilities by utilizing top-down approach, bottom-up approach, system metadata, user metadata, data from SME's (in case the query relates to a particular business), predictions, and recommendations (such as from a cognition module).
  • the present invention discloses a method and a system for empowering an entity to enhance performance thereof or any performance related thereto by utilizing artificial intelligence and cognition through system experience or through the experience of the user (entity).
  • the system (and corresponding method) is a single source of intelligence generation to perform business process (es), research and other complex aspects of life intelligently and autonomously.
  • the system has an ability to understand the intricacies to produce analytics and also to engineer and re-engineer the underlying process for generating an intelligent output for empowering an entity that provides an input query.
  • the system provides scoring and ranking to apply weightages and factors to determine the most effective and accurate projections.
  • the system utilizes deep learning process to evolve from one state to another.

Abstract

The present invention discloses a system that is designed, developed to derive, retrieve and provide holistic intelligence to empower entities by processing complex data integrated from data sources and stored in its database with advanced data structures. The system includes an Omni Presence Layer, I/O Processing Layer, Deciding Layer, OmniSense Layer, Omniscience Layer, Transient Layer and Data Layer. The Omni Presence Layer enables interaction between user and the system flexibly with any kind of input. The I/O Processing Layer processes the inputs and outputs by performing various functions. The Deciding Layer enables secure communication between the layers based on the processing requirements and user subscriptions. The OmniSense Layer provides intelligent responses to requests made by the user. The Omniscience Layer provides the knowledge required for generating intelligence and the know-how for the System to function and evolve over time. The Transient Layer improves the performance of the System. The Data Layer manages the data storage and processing requirements of the system.

Description

HOLISTIC INTELLIGENCE AND AUTONOMOUS
INFORMATION SYSTEM AND METHOD THEREOF
CROSS REFERENCE
[0001] This Application claims benefit of US Provisional Application No 62628271 Filed on 08-FEB-2018 which is incorporated by reference.
TECHNICAL FIELD
[0002] The present invention relates in general to an intelligence and information system and more particularly, the present invention relates to an omniscient holistic intelligence, and autonomous information system that consists of modules and methods for empowerment of individuals and organizations to achieve their requisite life and business goals with strategic recommendations and guidance to enhance performance thereof.
BACKGROUND
[0003] In this fast paced life on earth, as the technology is advancing rapidly and the information is growing abundantly, the intelligence required for the human to perform, compete and thrive is increasing. Various entities such as organizations, students, researchers, subject matter experts, employees, investors and individual professionals require intelligence and guidance for research and business purposes. Processes such as research, goal creation, goal setting, decision-making, execution, reporting, governance, termination etc. need to be carried out for research and for achieving business goals. Throughout the course of the professional life, the above- mentioned processes need to be performed while considering a multitude of factors in the present and future, such as state of the economy, state of the market, the state of the overall organization, state of the business functions or the state of the individual. In order to function in such a manner, single reliable source capable of providing overarching wisdom, prudence and acumen to empower the organizations and public is required.
[0004] Reference may be made to US 20150199744 A1 which discloses interest- driven data visualization systems that includes a processor, a memory connected and configured to store an interest-driven data visualization application, and metadata storage, wherein the interest-driven data visualization application configures the processor to define reporting data requirements, generate data retrieval job data based on the reporting data requirements, transmit the data retrieval job data, receive aggregate data, create at least one piece of reporting data using the received aggregate data and the reporting data requirements, associate visualization metadata with the reporting data describing the visual appearance of the at least one piece of reporting data, and generate a report using the reporting data requirements and the visualization metadata.
[0005] Reference may be made to US 20150081619 A1 which discloses systems and methods for interest-driven business intelligence systems including geo-spatial data in accordance with embodiments of the invention that includes raw data storage and perform extract, transform, and load processes, a data mart, and an intermediate processing layer, wherein the intermediate processing layer is configured to automatically generate metadata describing the raw data, derive reporting data requirements, and compile an interest-driven data pipeline based upon the reporting data requirements, where compiling the interest-driven data pipeline includes generating ETL processing jobs to generate geo-spatial data from the raw data, determining bounding data, bounding the filtered raw data based on the bounding data, generating geo-spatial data, and storing the geo-spatial data, generating reporting data including data satisfying the reporting data requirements based on the geo-spatial data, and storing the reporting data in the data mart for exploration by an interest-driven data visualization system.
[0006] Based on the aforementioned, none of the cited prior art document discloses an end to end system that displays omniscience and flexibility. People and organizations often find it challenging to digest voluminous data and provide intelligence across the board succinctly and quickly. Due to the information abundance and overload in today's digital world, they engage in a combination of multiple technologies, complex system and human resources with an effort to accomplish these requirements. Storing huge data efficiently, deriving intelligence out of complex data quickly, and automating the end-to-end process of managing data, providing advanced intelligence and decisions. This requires complex processes, automation tools etc.
[0007] The existing tools like Business Intelligence tools, Database Management tools, Interactive tools, Data integration tools, Data Science tools, Artificial Intelligence tools etc., do not provide holistic intelligence and required intelligent solutions for empowering a user. Moreover, using so many such resources and tools are not only costly and time consuming but also deficient in terms of performance and efficiency. Further, these tools perform in a specific manner for which they are designed. Moreover, at any time, the output of one tool may not necessarily be required as input for other tool, as it may depend upon various factors including situation, time, requirement, sentiments, market trend, social factors, personal, business goals, vision and so on. Therefore, even collaborative performance does not provide an aid to achieve for the desired intelligent result. Accordingly, there is a lack of an omniscient system to provide holistic intelligence essential for the empowerment of individuals and organizations. [0008] Based on the aforementioned, there is a need for a system, which performs every activity using intelligence. There are existing solutions that provide intelligence in a limited capability for a particular function or a process in an organization or for research. There is a need for a system, which not only provides intelligence for the research and business operations, but also performs any activity in an intelligent manner. That is, the system needs to understand the nature, context and objective of every activity and decide every aspect of performing the said activity. An intelligent manner of performing a particular activity means that the system should consider and combine multiple approaches towards performing the said activity.
[0009] A system that provides intelligence in every aspect on a huge scale needs to be efficient enough to work with as little inputs from the system users as possible. In addition to the above-mentioned autonomous system of generating intelligence, there is a need for a system that can provide human-like cognition. A system that can provide cognitive aspects such as perception, judgment, context, business acumen, etc. in addition to artificial intelligence is required.
[00010] A holistic approach towards intelligence requires the system to be flexible. That is, the system requires flexibility to handle any type of input, output, and command and consider multiple methods for intelligently performing a particular activity. Therefore, such a system requires a unique approach towards aspects of system operations including data management, system security, processing and optimization.
[00011] In today's world the data is redundant within an organization and all over the world, even the common data is redundantly utilized everywhere by making multiple copies of the same data for multiple purposes and applications. The efforts, cost, time etc. involved have grown exponentially as the data grows rapidly in the world. The system needs to be capable of handling massive amount of inputs in various forms. The system should efficiently handle large-scale data, files, signals etc. Organization data are confidential and it is mandatory to maintain high data security standards in the system to safeguard the data. The security capabilities of the existing systems are limited to provide data level security with Log Management. It just allows establishing row level security, which stipulates that all users of the system have their own user accounts. A great number of accounts might increase the chances that the system may be hacked. Further, it is often challenging for organizations using current products to achieve higher efficiencies pertaining to performance, volume, time, flexibility and managing end-to-end security of data within one platform.
[00012] In order to handle data and intelligence generation in a large scale, the system needs to optimize its operations intelligently. Existing Database Management Systems and Business Intelligent tools that have been the standard in the years are not up to the speed to manage huge and very fast-changing datasets automatically and are limited in capability to apply intelligence to the stored dataset. The required holistic system should possess data management capabilities that overcome the limitations of existing Database Management Systems like Neo4j, Mongo DB, etc. by collecting, storing and managing the huge amount of data efficiently. The existing advanced Database Management Systems lacks high performance and does not efficiently support various necessary operations. When it comes to user interaction or querying and presenting the data, the existing systems are restricted to Structured Query Language, Cypher Gremlin query languages or Natural Language Processing based querying. There is a need for a system with new, advanced, colloquial, man-machine common interaction languages that makes it easier for users of the system to interact among themselves and with the system. A huge volume of data presented in table format makes it difficult for the users of the system to analyze and understand. The existing interactive tools such as chat hots, messaging applications, etc. lack the ability to work with Virtual Reality and mathematical querying. They are limited in the ability to understand the context of a query and to return a dynamic response. The existing interactive tools need to be trained for every request and response. They return generic answers for new questions and situations.
[00013] It is evident that there is a need for a system with wisdom, prudence and intelligence that encompasses all business and research functions and is capable of performing business and research processes autonomously with minimal user interaction or by guiding individuals to achieve greater holistic performance of individuals and organizations.
SUMMARY OF THE INVENTION
[00014] The present invention discloses a system, omniscient by design to provide holistic intelligence, that acts as a single source of wisdom, prudence and intelligence to empower humankind to achieve greater holistic performance of individuals and organizations by processing data of various formats. It helps to solve problems with a functional approach by integrating general science and life sciences which includes physics, chemistry, mathematics, biology, technologies, philosophies, beliefs, languages, wisdom etc. For example, in life science and healthcare scenarios, the person's ability to perform better physically at sports requires many factors like food, hormones, metabolism, physical exertions, psychological conditions such as stress. Each of the factors may have different measures to be calculated, for providing different results or decisions or recommendations depending on the objectives, for which the system applies the science on the data in many ways and provide the optimal solution to the needs.
[00015] The system further provides a virtual laboratory with necessary tools to access the wisdom, prudence and intelligence and explore the existing sciences, facts, phenomenon etc. and create a new world of possibilities. [00016] Hereinafter, the term 'users' referred in the document are business employees, or individuals such as; business owners, decision makers, system owners, data administrators, data analysts, data scientists, business users, data modelers, developers, applications, interfaces etc. who may utilize the system.
[00017] The system is a continuously evolving holistic management system that encompasses all business and research functions and capable of performing business and research processes autonomously or by guiding users intelligently to achieve professional goals. The system allows users of different backgrounds with different physical, mental, emotional and technical abilities to interact flexibly and empower the users to utilize and improve their potential.
[00018] The system consists of components, which work individually or in conjunction with other components for implementing the system and corresponding processes. The components are integrated in a manner that enables data, intelligence and knowledge to be omnipresent throughout the system. The system is inspired by Natural/ general intelligence to empower organizations and individuals. The capabilities of the system may be fully delivered to all users or customized to the specific audience or user based on the requirements, subscriptions, and entitlements. The elements such as technical architecture, building blocks, construct, models and functions of the overall system and its components are aligned in order to provide Natural/ general intelligence. The present invention applies the concepts of Natural/ general intelligence such as judgment, evolution, instincts, memory etc. along with artificial intelligence compared to other systems applying only artificial intelligence. The intelligence is built based on core principles of mastering the information, intricacies, nuances, associated matters (Subject matters and physical matters), materials (Molecules and particles), properties, patterns, and measures through cognitions, and acumen, continuously evolving learning mechanisms to provide the natural & artificial intelligence. [00019] In an embodiment of the present invention, a computing system for empowering an entity to enhance performance is disclosed. The computer- implemented system may include, but is not limited to, an omni-presence layer, an I/O processing layer (hereinafter may interchangeably be referred to as 'processing layer'), a controlling layer, and an Omniscience Layer. The omni-presence layer may include a first set of modules for interfacing and collaborating with at least one entity. Herein the omni-presence layer receives at least one of one or more types of input queries from at least one entity. The processing layer may include a second set of modules for receiving and validating at least one of one or more types of input queries from the at least one entity. The controlling layer may include a plurality of sub-layers having a third set of modules. The sub-layers may include (but are not limited to) a deciding layer for deciding a course of action by analyzing the at least one input query; and redirecting the at least one input query, through a communication module, to one of other sub-layers of the plurality of sub-layers based on the decided course of action. Further, the controlling layer may include an omnisense layer including an orchestration module for coordinating, directing and redirecting the at least one input query to the other modules of the omnisense layer; an intelligence module for creating at least one application, by generating intelligence, to process the at least one input query when the at least one input query is redirected to the omnisense layer. Further, the omnisense layer may include a recommendation module configured to provide one or more intelligent recommendations based on processing the at least one input query. Further, the controlling layer includes a data layer comprising a set of modules, for storing and managing the data to process one of one or more types of input queries from the at least one entity. Furthermore, the controlling module may include a transient layer comprising a set of modules, for storing and managing the frequently accessed and processed data by the at least one entity. The data layer of the third set of modules comprising an auto-organization module configured for classification of streams of data and organization of the data based on data structures. The omniscience layer comprising a fourth set of modules, for providing knowledge based on one of the one or more types of input queries from at least one entity.
[00020] The second set of modules comprising a rendering module configured for receiving at least one of one or more types of output from all the modules and layers of the computing system; transforming the intelligent recommendations into one or more elements. These elements may include, but not limited to, visual elements, signals etc. Further, the rendering module may deliver the output to the at least one entity in collaboration with the interface and presentation module.
[00021] Herein, The omni-presence layer, the processing layer, the controlling layer and an omniscience layer are communicatively coupled for carrying out intelligent interactions corresponding to the at least one input query. The one or more types of input queries comprise signals, data in natural language, bulk data, gestures, mathematical query, data structure, symbols, business automation language, audio and visual. The omni-presence layer may include a presentation module, a collaboration module, an interface module, an integration module. The presentation module may make intelligent interactions with the at least one entity in a multi-dimensional way for facilitating the at least one entity with an Advanced User Interface and an intelligent set of functions to sketch visual representation based on preference and situation associated with the at least one entity. The collaboration module may enable interaction among multiple entities and the computer-implemented system in an instance of group discussions. The interface module may enable interactions with one or more external applications. The integration module may integrate data received from multiple heterogeneous sources, to the computing system, by utilizing Data Migration and Integration techniques thereof. [00022] The second set of modules may perform one or more functionalities to convert the at least one input query from one format into an alternate format based on a type of the at least one input query. The one or more functionalities correspond to validating, enriching, transforming and translating the at least one input query. The omniscience layer may include a fourth set of modules having a time machine module, a library module, an evolution module, a learning module and a training module. Further, in an embodiment, the system may further grow its knowledge from data obtained from SMEs that supports the system model. The fourth set of modules provides knowledge to generate the intelligence and for self-evolution of the computer-implemented system. The intelligence is generated by creating a set of rules (such as deterministic rules and non-deterministic rules (such as, but not limited to, probabilistic rules) based on one or more factors and data obtained from the omniscience layer. The omnisense layer further comprises an orchestration module for prioritizing, channelizing and ensuring completion of a request corresponding to the at least one input query. The request may be received from at least one of the deciding layer and one or more modules of the omnisense layer. Further, the orchestration module further configured for determining a need and a level of intelligence required for processing the at least one input query.
[00023] Further, the sub-layers further comprising a transient layer having one or more modules. The one or more modules comprise caching module, In-Memory module and a session module. The one or more modules implemented for: facilitating an un-interrupted user-session and back-up management; and storing most frequent input queries using ranking and graph data structures. Furthermore, the sub-layers may include a data layer having one or more modules for managing storage of data and processing requirements by providing real-time updates to at least one of the processing layer, other sub layers in the controlling layer and the omniscience layer. [00024] Further, the intelligence module includes (but not limited to) an artificial intelligence (AI, hereinafter may interchangeably be referred to as ΆG) module and Cognition module. The AI module may track Slowly Changing Dimensions (SCD) of data, corresponding to the at least one entity, in a data layer to identify changes from one state to another in terms of one or more aspects related to the at least one entity. Further, the AI module may make predictions based on Key Performance Indicators (KPI), the Slowly Changing Dimensions (SCD), position and time coordinates, and data from an omniscience layer and a data layer. Furthermore, the AI module may create a programming code on the fly using dynamic programming abilities by utilizing top-down approach, bottom-up approach, metadata, predictions, and recommendations from a cognition module.
[00025] The AI module may further perform a Search Driven Analytics (SDA) by enabling the at least one entity to perform analysis on own data and further by developing intelligence based on the analysis performed by the at least one entity. The AI module may utilize Automated Machine Learning (Auto-ML) for automatizing functionalities from data preprocessing to model selection. Further, the AI module may perform an analysis of one or more parameters associated with the at least one input query for generating one or more predictions based thereon by utilizing hyperopt for parameter optimization. Furthermore, the AI module may select a model with one or more parameters suitable for processing the input query. The selected model provided to the at least one entity for enabling the at least one entity to interact therewith. Also, the AI module generates one or more recommendations of visual elements, suitable for visualizing the data. The visual elements provided to the at least one entity for the representation of data distribution.
[00026] The cognition module may include a set of facts, a set of rules, and manifesto to apply a logic to process the at least one input query based on data and learning from the data layer, and omniscience layer. Herein, the one or more aspects may include (but are not limited to) at least one of interactions, activities, transactions, emotions, sentiments, decisions, movements, changes, reactions, context and connotations related to the at least one entity.
[00027] The cognition module may further perform one or more analysis (such as behavioral, psychological and physiological analysis) for determining emotions and instinct associated with the at least one entity. Further, the cognition module may generate one or more recommendations based on the logic, one or more situations and conditions. Furthermore, the cognition module may determine possible current and futuristic impacts related to each of the generated one or more recommendations.
[00028] The omnisense layer may further include a monitoring module configured to perform at least one of: monitoring performance of the computer-implemented system to detect an anomaly impacting the performance (such as an anomaly in an entity and/or structure(s) that can impact the performance); providing proactive support in collaboration with a time machine module of an omniscience layer by automatically rolling time periods based on the detected anomaly; providing a command to the intelligence module to determine root cause of the anomaly; and facilitating the at least one entity to manage one or more factors to detect one or more anomalies based on market trend and behavioral pattern corresponding to the at least one entity. Further, the omnisense layer may include an automation module further configured for gathering input from the at least one or more entities, and in collaboration with a library module of an omniscience layer to automate processes autonomously.
[00029] Further, the second set of modules may include a receiving module, a transmission module, an enrichment module, a translation module, a transformation module, distribution module and a rendering module. The enrichment module validates and ratifies the at least one input by utilizing data from a data layer and facilitating the at least one entity to define rules. The rules may be utilizable by an omniscience layer to learn and evolve based thereon. The transmission module may process the at least one input for transformation thereof into an acceptable format, when the input query is a signal. The transformation module may transform one data structure into another data structure based on an input received from an enrichment module. Further, the distribution module may distribute queries to one or more layers of the computer-implemented system for parallelism, to support multiple views, concurrency, failover recovery, scalability, and replications.
[00030] Further, the deciding layer may include a security module, the communication module and a Micro-Service integration module. The security module configured for securing the Omni Presence layer, the processing layer, the transient layer, the data layer and the Omniscience layer with protective measures comprising authentication, security enforcements and cryptographic methods. Further, the security module is configured for getting authentication to access data and applications corresponding to the computing system for ensuring security thereof.
[00031] The communication module may coordinate with an authentication module of the security module and the Micro-Service integration module for getting authentication to access data and applications corresponding to the computer- implemented system for ensuring security thereof. Specifically, the communication module is configured for coordinating with an authentication module of the security module and a Micro-Service integration module and redirecting the at least one input query to the other sub layers of the controlling layer for further processing. [00032] The invention provides the power of integrated-science, mathematics, physics, chemistry, biology, technologies, geometries, cross-functional universal wisdom, philosophies, and metaphysical aspects. The system uses its power of language, informatics, and cognition to create documents, logic and intuition based on highly evolving consciousness from different contexts and perspectives such as; psychological, philosophical, metaphysical, spiritual, commercial, technical, physical, emotional based on situations and perceptions. The system aids professional management by considering the overall context obtained from the above-mentioned process and from the present and possible future state of the economy, market, organization, individual and its impact on all business functions. The system has the maneuverability to detect anomalies and dynamically change its actions. The system not only considers the present and future states, but also considers the dynamic aspects such as direction and pace of the economy, market, organization, business function etc. The above-mentioned elements are used to not only provide intelligence, derive logic or to just provide guidance but also to perform every activity such as research, data management, analytics, application management, communication etc. in an intelligent manner for better decisions and empowerment in the life of businesses. The system helps users to accomplish their projects of various kinds such as creative, functional, practical, conceptual, logical etc. from the inception, implementation, maintenance, training and automation. The system also empowers the individuals such as owners, executives, employees through its wisdom and by providing the acumen and prudence to run the organizations effectively and efficiently. The system also has adaptability to organically evolve as the businesses or the individuals evolve. The system also autonomously performs support processes such as data management, monitoring, security etc. for performing the above-mentioned processes.
[00033] The advantageous feature of the system is to provide wisdom in multitude of fields that are structured and delivered to provide unified intelligence that is integrated with the business to empower the organization. The system performs every activity using intelligence. That is, the system understands every aspect required to perform the particular activity intelligently. The system considers and combines multiple approaches such as model driven, domain driven, data driven approaches etc. for performing the said activity.
[00034] The system consists of a single source of intelligence generation irrespective of whether it is a system or business process. In order to perform the business processes intelligently, the system uses a combination of Market Intelligence (Statistics), Social Intelligence (Sentiments), Self-Intelligence and Collective-Intelligence. This combination provides the power of developing intelligence from a 360-degree perspective of the business. Its central source of intelligence is based on integrated science and philosophies that is capable of providing single version of intelligence comprising of; amalgamated knowledge, information, facts, and figures. In order to provide the holistic intelligence significant to businesses, organizations, individuals. Further, the system utilizes its own intelligence to combine knowledge derived from the insights generated from information assets, facts, knowledge, subject matter expertise, learning experiences, science encompassing various aspects of world and individual businesses.
[00035] The system has the ability to understand the intricacies, to produce analytics and also to engineer and reengineer the underlying processes. The system understands the processes over time and performs analyses and forecasts during the past and future time periods using its 'Time Machine' component. The system applies a 360-perspective with scores & ranks to apply weightages and factors to determine the most possible accurate projections. The system produces predictive analytics and forecasts by analyzing organizational, professional and individual personal factors.
[00036] The system functions in a cascading manner to chain all insights to create a holistic picture as compared to producing insights corresponding to a business process for a particular set of business use cases. The analytics may be provided using a combination of functions, methods and algorithms by understanding the semantics, implicit and explicit relationships among data, context, user perceptions, patterns, etc. The system automatically generates different analytics based on the nature of the data by referring to the appropriate business libraries considering the important data for the given business belonging to a particular industry or domain and generates generic standard analytics and also particular organization-specific analytics by developing a deep understanding of the implicit data and relationships using cognition and Semantic Libraries. The system also helps users with collection, classification and aggregation, division, and dissemination. The system can get trained to classify better based on the users' patterns automatically. It also allows people to combine data from different sources to automatically transform and utilize those for analytics. It gives the options to either make it volatile/ transient process or persistent process to step into it and edit it for more sources. In a particular example, the system may automatically collect, process, classify, consolidate and aggregate a company's transactions by understanding the taxonomy of the data. Taxonomy is a hierarchical way of classifying the data based on their inherent properties and relationships. .It may then apply analytical functions on the aggregated data and provide valuable insights on the data. Further it monitors, learns, and visualizes the dynamics of the information and prompt decision-makers with critical/ significant insights by applying proactive analytics.
[00037] The system not just provides intelligence; it provides wisdom, guidance, prudence, acumen to maneuver difficult situations in the business through investigation, predictions proactively. The system uses cognition on top of Artificial Intelligence by utilizing the knowledge acquired through continuous learning. The system has the ability to roll back to a past time or look into the future to generate insights. The cognitive abilities help the system to evolve itself using the intelligence from the users, external subject matter experts and the system itself using techniques such as introspection, extroverted and extrapolation of past experiences and possible future scenarios. The system mentors organizations in known situations that are hard to maneuver or proactively prevent a crisis by identifying possible signs. The evolution is not just restricted to the system and business knowledge. The system identifies the underlying cycles followed by the processes and functions. The system continuously evolves to improve the efficiency of operations of such cycles.
[00038] The system is dynamic, intelligent, efficient, flexible, scalable, portable, extensible, reusable, energy-efficient, space-efficient, secured, self-aware, self managing and self-evolving which allows the users to interact with the system easily, it is capable of receiving any kind of input or request and responding to the request appropriately in an intelligent and autonomous manner.
[00039] Since the system works in a holistic manner and uses a huge volume and variety of data, it needs a unique approach towards data management compared to other systems. In particular, the system provides the ability to work with the data in a unique manner in terms of modeling, formatting, storing, manipulating, maintaining and utilizing the underlying data. The system processes, stores, tracks and retrieves the data efficiently using Dimensionality Reduction (DMR), Slowly Changing Dimensions (SCD), Master Data Management (MDM), Automatic Anomaly Detection, Auto-Modeling, Data Validation techniques, etc. The System stores the data in a unique structure that allows mathematical equations combined with functional programs and algorithms. The System has the ability to dynamically simulate, extrapolate, calculate or manipulate values. The system has the ability to automatically extract form fields from an application, scrap and create the respective fields in the grid and ingest data from a list into the grid. This way, manual ingestion of data can be avoided and the data can be loaded automatically. The system may automatically load, model and process transferred inputs such as software snippets, grist, excerpts, particles, atom, iota etc. through any medium with the option for the user to allow or deny such access to the system. The system automatically processes the input with the option to reverse actions using undo, redo, duplicate and de-duplicate operations. The system automatically creates object models with the respective properties once the data is loaded. The System also provides the ability to transform data from columns to rows, rows to columns and two-dimensional data to multi-dimensional data or graphs, trees, etc. This allows users to quickly fetch the data and transform it to desirable formats for further analysis or visualization. In addition to this, the visualization allows the user to visualize the digital landscape of the organization, business models, system architecture, organizational data etc. The system also allows the user to view data at the record level. In a particular example, a record about the product can be viewed as a card that provides images along with vital information about the product such as price, launch date etc. Such a representation would allow the user to easily browse through different products from a single view with the ability to pin selected products for further analysis. The information is not only restricted to data. It may also be in the form of signals or thoughts. The system may have the ability to process signals to receive, transform, amplify and transmit data in the form of signals. The ability of the system to transform all media (audio and video) into assets with significant business values enables the digital transformation in organizations. The thoughts are maintained as definitions or keywords in the system library with direct and indirect relationships that are hierarchical, or non-hierarchical in nature. They are maintained in different data structures such as continuously spanning trees, or sporadic trees created instantaneously, or custom data structures. The relationships are mapped directly or with the help of intelligence using configurable, static and dynamic rules.
[00040] The system can handle data storage in different levels and forms such as atomic level, molecular level, chains, branches, sequence, etc. along with history, patterns and changes in the data. The storage is done in a smart way using intelligent hashing algorithms. Any new message signal can change the memory in the required portion. The system reacts to any type of input in a way similar to how the nervous system of human body reacts. [00041] The system's foundation is also built on the key principles universal common data, universal language with multiple views and versions according to organizations. The system manages the huge amount of data in the minimal and reusable way as opposed to a redundant way. The system maintains crowd-sourced and crowd-consumed common data that can be updated and socialized with a bigger audience if required. The system provides the ability to maintain different versions of the common data based on the data needs of each individual. The system provides the ability to synergize multitude of data provided by multiple sources, transform and provide information, intelligence and guidance to multiple organizations depending on the subscriptions of the organizations. The system uses unified generic languages that are very easy for it and for the users to understand besides supporting multiple new and existing languages, be it colloquial, technical, scientific languages, etc. The system has a spectrum of communicative abilities to effectively interact with the user. The system can understand and interact with the users or stimulate collaboration among users using formal and colloquial natural languages, query languages, scientific languages, mathematical language, user body language, gestures etc.
[00042] The system is capable of presenting information, reports and self generated analytics in an intelligent and interactive manner by using cognitive intelligence. The system offers advanced UI (User Interface) and UX (User Experience) that elevates the beauty, function and the experience of interactions & communicating. The system makes everything hyper-informative by communicating to the users visually and succinctly. This reduces the time, energy and cost involved in the interactions and collaborations in the organization. The system automatically recognizes the type of users in an organization, the data, the context, etc. and presents the insights in various forms accordingly such as presentations, chat hots, graphs, tables, audios, virtual presence, etc. The System can also manage multiple queries by different users during group discussions. The system offers fully integrated collaborative environment that provides an ability to send emails, chats, voice calls, voice messages, video clips, screenshots, images etc. across the organizations or a group, from one user to a specific user or groups of users.
[00043] The system is capable of illuminating data with advanced multidimensional visualization to provide users an immersive experience by interacting with data using an advanced, modular architecture with Soft Computing and Dynamic Programing approaches where the building blocks, components, particles, are maintained at the atomic and lower levels. These are packaged as compositions that consist of; codes, libraries, datasets, etc.
[00044] The system is autonomous in nature with as little user input as possible to handle the enormous scale of intelligence. The system is capable of performing activities such as knowledge discovery, data organization, validation, correction and analytics in an autonomous manner. The autonomous nature is not only restricted to the management of existing research, business and system processes. The system performs functions right from creation of tools or projects for research, business and system processes, maintaining or terminating activities on its own accord. The system functions using the basic building blocks defined as 'matter'. This produces possibilities for the system to create activities, processes, applications or even other systems and enable their functioning using basic building blocks such as data, functions, algorithms, language, rules, applications, snippets, excerpt, grist, particles, atom, iota, resources for physical matter etc. The above-mentioned building blocks can also be transferred through any medium into the system. The system also provides reusable design patterns and automatically maps the patterns to create forms for the matters for creating projects of different kinds such as practical, conceptual, logical, functional etc. The system leverages its proprietary comprehensive storyboarding, articulations functionalities and methods to capture the creative users' thoughts, imaginations, requirements, actions, narrations, gestures and transforms those into designs, models and implementation for a final product or project required by the user.
[00045] The system supports designing organization's functional architecture patterns and structure according to the business landscape and custom requirements as well as traditional architectural patterns. The underlying architecture of the system is robust and modular capable of supporting different patterns as mentioned above, or new patterns altogether. The ecosystem (The platform consisting of the system or the suite of systems) helps organizations and individuals to design and develop concepts, projects, products etc. to perform functions without limitations. Each individual system extends its functional and technological capabilities to fulfill the requirements of the users in the creation, design and development of concepts, projects, products etc. The ecosystem supports organizations for creating intelligent application, analytical environments and settings to support business management, organizations structuring, resources management, communications, transactions, functions, processes, workflows, models, data, policies, procedures, rules, protocols, security, governance etc.
[00046] The system supports both corporate/executive decision-based centralized and workforce and productivity based decentralized business architectural patterns taking governance, control functions, manifestos, security etc. into consideration. The system also aligns functional and technical architectural patterns, applications, functions, processes, models, and data as per the standards, styles, best practices, and guiding principles. The ecosystem may have component based decentralized modular application architecture that consists of categorically organized reusable assets such as; services, functions, functional aspects, objects, data structures, data, components, elements, properties etc. pertaining to layers of the system. The system also supports disparate and conformed application architectures where all forms (usable formats, visual formats, shapes, colors etc.) can be decentralized with metadata and data and programs to support independent operations of forms (cards) with visual representations, functionalities and data. They can communicate to the other components according to their configurations and setting. The system shall provide a virtual or physical consolidated data store that may act as a conduit to connect to all other databases to link all organizations data assets to a common pool for catering to the requests and as well as distributing the data to corresponding data storage using the distribution mechanisms. The systems follow principles that exist without forms and allow organizations to give forms to the matters based on their requirements and purposes. The system also provides proprietary user interface components and elements and styles and standards to gather and deliver information as per requests as queries pertaining to matters and represented in different forms (presentation - UI Form, Visualizations, Documents, grids etc.) as cards, decks, collections, relations and utilizations etc.
[00047] The system may also provide the ability to work with quantum computers by converting data in to integration friendly formats.
[00048] The following paragraphs provide a brief description about the construction, functions and methods of the overall system and its components.
[00049] The system comprises of Omni Presence Layer, I/O Processing Layer, Processing Layer, Omni Sense Layer (hereinafter may interchangeably be referred to as Omnisense Layer'), Omniscience Layer, Transient Layer and Data Layer. All the components of the system can work independently and work together as a system or a sub-system. These components can be in one environment or in different environments depending on the convenience, needs and performance requirements.
[00050] The Omni Presence Layer is enabled with a hot which uses Natural Language Processing , Machine learning techniques and Metadata that is gathered automatically from the system operations and also the metadata that is been inputted by various SMEs. This helps the users to work collaboratively on all phases of the system.
[00051] The Omni Presence Layer comprises of Presentation Module, Collaboration Module, Interface Module and Integration Module. The Omni Presence Layer can receive and manage any type of input. The input can be of various forms such as, computer queries, commands, natural language interactions, multiple data file formats, gestures, signals, mathematical representations, etc. Apart from this, Luturistic language integration is also possible in the system, it dynamically adapts to the futuristic language that may be integrated and it enables users to interact seamlessly.
[00052] The inputs can be provided by the users, other applications, external devices, etc. The system is capable of receiving input of any kind in different formats and can store them in a unified structure. The system can collaborate with application data or system data and the data obtained from the role-based communication between the users in order to generate highly intelligent and accurate analytics, which helps in improving the performance of the system.
[00053] The system is capable of receiving any type of input through multiple mediums such as audio, video, files, databases, etc. and is also capable of storing the data in a unique flat file structure that allows flexibility to fetch and access the data at a high speed. Specifically, the data can be structured, unstructured or semi- structured.
[00054] The Presentation Module makes intelligent interactions between the user and the devices seamlessly by providing the ability to use any methods (Computer Vision, User Interface, Sensors, Touch Devices, in-built Bots with Voice Processing, etc.), modes (Visions, Gesture, Voice, Text, etc.) and language (Colloquial, Syntactical, Mathematical, Visual, Textual, etc.) of communication to Digital Landscaping, Digital Transformations, Digital Mining, Analytics, etc. It has the ability to interact with any form of data in a multi-dimensional way (3D and beyond) and displayed as holographs, augmented reality etc. The Presentation Module requires revision on bits in to images and vice versa and comments it out to get full meaning and navigates the picture mathematically by vectors. The Presentation Module also has the ability to provide multiple User Interfaces depending on the roles of the audience or the components of the product suite. The Interface Module enables interactions such as alerts, data transfer, triggers, etc. with the external applications like Mobile Applications, Wearable Devices, Desktop Applications, etc. The Presentation Module enables user interactions with visual experience to visualize many different items including navigation of the collection of databases; be it objects, matters, measures, queries, users, etc. and also provides the ability to create, modify and delete the data. The Collaboration Module enables interaction among multiple users and the system in the instance of group discussions. The Collaboration Module with the help of the Communication Module facilitates users to communicate, collaborate & socialize intelligently with the organization, the user groups or the other users of the system. With a unique combination of principles, functions, methods and user experience, the Collaboration Module provides seamless communications in an intelligent and flexible way in group discussion scenarios such as; chat, email, meetings, presentations etc. The Integration Module serves as the gateway to transfer bulk or real-time data or data from external sources.
[00055] The I/O Processing Layer receives the input, validates, translates, transforms, delivers and provides the output. The I/O Processing Layer utilizes its Cognition in areas such as user interactions, voice-based questioning or requesting, querying, NLP etc. to provide highly efficient and fast Natural Language Processing capabilities. The I/O Processing Layer uses a unique combination of compression, decompression, encryption and decryption techniques and also keeps track of the previous content, context and connotation in order to understand what one wants or says, for the System to respond appropriately.
[00056] The I/O Processing Layer comprises of the Receiving Module, Transmission Module, Enrichment Module, Translation Module, Transformation Module, Distribution Module and Rendering Module. The Receiving Module receives any kind of data from multiple mediums and performs the extraction operation. The Extraction phase, receives both the structured and the unstructured data efficiently from various sources like live connectors using multi-threading and parallel processing techniques, where the unstructured data is analyzed and recommended with a suitable schema by finding the attributes, dimensions and measurements using various techniques like Top Down Approach and Bottom Up Approach. The Transmission Module enables Signal Processing by the System. The Enrichment Module parses, infers, validates the input and identifies the most important features of the data by removing ambiguity and thereby cleans the data. The Translation Module enables translations from one language to another. The Transformation Module provides ability to transform one Data Structure or Data Type to another. The Transformation Module also performs Auto-Modeling of data with the help of hot (Artificial Intelligence) from scratch or with minimal manual intervention using its proprietary approach, functions and methods. It uses libraries, functions, methods and algorithms; with intuitive Visual Modeling tools using Industry Libraries, Taxonomies, QAs, etc. to graphically view and model the data, objects, information, visual components, etc. Further, The system applies intelligence from the omniscience layer, to recommend innovative transformation techniques by inferring insights from the extracted data. The system uses advanced logics to maintain the log of the transformation, thereby preventing the system from replicating the data on every iteration. The Distribution module performs distribution of queries to various data stores and machine resources and systems for optimum parallelism, to support multiple views, concurrency, failover recovery, scalability, replications etc. The system shall utilize its own techniques, functions, methods and algorithms to improve the efficiency of the distributed mechanisms to support transactional, analytical processing and operations. The system may use its own hardware, data storage and operating systems to enhance the system to advanced level to scale and provide greater performance. The Rendering Module is capable of rendering or sketching a visual using the properties such as color gamut/palate, animation behavior, visual elements such as shapes, dimensions, perspectives, etc. The Rendering module supports Auto-Visualization, which understands the pattern and insights of the organizational data and provides the suitable visual appearance of the data and outliers in the form of charts by applying intelligence from the omniscience layer. The system allows the user to interact, navigate and manipulate the visualizations dynamically using voice, gestures etc. The charts created by Auto-Visualization can be easily shared to different people in the domain and the privileges for the chart can be customized by the owner. Auto- Visualization uses mathematical formulae to align the complex visualizations thereby preventing them from overlapping with each other. The Auto-Visualization feature automatically recommends charts and visualizes the data based on the type, structure of the dataset, historical data of the users and domain knowledge from SMEs. The feature uses State-of-the-art recommendation algorithms that combines the metadata of the dataset and a user preference and interest with similar users in the same domain working on the similar datasets and visualizations. Auto Visualizations visualizes the data with better animations and insights which helps the user to produce actionable decisions for the business organization. The system creates insights from the visualizations by analyzing the patterns and detecting outliers/ anomalies based on the distribution of the data points.
[00057] The rendering module also supports the Dynamic Programming i.e. the system automatically generates the program codes for the given prototype with appropriate hierarchy and style with the help of the data layer and the processing layer. [00058] The Processing Layer enables communication among the different layers of the System and provides access to data and the services based on the privileges and the subscription of the user. It comprises of Security Module, Communication Module and Micro-Service Integration Module. The Security Module includes the Authentication Module, which authenticates the user credentials by enforcing security protocols, handles User Management and overall security of the system. The Security Module has a framework for Audit Data Capture, which captures the audit details of every table or transaction such as the sender, the source, the medium, the time of transaction, etc. The audit data for every data stored in the database such as the time, method, user details, etc. of the data deletion, addition or modification is also captured. The Communication Module provides directions to communicate with different Layers. The Micro-Service Integration Module allows or denies access to the different layers of the system based on the subscription of the user.
[00059] The Omni Sense Layer (hereinafter may interchangeably be referred to as Omnisense Layer') provides intelligent responses for any given request. The Omni Sense Layer comprises of Recommendation Module, Monitoring Module, Automation Module, Intelligence Module and Orchestration Module. The Recommendation Module is capable of providing suggestions or recommendations based on understanding of the patterns and preferences, interests of the users and suggests next set of actions that might be significant to the user. The Recommendation Module is also capable of automatically providing recommended inputs with respect to designs, illustrations or the content of the images for creating new applications. This may be performed by applying its acumen or omniscience overarching intelligence to give various options and benefits. The Monitoring Module constantly monitors the data for patterns, trends and anomalies, etc. The Automation Module gathers input from various modules and automates processes autonomously.
[00060] The Intelligence Module comprises of Artificial Intelligence Module and Cognition Module. The Artificial Intelligence Module performs functions such as; pattern recognition, determination, investigation, perception, analysis, etc. The Artificial Intelligence Module performs Search Driven Analytics, which enables the users to perform analysis on the own data using simple search queries in the natural language. This Analytics identifies the keywords in the queries using a Natural Language technique called keyword extraction and performs the analysis given by the user on the data. The Artificial Intelligence module has also incorporated an important feature called Automated Machine Learning (Auto-ML) which automates the whole process from data preprocessing to model selection. In the data processing phase, the system imputes the null values, detects the outliers in the data, Normalizes the values, removes special characters, handles mismatching data types and validates the data. In feature Engineering phase, the system chooses the best features for the prediction based on the relationship between features and the target. Further, the system chooses the best algorithm to perform analysis and Prediction using hyperopt which includes the parameter optimization of the algorithm to produce better accuracy.
[00061] Artificial Intelligence module has also got population based deep learning technique, which helps to choose the model with the best parameters for it. In this method random number of nodes are populated and all the nodes are trained with the random hyperparameters independently and the nodes with the higher performance are chosen at the end of the training. This training is done iteratively and the best nodes for the models is chosen and used by the system. The nodes that are chosen acts like objects that can be mapped dynamically with the other nodes using external rules and logics. Each node has the ability to perform desired actions and has appropriate properties which can be altered and remapped dynamically. The system also has the ability to visualize the nodes and the architecture of the neural network and allows the user to interact, alter and manipulate the architecture through voice, gestures etc. The Cognition Module of the Intelligence Module understands the context, user preferences and provides intelligence to the various components of the System. The cognitive module acts like consciousness with the subconscious programs, neural networks, processors and memory to provide spontaneous and optimal intelligence for complex scenarios. This could be applicable for decision making at the higher level of the organizations by aggregating lower level/ grains of information across the organization in varied disciplines or subject matters and subject areas. For example, human resources productivity is based on many factors such as psychological, physiological, environmental, motivational factors, however; the energy required to complete a task can be measured and to figure out the possible causes for the lack of energy or high energy relies on taking other factors also into consideration. Most often in such cases, mathematics, logic, analytics pertaining the corresponding scientific areas need to be applied to complete the analysis to provide a comprehensive and appropriate solutions. The Orchestration Module enables the connection of different modules in the Omni Sense Layer and the co-ordination of the execution of processes, functions, methods, etc.
[00062] The Omniscience Layer provides the knowledge required for generating intelligence and the know-how for the System to function and to evolve over time. The Omniscience Layer comprises of the Time Machine Module, Library Module and Evolution Module. The Time Machine Module provides the ability to look back at the past and can also make predictions based on the past. The Time Machine Module considers various factors such as time zone, fiscal calendar, calendar holidays, information, communications and collaboration, which are integrated with cross-functional aspects of the organizations to make timely decisions or to provide suggestions. It is also capable of providing timely triggers for the different layers in the system. The Library Module provides various resources such as Language Libraries, Business Case Libraries, Code Libraries, etc. to the different Layers of the System. The Evolution Module provides the ability for the System to learn, train and evolve from the pre-defined rules and manages those rules automatically based on learning how Data Stewards handle data and metadata. The Evolution Module comprises of Training Module (such as a Training Module 254, as depicted in FIG. 2) and Learning Module. The Learning Module enhances the system knowledge, intelligence, acumen, etc. based on the past experiences, behaviors, patterns of the user and the System. The learning module also acquires the knowledge and the intelligence using knowledge graphs in which the rules created by subject matter experts, are defined as a group of objects and relations based on human acumen, ontology etc. The Evolution Module has various Machine Learning mechanisms & algorithms to learn matters on its own and learn from the learning data feeds provided by the System or directly by the users using the system's interfaces. The Evolution Module uses many techniques and algorithms such as and not limited to; Supervised Learning, Unsupervised Learning, Reinforced Learning, Cortical Learning, Sequence-to-Sequence Learning, Genetic Algorithms, Neural Networks, Tensors, Graph Theories & Dynamic Learning, Direct Learning, Sensory Based Learning, etc. Feature Engineering obtains various features from the domain knowledge obtained from SMEs, market trends and utilizes a neural network technique called Deep Feature Synthesis that creates new features from the existing features for prediction based on the primitive operations performed on the features. Feature Selection then selects the best features for prediction. Further, the system applies different machine learning algorithms on the input based on the size, distribution, structure of the data and choses the best algorithm or ensemble of many algorithms based on the performance of the model. The system evaluates the efficiency of the model using different performance metrics to test how good the model is by comparing the model output with the actual output and tunes the hyperparameters of the model using Tree Parzen Estimator (TPE) which optimizes the value of the hyperparameters based on the historical measurements for producing better prediction results. [00063] The Transient Layer improves the performance of the System by storing or maintaining the most frequent requests and user session details. The Transient Layer also manages backup copies of transactions and transformations. The Transient Layer comprises of Caching Module, In-Memory Module and Session Module. The System caches analytics and pre-defined reusable objects (Analytical Objects such as queries, reports, dashboards, datasets, etc.) for quicker response to requests. The system intelligently caches the information based on the most requested or used queries and responses to make the Response to Request cycle faster. The Search functions of the System looks up layers of intelligence such as; Subject Matter Expertise, Internal Intelligence, External Intelligence, residing in the Library Module and intelligently understands the context, intent, connotation and intonation of the requests and processes the response dynamically from its cache. The System also produces results from the cache/ temporary Memory (In-Memory), if the requested results are already processed. The Session Module handles global sessions based on user logs and also it validates the user session for every internal or external request or response.
[00064] The Data Layer comprises of Compression/Decompression Module, Auto-Organization Module, Master Data Management, Metadata Management, Log Management, Data File Management, Incident Management and Training Data. The Data Layer integrates the data from various heterogeneous sources and provides a unified view of data using Recommendation module. The System performs various analytics and querying on the data streams by considering various dependencies and relationship in data. The Data layer performs different types of modelling like Business modelling, KPI modeling, Architectural modelling, Data Modelling, Process Modelling, Question modelling etc. The high performance serverless, stateless architecture , eventful hardware systems then stores the modelled data on a cloud infrastructure where machines can be added or removed concurrently as a plug and play feature based on the requirement for the resources to store the data. The system has an event architecture where no history for communications or transitions is maintained. The System utilizes different Compression & Decompression techniques for space reduction and for hashing and indexing purposes to ensure faster performance of data operations. The Auto-Organization Module utilizes the Universal Libraries consisting of standard domain-specific naming conventions in the Library Module to organize the data autonomously. The Library Module consists of Value Chain Prepositions defined by the Subject Matter Experts to analyze the attributes of the data. The naming conventions are segmented based on data patterns with respect to language, uniqueness, confidentiality, security, etc. The naming conventions may also be segmented according to the requirements of the tenants or organizations according to subscriptions, agreements and mandates, etc. The non-universal values specific to a tenant or an organization shall be isolated, segmented and secured based on the security requirements. The Master Data Management handles the master data of every single organization and for all organizations individually using the multi-tenant system by classifying the real-time or organizational data using data attributes. These classifications are hierarchical and non-hierarchical in nature or patterns. The Master data enables eco- friendly operations and data processing, saving operational cost and energy by eliminating redundancies.
[00065] The Metadata Management involves the use of a unique model and data structure to store the informations of data that includes objects, relationships, properties, IDs, etc. or all in one format. The Log Management Module captures all transactions and their changes into logs or system audits for operations such as; rollbacks, undo, redo, purge, etc. The Data File Management provides the ability to save the data in various formats including documents, media (audio, video, etc.), etc. as it can convert the entire document into a file format quickly. The Data File management stores the images and videos in the form a binary string of 0s and Is and converts back the string to images and videos at the time of retrieval. The Data File Management prevents the read-write conflicts of the queries by creating a directed acyclic graph (DAG) scheduler, with read nodes having the read queries and write nodes having the write queries. Whenever the read query depends on the write query, a directed edge is created from the read node to the write node, denoting that the read operation should occur after write operation. The Data File Management also supports replication of data by creating different conceptual, logical and physical versions of the data to different transient or persistent repositories or databases. The incident management 276 (as depicted in FIG. 2) handles the data loss or failures, by maintaining a backup of the data and it will be replaced quickly without much delay or lag. The system maintains knowledge in various domains in the form of different libraries, algorithms, functions, methods, etc. as Training Data in the Data Layer.
[00066] The system with the help of the metadata facilitates the users to create a whole application and the application itself will be able to create a new application from it. The system metadata created by the SMEs are considered as nodes of objects and each node is capable of performing a functionality. The nodes can be mapped together to provide a combined and hybrid functionality and to automate various processes in the organization. These nodes can also be remapped, reused and manipulated as per the functional requirements. The system will also guide the operations and will manage the application in data. The system has the ability to auto-save all the actions, operations, transitions, data etc. is that is being processed by the user. The system has ability to become self-aware and simulate, synthesize, construct, reconstruct, solve, dissolve by itself through its intelligence and dynamic programming abilities. The system automatically goes to the power saving mode during its idle time and manages its power and its high performance hardware resources efficiently. The system has also got user metadata that has the descriptions about the various users that are mapped to one other. The system has got the ability to analyze, create whole reports and insights of the users based on the user metadata provided to the system. The system grows its metadata dynamically by acquiring beneficial data from various sources like user interfaces, system interfaces, user decisions, patterns etc. which allows the system to evolve over time dynamically.
[00067] On the whole this robust platform delivers Artificial Intelligence, business acumen, analytics, database solutions, interactions and collaborations; all from one high-modularized, componentized and compartmentalized platform available as in cloud & in-premise versions that can work independently as a local system for every organization or a global system for multi-national corporations with multiple levels of data security. The system is a high performance super computing system that works in the cloud and manages the data efficiently.
[00068] The system is very advanced, smart, and it consists of micro and nano services that can invoke communication with other applications through integration or automation. These services can be installed in any platform and can be interfaced with other applications. The same way, other systems can provide information using robust and secured gateways, frameworks and live streams to push data of various formats to the platform seamlessly. The system supports decentralized consoles that are integrated with its network through services. The system may package the data, voice, audio, vision, etc. and the computing energy such as memory and operations & instructions & Memory activated decentralized consoles or devices and reproduce results independently. Those devices or consoles shall work with solar and capacitors, transistors & memory. The system provides the ability to automate all tasks, operations, functions etc. This includes managing operations, processes autonomously using advanced planning, execution, monitoring and governing capabilities.
[00069] Moreover, the Dynamic Programming abilities of the system enable it to automatically create programs such as queries, codes, enumeration, iterations, execution and automation, to churn the data, apply analytics on it using the Machine Learning, analytical or statistical algorithms and produce results. The Top-down and Bottom-up approaches are not only used by the system for generating analytics, text mining, structuring or re-structuring the data but also for enabling the system to autonomously generate intelligence for creating applications with the help of schemas and relationships using a Dynamic Programming Framework. Furthermore, the system may also include the required objects, components, data, codes, variables, parameters to activate the messages received from the Collaboration Module of the system as matters to perform its work automatically in any compatible environment. These messages may have a lifespan, security and decay policies/ rules and automatically manage based on time and events etc. These configurations shall be managed setting its patterns. The system automatically does data recovery, the systems Master Kit Program have the ability to recover the components and build a new application from scratch.
[00070] It may be noted that a platform consisting of a suite of systems could be utilized by an enterprise or individuals to combine the technological and functional capabilities of the group of autonomous systems.
BRIEF DESCRIPTION OF DRAWINGS
[00071] Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[00072] FIG. 1 illustrates an exemplary environment where various embodiments of the present disclosure are implemented;
[00073] FIG. 2 illustrates a block diagram for a computer-implemented system to empower and enhance performance of an entity, in accordance with an embodiment of the present disclosure; [00074] FIG. 3 illustrates an exemplary information corresponding to a Library Module, in accordance with an embodiment of the present invention;
[00075] FIG.4 illustrates an exemplary database for storing information corresponding to various parameters, in accordance with an embodiment of the present disclosure; and
[00076] FIG. 5 illustrates a flow diagram of a method for empowering an entity to enhance performance thereof, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
[00077] Referring to FIG. 1 that illustrates an exemplary environment 100 where various embodiments of the present disclosure are implemented. As depicted, a plurality of entities, such as entity 1 102a, entity2 102b, ...and so on up to entity n 102c, are connected to a system 106 via a network, such as a network 104. Hereinafter, the entity 1 102a, entity 2 102b,... and so on up to entity n 102c may collectively be referred to as 'entities 102'. The system assigns privileges to entities based on their roles to access the data. The entities may include, but not limited to, an individual user, a group of individuals, an organization or a business unit, and an external application. Hereinafter, the term 'entity' or 'entities' may interchangeably be referred to as 'user' or 'users' respectively. Each user of the users 102 may access the system 106 through a User Access medium. Examples of the User Access medium (corresponding to each user) may include, but is not restricted to, devices, machines, peripherals such as laptop, tablet computer, smart phone, personal digital assistant, cell phone, personal computer, robots, gaming gadgets, and so on. The system may allow human body to interact with the system without using common input devices, such as a keyboard, mouse, controller, or any other voice-entry mechanism. The present invention provides a system and a method that can receive and manage any type of input and send it to the System's 106 Deciding layer 112.
[00078] FIG. 1 may be understood more clearly in conjunction with FIG. 2 that illustrates modules for (but not limited to) receiving inputs and generating analytics, insights, responses, alerts, etc. by using intelligence. Specifically, FIG. 2 illustrates a block diagram for autonomous information system, in accordance with an embodiment of the present disclosure.
[00079] As depicted (in FIGS. 1 and 2), the data received as input from the Omni Presence Layer 108 is fed into the 1/ O Processing Layer 110 (or the 'Processing Layer 110'). The 1/ O Processing Layer's 110 main function is to receive the input (data of any format) through the Receiving Module 210, validate, parse and enrich the received data with the Enrichment Module 214 without any data leakage and translate the data to system understandable format with the Translation module 216 and manages the data traffic and concurrency with the help of Distribution module 220 and finally render the complete set of data using the Rendering Module 222 and pass it on to the Deciding layer 112. Detailed description and working of each of the above-mentioned modules is explained below.
[00080] The Omni Presence Layer 108 serves to enable interaction between users or applications with the System 106. The system 106 may be a standalone intelligent system or alternatively may be utilized as a plug-in to provide intelligent output for other systems or applications. In an embodiment, the system 106 may be a computing device such as, but not limited to, a personal computer, a laptop, a mobile phone, a smart-phone, and so forth to generate and provide intelligent output. The users may provide inputs in the form of queries, commands, data files, signals, etc. through the System's Embedded User Interface that may be voice- enabled, gesture-enabled, video-enabled, touch-enabled or click-enabled or a combination of icons through its own proprietary or external peripherals or online through the Presentation Module 202. The system receives the input query from the user and analyses the relevant queries and responses and extracts the context of the conversation using sequence to sequence mapping and Memory Networks techniques, where the input received is fed to the encoder which vectorizes the input to a variable length sequence and processed using the context vector and produces the appropriate output sequence (response) in the decoder phase using the intelligence obtained from Omniscience layer 116. The system also handles images like gestures as input by creating a binary mask of the image, computing the contour and converting the images to bits which has the pixel information matrix. Further, the system evaluates the matrix on a Neural Network model trained on image dataset to identify the image. The Neural Network model have neurons arranged in three dimensions say width, height and depth which captures the pixel information of the image and slide the filter over the data to form the convolutional dot product of the image. The matrix is scaled down using pooling techniques and then redirected to the fully connected layer where the model learns the features of the matrix and recognizes the image given to the system. Furthermore, the system 106 has an intuitive Query Generation tool to facilitate Auto Computation and Auto Query building by understanding the patterns and preferences of the users and other supporting factors and recommends the best alternate queries, which facilitates the users with minimal querying knowledge to easily construct queries. The queries interact with the databases and fetch the responses in minimum time using multi-threading concepts, it processes the queries in parallel to avoid queues and also forms nested queries to perform the database query operations in a sequential manner. The system has the ability to receive large amount of data from the user and store it in chunks based on the system memory availability and also it automatically manages the system from overloading and there-by reducing the lag in the performance of the system. [00081] The Presentation Module 202 has the ability to provide different user interfaces for the different roles of the user or the components of the product suite. Further it has the ability to combine the power of voice, language, visuals and mathematics to facilitate a unique new way of experience to the users to interact with the System 106. The Presentation Module 202 has the intelligence to automatically display visual forms in the place of text and allows users to interact with data through highly intuitive visual-based navigation approach through icons, emoticons, dynamic visuals (pre-defined figures, contours, shapes and colors that may be automatically created by the system) to construct, browse and review the resultant data as visuals. The system utilizes homeomorphism techniques to relate keyword to a reference symbols or visuals to other forms so on and so forth and helps to transform one to another dynamically. This helps to communicate visually and as well textually. The system utilizes this to enhance the performance and the experience. The system learns itself to determine and predict thoughts and also reproduce figments of imagination or thoughts to desired visuals. One of the fundamental principles behind the System 106 is to provide an intuitive tool along with design patterns to achieve all of the above from an easy to use user interface that provides voice-based, gesture-based, touch-based interaction to accomplish the tasks.
[00082] The Presentation Module 202 co-ordinates with the Enrichment Module 214 of the I/O Processing Layer 110 for validation, and the appropriate Libraries from the Library module 248 of the Omniscience layer 116 depending on the type of input. The Presentation Module 202 provides an advanced User Interface and an intelligent set of functions to sketch visual representations and Illustrations as per user preferences. The Presentation Module 202 utilizes its intelligence to understand a specific user's intentions, thoughts, patterns in strokes, movements, develops perceptions and completes the illustration. The Presentation Module 202 also makes determinations by applying cognitive abilities provided by the Intelligence Module 234 and automatically starts sketching to complete what the user desires to complete. The Presentation Module 202 uses an intuitive, intelligent matrix to follow dots, lines and associations to figure out what one wants to sketch, such as; sketching icons, sketching illustrations for storyboarding, etc. The Presentation Module 202 is also capable of gathering inputs from the user in the form of voice, gestures, a combination of icons through online/ onscreen sketching and infers what one wants to accomplish by looking up the appropriate Libraries such as, Language Library 302, Visual Library, etc. and applies the mathematical equations, expressions, functions, methods, algorithms, programs, system commands, etc. to sketch automatically. For example, if the user provides the input in the form of a full or partial sketch, the Presentation Module 202 co-ordinates with the Enrichment Module 214 and visual, key-stroke geometry, language libraries, etc. from the Library module 248 dynamically to predict and validate the inputs.
[00083] The Collaboration module 204 of the Omni Presence Layer 108 uses industry standard messaging protocols or its own protocols and formats to support all types of communications and provides the ability to integrate into its own streams to allow messages to flow and be supplied and consumed by different services and system or functionalities. The system has intelligent logging mechanisms to log all messages and reproduce/replicate those messages depending on the requirements. In case of multiple users in group discussion scenarios, the inputs may be provided through the Collaboration module 204. The Collaboration module 204 allows multiple users to communicate with the System 106 at the same time through different mediums such as audio, video, chat, e-mail, Virtual Presence, etc. The users may interact with each other or with the System 106. The Collaboration module 204 receives the inputs from multiple users and associates the inputs with the user roles, history, Context of the discussion, etc. with the help of the Omniscience layer 116. The Collaboration module 204 may also receive commands from the Monitoring Module 242 to start a new discussion with the relevant users based on their roles, user privileges, availability, etc. With the help of the Enrichment Module 214, the Collaboration module 204 validates the inputs of the various users in the discussion and provides Dynamic Responses, Relevant Information, News, Root causes to problems etc. back to the users. In case an action needs to be performed by the System 106, such as setting up Events, Alerts, Objectives etc., the inputs are fed to the Automation module 244 of the Omni Sense Layer 114 (hereinafter may interchangeably be referred to as Omnisense Layer 114'), which gathers input from various modules and automates processes like modelling, analytics, prediction etc. autonomously. The Interactions made among the users and with the System 106 flows as inputs to the 1/ O Processing Layer 110 and is processed in a similar fashion as the Inputs from the Presentation Module 202, as described in the below paragraphs. Apart from the user inputs, the External Applications, Device data (IOT), data from other Technologies, Products, Platforms and Environments may also provide inputs in various formats using robust and secured Gateways, Lrameworks, Live streams, etc., and execute Processes, Jobs, Tasks, and monitor them through the Interface Module 206 using advanced Connectors. The Integration module 208 has the ability to create a new environment and manage the data, migrate or import data such as; data from the databases partly or entirely, and apart from that it also integrates data from heterogeneous sources to the system 106 using its one or more proprietary Data Migration tools, Integration tools and techniques. The Integration module 208 has the ability to load the data faster by predicting the Domain-specific data with the help of the Library module 248. This way, the time taken to load all the values is reduced. The Integration module 208 utilizes a combination of techniques to traverse and scan the records in the most optimal way to map and extract the values to load the data. These techniques may include but not limited to, using Validation of the data, Combinatorial Principles and Probabilities of the combination of the Data, Time and other Dimensions based combinational patterns, hashing and mapping, compressions, decompression, predictions, etc. The Integration module 208 also consists of many reusable components and elements such as; Operational Calendar, Scheduling and Process Maps Designer, Simulation, Error Handling, Reporting and Notification, etc. The Integration module 208 works with the Receiving Module 210 in an integrated way for the Administrators of an Organization to perform Planning, Strategizing, Complex Execution Planning and to manage all processes, jobs, tasks, calendars, events, and logistics efficiently with the help of a Bot (External application) or manually through the Presentation Module 202.
[00084] The inputs received from Presentation Module 202, Collaboration module 204, Interface Module 206 or Integration module 208 is sent to the 1/ O Processing Layer 110.
[00085] The input received in any language such as Notation or Symbol based Language, Visual based Language, Business Automation Language, Code Word Language, Body or Gesture Language, etc. from any one of the modules of the Omni Presence Layer 108 are sent to the Receiving Module 210 of the 1/ O Processing Layer 110. The System also receives input in the form of Mathematical Query Language, which takes the mathematical formulae as input and performs the querying on the distributed database by returning the appropriate result. The Receiving Module 210 is capable of encrypting or decrypting the input data received from the Omni Presence Layer 108.
[00086] In case the input received by the 1/ O Processing Layer 110 is in the form of a signal, the Transmission module 212 processes the signals. The Transmission module 212 processes data with the help of its custom functions, methods and programs by applying digital and analog Signal Processing principles to receive, to process and to transmit the real-time data and signals to different systems, users and organizations. This can be combined with particle data to interface with or leverage Quantum Computers and storage mechanisms. The Transmission module 212 process and converts the received signals to system understandable format and sends it to the Deciding layer 112 of a controlling layer 113. [00087] The Inputs received by the Receiving Module 210 are validated by the Enrichment Module 214. The Enrichment Module 214 has the ability to browse and query Objects (Docs), Content (Topics, Taxonomies, Data), Audio, Video and Signals. The Enrichment Module 214 processes user queries or commands (Natural language or Query language) and bulk data differently. In case of user queries or commands, the Enrichment Module 214 receives the input in the form of requests and determines the correct queries (misspelt table or field names, system commands, incorrect syntax, typographical errors, etc.) using the metadata in the Data Layer 118, looking up the corresponding Semantic, Syntactic Libraries from the Library module 248 and by applying cognition obtained from the Cognition Module 238. Search Driven Analytics in the system allows the users to ask questions in a natural language which gets converted to data queries upon which analytics is performed. In case of bulk data import; be it structured or unstructured data the Enrichment Module 214 receives it from the Integration module 208 and segregates it using separators with the help of the Syntactic library 304 from the Library module 248. Further, the Enrichment Module 214 cleanses and synergizes the input to maintain the accuracy and the integrity of the data based on pre-defined set of rules and intelligence derived from the pattern learning and cognition provided by the Cognition Module 238. The Enrichment Module 214 also utilizes a unique proprietary Hashing technique to compress keywords and values and uses the keys to map to the values and keywords for handling Data Migration through the Integration module 208 to load huge volume of data at a high speed efficiently from the Data Layer 118. The Enrichment Module 214 provides the ability to the users to construct, create or define rules that are applicable to Metric Calculations, Data Validation, Cognition, Decision, Regulation, Administration, Automation, Communications, Behavior, Application, Learning, Memorization, Articulation, Action, Determination, Pattern Recognition, Perception, Reflex, Visual/ Textual/ Audio Representations, etc. using the functional or math elements such as operators, functions, in the form of notations, symbols and icons together in the most succinct way that the users can comprehend and relate to the logic, the construct and the context of it. In case the input is ambiguous even after validation, the Enrichment Module 214 will iterate the process of receiving and validating the input by understanding the context, connotation, intonation, the intent, etc. using the Cognition Module 238 till it receives sufficient input from the user. For example, if the user provides an incomplete sketch, the Enrichment Module 214 recognizes what the user desires to sketch and completes the sketch using visual, key-stroke geometry, language libraries, etc. from the Library module 248. In another example, the Enrichment Module 214 associates the input with the context, the role of the user, the direction of discussion, etc. in a group discussion.
[00088] Further the Enrichment Module 214 parses the data and sends the data query or command to the Translation module 216 and the data will flow to the Transformation Module 218.
[00089] The Translation module 216 may receive the validated query or command in any language from the Enrichment Module 214 and translate it to a Unified Language with the help of Semantic, Language Libraries, etc. from the Library module 248 and also connects with the Cognition Module 238 to extract keywords from the query using natural language processing techniques and to understand the user specific language, acronyms, colloquial, typographical errors, key-stroke errors, etc. The Translation module 216 has the ability to create its own knowledge base using Knowledge Frameworks by converting constructs automatically into questions, phrases, clues, answers and responses and condensing them into Short- Forms. The processed query or command from the Translation module 216 flows to the Distribution module 220.
[00090] The Transformation Module 218 may receive the input from the Enrichment Module 214 and may transform the data structure, data type or data representation from one to another such as; SDR (Sparse Density Representations), SVD (Singular Value Decompositions), Vectors, Tensors, Data from relational database format, Graph based data format, etc. The knowledge graph showcases the connection between the function and sub function of each domain. The data in each domain is been categorized and clustered and fetched in the knowledge graph based on the queries.
[00091] The Transformation Module 218 performs Auto-Modelling using various techniques such as and not limited to Topic Modeling, Entropy based Modeling, SDR (Sparse Density Representations) based Modeling, Entity Relationship Modeling, Object Modeling, and Clustering & Segmentation based modeling, Custom Hierarchical Clustering-based Modeling, Inference & Rationale-based Modeling, Visual Contour & Direction Based Modeling, Vectors & Tensor Based Modeling etc. and automatically models the data. The system provides the ability to package models along with information such as; programs, codes, mathematical equations, functions, images, parameters, behavior patterns, instructions, manifesto/rules etc. by applying integrated science from omniscience layer depending on the cause/nature of the model's output that is supposed to be produced. The system's artificial intelligence uses its cognitive abilities combined with the language to transform the intentions, desires and the requirements and determines the purpose. The Transformation Module 218 proposes a schema not only based on the labels, explicit relationships, etc. but also by analyzing the implicit relationships among the data. The Transformation Module 218 utilizes different components such as Data Dictionary, Naming Convention files, supplied by the organizations or the users or else the system applies its own intelligence and maps physical table names, column names, etc. automatically to pre-defined Global Libraries from the Library module 248 using advanced techniques. The Transformation Module 218 allows users to make changes to the data structures and uses its core Intelligence in such a way enabling its model to connect to the underlying data structure. The system will store the data in a unique format as well as in the inputted format. The Transformation Module 218 also provides abilities to the user to compare data in multiple ways along with the combination of Visual Representation of data through graphs, charts, grids, maps and as texts.
[00092] Further, based on the industry domain, the Recommendation module 240 (of the omnisense layer 114 which is a sub-layer of the controlling layer 113) may provide suggestions for modelling the data. For example, the users can model the data based on their preferences or they can use the system suggestions. The Rendering Module 222 renders or sketches a visual using the properties such as color gamut/ palate, animation behavior, visual elements such as shapes, dimensions, perspectives, etc. The transformed data structure and the suggested data model are rendered by the Rendering Module 222. The transformed data flows to the Distribution module 220.
[00093] The Distribution module 220 receives the processed query or command from the Translation module 216 and the transformed data from the Transformation Module 218. The Distribution module 220 will process the query or command from the Translation module 216 and checks for the best alternative query or command. In addition to this, the Distribution module 220 generates Mathematical Equations for the query. Further, the Distribution module 220 chooses the best method of processing the query or the command based on optimal performance. The Distribution module 220 distributes the transformed data from the Translation module 216 using concurrency control techniques. These techniques handle multiple requests from the users and distribute the resources parallelly without holding them in the queue which helps in dividing and managing the processes when more than one process runs simultaneously.
[00094] The processed input is then sent to the Deciding layer 112 that is a sub layer of the controlling layer 113. The Deciding layer 112 decides the course of action based on the implicit or explicit commands received. (Example: Command, Data import, etc.). The Communication module 228 co-ordinates with the Authentication Module 226 of the Security module 224 for getting Authentication to access the data and the Micro-Service Integration Module 230 for getting Authentication to access the Services. The Security module 224 of the Deciding layer 112 has a set of templates and logic to enforce proper Audit Data Capturing and has a unique combination of encryption and decryption techniques, models, mathematical equations, algorithms and programs to enforce and manage security. The Micro- Service Integration Module 230 manages user subscriptions and restricts the user access to the different layers of the system 106. The Micro-Service Integration Module 230 also helps the system 106 to work with independent layers, components, modules, etc. If the command is for data storage, the Deciding layer 112 will redirect to the Data Layer 118 through the Communication module 228. If the request or command is for query, analytics, etc. the Deciding layer 112 will redirect to the Omni Sense Layer 114 through the Communication module 228. The above said process is performed with the help of Multithreading and Parallelism techniques. Queries will be routed to specific data stores and structures depending on the nature of the input type, format, query etc. If it is a select or aggregate or analytical read only type of queries, then those will be routed to efficient data structures and the operations like update, delete, and insert may be routed to other proprietary data structures.
[00095] Once the command for data storage reaches the Data Layer 118, it will check for the best data structure in terms of Dimensionality Reduction to organize and store the data based on domain, context, etc. by referring the naming convention from the Library module 248. Different organizations look at different information selectively that are very specific to each organization based on needs & security. In such case; the system 106 provides the ability to store organization specific data in a secured way that information is segmented and segregated appropriately with private keys and rules to ensure privacy and access control. The data queries given by the user reaches the query parsing engine which checks for the validity of the query using shift reduce parsers. The validated queries then reaches the caching layer in which a graph structure is maintained, where the queries are mapped with the information that is used to retrieve the data from the database based on the metadata provided by the Metadata management module. Initially the cache layer will be empty. On every cache miss, the system directly goes to the database, retrieves the information and maps the queries in the caching layer graph using dynamic binary tree structures and graph coloring algorithms. A ranking algorithm is applied which ranks the queries based on the time and frequency of access. The System prunes the graph by deleting the inactive query nodes when the graph size crosses the cache buffer size. The Caching layer prioritizes the most frequently asked queries and keeps the corresponding query nodes in the caching layer for faster access. The graph in the caching layer provides recommendations and suggestions based on the Tridac Closure Relationships. The Auto-Organization Module 266 discovers the data organization process and gathers all statistics (360 degree) about the input such as; width, length, and depth, quality metrics etc. and creates an internal model automatically. The Compression/Decompression Module 256 utilizes its advanced Data Compression Framework to compress the incoming data and enhances the performance of the system 106 by ensuring reduced data storage. It consists of; layered compression techniques, which are capable of quickly reducing the size of the data to a great extent by using advanced standard algorithms such as LZ77, arithmetic encoding, etc. The custom combination of functions, methods, mathematical equations, algorithms help to reduce storage space occupied by the data to its maximum possibilities. The compressed data is stored as files by the Data File Management module 264.
[00096] The Metadata Management 260 of the Data Layer 118 creates and updates the Metadata based on the data organization structure. The Master Data Management 258 framework is responsible for maintaining and updating the master data containing the location details of the stored files which helps to process the query or command for accessing or retrieving the stored data files when needed. Furthermore, it creates multiple duplicates of the data files in the system using the Data Recovery Framework to facilitate the user to recover the data during data disaster. The Data Recovery Framework of the system may have a Master kit program that can create a new full application from the scratch. For example, if the front-end application or the server product crashes and it cannot be recovered, then the Master kit program will reinstall the required components and create all the necessary components such as data, codes, forms and essentials required for the system to come back alive. This may be applicable for in-house (Not cloud) instances. Further, the Data Layer 118 maintains and updates the log details of all the actions or transactions performed with reference to the Time Machine Module 246. The Log Management 262 utilizes its Change Data Capture Framework to capture all changes intelligently with its proprietary data structure automatically and it also combines with Time Machine Module 246 to provide the ability to look at data as-is, as-was, as-to-be and at different time frames specifically. The Log Management 262 combines the power of Change Capture and Distributed Mechanisms and storage for providing real-time updates or feeds. The Log Management 262 also captures the transactions and their changes into logs for operations such as; rollbacks, undo, redo, purge, etc. Finally, the Data Layer 118 gives the response as success message or error message through the Rendering Module 222.
[00097] The Omni Sense Layer 114 receives the input from the Deciding layer 112. All the activities needed to be performed to process the input are controlled, coordinated, executed and managed by the Orchestration module 232. The Orchestration module 232 comprises of controllers that control the flow of the requests, routers that channelize the requests and responses, governors that prioritize and optimize the requests and responses and supervisors that ensure the request completion, logging, threads and sessions. The Orchestration module 232 may receive the command implicitly from the Omni Sense Layer 114 or explicitly from the Deciding layer 112 and it will redirect to the concerned modules of the Omni Sense Layer 114. The Orchestration module 232 determines whether the command requires intelligence from the Intelligence Module 234 or only data retrieval from Data Layer 118. For generating intelligence, the Intelligence Module 234 analyzes the related data from the Data Layer 118 or the Transient Layer 120 using various approaches such as Top-Down approach, Bottom-Up approach, etc. to create kinesis of data. The data will be available in the same stream and the autonomous system, analyze and fetch the required data based on the query. The Orchestration module 232 may check whether the request is already present in the Transient Layer 120. The Caching Module 270 of the Transient Layer 120 stores the data for the most frequent requests made by a user and directs towards the appropriate libraries, knowledge etc. through the Deciding layer 112 for faster recommendations. The Session Module 274 provides session information for a particular user to the Caching Module 270. In cases where similar requests are made by multiple users frequently over a period of time, the In-Memory Module 272 of the Transient Layer 120 aggregates such requests, gathers appropriate data and directs towards the corresponding libraries, knowledge, etc. In case the request needs intelligence, the Orchestration module 232 sends the recommendations through the Recommendation module 240 for the most frequent requests based on data from the Caching Module 270 and the In-Memory Module 272. If the request is only for data retrieval, the Orchestration module 232 sends the data from Caching Module 270 or In-Memory Module 272 to the Rendering Module 222. In case the request is not present in the Transient Layer 120, the Orchestration module 232 requests the Intelligence Module 234 for processing the new request.
[00098] The Intelligence Module 234 requests the Orchestration module 232 for necessary data to generate intelligence. The Intelligence Module 234 may generate intelligence with the help of mathematical logics and regular expressions that enable the system 106 to work in Top-Down and Bottom-Up approaches with abilities to provide interesting possibilities to work with the data and also ensures smooth, precise and concise interactions. The Top-Down and Bottom-Up approaches used in autonomous systems for generating analytics, Text Mining, structuring or re- structuring the data and also for enabling the system to autonomously generate intelligence for creating applications with the help of schemas and relationships using a Dynamic Programming Framework with the help of Rendering Module 222 which is capable of dynamically creating the program or codes on the fly to create apps, user interfaces, databases, functions, processes, methods, etc. The Top-Down approach involves automatically understanding the landscape, model, structure, labels of the data and derives significance of the business imperatives, criticalities and intricacies associated to various aspects of business that are present in the data and proactively provide deep insights from the data and recommendations to drive business decisions and solutions by determining the issues and ways to solve business problems The system has the ability to transform information and models to various forms; Example: system converts a processes, data models and information etc. to applications automatically or with minimal inputs.
[00099] The Intelligence Module 234 may use a Bottom-Up approach of understanding the implicit context of data as values, labels, relationships and associations to language, classifications, etc. The Bottom-Up approach also has the ability to automatically apply different metrics or arithmetic calculations (Perform different functions, logic, operators) to aggregate the data based on various possibilities by digging or mining through the Atomic levels of the data in the Data Layer 118. The Intelligence Module 234 aggregates the data by understanding the implicit values, association, correlation of values and values to labels (Field Names) and derives the context by itself and maps the Context Reference Libraries from the Library module 248. In addition to the Top-Down and Bottom-Up approach, the Intelligence Module 234 may also request the Orchestration module 232 for knowledge provided by Subject Matter Experts (SMEs) through the Library module 248 for generating intelligence. The Intelligence Module 234 receives the related data based on Top-Down, Bottom-Up approach and knowledge provided by Omniscience layer 116. The Artificial Intelligence module 236 of the Intelligence Module 234 performs the appropriate tasks such as identifying the problems, identifying required KPIs (Key Performance Indicators), Feature generation, selection of algorithms, Clustering, Predictive analysis, Correlation, Semantic analysis, Scoring, Ranking, etc. using the appropriate libraries such as Semantics, Machine learning libraries, etc. from the Library module 248. The Artificial Intelligence module 236 also handles the dynamics of the organization's movement from one state to another by tracking the Slowly Changing Dimensions (SCD) of the data in the Data Layer 118 which helps the Artificial Intelligence module 236 to understand the changes happened from one state to another in terms of all interactions, activities, transactions, emotions, sentiments, decisions, movements, changes etc. The slowly changing dimensions creates a log which has entry for every activity, transaction etc. which helps to understand the behavior, intent of the user data flow. At times, the flow will also be useful to roll back the previous states and the actions.
[000100] By understanding the changes, the Artificial Intelligence module 236 determines the state by taking the matters, measures and efforts, position or coordinates time, etc. and applies physics from the Science library and determines various factors and the relevant information pertaining to the state in order to make better determinations, judgments and predictions. The Artificial Intelligence module 236 uses the Training Data 268 from the Data Layer 118 initially to train the various algorithms present in the Intelligence Module 234. The trained algorithms act on the stream of gathered data from the Data Layer 118 to generate intelligence thereon. The algorithms constantly learn from every interaction such as clicks, gestures, user conversations, etc. and understand the context, topic, significance, etc. and update its knowledge, intelligence and acumen through the Evolution Module 250 of the Omniscience layer 116.
[000101] The Monitoring Module 242 continuously monitors the system 106 by using its own techniques, functions, methods and algorithms and pre-aggregate and cache data based on usage patterns and also identify probabilities based on the implicit patterns of the data and policies, procedures, perceptions etc. This enhances the analytical performance of the system. The underlying data changes are automatically applied to the aggregates. This is further extended to managing the already saved or viewed reports or cached datasets by updating or refreshing those when the underlying data is changed at the lowest levels. Furthermore, the Monitoring Module 242 automatically monitors and detects anomalies and sends implicit commands to the Intelligence Module 234 through the Orchestration module 232. The Monitoring Module 242 does real-time monitoring and provides Proactive support with the help of the Time Machine Module 246 by automatically rolling time periods. When an anomaly is detected it passes a command to the Intelligence Module 234 through the Orchestration module 232 to find the root cause of the anomaly. The data Choreography is been done by the orchestration module. The Intelligence Module 234 uses techniques such as Correlation, Pattern recognition, etc. to identify the positive and negative impacts of the anomaly. The Intelligence Module 234 may use Top-Down, Bottom-Up approaches to retrieve related data. The Monitoring Module 242 considers the personal (Employee of the organization) or the professional (Organization) factors such as, effort, energy, rate, measure, force, gravity, stability, strength, environment, situation, fate, destiny, etc., to identify the positive and negative impacts. The positive impact is identified with the help of variables like Dreams, Ambitions, Goals, Desire, drive etc., and the negative impact is identified with the help of variables like Deficiency, Depression, Delirium, Disastrous, Destitute, etc. The Monitoring Module 242 also provides the ability to the users to manage or control the factors to detect the anomalies based on the emerging market trends and the user behavioral pattern and intents. In case of positive impact, the Artificial Intelligence module 236 provides the details of the root cause to the Learning module 252 to evolve. The Artificial Intelligence module 236 also determines the possible impacts that the business may have or might go through and proactively informs the key business or organizations' key executives to take charge of the situations. In the case of negative impact, the Artificial Intelligence module 236 identifies the problem and automatically investigates it to find the root cause based on business situations by understanding the patterns and trends present in the data using the intelligence received from the Cognition Module 238. The positive and negative impacts are determined not only by the performance of the organization, but also intangible values such as customer satisfaction, employee wellness, ethics, standards, quality, stability etc. The Evolution Module 250 continuously learns from the actions of the users and the system in different ways to provide solution to a problem by observing different user behaviors, user choices, user patterns, etc. and storing them in the Omniscience layer 116. With the learned knowledge it automatically enhances its automatic designing abilities and update design patterns libraries for updating its skills and cognition.
[000102] The Evolution Module 250 has Self-Correction techniques to reform & enrich its knowledge by assimilating, ascertaining information from various different sources and primarily the internal sources and it will keep track of the intelligence derived and applied on situation or requirements and updates its abilities. In order to provide the right judgment and to take right decisions, the Cognition Module 238 along with the Evolution Module 250 considers various factors such as, the combinations of all experiences, lessons learnt, understanding the patterns, etc. The Evolution Module 250 continuously evolves from the global level to the local level of patterns corresponding to the best practices and standards.
[000103] The system 106 applies cognition on top of the intelligence that is generated by the Artificial Intelligence module 236 for Implicit and Explicit commands received from the Orchestration module 232. The system has the intelligence of all intelligence collectively, selectively, individually and creatively. The Cognition Module 238 of the Intelligence Module 234 has the knowledge to perform actions such as, but not limited to making investigations, determinations, developing Perceptions, Emotions etc. The Cognition model learns and acquires knowledge from emotions, thoughts, sentiments and experience. This model mimics the human thought process by including various technologies like Deep Learning, Sentimental Analysis and Natural Language Processing. The cognitive model will then help the users by performing reasoning and decision making based on the acquired knowledge by applying inferences which are basically the conclusions drawn from the data based on the evidences and reasoning. The Cognition Module 238 has a Rationalization Framework, which consists of multitude factors, manifesto and rules defined to apply logic as per the cognition by learning the attitude and fortitude of the user and the magnitude and amplitude of the situation. The rationalization is based on relative, absolute or mixed mode of judgmental abilities, reasoning abilities and provides recommendations based on acumen or wisdom as to what is appropriate to perform depending on the situations and circumstances. The Cognition Module 238 also determines what repercussions or impacts such decisions might have on dynamics of the business with respect to the current and future. The Cognition module uses psychology to understand and analyze the emotions and the instinct of the user by applying sentimental analysis in natural language processing. The Cognition module uses epistemology to understand the nature and the behavior of the object and applies metaphysics to interpret the relationship between the objects .In order to perform the above-mentioned actions and thought processes, the Cognition Module 238 may request the Orchestration module 232 to extract the appropriate knowledge from the Evolution Module 250 and libraries from the Library module 248 of the Omniscience layer 116.
[000104] The Library module 248 may include various types of library data (as depicted in FIG. 3). Specifically, as depicted, the library data may include, but is not limited to, a language library 302, a Syntactic Library 304, a Semantic library 306, a Business Case Library 308, an Organization Structure Library 310, a Value Chain Library 312, a Service Library 314, a Rules Library 316, an Orchestration Library 318, a Document Library 320, and Simulation Library 322. The Library module 248 uses Reinforcement learning to understand the Business Strategies of the organization by analyzing the user data and models the KPIs according to the domain data which reduces the time complexity and increases the performance of the system. It clusters the KPIs based on their domains and helps the system to identify the domain and to process the function when the KPI query is given by the user The library module receives the data from various sources and formats from which insights are inferred, objects are identified as nodes, and the relationship between the objects are identified and established in the Knowledge graphs. The objects or entities are navigated from each other using the relations between them. The nodes are clustered based on the domain specializations and that improves the performance of the knowledge graph by reducing the accessing time of the nodes.
[000105] Throughout the process of generating intelligence and applying cognition, the Intelligence Module 234 needs to extract data from a past time or may require forecasts for a future time. The Time Machine Module 246 of the Omniscience layer 116 may provide the framework to roll back to a past time or look into the future as and when required by the Intelligence Module 234. In a particular example, a user request may require recommendations using forecasted data. Once the Intelligence Module 234 makes a request to the Orchestration module 232 for forecasted data, the Time Machine Module 246 is called into action and may create forecasts for the future in coordination with the Data Layer 118, Evolution Module 250 and Library module 248.
[000106] The Intelligence Module 234 balances the intelligence from various sources and passes on the generated intelligence in harmony to the Recommendation module 240 through the Orchestration module 232. The Recommendation module 240 may extract the most appropriate recommendations particular to the user, context, etc. by using the inputs from the Cognition Module 238. In the case of group discussions, the Recommendation module 240 understands the role and significance of the users, context and flow of discussion and extracts the most appropriate recommendations throughout the course of the discussion.
[000107] The recommendations from the Recommendation module 240 of the Omni Sense Layer 114 flow to the Rendering Module 222 of the 1/ O Processing Layer 110. The Rendering Module 222 transforms the recommendations into visual elements such as graphs, trees, Virtual Presence, Table, etc. using the libraries from the Library module 248. The Rendering Module 222 receives inputs from the Intelligence Module 234 of the Omni Sense Layer 114 for the selection of the Visual Representation and its elements such as color, labels, axes etc. based on user choices, preferences, etc.
[000108] The processed output is then sent to the Interface Module 206 of the Omni Presence Layer 108, which provides the output in the form of audio, video, Virtual Presence, text, table, sketch etc. The processed output could also be sent through the system's Interface Module 206 of the Presentation Module 202 to another application integrated with the system. In the case of a group discussion, the processed output is then provided to the users through Chats, e-mails, VR, graphs, etc. through the Collaboration module 204.
[000109] In an embodiment, data corresponding to each layer of the computing system, such as the system 106, may be stored in a physical storage layer 122 (as depicted in FIG. 1). Further, the physical storage layer 122 is depicted as an exemplary database 122 in FIG. 4 for storing various types of data such as data inputted by a user, data analyzed, data gathered from various sources, data predicted, intermediate data (as may be generated during processing) data evolved over time and data corresponding to various facts and rules. For example, as depicted, the data may include, but is not limited to, master data, Meta data, log data, library data, training data, transaction data, system data, file management data, analytical data, statistical data and user data. The data stored in the database 122 may be utilized (directly or indirectly) by one or more functional layers of the computer implemented system 106 to process an input query (that may be inputted or derived by the computer-implemented system 106) and to generate a suitable output intelligently for assisting the user in taking a right decision with respect to time, situation and other aspects corresponding to the user's life, business, research and goal.
[000110] FIG. 5 illustrates a flow diagram of a method 500 for empowering an entity to enhance performance thereof, in accordance with an embodiment of the present disclosure. The method may be facilitated by a system, such as the system 106, (utilizing hardware components) to understand the query of the entity and provide intelligent recommendations for empowering the entity to take appropriate decision for managing performance, corresponding to the query, of the entity. The method may be understood more clearly when read in conjunction with FIGS. 1-4. The order in which the method is performed is not intended to be construed as limitation, and further any number of the method steps may be combined in order to implement the method or an alternative method without departing from the scope of the invention.
[000111] At step 502, the method may receive an input query from at least one entity. The at least one entity may include, but is not limited to, an individual user, a group of individuals, organization, business, an application or a group of different types of entities. The input query may be received by an omni-presence layer of the system that enables interaction between the at least one entity and the system with a type of input. Herein, the input may interchangeably be referred to as 'input query'. The input query may be of one or more types such as a signal, data in natural language, bulk data, gestures, mathematical query, data structure, symbol, business automation language, audio and visual.
[000112] Further, at step 504, the method may perform one or more functionalities to parse, validate, enrich, transform and translate the input query from one format into another format based on a type of the input query. The one or more functionalities may be performed by an 1/ O processing layers (may interchangeably be referred to as 'processing layer') having one or more modules to process the input and an output corresponding to the input query. The one or more functionalities may include, but is not limited to, reception of the input query, signal processing, parsing, validation, translation, transformation, distribution and output processing. For example, if the input query is in the form of signal then the input query (hereinafter may interchangeably be referred to as 'input' or 'query') may be processed (through a transmission module of the system) to transform the input to an acceptable form (such as the system understandable form). The input received may be validated (such as validation by an enrichment module of the system 106) by determining the correct queries using the metadata and library data (as depicted in FIG. 4). Further, the functionalities may include cleansing and synergizing the input to maintain the accuracy and integrity of the data corresponding to the input based on predefined set of rules and intelligence derived from pattern learning and cognition (as may be provided by cognition module of the system 106). Further, in an embodiment, the method may facilitate the user to define rules and conditions corresponding to the input that may be utilized to understand the context and intent of the user with respect to the input query. Further, the method may iterate the process of receiving and validating the input until the sufficient input is received from the user. For example, if a user provides an incomplete sketch, the method may determine the intent of the user behind the incomplete sketch and also completes the sketch by utilizing cognition obtained through past, present and predictive future data.
[000113] Once the query is validated, in one embodiment, the query may be translated into a Unified Language by utilizing semantics, language library etc. and further to construct a knowledge base by understanding questions, clues, answers and responses corresponding to the query. In another embodiment, the validated query may be processed using knowledge graphs by transforming the data structure, data type or data representation, corresponding to the query, from one to another form. The method may propose a schema based on external relationship and internal relationship among the data. Further, the method may facilitate the user to change the data structures and connect its model with the underlying data structure.
[000114] Once the query is processed through translation and transformation, the method may generate mathematical equations for the query. Further, the method may determine a best method to process the query based on optimal performance. Furthermore, the data may be distributed to another layer of processing (such as a deciding layer of the controlling layer) using concurrency control techniques to handle multiple requests from one or more entities (users) parallelly to support multiple views, concurrency, failover recovery, scalability, and replications.
[000115] In an embodiment, the method may (by utilizing high performance stateless hardware) implement parallelism to enhance efficiency. For example, the system may be integrated with an external quantum computers/ hardware to perform large number of computations simultaneously. Accordingly, the method may be implemented by utilizing such integration with a quantum computer for high performance intelligence capabilities. For example, the quantum parallelism may be utilized for mathematical equations, simulations and other complex computations in collaboration with the artificial intelligence and cognition implemented by the method (associated with the system 106) for high performance capabilities.
[000116] At step 506, the method may analyze and redirect the input query, by the deciding layer, to at least one of the other sub-layers of a controlling layer for further processing (through a secured communication module) of the query. Based on the analysis of the input query, the method may decide a course of action based on implicit and explicit commands for further processing of the input query. Further, the method may redirect the query (through a communication module to other sub layers of the controlling layer) for further processing based on the decided course of action. For this, the method may implement multithreading and parallelism techniques. Also, the method may implement one or more security measures to ensure authenticated access to data and applications by an external entity or internal modules of the system.
[000117] The method uses cognition and performs all the processing intelligently and autonomously. In an embodiment, if the course of action decided by the method corresponds to handling data related to the query then the data may be managed optimally (such as by a data layer of the system) with ranking and log management to provide real time updates and feeds to the user.
[000118] In an alternate embodiment, if the course of action, for the query, corresponds to analytics then the method may prioritize and channelize the request corresponding to the input query and further ensure completion of the request corresponding to the query. The request may be channelized internally among various modules of the system.
[000119] Further, at step 508, the method may determine the need and level of intelligence required to process the query. Accordingly, the method may process the input query by generating intelligence, through an intelligence module. Herein, in an embodiment, the intelligence may be generated for creating applications using dynamic programming framework based on data provided by a data layer, transient layer and an omniscience layer (such as the omniscience layer 116 of the system 106). Further, the method may create a set of rules based on one or more factors to generate intelligence for processing the query. The method may dynamically create program/codes on the fly to determine and provide correct judgment.
[000120] Specifically, the method may utilize intelligence module for tracking Slowly Changing Dimensions (SCD) of data, corresponding to the at least one entity, in a data layer to identify changes from one state to another in terms of one or more aspects related to the at least one entity. Further, the method for identifying one or more required factors such as identifying the problems, identifying the Key Performance Indicators (KPIs), feature generation, clustering, selection of algorithm, predictive analysis, correlation, semantic analysis, scoring, ranking etc. by utilizing appropriate libraries such as Rules library, Semantics library, Machine Learning libraries etc. (as shown in an exemplary Library Module of FIG. 3). Further, in an embodiment, the method may make predictions based on the Key Performance Indicators (KPI), the Slowly Changing Dimensions (SCD), position and time coordinates, and data from an omniscience layer and a data layer. Further, the method may create a programming code on the fly using dynamic programming abilities by utilizing top-down approach, bottom-up approach, system metadata, user metadata, data from SME's (in case the query relates to a particular business), predictions, and recommendations (such as from a cognition module).
[000121] Further, the method may perform (through an artificial intelligence module) one or more functionalities for processing the input query. The functionalities may include, but not limited to, performing a Search Driven Analytics (SDA) by enabling the entity to perform analysis on own data and further developing intelligence based on the analysis performed by the entity; utilizing an Auto-ML for automatizing functionalities from data preprocessing to model selection; performing an analysis of one or more parameters associated with the input query for generating one or more predictions based thereon by utilizing hyperparameter optimization (TiyperopT) for parameter optimization; and selecting a model with one or more parameters suitable for processing the input query. Herein, the selected model may be provided to the entity for enabling the entity to interact therewith.
[000122] Further, the method may utilize a set of facts, a set of rules, and manifesto (from the cognition module) to apply a logic to process the input query based on data and learning from the data layer, and omniscience layer. Herein, the method may perform a sentimental analysis for determining emotions and instinct associated with the user (entity). Also, the method generates one or more recommendations based on the applied logic, one or more situations and conditions. Furthermore, the method may determine possible current and futuristic impacts related to each of the generated one or more recommendations. The method may implement ontology for recommendations and heuristics for analyzing sentiments through neural network capable of providing intelligence and triggers and implements the functions and processes of the entity automatically as a response to stimuli. In an embodiment, the method may receive an external input from quantum computers that may be utilized by the intelligence module for processing of complex computing with high speed and accuracy.
[000123] Further, the method may monitor (by implementing a monitoring module) performance of each functional module to detect an anomaly in the performance. In case an anomaly is detected, the method may provide a proactive support by automatically rolling time periods based on the detected anomaly. Further, the method may determine a root cause of the anomaly. The method further facilitates the entity to manage one or more factors so as to detect one or more anomalies based on market trend and behavioral pattern corresponding to the entity. Further, based on various factors such as understanding the pattern, user's actions/reactions, detection and management of anomalies etc., the method evolves by adopting self-correcting techniques to reform and enrich knowledge and update its abilities for best practices and standards.
[000124] At step 510, the method may provide one or more intelligent recommendations, through a recommendation module, based on the input query. The method may generates intelligent recommendations based on various factors including (but not limited to) intelligence gathered through analysis of various aspects related to the entity, market trend, frequency of the query, user' s state of mind, evolutionary changes, and so on. The recommendations may be scored and weighted to determine the most appropriate recommendations based on one or more conditions associated with the context of the query or the entity's situations and circumstances.
[000125] Further, at step 512, the intelligent recommendations may be transformed (for example, by a rendering module) into one or more visual elements such as graphs, trees, virtual presence, table etc. using one or more libraries. For example, the libraries provided by a library module 248 (as depicted in FIG. 3) may be utilized by the method for processing the query. As depicted in FIG. 3, the library module 248 may include various types of different libraries data including, but not limited to, a language library 302, a Syntactic Library 304, a Semantic library 306, a Business Case Library 308, an Organization Structure Library 310, a Value Chain Library 312, a Service Library 314, a Rules Library 316, an Orchestration Library 318, a Document Library 320, and Simulation Library 322.
[000126] Further, at step 514, the method may provide a processed output corresponding to the one or more visual elements to the entity to empower the at least one entity to take an intelligent decision, based on the processed output, corresponding to the input query. The processed output may be provided to the entity by an interface module. Further, it may be appreciated by a person skilled in the art that the output may be provided in the form of audio, video, virtual presence, text, table, sketches etc. In an embodiment, the processed output may be provided to one or more other applications integrated to the system. The processed output may be provided through various ways. For example, in case of group discussion, the processed output may be provided through chats, emails, VR, graphs etc.
[000127] It may be appreciated by a person skilled in the art that the method is not restricted to the aforementioned description. The keywords such as 'Firstly', 'Secondly', 'Thirdly', 'Fourthly', 'First Set', 'Second Set', 'Third Set', 'Fourth Set' and so on should not be construed as limiting as such keywords are provided for better understanding of the disclosure. Also, the method steps and flow is not restrictive as the method and system implementation may take any necessary and suitable step (either related to functional aspects or to the hardware components related to any module) on the basis of intelligence generated corresponding to the input query, as described aforementioned in the disclosure. Further, many more examples and embodiments may be implemented without departing from the scope of the invention.
[000128] Advantageously, the present invention discloses a method and a system for empowering an entity to enhance performance thereof or any performance related thereto by utilizing artificial intelligence and cognition through system experience or through the experience of the user (entity). The system (and corresponding method) is a single source of intelligence generation to perform business process (es), research and other complex aspects of life intelligently and autonomously. The system has an ability to understand the intricacies to produce analytics and also to engineer and re-engineer the underlying process for generating an intelligent output for empowering an entity that provides an input query. Further, the system provides scoring and ranking to apply weightages and factors to determine the most effective and accurate projections. Again further, the system utilizes deep learning process to evolve from one state to another. Furthermore, the system is capable of being used as a plug-in for other applications and is also able to collaborate with one or more external applications (such as quantum computer, other digital computer, mobile applications etc.) to generate and provide intelligent solutions based on several contributory factors. Additionally, the system utilizes time machine component to perform analysis of processes over time and to forecast accordingly. Moreover, the system facilitates user to define own rules, alter the selected models and set own preferences for allowing processing in accordance with user's choices. Also, the system is capable of combining data from different sources to automatically transform and utilize those for analysis.
[000129] Further, it may be appreciated by a person skilled in the art that the method and the system are not restricted to the aforementioned advantages and applications. Further, many more advantages and useful applications may be achieved by implementing the system.

Claims

Claims:
1. A computing system omniscient by design to provide holistic intelligence for empowering an entity to enhance life and business performance by processing data of various formats, the computing system comprising: an omni-presence layer comprising: a first set of modules for interfacing and collaborating with at least one entity; and an interface module for providing a processed output corresponding to the one or more elements to the at least one entity, thereby empowering the entity with decisions and recommendations based on intelligence for enhancing performance thereof. a processing layer comprising: a second set of modules for receiving and validating at least one of one or more types of input queries from the at least one entity; an enrichment module for data shaping, augmentation and modeling; a rendering module for the dynamic compilation of visual or signal based output; a controlling layer comprising a plurality of sub-layers having a third set of modules, the sub-layers comprising: a deciding layer for: deciding a course of action by analyzing the at least one input query; and redirecting the at least one input query, through a communication module, to one of other sub-layers of the plurality of sub-layers based on the decided course of action; and an omnisense layer comprising: an orchestration module for coordinating, directing and redirecting the at least one input query to the other sub modules of the omnisense layer; an intelligence module for generating intelligence, to process the at least one input query when the at least one input query is redirected by the orchestration module; and a recommendation module configured to provide one or more intelligent recommendations based on processing the at least one input query; a data layer comprising: a set of modules, for storing and managing the data to process one of one or more types of input queries from at least one entity; an auto-organization module configured for the classification of streams of data and organization of the data based on data structures; a transient layer comprising a set of modules, for storing and managing the frequently accessed and processed data by the at least one entity; an omniscience layer comprising a fourth set of modules, for providing knowledge based on one of one or more types of input queries from at least one entity;
2. The computing system of claim 1, wherein the omni-presence layer, the processing layer, the omnisense layer of the controlling layer and the omniscience layer are communicatively coupled with, for carrying out intelligent interactions corresponding to the at least one input query.
3. The computing system of claim 1, wherein the omni-presence layer further comprising:
a presentation module for making intelligent interactions with the at least one entity in a multi-dimensional way for facilitating the at least one entity with an Advanced User Interface and an intelligent set of functions for omnificent visual representation based on preference and situation associated with the at least one entity; a collaboration module for enabling interaction among multiple entities and the computer-implemented system in an instance of group discussions; an interface module for enabling interactions with one or more external applications; and an integration module for integrating data received from multiple heterogeneous sources, to the computing system, by utilizing Data Migration and Integration techniques thereof.
4. The computing system of claim 1, wherein the one or more types of input queries which comprise signals, data in natural language, bulk data, gestures, mathematical query, data structure, symbols, business automation language, audio and visual is received, comprehended, standardized and contextualized.
5. The computing system of claim 1, wherein the second set of modules further configured for performing one or more functionalities to convert the at least one input query from one format into an alternate format based on a type of the at least one input query, the one or more functionalities correspond to validating, altering, enriching, transforming and translating the at least one input query.
6. The computing system of claim 1, wherein the second set of modules further comprising:
a transmission module configured to process the at least one input for transformation thereof into an acceptable format, when the input query is a signal; and a transformation module for altering and modelling data values, attributes and structures based on an input received from an enrichment module. an enrichment module configured for:
validating the at least one input by utilizing data from a data layer; and
facilitating the at least one entity to define rules, the rules being utilizable by an omniscience layer to learn and evolve based thereon. a distribution module configured for distribution of queries to one or more layers of the computing system for parallelism, to support multiple views, concurrency, failover recovery, scalability, and replications.
7. The computing system of claim 1, wherein the deciding layer comprising:
a security module configured for:
securing the Omni Presence layer, processing layer, transient layer, data layer and Omniscience layer with protective measures comprising but not limited to authentication, security enforcements and cryptographic methods; and
for getting authentication to access data and applications corresponding to the computing system for ensuring security thereof; a communication module configured for:
coordinating with an authentication module of the security module and a Micro-Service integration module; and redirecting the at least one input query to the other sub layers of the controlling layer for further processing.
8. The computing system of claim 1, wherein the omnisense layer further comprises an orchestration module configured for prioritizing, channelizing and ensuring completion of a request corresponding to the at least one input query, the request being received from at least one of the deciding layer and one or more modules of the omnisense layer.
9. The computing system of claim 1, wherein the omnisense layer further comprising:
a monitoring module configured to perform at least one of:
monitoring performance of the computing system to detect an anomaly in entities, structures and flows impacting the performance; providing proactive support in collaboration with a time machine module of an omniscience layer by automatically rolling time periods based on the detected anomaly; providing a command to the intelligence module to determine root cause of the anomaly; and facilitating the at least one entity to manage one or more factors to detect one or more anomalies based on trends and behavioral patterns corresponding to the at least one entity; an automation module further configured for gathering input from the at least one or more entities, and in collaboration with a library module of an omniscience layer to automate processes autonomously. an intelligence module configured for providing cognitive intelligence for processing one of one or more types of input query from the at least one entity.
10. The computing system of claim 1, wherein the intelligence module comprising: an artificial intelligence module configured for: tracking Slowly Changing Dimensions (SCD) of data, corresponding to the at least one entity, in a data layer to identify changes from one state to another in terms of one or more aspects related to the at least one entity; and making predictions based on Key Performance Indicators (KPI), the Slowly Changing Dimensions (SCD), position and time coordinates, and data from an omniscience layer and a data layer; and creating a programming code on the fly in collaboration with automation module, using dynamic programming abilities by utilizing top-down approach, bottom-up approach, metadata, predictions, and recommendations from a cognition module; and a cognition module comprising a set of facts, a set of deterministic and non-deterministic rules, and manifesto to apply a logic to process the at least one input query based on data and learning from the data layer, and omniscience layer.
11. The computing system of claim 10, wherein the one or more aspects comprise at least one of interactions, activities, transactions, emotions, sentiments, decisions, dynamics, movements, changes, reactions, context and connotations related to the at least one entity are directed to the orchestration module for processing.
12. The computing system of claim 10, wherein the cognition module further configured for: performing one or more behavioral, psychological and physiological analysis for determining emotions and instinct associated with the at least one entity; generating one or more recommendations based on the logic, one or more situations and conditions; determining a need and a level of intelligence required for processing the at least one input query; and determining possible current and futuristic impacts related to each of the generated one or more recommendations
13. The computing system of claim 1, wherein an artificial intelligence module of the intelligence module configured to perform at least one of: a Search Driven Analytics (SDA) by: enabling the at least one entity to perform analysis on own data; and developing intelligence based on the analysis performed by the at least one entity; utilizing Automated Machine Learning and Deep Learning for automatizing functionalities from data preprocessing to model selection; an analysis of one or more parameters associated with the at least one input query for generating one or more predictions based thereon by utilizing Hyperparameter Optimization; selection of a model with one or more parameters suitable for processing the input query, the selected model provided to the at least one entity for enabling the at least one entity to interact therewith;and generating one or more recommendations of visual elements, suitable for visualizing the data, the visual elements provided to the at least one entity for the representation of data distribution.
14. The computing system of claim 1, wherein the fourth set of modules comprising a time machine module, a library module, an evolution module, a learning module and a training module, configured for growing and providing knowledge to generate the intelligence and for self-evolution of the computing system.
15. The computing system of claim 14, wherein the intelligence is generated by creating a set of deterministic and non-deterministic rules based on one or more factors and data obtained from the omniscience layer by utilizing data constructs topic models, knowledge graphs and ontologies.
16. The computing system of claim 1, wherein the sub-layers further comprising a transient layer having one or more modules, the one or more modules comprise caching module, In-Memory module and a session module, the one or more modules implemented for: facilitating an un-interrupted user-session and back up management; and storing most frequent input queries using ranking and graph data structures.
17. The computing system of claim 1, wherein the sub-layers further comprising a data layer having one or more modules, the one or more modules comprise compression and decompression module, master data management module, metadata management module and auto-organization module, the one or more modules implemented for: managing storage and retrieval of data, and processing requirements by providing real-time updates to at least one of the processing layer, other sub layers in the controlling layer and the omniscience layer.
18. The computing system of claim 1, wherein the second set of modules comprising a rendering module configured for:
receiving the at least one of one or more types of output from each module and layer; transforming the output into one or more elements, signals; and delivering the output to the at least one or more entity in collaboration with the interface and presentation module.
PCT/US2019/017126 2019-02-07 2019-02-07 Holistic intelligence and autonomous information system and method thereof WO2020162943A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2019/017126 WO2020162943A1 (en) 2019-02-07 2019-02-07 Holistic intelligence and autonomous information system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2019/017126 WO2020162943A1 (en) 2019-02-07 2019-02-07 Holistic intelligence and autonomous information system and method thereof

Publications (1)

Publication Number Publication Date
WO2020162943A1 true WO2020162943A1 (en) 2020-08-13

Family

ID=71947264

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/017126 WO2020162943A1 (en) 2019-02-07 2019-02-07 Holistic intelligence and autonomous information system and method thereof

Country Status (1)

Country Link
WO (1) WO2020162943A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905805A (en) * 2021-03-05 2021-06-04 北京中经惠众科技有限公司 Knowledge graph construction method and device, computer equipment and storage medium
CN112905806A (en) * 2021-03-25 2021-06-04 哈尔滨工业大学 Knowledge graph materialized view generator and generation method based on reinforcement learning
US11188833B1 (en) * 2020-11-05 2021-11-30 Birdview Films. Llc Real-time predictive knowledge pattern machine
CN114707488A (en) * 2022-02-25 2022-07-05 马上消费金融股份有限公司 Data processing method and device, computer equipment and storage medium
CN115080968A (en) * 2022-06-08 2022-09-20 陕西天诚软件有限公司 Artificial intelligence server with intelligent security protection
CN116032652A (en) * 2023-01-31 2023-04-28 湖南创亿达实业发展有限公司 Gateway authentication method and system based on intelligent interactive touch panel
CN116680459A (en) * 2023-07-31 2023-09-01 长沙紫喇叭电子商务有限公司 Foreign trade content data processing system based on AI technology

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6202060B1 (en) * 1996-10-29 2001-03-13 Bao Q. Tran Data management system
US20020133368A1 (en) * 1999-10-28 2002-09-19 David Strutt Data warehouse model and methodology
US20040139426A1 (en) * 2002-10-25 2004-07-15 Yuh-Cherng Wu Enterprise multi-agent software system
US20080010496A1 (en) * 2006-06-08 2008-01-10 Sanjoy Das System and Method to Create and Manage Multiple Virtualized Remote Mirroring Session Consistency Groups
US20140095525A1 (en) * 2012-09-28 2014-04-03 Oracle International Corporation Tactical query to continuous query conversion
US20170060639A1 (en) * 2015-08-28 2017-03-02 Vmware, Inc. Scalable Concurrent Execution of Distributed Workflows Sharing Common Operations
US20170109762A1 (en) * 2015-10-19 2017-04-20 Yeon Tae KIM Omni-channel marketing curation system based on big data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6202060B1 (en) * 1996-10-29 2001-03-13 Bao Q. Tran Data management system
US20020133368A1 (en) * 1999-10-28 2002-09-19 David Strutt Data warehouse model and methodology
US20040139426A1 (en) * 2002-10-25 2004-07-15 Yuh-Cherng Wu Enterprise multi-agent software system
US20080010496A1 (en) * 2006-06-08 2008-01-10 Sanjoy Das System and Method to Create and Manage Multiple Virtualized Remote Mirroring Session Consistency Groups
US20140095525A1 (en) * 2012-09-28 2014-04-03 Oracle International Corporation Tactical query to continuous query conversion
US20170060639A1 (en) * 2015-08-28 2017-03-02 Vmware, Inc. Scalable Concurrent Execution of Distributed Workflows Sharing Common Operations
US20170109762A1 (en) * 2015-10-19 2017-04-20 Yeon Tae KIM Omni-channel marketing curation system based on big data

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11188833B1 (en) * 2020-11-05 2021-11-30 Birdview Films. Llc Real-time predictive knowledge pattern machine
CN112905805A (en) * 2021-03-05 2021-06-04 北京中经惠众科技有限公司 Knowledge graph construction method and device, computer equipment and storage medium
CN112905805B (en) * 2021-03-05 2023-09-15 北京中经惠众科技有限公司 Knowledge graph construction method and device, computer equipment and storage medium
CN112905806A (en) * 2021-03-25 2021-06-04 哈尔滨工业大学 Knowledge graph materialized view generator and generation method based on reinforcement learning
CN114707488A (en) * 2022-02-25 2022-07-05 马上消费金融股份有限公司 Data processing method and device, computer equipment and storage medium
CN114707488B (en) * 2022-02-25 2024-02-09 马上消费金融股份有限公司 Data processing method, device, computer equipment and storage medium
CN115080968A (en) * 2022-06-08 2022-09-20 陕西天诚软件有限公司 Artificial intelligence server with intelligent security protection
CN115080968B (en) * 2022-06-08 2023-06-02 陕西天诚软件有限公司 Artificial intelligence server with intelligent safety protection
CN116032652A (en) * 2023-01-31 2023-04-28 湖南创亿达实业发展有限公司 Gateway authentication method and system based on intelligent interactive touch panel
CN116032652B (en) * 2023-01-31 2023-08-25 湖南创亿达实业发展有限公司 Gateway authentication method and system based on intelligent interactive touch panel
CN116680459A (en) * 2023-07-31 2023-09-01 长沙紫喇叭电子商务有限公司 Foreign trade content data processing system based on AI technology
CN116680459B (en) * 2023-07-31 2023-10-13 长沙紫喇叭电子商务有限公司 Foreign trade content data processing system based on AI technology

Similar Documents

Publication Publication Date Title
Faroukhi et al. Big data monetization throughout Big Data Value Chain: a comprehensive review
Wang et al. Transformation from IT-based knowledge management into BIM-supported knowledge management: A literature review
Gollapudi Practical machine learning
Barenkamp et al. Applications of AI in classical software engineering
Nadal et al. A software reference architecture for semantic-aware Big Data systems
Babu et al. Exploring big data-driven innovation in the manufacturing sector: evidence from UK firms
Freitas et al. Big data curation
WO2020162943A1 (en) Holistic intelligence and autonomous information system and method thereof
Song et al. Big data and data science: Opportunities and challenges of iSchools
Chen et al. Automatic concept classification of text from electronic meetings
Akerkar et al. Intelligent techniques for data science
Michalczyk et al. A state-of-the-art overview and future research avenues of self-service business intelligence and analytics
Becker et al. New Horizons for a Data-Driven Economy: Roadmaps and Action Plans for Technology, Businesses, Policy, and Society.
Ali et al. The KEEN universe: An ecosystem for knowledge graph embeddings with a focus on reproducibility and transferability
Lee et al. A systematic idea generation approach for developing a new technology: Application of a socio-technical transition system
Henriksson et al. Holistic data-driven requirements elicitation in the big data era
Lu et al. Frontier of information visualization and visual analytics in 2016
Zhuge Mapping big data into knowledge space with cognitive cyber-infrastructure
Davis Making sense of open data: from raw data to actionable insight
Hernes Consensus theory for cognitive agents’ unstructured knowledge conflicts resolving in management information systems
Shibolet et al. Resources for comparative analysis of IDN authoring tools
Gómez-Berbís et al. ADL-MOOC: Adaptive learning through big data analytics and data mining algorithms for MOOCs
Liu Apache spark machine learning blueprints
Moreno et al. Representing experts' interpretive trails with hyperknowledge specifications
Kejriwal et al. A Lightweight Global Taxonomy Induction System for E-Commerce Concept Labels

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19914332

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19914332

Country of ref document: EP

Kind code of ref document: A1