LU100517B1 - Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises - Google Patents

Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises Download PDF

Info

Publication number
LU100517B1
LU100517B1 LU100517A LU100517A LU100517B1 LU 100517 B1 LU100517 B1 LU 100517B1 LU 100517 A LU100517 A LU 100517A LU 100517 A LU100517 A LU 100517A LU 100517 B1 LU100517 B1 LU 100517B1
Authority
LU
Luxembourg
Prior art keywords
premises
user
computing unit
data records
devices
Prior art date
Application number
LU100517A
Other languages
German (de)
French (fr)
Inventor
Bernard Ndolo
Original Assignee
Bernard Ndolo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bernard Ndolo filed Critical Bernard Ndolo
Priority to LU100517A priority Critical patent/LU100517B1/en
Application granted granted Critical
Publication of LU100517B1 publication Critical patent/LU100517B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance

Abstract

Disclosed is a system and method for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises. The method includes the step of curating a plurality of data records pertaining to premises and a plurality of devices installed in the premises. The plurality of data records is curated during planning, construction, and installation phases. The method includes the step of storing the curated data records in a database. The method includes the step of accessing the stored data records corresponding to the premises and the devices through a computing unit on receiving an input command from a user or the stored data may automatically activate through a plurality of sensors configured to the computing unit. The method includes the step of displaying the accessed data on receiving a pointing gesture from the user through the computing unit either towards the premises, the devices installed in the premises, a plurality of elements within the premises such as a wall, ceilings, floors, doors etc. The computing unit comprises an augmented reality mechanism to display the curated data records through a plurality of pre-defined user- interface states.

Description

METHOD AND SYSTEM FOR CURATING, ACCESSING, AND DISPLAYING A PLURALITY OF DATA RECORDS PERTAINING TO PREMISES, AND A PLURALITY OF DEVICES INSTALLED IN THE PREMISESMETHOD AND SYSTEM FOR CURATING, ACCESSING, AND DISPLAYING A PLURALITY OF DATA RECORDS PERTAINING TO PREMISES, AND A PLURALITY OF DEVICES INSTALLED IN THE PREMISES

TECHNICAL FIELDTECHNICAL FIELD

[0001] The present invention generally relates to a method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises.The present invention generally relates to a method and system for curating, accessing, and displaying a process of pertaining to premises.

BACKGROUNDBACKGROUND

[0002] Conventionally, a user depends on a blueprint, user manual, and documents to access and understand the data and information of constructed premises to be renovated (room, house, building etc.), and the devices (electrical cabling, plumbing networks, gas sensors, heating unit, ventilation, air conditioning units, lights, furniture etc.) installed in the premises.Conventional, a user depends on a blueprint, user manual, and documents to access and understand the data and information of a renovated premises (room, house, building, etc.), and the devices (electrical cabling, plumbing networks , gas sensors, heating unit, ventilation, air conditioning units, lights, furniture etc.) installed in the premises.

[0003] The utilization of the blueprint, user manuals, and documents slow down the response times to critical events which can, in turn, lead to increased possibilities of incurring expensive repair costs. Delays in undertaking repairs and renovations of the premises and devices can lead to serious accidents in the event of an emergency such as an electrical fault, fire, water damage and etc. Furthermore, in case the device or service provider is unavailable, the user or owner has to waste a lot of time and money searching for an alternative service provider who in turn has to find/search for the original installation blueprints or information concerning the installed devices and smart sensors.[0003] The utilization of the blueprint, user manuals, and documents slow down the response times to critical events which can, in turn, lead to increased possibilities of incurring expensive repair costs. Delays in undertaking repairs and renovations of the premises and devices can lead to serious accidents in the event of an emergency such as an electrical fault, fire, water damage and etc. Furthermore, in the case of the device or service provider is unavailable, the user or owner has to waste a lot of time and money searching for an alternative service provider who is looking at the blueprints or information concerning the installed devices and smart sensors.

[0004] Additionally, there are many contractors and vendors that are involved in the construction of premises, therefore, when an installed device fails to operate as expected, or many at times are outdated, the users and premises owners have to locate the information pertaining to the contractors/vendors who worked on the project. Information about the installation dates and guarantee obligations of the various installed devices and systems are not easily and readily accessible, obtainable or many at times are outdated.[0004] Additionally, there are many contractors and vendors that are involved in the construction of the premises to the contractors / vendors who worked on the project. Information about the installation and system requirements are not readily and generally accessible, obtainable or many at times are outdated.

The users are therefore faced with the challenges of sourcing new systems, devices and service providers which in turn leads to waste valuable resources of time and money.The users are faced with the challenges of sourcing new systems, devices and service providers.

[0005] There are various systems and methods that exist to solve the aforementioned problems. However, the existing systems and methods do not provide the premises owner/end user a unified platform and a software application that can be used to easily access curated data and information, view the curated data and information, offer control functions of the installed devices and systems and enable them to purchase goods and services relating to devices, smart sensors, installation and maintenance services. The existing systems and methods provide the premise owner/end user multiple or different platforms and software applications that are used to access data and information, view the data and information, control the various installed devices and systems and offer fragmented options to enable the purchase of goods and services. This lack of a unified platform and a software application, in turn, leads to a complex, slow, expensive and often frustrating user experience.[0005] There are various systems and methods that exist to solve the aforementioned problems. However, the existing systems and methods are used to provide information to and from the installed devices systems and enable them to purchase goods and services relating to devices, smart sensors, installation and maintenance services. The existing systems and methods provide the premise owner / end user with the information that they are using goods and services. This is a unified platform and a software application, in turn, leads to a complex, slow, expensive and often frustrating user experience.

[0006] Therefore, there is a need for a unified system and method that can be used to easily access curated data and information, view the curated data and information, offer control functions of the installed devices and systems and enable them to purchase goods and services relating to devices, smart sensors, installation and maintenance services. Furthermore, there is a need for a system and method which can enable the user to add to or remove from the curated data and information of the installed devices and systems.Therefore, there is a need for a good and a good time for a good and a good time services relating to devices, smart sensors, installation and maintenance services. Furthermore, there is a system and method that can be used to remove the data from the installed devices and systems.

[0007] The disadvantages and limitations of traditional and conventional approaches will become apparent to the person skilled in the art through a comparison of the described system and method with some aspects of the present disclosure, as put forward in the remainder of the present application and with reference to the drawings.The disability and limitations of traditional and conventional approaches to the present invention are discussed in greater detail in the remainder of this application with reference to the drawings.

DISCUSSION OF RELATED ARTDISCUSSION OF RELATED ART

[0008] A system and method to provide an augmented reality image which combines a real-time, real view of an external element (e.g., a wall or a ceiling) in a real environment, overlaid with an image of a 3D digital model of internal elements such as pipes, conduits, wall studs etc. as they exist hidden behind the external element. By incorporating the AR (Augmented Reality) technology into land surveying, 3D laser scanning, and digital modelling processes, the 3D digital model of the internal elements is overlaid on the live view of the mobile device, aligned to the orientation and scale of the scene shown on the mobile device, as disclosed in US patent application 20140210856 A1 of Sean Finn, which is incorporated herein by reference. Further, a wearable augmented-reality system such as DAQRI Smart Helmet, being developed for use in industrial fabrication industries—especially the building and construction industry. Essentially, this smart helmet allows builders, engineers, and designers to take their BIM model to the construction site, wear it on their heads, and experience it as an immersive, full-scale 3D environment. Furthermore, Shapetrace has developed an augmented/mixed reality tools to help construction teams prevent errors and build right the first time. They compare the 3D construction plans (BIM) with the actual conditions using tablets. However, the patent and non-patent literature mentioned above do not explicitly discuss a unified system and method to access and display the curated data records and information pertaining to any premises, and devices installed in the premises. The existing arts are limited to only manufacturing plants and other industrial equipment. Additionally, the existing arts offer only one aspect of the AR (Augmented Reality) data viewing function and while utilizing CAD drawings to identify the devices. Further, the literature mentioned above also does not talk about a unified platform and a software application that enables the user to order the replacement or upgrade devices, including the possibility of purchasing maintenance and installation services for those devices.[0008] A real-time view of a real-time, real-time view of an external element (eg, a wall or a ceiling) in a real environment, overlaid with an image of a 3D digital model internal elements such as pipes, conduits, wall studs etc. as they exist hidden behind the external element. By incorporating the AR technology into land surveying, 3D laser scanning, and digital modeling processes, the 3D digital model of the internal elements is overlaid on the mobile device, aligned to the orientation and scale of the scene in the US Patent Application 20140210856 A1 of Sean Finn, which is incorporated by reference. Further, a wearable augmented reality system searches for DAQRI Smart Helmet, being developed for use in industrial manufacturing industries-especially the building and construction industry. Essentially, this smart helmet allows builders, engineers, and designers to take their BIM model to the construction site, using it on their heads, and experiencing it as an immersive, full-scale 3D environment. Furthermore, Shapetrace has developed an augmented / mixed reality tools to help construction teams prevent mistakes. They compare the 3D construction plans (BIM) with the actual conditions using tablets. However, the patent and non-patent literature is referred to as "Unified System and Method for Accessing and Displaying the Data. The existing arts are limited to manufacturing plants and other industrial equipment. Additionally, the AR (Augmented Reality) data viewing function and also using CAD drawings to identify the devices. Further, the literature also discusses the possibility of providing the replacement or upgrade device, including the possibility of purchasing maintenance and installation services for those devices.

SUMMARY OF INVENTIONSUMMARY OF INVENTION

[0009] According to embodiments illustrated herein, there is provided a system which functions as a unified platform and a software application for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises. The platform and the software application also include the function of controlling the plurality of installed devices and systems and enabling the purchase of goods and services related to such devices and systems. The unified platform and the software application includes a processor, and a memory to store machine readable instructions that when executed by the processor, curate a plurality of data records pertaining to premises and a plurality of devices installed in the premises through a curation module. The plurality of data records and information is curated during a plurality of phases such as a planning phase of the premises, a construction phase of the premises, an installation phase of the devices in the premises. The processor is further configured to store the curated data records in a database. Then the processor is configured to access the stored data records corresponding to the premises and the devices through an access module by utilizing a computing unit on receiving an input command from a user or the stored data may automatically activate through a plurality of sensors configured to the computing unit.[0009] According to a system of the invention, there is provision for the provisioning of services, and the operation of the system is performed. The platform and the software application thus include the function of controlling the network of devices and systems. The unified platform and the software application includes a processor, a processor and a processor, a device for the production of data. The process of planning a phase of the premises, an installation phase of the devices in the premises. The processor is configured to store the curated data records in a database. It is also possible to read through a device of the type described in this document computing unit.

[0010] Further, the processor is configured to display the accessed data through a display module on receiving a pointing gesture by the computing unit either towards at least one of the premises, the devices installed in the premises, a plurality of elements within the premises such as a wall, ceilings, floors, doors etc. The computing unit comprises an augmented reality mechanism to display the curated data records through a plurality of pre-defined user-interface states. In an aspect, the present unified platform and application enable the user to diagnose installed devices and purchase replacement devices and maintenance services of the installed devices.Further, the processor is located in the premises, the devices are installed in the premises search as a wall, ceilings, floors, doors, etc. The computing unit comprises an augmented reality mechanism to display the curated data records through a pre-defined user-interface state. In an aspect, the present invention provides the devices to be used and the devices to be replaced.

[0011] As per the embodiments illustrated herein, there is provided a method for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises. The method includes the step of curating, by one or more processors, a plurality of data records pertaining to premises, and a plurality of devices installed in the premises. The plurality of data records is curated during a plurality of phases such as a planning phase of the premises, a construction phase of the premises, an installation phase of the devices in the premises. Then the method includes the step of storing, by one or more processors, the curated data records in a database. Further, the method includes the step of accessing, by one or more processors, the stored data records corresponding to the premises and the devices through a computing unit on receiving an input command from a user or the stored data may automatically activate through a plurality of sensors configured to the computing unit. Furthermore, the method includes the step of displaying, by one or more processors, the accessed data on receiving a pointing gesture from the user through the computing unit either towards the premises, or various elements within the premises such as wall, ceilings, floors, doors etc. or the devices installed in the premises. The computing unit comprises an augmented reality mechanism to display the curated data records through a plurality of pre-defined user-interface states.As per the illustrated in, there is provided a method for curating, accessing, and displaying a pertaining to premises. The method includes the step of curating, by one or more processors, a plurality of data records pertaining to premises, and a device of the kind installed in the premises. A process phase of the premises, an installation phase of the devices in the premises. Then the method includes the step of storing, by one or more processors, the curated data records in a database. Further, the method includes the step of accessing, by one or more processors sensors configured to the computing unit. Furthermore, the method involves the use of the equipment or the equipment, or the elements in the premises, as walls, ceilings, floors, doors etc. or the devices installed in the premises. The computing unit comprises an augmented reality mechanism to display the curated data records through a pre-defined user-interface state.

[0012] Accordingly, one advantage of the present invention is that it provides a unified platform and an application that displays the installed infrastructure in the premise, provides a control over the devices, allows the end user to order/purchase replacement or upgrade devices, enables the user to order maintenance and installation of services, and ability to extract diagnostics information of the installed devices and systems.[0012] Accordingly, the invention is hereby incorporated by reference. It provides a control over the devices, allows the end user to order or purchase replacement or upgrade devices. enabling the user to maintain and dispose of the installed devices and systems.

[0013] Accordingly, one advantage of the present invention is that it provides a fast and easy access to the curated data records and information about the premises or the installed devices in the premises by using a single/one software application that has a user interface which automatically responds to the users pointing gestures, preferences and computing unit internal sensors.Accordingly, it is a tool for use in a home office or home office environment which automatically responds to the users' gestures, preferences and computing unit internal sensors.

[0014] Another advantage of the present invention is that it enables the user to add to or remove from the curated data and information of the premises and the installed devices. [0015] Still another advantage of the present invention is that it provides a novel mechanism to automatically identify an installed device in the premises and to provide the curated data and diagnostics of that installed device.Another advantage of the present invention is that it enables the user to add or remove from the curated data and information of the premises and the installed devices. [0015] Still another embodiment of the present invention is shown.

[0016] Another advantage of the present invention is that it enables the user to purchase a replacement device or system, purchase installation, repair or maintenance services from approved or various suppliers and installation companies.Another advantage of the present invention is that it provides for a replacement device or system, purchase installation, repair or maintenance services from approved or various suppliers and installation companies.

[0017] Still another advantage of the present invention is that it enables the user to control multiple functions of the different installed devices and systems in the premises.Still another embodiment of the present invention provides the user-to-control multiple functions of the different installed devices and systems in the premises.

[0018] Still another advantage of the present invention is that it provides the user with the installation date of the device, the installer’s name, and contact details of the installer. [0019] Still another advantage of the present invention is that it informs the user about the availability schedules of the various maintenance and installers contractors based on the geographical location of the user.Still another embodiment of the invention provides the user with the installation date of the device, the installer's name, and contact details of the installer. [0019] Still another embodiment of the present invention is based on the availability of schedules for the various maintenance and installers of contractors based on the geographical location of the user.

[0020] Still another advantage of the present invention is that it enables the user to rate the services provided by the various device suppliers, installers and maintenance providers.[0020] The various device suppliers, installers and maintenance providers, provide the same.

[0021] Still another advantage of the present invention is that it provides a single/one software application that has a user interface which automatically responds to the users pointing gestures, preferences and computing unit internal sensors to gain access to all the above-mentioned advantages.[0021] Still another embodiment of the present invention ,

[0022] The aforementioned features and advantages of the present disclosure may be appreciated by reviewing the following description of the present disclosure, along with the accompanying figures wherein like reference numerals refer to like parts.[0022] The following description of the present disclosure is hereby incorporated by reference, along with the accompanying figures.

BRIEF DESCRIPTION OF DRAWINGSLETTER OF DRAWINGS

[0023] The appended drawings illustrate the embodiments of the system and method for curating, accessing, and displaying a plurality of data records information pertaining to premises, elements of the premises, and a plurality of devices installed in the premises of the present disclosure. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries in the drawings represent an example of the boundaries. In an exemplary embodiment, one element may be designed as multiple elements, or multiple elements may be designed as one element. In an exemplary embodiment, an element shown as an internal component of one element may be implemented as an external component in another and vice versa. Furthermore, the elements may not be drawn to scale.The appended drawings illustrate the operation of the system and method for curating, accessing, and displaying a process of pertaining to premises, elements of the premises, and a system of the present disclosure. Any person with ordinary skills in the art wants to appreciate that the illustrated element boundaries in the drawings represent an example of the boundaries. In an embodiment, one element may be designed as multiple elements, or multiple elements may be designed as one element. In an embodiment, an external component in another and vice versa is presented. Furthermore, the elements may not be drawn to scale.

[0024] Various embodiments will hereinafter be described in accordance with the accompanying drawings, which have been provided to illustrate, not limit, the scope, wherein similar designations denote similar elements, and in which: [0025] FIG. 1 illustrates the flowchart of the method for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises, in accordance with an embodiment.Various embodiments will be described in accordance with the accompanying drawings, which have been provided to illustrate, not limit, the scope, the similar designations denote similar elements, and in which: [0025] FIG. 1 illustrating the flowchart of the method for curating, accessing, and displaying a pattering of data records pertaining to premises, and a pending device installed in the premises, in accordance with an embodiment.

[0026] FIG. 2 represents a block diagram of the present system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises, in accordance with at least one embodiment; [0027] FIG. 3 illustrates an exemplary view of a pointing gesture from the user through the computing unit towards the device (smart sensors) of the premises, in accordance with at least one embodiment; [0028] FIG. 4 illustrates an augmented reality control state of the device such as a TV on receiving a pointing gesture from the user through the computing unit, in accordance with at least one embodiment; [0029] FIG. 5 illustrates an augmented reality control state of the device such as a stereo system on receiving a pointing gesture from the user through the computing unit, in accordance with at least one embodiment; [0030] FIG. 6 illustrates an exemplary view of a pointing gesture from the user through the computing unit towards the floor of the premises, in accordance with at least one embodiment; [0031] FIG. 7 illustrates an exemplary view of a pointing gesture from the user through the computing unit towards the ceiling of the premises, in accordance with at least one embodiment; [0032] FIG. 8 illustrates an exemplary view of a 360 degree pointing gesture from the user through the computing unit towards the wall of the premises or the devices installed in the premises, in accordance with at least one embodiment; [0033] FIG. 9 illustrates an exemplary view of the user wearing a mixed reality headset, in accordance with at least one embodiment; [0034] FIG. 10 illustrates an exemplary view of the user wearing a virtual reality headset, in accordance with at least one embodiment; [0035] FIG. 11 illustrates an exemplary view of a pointing gesture from the user through the computing unit towards the lights installed in an office, in accordance with at least one embodiment; [0036] FTG. 12 illustrates an exemplary view of a pointing gesture from the user through the computing unit towards a building, in accordance with at least one embodiment; [0037] FIG. 13 illustrates a plurality of pre-defined user-interface states, in accordance with at least one embodiment; [0038] FIG. 14 illustrates an exemplary view of a clock face/other image user-interface state and augmented reality user-interface state depicts plumbing and cabling networks, in accordance with at least one embodiment; [0039] FIG. 15 illustrates an augmented reality control state and an exemplary view of a pointing gesture from the user through the computing unit to control an air-conditioning unit installed in the premises, in accordance with at least one embodiment; and [0040] FIG. 16 illustrates an augmented reality control state and an exemplary view of a pointing gesture from the user through the computing unit to control a floor heating unit installed in the premises, in accordance with at least one embodiment.FIG. 2 represents a block diagram of the present system for curating, accessing, and displaying a data room pertaining to premises, and a device of the kind installed in the premises, in accordance with at least one embodiment; FIG. 3 illustrates an exemplary view of the user through the computing device towards the device, in accordance with at least one embodiment; FIG. 4 shows an augmented reality control state of the device seeks as a TV on receiving a pointing gesture from the user through the computing unit, in accordance with at least one embodiment; FIG. In accordance with at least one embodiment; FIG. 6 illustrates an exemplary view of a home office building in accordance with the invention, in accordance with at least one embodiment; FIG. 7 illustrating an exemplary view of a home office building, in accordance with at least one embodiment; FIG. In an embodiment, in accordance with at least one embodiment; FIG. 9 illustrated an exemplary view of the user wearing a mixed reality headset, in accordance with at least one embodiment; FIG. 10 illustrating an exemplary view of the user wearing a virtual reality headset, in accordance with at least one embodiment; FIG. 11 illustrated an exemplary view of a home office, in accordance with at least one embodiment; FTG. 12 illustrating an exemplary view of the user through the computing unit towards a building, in accordance with at least one embodiment; FIG. Figure 13 illustrates a podium of pre-defined user-interface states, in accordance with at least one embodiment; FIG. User interface state and augmented reality user interface state depicting and cabling networks, in accordance with at least one embodiment; FIG. 15 shows an augmented reality control state and an exemplary view of a plane that is being used for the purpose of an air conditioning unit installed in the premises, in accordance with at least one embodiment; and FIG. 16 in an office setting, in accordance with at least one embodiment.

DETAILED DESCRIPTIONDETAILED DESCRIPTION

[0041] The present disclosure is best understood with reference to the detailed drawings and description set forth herein. Various embodiments have been discussed with reference to the drawings. However, the person skilled in the art will readily appreciate that the detailed descriptions provided herein with respect to the drawings are merely for explanatory purposes, as the systems and methods may extend beyond the described embodiments. For instance, the teachings presented and the needs of a particular application may yield multiple alternative and suitable approaches to implement the functionality of any detail described herein. Therefore, any approach may extend beyond certain implementation choices in the following embodiments.The present disclosure is best understood with reference to the detailed drawings and description set forth. Various aspects have been discussed with reference to the drawings. However, the person skilled in the art may or may not appreciate the detailed explanations provided herein. For instance, the teachings presented and the needs of a particular application may yield. Therefore, any approach may extend beyond certain implementation choices in the following.

[0042] FIG. 1 illustrates the flowchart 100 of the method for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises, in accordance with an embodiment. The method initiates with the step 102 of curating, by one or more processors, a plurality of data records pertaining to premises, and a plurality of devices installed in the premises. In an embodiment, the premises are selected from at least one of a room, a house, an apartment, a commercial building, and/or combination thereof. In an embodiment, the plurality of devices and infrastructure includes but not limited to an electric cabling, telephone or Ethernet cabling, a plumbing infrastructure/system, a general cabling infrastructure, a heating unit, a ventilation, an air-conditioning unit, an electrical unit, a furniture, an electronic unit etc.FIG. 1 illustrating the flowchart 100 of the method for curating, accessing, and displaying a pattering of data records pertaining to premises, and a pending set of apparatus in the premises, in accordance with an embodiment. The method initiates with the step 102 of curating, by one or more processors, a plurality of data records pertaining to premises, and a pertinent to devices installed in the premises. In an embodiment, the premises are one of a room, a house, an apartment, a commercial building, and / or combination thereof. In an embodiment, the apparatus and infrastructure includes, but is not limited to, electric cabling, telephone or Ethernet cabling, a plumbing infrastructure / system, a general cabling infrastructure, a ventilation unit, an air conditioning unit, an electrical unit, a furniture, an electronic unit etc.

[0043] The plurality of data records is curated during a plurality of phases such as a planning phase of the premises, a construction phase of the premises, an installation phase of the infrastructure and devices in the premises. The data is collected by utilizing various methods such as user inputs, digital blueprints of the premises and devices, video and sound recordings etc. Further, the collected data is processed for the presentation in a pre-defined state such as augmented reality (AR). The collection and curation of the data and information is a continuous process.The Theory of data records is curated during a phase of the construction phase of the premises, a construction phase of the premises, an installation phase of the infrastructure and devices in the premises. Further, the collected data is processed in a pre-defined state such as augmented reality (AR). , The collection and curation of the data and information is a continuous process.

[0044] Then the method includes the step 104 of storing, managing and processing the curated data records in a database or in a cloud. Further, the method includes the step 106 of accessing, by one or more processors, the stored data records corresponding to the premises and the devices through a computing unit on receiving an input command from a user or the stored data may automatically activate through a plurality of sensors configured to the computing unit. In an embodiment, the computing unit includes but not limited to a computer, a smartphone, a tablet, a personal digital assistant (PDAs), mixed reality headsets, virtual reality headsets, and/or combination thereof.Then the method includes the step 104 of storing, managing and processing the curated data records in a database or in a cloud. Further, the method includes the step 106 of accessing, by one or more processors, the stored data records are received by a processor or a processor configured to the computing unit. In an embodiment, the computing unit includes, but is not limited to, a smartphone, a tablet, a personal digital assistant (PDA), mixed reality headsets, virtual reality headsets, and / or combination thereof.

[0045] In an embodiment, the present method utilizes various internationally recognized device identification methods to identify the various devices installed in the premises. Examples of the internationally recognized device identification methods include but not limited to Universal Product Code (UPC), International Standard Book Number (ISBN), and European Article Number (EAN). The Universal Product Code is a code printed on the retail product packaging to aid in identifying a particular item. It consists of a machine-readable barcode, which is a series of unique black bars, and a unique 12-digit number beneath it.In an embodiment, the present method utilizes various known device identification methods to identify the various devices installed in the premises. Examples of the widely recognized device identification methods include but are not limited to Universal Product Code (UPC), International Standard Book Number (ISBN), and European Article Number (EAN). The Universal Product Code is a code printed on the product packaging to aid in a particular item. It consists of a machine-readable barcode, which is a series of unique black bars, and a unique 12-digit number below it.

[0046] In another embodiment, the present method automatically identifies the installed device by utilizing a plurality of image recognition technologies such as Google Cloud Vision (developed by Google™), Amazon Rekognition (developed by Amazon™), Microsoft Azure (developed by Microsoft™) Apple Vision (developed by Apple™), Facebook Image-Recognition (developed by Facebook™), IBM Watson Visual Recognition (developed by IBM™), Cloudsight™, Clarifai™, Device Manufacturers image libraries and etc. The present system accesses these technologies by using authorized or licensed APIs provided by the respective organizations.In another embodiment, the present invention automatically identifies the recognition technology using Google Cloud Vision (developed by Google ™), Amazon Recognition (developed by Amazon ™), Microsoft Azure (developed by Microsoft ™) Apple Vision (developed by Apple ™), Facebook Image Recognition (developed by Facebook ™), IBM Watson Visual Recognition (developed by IBM ™), Cloudsight ™, Clarifai ™, Device Manufacturers image libraries and etc. The present system accesses these technologies by using authorized or licensed APIs provided by the respective organizations.

[0047] Furthermore, the method includes the step 108 of displaying, by one or more processors, the accessed data on receiving a pointing gesture from the user through the computing unit either towards the premises, various elements within the premises such as wall, ceilings, floors, doors etc. or the devices installed in the premises. The computing unit comprises an augmented reality mechanism to display the curated data records through a plurality of pre-defined user-interface states. In an embodiment, the plurality of user interface states includes a clock face/other image user-interface state, as shown in FIGS. 13-14, and an augmented reality user-interface state, as shown in FIGS. 13-14.Furthermore, the method includes the stepping-in of the computing unit, respectively, the premises, various elements within the premises such as wall, ceilings , floors, doors etc. or the devices installed in the premises. The computing unit comprises an augmented reality mechanism to display the curated data records through a pre-defined user-interface state. In an embodiment, the parameters of the user interface state include a clock face / other image user-interface state, as shown in FIGS. 13-14, and augmented reality user-interface state, as shown in FIGS. 13-14.

[0048] In an embodiment, the clock face/other image user-interface state displays a plurality of visual cues pertaining to the premises and the devices and further prevents an unintentional activation of the augmented reality user-interface state. The visual cue includes but not limited to a textual data record, a graphical data record, etc.In an embodiment, the clock face / other image user-interface state displays a visual cues pertaining to the premises and the devices and further prevents the unintentional activation of the augmented reality user-interface state. The visual cue includes but not limited to a textual data record, a graphical data record, etc.

[0049] In an embodiment, the augmented reality user-interface state activates on receiving the pointing gesture at a wall, a floor, a ceiling, a door, a room, a device, a smart sensor, a building, or a furniture to display a corresponding curated data record.In an embodiment, the augmented user interface state activates on receiving the gesture on a wall, a floor, a ceiling, a door, a room, a device, a smart sensor, a building, or a furniture to display a corresponding curated data record.

[0050] Then the method includes the step 110 of enabling, by one or more processors, the user to add or remove data related to a plurality of additional devices which are originally not installed in the premises at the planning phase, construction phase, and installation phase of the devices in the premises. Further, the method includes the step 112 of enabling, by one or more processors, the user to wirelessly control a plurality of functions of the devices. The method then includes the step 114 of enabling, by one or more processors, the user to purchase a device, install a device, or purchase installation and maintenance services of the device or system in case the device or a system is damaged and requires a replacement or maintenance.[0050] In the process of the invention, the process of the invention is based on the invention of the process of the invention Installation phase of the devices in the premises. Further, the method includes the step 112 of enabling, by one or more processors, the user to wirelessly control the functions of the devices. In the case of a device or device, the device is or is being used, the device is or is being replaced by a device replacement or maintenance.

[0051] FIG. 2 represents a block diagram of the present system 200 for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises, in accordance with at least one embodiment. Fig. 2 is explained in conjunction with Fig. 1. In one embodiment, the system 200 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.FIG. 2 represents a block diagram of the present system 200 for curating, accessing, and displaying a data room pertaining to premises, and a device of the kind installed in the premises, in accordance with at least one embodiment. Fig. 2 is explained in conjunction with Fig. 1. In one embodiment, the system 200 may include at least one processor 202, an input / output (I / O) interface 204, and a memory 206. The processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and / or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.

[0052] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 200 to interact with a user directly or through the computing units. Further, the I/O interface 204 may enable the system 200 to communicate with other computing devices, such as web servers and external data servers. The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.[0052] The I / O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I / O interface 204 may allow the system 200 to interact directly with or through the computing units. Further, the I / O interface 204 may enable the system 200 to communicate with other computing devices, such as web servers and external data servers. The I / O interface 204 can facilitate multiple communications in a wide variety of networks and protocols, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I / O interface may include one or more ports for connecting to one another.

[0053] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.The memory may include any computer-readable medium known, including, for example, volatile memory, static random access memory (SRAM) and dynamic random access memory (DRAM), and / or non-volatile memory , read as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.

[0054] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 includes a curation module 212, an access module 214, a display module 216, a modification module 217, a control module 218, a purchase module 219, and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the system 200.The modules include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 includes a curation module 212, an access module 214, a display module 216, a modification module 217, a control module 218, an purchase module 219, and other modules 220 or coded instructions that supplement applications and functions of the system 200.

[0055] The data 210, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 210 may also include a curation data 222, an access data 224, a display data 225, a modification data 226, a control data 227, a purchase data 228, and other data 230. The other data 230 may include data generated as a result of the execution of one or more modules in the other module 220.The data 210, among other things, serves as a repository for storing data processed, received, and generated by one or more modules 208. The data 210 may also include a curation data 222, to access data 224, a display data 225, a modification data 226, a control data 227, a purchase data 228, and other data 230. The other data 230 may include data generated as a result of the execution of one or more modules in the other module 220.

[0056] In one implementation, the curation module 212 curates a plurality of data records pertaining to premises and a plurality of devices installed in the premises. The plurality of data records is curated during a plurality of phases such as a planning phase of the premises, a construction phase of the premises, an installation phase of the devices in the premises.In one implementation, the curation module 212 curates a peculiarity of data records pertaining to premises and a pert of devices installed in the premises. A process phase of the premises, an installation phase of the devices in the premises.

[0057] The processor is configured to store the curated data records in a database or in a cloud. In one implementation, the access module 214 accesses the stored data records corresponding to the premises and the device by utilizing a computing unit on receiving an input command from a user. In one implementation, the display module 216 displays the accessed data on receiving a pointing gesture by the computing unit either towards the premises or the devices installed in the premises. The computing unit comprises an augmented reality mechanism to display the curated data records through a plurality of pre-defined user-interface states.[0057] The processor is configured to store the curated data records in a database or in a cloud. In one implementation, the access module 214 accesses the stored data records corresponding to the premises and the device by using a computing unit on receiving an input command from a user. In one implementation, the display module 216 displays the data being received by the computing device. The computing unit comprises an augmented reality mechanism to display the curated data records through a pre-defined user-interface state.

[0058] In one implementation, the modification module enables the user to add or remove data related to a plurality of additional devices which are originally not installed in the premises. In one implementation, the control module enables the user to wirelessly control a plurality of functions of the devices. In one embodiment, the wireless control mechanism can be accomplished by a plurality of methods. In the first method, once the software application automatically identifies the computing unit, the software application accesses the manufacturer of the device's built-in control functions/capabilities/methods. The control functions/capabilities/methods of the identified device are displayed in AR display mode by the application to the user.In one implementation, the modification module allows the user to add or remove data from a network. In one implementation, the control module enables the user to wirelessly control a device of the devices. In one embodiment, the wireless control mechanism can be accomplished by a. In the first method, the computing unit identifies the device's built-in control functions / capabilities / methods. The control functions / capabilities / methods of the identified device are displayed in AR display mode.

[0059] In the second method, the software application uses the pre-programmed/configured control functions made by the installer or the user as a result of connections made between devices. For example, the devices of a multimedia system typically may be interconnected (e.g., by cabling, internet protocol, Bluetooth or infrared) in a wide variety of different manners. Once a user (e.g., an installer or end user) has determined all the connections/control functions that are required, or at least are desirable, between devices of a multimedia system, the application will gain access to the pre-programmed/configured control functions and give the end user the capability of controlling the multimedia system via AR (Augmented) display mode generated by the application. The software application gains access to the installer’s or user’s pre-programmed/configured control functions by using Internet protocol gateway components and licensed or authorized application interface protocols.[0059] In the second method, the software application uses the pre-programmed / configured control functions made by the installer or the user as a result of connections made between devices. For example, the devices of a multimedia system may be interconnected (e.g., by cabling, internet protocol, Bluetooth or infrared) in a wide variety of different manners. Once a user (eg, an installer or an end user) has established access to the pre-programmed / configured control functions and give the end user the capability of controlling the multimedia system via AR (augmented) display mode generated by the application. The software application gains access to the installer's or user's pre-programmed / configured control functions by using Internet protocol gateway components and licensed or authorized application interface protocols.

[0060] In one implementation, the purchase module to enable the user to purchase a device in case the device is damaged or requires a replacement. In an embodiment, the present system 200 and method can be utilized as a software application which uses Augmented Reality (AR) to display the various functions of the present installed device or system. If the user’s computing unit has AR capabilities, the user can use the present system 200 to get the data and information about the house, room or installed infrastructures of the building.In one implementation, the purchase module to be used in the case of the device is damaged or requires a replacement. AR capabilities, the user can use the software, which uses augmented reality (AR) to display the various functions of the presently installed device or system the present system 200 to get the data and information about the house, room or installed infrastructures of the building.

[0061] For example, if the user wants to see where the water pipes and electric cables of a building, house or room were installed behind a specific wall, floor or ceiling, all they have to do is, activate the software application installed on his/her computing unit, point their computing unit at a wall, floor or ceiling that he/she wishes to get information about and the user interface of the software application will change to display an AR (augmented reality) display of the water pipes and electric cables that were installed behind that specific wall, floor or ceiling (shown in FIG. 14). FIG. 3 illustrates an exemplary view 300 of a pointing gesture from the user through the computing unit 308 towards the device (smart sensors) 304 of the premises, in accordance with at least one embodiment.[0061] For example, if the user wishes to see the water pipes and electrical cables of a building, house or room have been installed in a wall, floor or ceiling his / her computing unit at a wall, floor or ceiling that he wants to get information about and the user interface of the software application wants to change to display AR (augmented reality) display of the water pipes and electric cables that were installed in that particular wall, floor or ceiling (shown in FIG. FIG. 3 illustrates an exemplary view of the device 308 towards the device 308 towards the device 304 of the premises, in accordance with at least one embodiment.

[0062] FIG. 6 illustrates an exemplary view 600 of a pointing gesture from the user through the computing unit towards the floor 602 of the premises, in accordance with at least one embodiment. FIG. 7 illustrates an exemplary view 700 of a pointing gesture from the user through the computing unit 308 towards the ceiling 702 of the premises, in accordance with at least one embodiment. FIG. 11 illustrates an exemplary view 1100 of a pointing gesture from the user through the computing unit 308 towards the lights 1102 installed in an office, in accordance with at least one embodiment.FIG. 6 illustrated an exemplary view 600 of a pointing out the user through the computing unit towards the floor 602 of the premises, in accordance with at least one embodiment. FIG. 7 illustrated an exemplary view of the computer through the computing unit 308 to the ceiling 702 of the premises, in accordance with at least one embodiment. FIG. 11 illustrated at an exemplary view 1100 of a headset from the computer through the computing unit.

[0063] In another example, if the user points their computing unit at a particular device or smart sensor the software application would automatically identify the device and offer the user control function of that device (shown in FIG. 13). FIG. 4 illustrates an augmented reality control state 400 of the device such as a TV 402 on receiving a pointing gesture from the user through the computing unit 308, in accordance with at least one embodiment. FIG. 5 illustrates an augmented reality control state 500 of the device such as a stereo system 502 on receiving a pointing gesture from the user through the computing unit 308, in accordance with at least one embodiment. FIG. 15 illustrates an augmented reality control state and exemplary view 1500 of a pointing gesture from the user through the computing unit 308 to control an air-conditioning unit 1502 installed in the premises, in accordance with at least one embodiment. The present system 200 enables the user to control the air-conditioning (AC) unit 1502 by utilizing the augmented reality function. FIG. 16 illustrates an augmented reality control state and exemplary view 1600 of a pointing gesture from the user through the computing unit 308 to control a floor heating unit 1602 installed in the premises, in accordance with at least one embodiment. The present system 200 enables the user to control the floor heating unit 1602 via the augmented reality function. The software application automatically offers the option of controlling the floor heating unit 1602 on receiving the points gesture from the user through his/her computing unit 308 to the floor.[0063] In another example, the software application would automatically identify the device and offer the user control function of that device (shown in FIG. 13). FIG. 4 illustrating an augmented reality control state of the device 308, in accordance with at least one embodiment. FIG. 5 shows an augmented reality control state 500 of the device 502 as it receives it, which receives the signal from the user through the computing unit 308, in accordance with at least one embodiment. FIG. 15 illustrated at augmented reality control state and exemplary view 308 to control at the air conditioning unit 1502 installed in the premises, in accordance with at least one embodiment. The present system 200 enables the user to control the air-conditioning (AC) unit 1502 by utilizing the augmented reality function. FIG. 16 Illustrations in augmented reality control state and exemplary view of the computing unit. The present system 200 enables the user to control the floor heating unit 1602 via the augmented reality function. The software application 160 automatically receives the option of controlling the floor heating unit 1608 on the floor.

[0064] Further, if the user points his/her computing unit at a specific device, smart sensor, system, furniture or light, the system will automatically detect the device, smart sensor, system, furniture or light and proceed to provide information concerning device’s specification, diagnostics results, installation date, guarantee information, suppliers and installer information in the event the device needs to be serviced, repaired or replaced. The user would have the ability to purchase the device, order maintenance or installation services from approved or various suppliers and installation companies. The present system enables the user to add or change installed devices, systems, suppliers and installation companies to the curated data.Further, the device will automatically detect the device, smart sensor, system, furniture, or device for a specific device device's specification, diagnostics results, installation date, warranty information, suppliers and installer information in the event the device needs to be serviced, repaired or replaced. The user would have the ability to purchase the equipment or order the service from approved or various suppliers and installation companies. The present system enables the devices to be installed or installed.

[0065] FIG. 13 illustrates a plurality of pre-defined user-interface states 1300 such as a clock face/other image user-interface state 1302 and augmented reality user-interface state 1304, in accordance with at least one embodiment. FIG. 14 illustrates an exemplary view 1400 of plumbing and cabling networks 1402 and 1404 in a clock face/other image user-interface state and augmented reality user-interface state respectively, in accordance with at least one embodiment. The software application of the present system is configured with the computing unit of the user. This software application includes a plurality of user interface states (shown in FIGS. 13-14). A user interface state is a state in which the present software application responds in a predefined manner to a user input or action. The plurality of the user interface states on the computing unit includes a clock face/other image user-interface state 1302 and augmented reality user-interface state 1304.FIG. 13 illustrated a prioritized user-interface state 1300 search as a clock face / other image user-interface state 1302 and augmented reality user-interface state 1304, in accordance with at least one embodiment. FIG. 14 illustrated and 1440 in a clock face / other image user-interface state and augmented reality user-interface state, respectively, in accordance with at least one embodiment. The software application of the present system is configured with the computing unit of the user. This software application includes a number of user interface states (shown in FIGS. 13-14). A user interface state is a state in which the present software application responds in a predefined manner to a user input or action. 1302 and augmented reality user-interface state 1304.

[0066] In the clock face/other image user-interface state 1302, when the computing unit 308 is powered on and the software application is activated, the clock face/other image user-interface state 1302 ignores most, if not all, user inputs. Thus, the clock face/other image user-interface state 1302 does not initiate any action in response to the user input and/or the software application is prevented from performing a predefined set of functions. The clock face/other image user-interface state 1302 may be used to prevent unintentional activation of augmented reality user-interface state when the software application is launched.In the clock face / other image user-interface state 1302, when the computing unit 308 is powered on, the clock face / other image user-interface state 1302 ignores most, if not all, user inputs. Thus, the clock face / other image user-interface state 1302 does not initiate any action in response to the user input and / or the software application. The clock face / other image user-interface state 1302 may be used to prevent unintentional activation of augmented reality.

[0067] When the software application is in clock face/other image state 1302, the AR (augmented reality) user-interface state 1304 displays function/capability may be said to be de-activated. In the clock face/image user-interface state, the application may respond to a limited set of user inputs, including input that corresponds to activating other functions that don’t include the AR (augmented reality) user-interface state 1304. In other words, the clock face/other image user-interface state 1302 of the software application responds to the user input corresponding to attempts to activate other functions that do not involve the display of AR data and information (shown in FIG. 13).When the software application is in clock face / other image state 1302, the AR (augmented reality) user-interface state 1304 displays function / capability may be de-activated. Augmented reality) user-interface state 1304. In other words: in the clock face / image user-interface state, the application may respond to a limited set of user inputs words, the clock face / other image user-interface state (s) shown in FIG. 13).

[0068] The software application clock face/other image user-interface state 1302 on the tablet computer, smartphone, mixed reality headsets and virtual reality headsets may display one or more visual cue(s) of an activated AR function to the user. The visual cues may be textual, graphical or any combination thereof. The visual cues are displayed upon a particular occurring while in the application clock face/other image user-interface state 1302. The particular events that trigger the display of visual cues may include the tablet computer, smartphone, mixed reality headsets and virtual reality headsets image recognition capabilities, user’s pointing gestures, geographical position and building and room identification sensors.The software application clock face / other image user-interface state 1302 on the tablet computer, smartphone, mixed reality headsets and virtual reality headsets. The visual cues may be textual, graphical or any combination thereof. The visual cues are displayed in the application clock face recognition capabilities, user's positioning gestures, geographical location and building and room identification sensors.

[0069] The AR (augmented reality) user-interface state 1304 includes a gesture of pointing the phone at a wall, floor, ceiling, door, room, device, smart sensor or furniture. The AR user-interface state 1304 is a predefined function activated when the user points their device at a wall, floor, ceiling, door, room, device, smart sensor, building, or furniture. FIG. 12 illustrates an exemplary view 1200 of a pointing gesture from the user through the computing unit 308 towards a building 1202, in accordance with at least one embodiment.The AR (augmented reality) user-interface state 1304 includes a gesture of pointing the phone at a wall, floor, ceiling, door, room, device, smart sensor or furniture. The AR user-interface state 1304 is a predefined function activated at the wall, floor, ceiling, door, room, device, smart sensor, building, or furniture. FIG. 12 illustrating an exemplary view of the user through the computing unit 308 towards a building 1202, in accordance with at least one embodiment.

[0070] The gesture is a motion of the object/appendage pointing a tablet computer, smartphone, mixed reality headsets and virtual reality headsets device at an object or space. For example, the predefined gesture may include pointing a tablet computer, smartphone, mixed reality headsets and virtual reality headsets at a wall, ceiling, door, floor, building, device, smart sensor and making a 360-degree rotation (shown in FIG. 8). FIG. 8 illustrates an exemplary view 800 of a 360 degree pointing gesture from the user through the computing unit 308 towards the wall 802 of the premises or the devices installed in the premises, in accordance with at least one embodiment.The gesture is a motion of the object / appendage pointing at a tablet computer, smartphone, mixed reality headsets and virtual reality headsets device at an object or space. A 360-degree rotation (shown in FIG. 1) for example, the predefined gesture may include a tablet computer, smartphone, mixed reality headsets and virtual reality headsets at a wall, ceiling, door, floor, building, device. 8th). FIG. 8 illustrating the embodiment of the computing unit 308 facing the wall 802 of the premises or the device installed in the premises, in accordance with at least one embodiment.

[0071] While the application is in clock face/other image user-interface state, the users may activate AR (augmented reality) user-interface state, i.e. point their mobile device as shown in FIGS. 3, 4, 5, 6, 7, and 11. The gesture of pointing a tablet computer, smartphone, mixed reality headsets and virtual reality headsets can be performed using one or two hands. However, it should be appreciated that the pointing gesture may be made using any suitable object or appendage, such as a tripod, selfie-stick, etc. FIG. 9 illustrates an exemplary view 900 of the user wearing a mixed reality headset 902, in accordance with at least one embodiment. FIG. 10 illustrates an exemplary view 1000 of the user wearing a virtual reality headset 1002, in accordance with at least one embodiment.While the application is in clock face / other image user-interface state, the users may activate AR (augmented reality) user-interface state, i.e. point their mobile device as shown in FIGS. 3, 4, 5, 6, 7, and 11. The gesture of a tablet computer, smartphone, mixed reality headsets, and virtual reality headsets. However, it should be appreciated that the device may be used as any suitable object or appendage, such as a tripod, selfie stick, etc. FIG. 9 illustrates an exemplary view of the user wearing a mixed reality headset 902, in accordance with at least one embodiment. FIG. 10 illustrative embodiment 1000 of the user wearing a virtual reality headset 1002, in accordance with at least one embodiment.

[0072] If the pointing gesture corresponds to a successful performance of the activate AR user-interface state i.e. the user performed the activated the AR user-interface state, the transitions of the user-interface state to the AR display mode depends on the element that they are pointing to such as premises and devices.[0072] If the pointing gesture is successful in the performance of the user-interface state, i.e. The AR user-interface state, the transitions of the user-interface state to the AR display mode depends on the element that they are pointing to such as premises and devices.

[0073] The software application begins the process of transitioning to the AR user-interface state activation state upon detection of any pointing gesture and aborts the transition as soon as the application determines that function needed does not correspond to the AR user-interface state.The software application begins the process of transitioning to the user-interface state. The user interface state actuation state does not correspond to the AR user-interface state.

[0074] When the software application is in clock face/other image user-interface state, the software application may display on user-interface objects corresponding to one or more functions of the software application and/or information that may be of interest to the user. The user-interface objects are objects that make up the user interface of the application and may include, without limitation, text, image, icons, soft keys (or “virtual buttons”), pull-down menus, radio buttons, check boxes, selectable lists, and so forth. The displayed user-interface objects may also include non-interactive objects that convey information or contribute to the look and feel of the user interface objects by making contact with the touch screen at one or more touch screen locations corresponding to the interactive objects with which she or he wishes to interact. The software application detects the contact and responds to the detected contact by performing the operation (s) corresponding to the interaction with the interactive object(s).[0074] When the software application is in clock face / other image user-interface state, the software application is displayed on a user-interface object corresponding to one or more functions of the software application and / or information user. The user-interface objects are objects of the application and may include, without limitation, text, image, icons, soft keys (or "virtual buttons"), pull-down menus, radio buttons, check boxes, selectable lists, and so forth. The displayed user-interface objects may also include non-interactive objects that convey or contribute to the interface of the user interface or he wishes to interact. The software application detects the contact and responds to the detected contact by performing the operation (s).

[0075] While the software application is in the clock face/other image user-interface state, the user may still make contact on a tablet computer, smartphone, mixed reality headsets and virtual reality headsets with touchscreen capabilities. However, the activated AR user-interface state is prevented from performing a predefined set of actions in response to detected contact until the devices detect the pointing gestures.While the application is in the clock face / other image user-interface state, the user may make contact on a tablet computer, smartphone, mixed reality headsets and virtual reality headsets with touchscreen capabilities. However, the activated AR user-interface state is disabled.

[0076] Thus, the present invention provides an integrated system which displays the installed infrastructure in the premise, provides a control over devices, allows to purchase replacement or upgrade devices, and enables the user to purchase maintenance and installation of services. The present invention provides a single unified platform to access, view, control and order goods and services related to premises and installed device. Further, the information that pertains to the suppliers of the devices, installers and maintenance service providers is curated by the present invention for the benefit of quality control of goods and services offered to the user.Thus, the present invention provides an integrated system which displays the installed infrastructure in the premise, provides a control over devices, allows to purchase or upgrade devices, and enables the user to purchase maintenance and installation of services. The present invention provides a single unified platform for accessing, viewing, controlling and ordering goods and services. Further, the information that pertains to the devices of the devices, installers and maintenance service providers is curated by the present invention for the benefit of quality control of goods and services offered to the user.

[0077] While embodiments of the present invention have been illustrated and described, it will be clear that the present invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to the person skilled in the art, without departing from the spirit and scope of the invention, as described in the claims.[0077] While the invention is not limited to these only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claims.

Claims (20)

1. Procédé mis en œuvre par un ou plusieurs processeurs, le procédé comprenant des étapes: d’organisation, par un ou plusieurs processeurs, d'une pluralité d'enregistrements de données relatifs à un local, et d'une pluralité d’appareils installés dans les locaux, dans laquelle la pluralité d'enregistrements de données est organisée pendant une pluralité de phases choisies parmi au moins une phase de planification des locaux, une phase de construction des locaux, une phase d'installation des appareils dans les locaux, et / ou une combinaison de ceux-ci; de stockage, par un ou plusieurs processeurs, des enregistrements de données organisées dans une base de données; d’accès, par un ou plusieurs processeurs, aux enregistrements de données stockées correspondant aux locaux et aux appareils à travers une unité de calcul à la réception d'une commande d'entrée d'un utilisateur ou que les données stockées peuvent automatiquement activer à travers une pluralité de capteurs configurés à l'unité de calcul; et d’affichage, par un ou plusieurs processeurs, des données accédées à la réception d’un geste de pointage de l'utilisateur à travers l'unité de calcul vers au moins l’un des locaux, les appareils installés dans les locaux, une pluralité d’éléments dans les locaux tels que des plafonds, des planchers, des portes et / ou une combinaison de ceux-ci, dans lequel l'unité de calcul comprend un mécanisme de réalité augmentée pour afficher les enregistrements des données conservés à travers une pluralité d'états d'interface utilisateur prédéfinis.A method implemented by one or more processors, the method comprising steps of: organizing, by one or more processors, a plurality of data records relating to a local, and a plurality of devices installed in the premises, wherein the plurality of data records are organized during a plurality of selected phases among at least one planning phase of the premises, a building phase of the premises, a phase of installation of the appliances in the premises, and / or a combination thereof; storing, by one or more processors, data records organized in a database; access, by one or more processors, to the stored data records corresponding to premises and apparatuses through a computing unit upon receipt of a user input command or which the stored data can automatically activate to through a plurality of sensors configured to the computing unit; and displaying, by one or more processors, data accessed upon receipt of a pointing gesture from the user through the computing unit to at least one of the premises, the devices installed in the premises, a plurality of elements in the premises such as ceilings, floors, doors and / or a combination thereof, wherein the computing unit includes an augmented reality mechanism for displaying the stored data records through a plurality of predefined user interface states. 2. Le procédé selon la revendication 1, comprend en outre l'étape consistant à permettre, par un ou plusieurs processeurs, à l'utilisateur d'ajouter ou de supprimer des données relatives à une pluralité d’appareils supplémentaires qui ne sont pas initialement installés dans les locaux lors de la phase de planification, la phase de construction et la phase d’installation des appareils dans les locaux.The method of claim 1, further comprising the step of allowing, by one or more processors, the user to add or delete data relating to a plurality of additional devices that are not initially installed in the premises during the planning phase, the construction phase and the installation phase of the appliances in the premises. 3. Le procédé selon la revendication 1, comprend en outre l'étape consistant à permettre, par un ou plusieurs processeurs, à l'utilisateur de commander sans fil une pluralité de fonctions des appareils.The method of claim 1 further comprises the step of allowing, by one or more processors, the user to wirelessly control a plurality of device functions. 4. Le procédé selon la revendication 1, comprend en outre l'étape consistant à permettre à un ou plusieurs processeurs d'acheter un appareil, d'installer un appareil et d'acheter des services de maintenance et d'installation de l’appareil ou du système au cas où l’appareil ou le système est endommagé ou nécessite un remplacement.The method of claim 1 further includes the step of allowing one or more processors to purchase a device, install a device, and purchase device maintenance and installation services. or the system in case the device or system is damaged or requires replacement. 5. Le procédé selon la revendication 1, dans lequel les locaux sont choisis parmi au moins une pièce, une maison, un appartement, un immeuble commercial, et / ou une combinaison de ceux-ci.5. The method of claim 1, wherein the premises are selected from at least one room, a house, an apartment, a commercial building, and / or a combination thereof. 6. Le procédé selon la revendication 1, dans lequel la pluralité d’appareils est choisie parmi au moins un câblage électrique, un câblage téléphonique ou Ethernet, une infrastructure / système de plomberie, une infrastructure générale de câblage, une unité de chauffage, une ventilation, une unité de conditionnement, une unité électrique, un meuble, une unité électronique et / ou une combinaison de ceux-ci.The method of claim 1, wherein the plurality of devices are selected from at least one of electrical wiring, telephone or Ethernet wiring, plumbing infrastructure / system, general wiring infrastructure, heating unit, ventilation, a packaging unit, an electrical unit, a piece of furniture, an electronic unit and / or a combination thereof. 7. Le procédé selon la revendication 1, dans lequel l'unité de calcul est choisie parmi au moins un ordinateur, un smartphone, une tablette, des casques de réalité mélangée, des casques de réalité virtuelle, et / ou une combinaison de ceux-ci.The method according to claim 1, wherein the computing unit is selected from at least one computer, smartphone, tablet, mixed reality headphones, virtual reality headphones, and / or a combination thereof. this. 8. Le procédé selon la revendication 1, dans lequel la pluralité d'états d'interface utilisateur est configurée avec l'unité de calcul comprenant: un cadran d'horloge / autre image représentant l’état de l'interface utilisateur et un état d'interface utilisateur à réalité augmentée.The method of claim 1, wherein the plurality of user interface states are configured with the computing unit comprising: a clock face / other image representing the state of the user interface and a state augmented reality user interface. 9. Le procédé selon la revendication 1, dans lequel l’état d’interface utilisateur représenté par le cadran d'horloge affiche une pluralité de repères visuels relatifs aux locaux et aux appareils, et empêche en outre une activation involontaire de l'état d'interface utilisateur à réalité augmentée, dans lequel le repère visuel est choisi parmi au moins un enregistrement de données textuelles, un enregistrement de données graphiques et / ou une combinaison de ceux-ci.The method of claim 1, wherein the user interface state represented by the clock face displays a plurality of visual cues relating to the premises and devices, and further prevents unintentional activation of the d-state. augmented reality user interface, wherein the visual cue is selected from at least one text data record, a graphical data record, and / or a combination thereof. 10. Le procédé selon la revendication 1, dans lequel l'état d'interface utilisateur à réalité augmentée s'active à la réception du geste de pointage sur un mur, un plancher, un plafond, une porte, une pièce, un appareil, un capteur intelligent, un bâtiment ou un meuble pour afficher un enregistrement de données organisées correspondant.The method according to claim 1, wherein the augmented reality user interface state is activated upon receiving the pointing gesture on a wall, floor, ceiling, door, room, appliance, an intelligent sensor, a building or a piece of furniture to display a corresponding organized data record. 11. Un système pour organiser, accéder et afficher une pluralité d'enregistrements de données concernant des locaux, et une pluralité d’appareils installés dans les locaux, le système comprenant: un processeur et une mémoire pour stocker des instructions lisibles par machine qui, lorsqu'elles sont exécutées par le processeur amène celui-ci à: mémoriser une pluralité d'enregistrements de données se rapportant à un local, et une pluralité d’appareils installés dans les locaux à travers un module d’organisation, dans lequel la pluralité d'enregistrements de données est organisée pendant une pluralité de phases choisies parmi au moins une phase de planification des locaux, une phase de construction des locaux, une phase d'installation des appareils dans les locaux, et / ou une combinaison de ceux-ci; stocker les enregistrements de données conservées dans une base de données; accéder aux enregistrements de données stockées correspondant aux locaux et aux appareils à travers un module d'accès en utilisant une unité de calcul à la réception d’une commande d'entrée d'un utilisateur ou les données stockées peuvent être automatiquement activées par une pluralité de capteurs configurés à l'unité de calcul; et afficher les données accédées à travers un module d'affichage à la réception d'un geste de pointage par l'unité de calcul vers au moins un des locaux, les appareils installés dans les locaux, une pluralité d'éléments dans les locaux tels un mur, des plafonds, sols, portes, et / ou une combinaison de ceux-ci, dans lequel l’unité de calcul comprend un mécanisme de réalité augmentée pour afficher les enregistrements de données programmés à travers une pluralité d'états d'interface utilisateur prédéfinis.A system for organizing, accessing, and displaying a plurality of data records relating to premises, and a plurality of devices installed in the premises, the system comprising: a processor and a memory for storing machine readable instructions which, when executed by the processor causes the processor to: store a plurality of data records relating to a local, and a plurality of devices installed in the premises through an organization module, wherein the plurality of data records is organized during a plurality of selected phases among at least one planning phase of the premises, a building phase of the premises, a phase of installation of the appliances in the premises, and / or a combination thereof ; store the stored data records in a database; accessing the stored data records corresponding to the premises and devices through an access module by using a computing unit upon receipt of a user input command, or the stored data can be automatically activated by a plurality of sensors configured to the computing unit; and displaying the data accessed through a display module upon receipt of a pointing gesture by the computing unit to at least one of the premises, the devices installed in the premises, a plurality of elements in the premises such as a wall, ceilings, floors, doors, and / or a combination thereof, wherein the computing unit includes an augmented reality mechanism for displaying the programmed data records through a plurality of interface states predefined user. 12. Le système selon la revendication 11, comprend en outre un module de modification pour permettre à un utilisateur d'ajouter ou de retirer des données relatives à une pluralité d’appareils supplémentaires qui ne sont pas initialement installés dans les locaux lors de la phase de planification, de construction et d'installation des appareils dans les locaux.The system of claim 11 further comprises a modification module for enabling a user to add or remove data relating to a plurality of additional devices that are not initially installed in the premises during the phase. planning, construction and installation of appliances in the premises. 13. Le système selon la revendication 11, comprend en outre un module de commande pour permettre à l'utilisateur de commander sans fil une pluralité de fonctions des appareils.The system of claim 11 further comprises a control module for enabling the user to wirelessly control a plurality of functions of the apparatuses. 14. Le système selon la revendication 11, comprend en outre un module d'achat permettant à l'utilisateur d'acheter un appareil, d'installer un appareil et d'acheter des services de maintenance et d'installation de l'appareil ou du système au cas où le l'appareil ou le système est endommagé ou nécessiterait un remplacement.The system of claim 11 further includes a purchasing module enabling the user to purchase a device, install a device and purchase maintenance and installation services from the device or device. the system in case the device or system is damaged or requires replacement. 15. Le système selon la revendication 11, dans lequel les locaux sont choisis parmi au moins une pièce, une maison, un appartement, un immeuble commercial et / ou une combinaison de ceux-ci.The system of claim 11, wherein the premises are selected from at least one room, a house, an apartment, a commercial building and / or a combination thereof. 16. Le système selon la revendication 11, dans lequel la pluralité d'appareils est choisie parmi au moins un câble électrique, un câblage téléphonique ou Ethernet, une infrastructure / système de plomberie, une infrastructure de câblage générale, une unité de chauffage, une ventilation, une unité de conditionnement, une unité électrique, un meuble, une unité électronique et / ou une combinaison de ceux-ci.The system of claim 11, wherein the plurality of devices are selected from at least one electrical cable, telephone or Ethernet cabling, plumbing infrastructure / system, general cabling infrastructure, heating unit, ventilation, a packaging unit, an electrical unit, a piece of furniture, an electronic unit and / or a combination thereof. 17. Le système selon la revendication 11, dans lequel l'unité de calcul est choisie parmi au moins un ordinateur, un smartphone, une tablette, des casques de réalité mélangée, des casques de réalité virtuelle et / ou une combinaison de ceux-ciThe system of claim 11, wherein the computing unit is selected from at least one computer, smartphone, tablet, mixed reality headsets, virtual reality headsets and / or a combination thereof. 18. Le système selon la revendication 11, dans lequel la pluralité d'états d'interface utilisateur est configurée avec l'unité de calcul comprenant: un état d'interface d'utilisateur sous forme de cadran d'horloge / autre image et un état d'interface utilisateur à réalité augmentée.The system of claim 11, wherein the plurality of user interface states are configured with the computing unit comprising: a user interface state in the form of a clock face / other image and a UI state with augmented reality. 19. Le système selon la revendication 11, dans lequel l'état d'interface utilisateur sous forme de cadran d'horloge / autre image affiche une pluralité de signaux visuels relatifs aux locaux et aux appareils, et empêche en outre une activation involontaire de l'état d'interface d'utilisateur à réalité augmentée, dans lequel le repère visuel est choisi parmi au moins un enregistrement de données textuelles, un enregistrement de données graphiques et / ou une combinaison de ceux-ci.The system of claim 11, wherein the clock / other image user interface state displays a plurality of visual signals relating to the premises and apparatus, and further prevents inadvertent activation of the augmented reality user interface state, wherein the visual cue is selected from at least one text data record, a graphical data record, and / or a combination thereof. 20. Le système selon la revendication 11, dans lequel l’état d’interface utilisateur à réalité augmentée s'active à la réception du geste de pointage sur un mur, un plancher, un plafond, une porte, une pièce, un dispositif, un capteur intelligent, un bâtiment ou un meuble pour afficher un enregistrement de données organisées correspondant.The system according to claim 11, wherein the augmented reality user interface state is activated upon receiving the pointing gesture on a wall, a floor, a ceiling, a door, a room, a device, an intelligent sensor, a building or a piece of furniture to display a corresponding organized data record.
LU100517A 2017-11-20 2017-11-20 Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises LU100517B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
LU100517A LU100517B1 (en) 2017-11-20 2017-11-20 Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
LU100517A LU100517B1 (en) 2017-11-20 2017-11-20 Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises

Publications (1)

Publication Number Publication Date
LU100517B1 true LU100517B1 (en) 2019-06-19

Family

ID=60574680

Family Applications (1)

Application Number Title Priority Date Filing Date
LU100517A LU100517B1 (en) 2017-11-20 2017-11-20 Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises

Country Status (1)

Country Link
LU (1) LU100517B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269617A1 (en) * 2013-07-10 2017-09-21 Crowdcomfort, Inc. Systems and methods for providing augmented reality-like interface for the management and maintenance of building systems

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269617A1 (en) * 2013-07-10 2017-09-21 Crowdcomfort, Inc. Systems and methods for providing augmented reality-like interface for the management and maintenance of building systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SVENJA KAHN ET AL: "Beyond 3D As-Built Information Using Mobile AR Enhancing the Building Lifecycle Management", CYBERWORLDS (CW), 2012 INTERNATIONAL CONFERENCE ON, IEEE, 25 September 2012 (2012-09-25), pages 29 - 36, XP032265669, ISBN: 978-1-4673-2736-7, DOI: 10.1109/CW.2012.12 *

Similar Documents

Publication Publication Date Title
US20190156576A1 (en) Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises
US11263363B2 (en) Dynamic generation and modification of a design model of a building for a construction project
CN103154856B (en) For the environmental correclation dynamic range control of gesture identification
US10705509B2 (en) Digital catalog for manufacturing
US8843350B2 (en) Facilities management system
US10372839B2 (en) Project management system providing optimized interaction with digital models
US9846531B2 (en) Integration of building automation systems in a logical graphics display without scale and a geographic display with scale
RU2612623C2 (en) Role user interface for limited displaying devices
US11941238B2 (en) Systems and methods for entity visualization and management with an entity node editor
JP7087270B2 (en) Information processing equipment and information processing programs
US11934744B2 (en) Method, system and graphical user interface for building design
US20190107940A1 (en) Map-like interface for an electronic design representation
US9141958B2 (en) Method for providing data to a user
US10620807B2 (en) Association of objects in a three-dimensional model with time-related metadata
KR20160108262A (en) Method for managing construction information by recording a location based photo on plan
US20180041401A1 (en) System Diagram GUI Development System and Method of Use
US10019129B2 (en) Identifying related items associated with devices in a building automation system based on a coverage area
JP2015038743A (en) Architectural information integrated management system and architectural information integrated management program
LU100517B1 (en) Method and system for curating, accessing, and displaying a plurality of data records pertaining to premises, and a plurality of devices installed in the premises
US11165599B2 (en) Cognitive component selection and implementation
KR20170132465A (en) Method for supporting remodeling the office, and server and computer-readable recording media using the same
JP2005032228A (en) Computer system, method for inputting data into computer system, computer program product, and produced commodity
EP3709153A1 (en) Method, apparatus, and recording medium for controlling digital signage
JP2021520533A (en) How and system to recommend profile pictures, and non-temporary computer-readable recording media
US20230032961A1 (en) Systems and methods for configuring and obtaining industrial equipment

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20190619