WO2020068156A1 - Procédé et appareil d'orientation - Google Patents

Procédé et appareil d'orientation Download PDF

Info

Publication number
WO2020068156A1
WO2020068156A1 PCT/US2019/024398 US2019024398W WO2020068156A1 WO 2020068156 A1 WO2020068156 A1 WO 2020068156A1 US 2019024398 W US2019024398 W US 2019024398W WO 2020068156 A1 WO2020068156 A1 WO 2020068156A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
coordinates
wireless
smart device
agent
Prior art date
Application number
PCT/US2019/024398
Other languages
English (en)
Inventor
Michael SANTARONE
Jason Duff
Michael Wodrich
Original Assignee
Middle Chart, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/142,275 external-priority patent/US10433112B2/en
Application filed by Middle Chart, LLC filed Critical Middle Chart, LLC
Priority to CA3114190A priority Critical patent/CA3114190A1/fr
Publication of WO2020068156A1 publication Critical patent/WO2020068156A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/04Details
    • G01S3/043Receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/14Systems for determining direction or deviation from predetermined direction
    • G01S3/46Systems for determining direction or deviation from predetermined direction using antennas spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
    • G01S3/50Systems for determining direction or deviation from predetermined direction using antennas spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems the waves arriving at the antennas being pulse modulated and the time difference of their arrival being measured
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0247Determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/14Determining absolute distances from a plurality of spaced points of known location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • G01S2205/02Indoor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]

Definitions

  • the present invention combines methods and apparatus for designating a geospatial location and a direction of interest.
  • a smart device is employed to generate a first geospatial point and then moved to a second geospatial point.
  • a directional vector is formed including the first geospatial point and the second geospatial point.
  • the present invention provides for automated apparatus for improved modeling of construction, Deployment and updating of a Processing Facility.
  • the improved modeling is based upon generation of As Built and Experiential Data captured with one or both of Smart Devices and Sensors located in or proximate to the Processing Facility.
  • the automated apparatus is also operative to model compliance with one or more performance levels for the Processing Facility related to processing of a Product.
  • a virtual model of a Processing Facility extends beyond a design stage of the structure into an‘As Built” stage of the structure and additionally includes generation and analysis of Experiential Data capturing conditions realized by the Processing Facility during a Deployment stage of the structure.
  • Experiential Data may be generated and entered into the AVM virtual model of the structure.
  • Experiential Data may include data indicative of a factor that may be tracked and/or measured in relation to the Processing Facility.
  • Experiential data is typically generated by Sensors in or proximate to the Processing Facility and may include, by way of non-limiting example, one or more of: vibration Sensors (such as accelerometers and piezo electro devices); force transducers; temperature sensing devices; amp meters, ohmmeters, switches, motion detectors; light wavelength capture (such as infrared temperature profile devices), water flow meters; air flow meters; and the like.
  • vibration Sensors such as accelerometers and piezo electro devices
  • force transducers temperature sensing devices
  • light wavelength capture such as infrared temperature profile devices
  • Some embodiments of the present invention include capturing data of procedures conducted during preventive maintenance and/or a Service Call and inclusion of relevant data into a virtual model.
  • Precise data capture during a Service Call or during construction may include actual locations of building features such as, electrical wiring and components, plumbing, joists, headers, beams and other structural components. Data capture may be ongoing over time as the building is used and modified, or updated during the life of a structure (sometimes referred to herein as the“Operational” or“Deployed” stage of the structure).
  • collected data may be used to predict Performance of a Property based upon features built into the structure and conditions experienced by the Property.
  • As Built data may include modifications to a Property that are made during a construction phase, and/or during a Deployment phase, of a Property life cycle.
  • As Deployed data may include details quantifying one or more of: machine operators, production quantity, yield, quality level, usage, maintenance, repairs and improvements performed on the Property.
  • production rates, yields, cost of build, and cost of Deployment, including maintenance costs incurred during Deployment of a Property may be calculated and included into one or more of: a production value of a Property including a Processing Facility; a sale price of a Property; and a lease value of a Property and overall asset volume of the Property.
  • a comprehensive cost of build and Deployment may be amortized over a term of years.
  • an amortized cost may be included in a scheduled payment for a term of years, such as, for example a monthly mortgage payment, wherein the monthly mortgage payment includes Total Cost of Ownership.
  • Total Cost of Ownership may include one or more of acquisition, deployment, repair and maintenance and energy usage.
  • a sale price that includes Total Cost of Ownership may have favorable tax implications for one of or both Buyer and Seller.
  • a) design data; b) As Built data; c) Experiential Data; and d) Lead Actions and Lag Benefit measurements, as they relate to multiple Properties may be aggregated and accessed to support one or more Properties.
  • Access to aggregated data may include execution of artificial intelligence (AI) routines.
  • AI routines may include, by way of non limiting example; structured algorithms and unstructured queries operative to predict Performance metrics and maintenance needs. AI routines may access both initial designs and data aggregated during build and deployment stages of the Property.
  • FIG. 1E illustrates a diagram of a user and directional image data.
  • FIGS. 3A-3F are illustrations of exemplary aspects of collecting and displaying data of a Processing Facility generated during construction of the Processing Facility.
  • FIG. 8 illustrates method steps that may be implemented according to some aspects of the present invention.
  • Smart Device as used herein includes an electronic device including, or in logical communication with, a processor and digital storage and capable of executing logical commands.
  • location identifiers may include, by way of non-limiting example, RFID chips, a visual markings (i.e. a hash tags or barcode), pins or other accurately placed indicators. Placement of the location identifiers may be included in the AVM and referenced as the location of the physical user device is determined. As described above, specific location identifiers may be referenced in the context of GPS coordinates or other more general location identifiers.
  • transmissions from one or more location identifiers 121A may be controlled via one or more of: encryption; encoding; password protection; private/public key synchronization or other signal access restriction.
  • Control of access to location identifiers 121A may be useful in multiple respects, for example, a location identifier may additionally function to provide access to data, a distributed network and/or the Internet.
  • details of a proposed use of a structure and parcel may be input into a design module and used to specify or recommend features to be included in an Augmented Virtual Model 100.
  • Structure automation devices may include, by way of non-limiting example one or more of: automated locks or other security devices; thermostats, lighting, heating, chemical processing, cutting, molding, laser shaping, 3D printing, assembly, cleaning, packaging and the like.
  • a structure with recorded As Built design features and vibration sensors may track activities in a structure and determine that a first occupant associated with a first vibration pattern of walking is in the structure. Recorded vibration patterns may indicate that person one is walking down a hallway and automatically turn on appropriated lighting and adjust one or more of: temperature, sound and security.
  • Security may include locking doors for which person one is not programmed to access.
  • a first pattern of vibration may be used to automatically ascertain that a person is traversing an area of a structure for which a high level of security is required or an area that is designated for limited access due to safety concerns.
  • As Built data has been collected.
  • Other structure automation may be similarly deployed according to As Built data, occupant profiles, biometric data, time of day, or other combination of available sensor readings.
  • the AVM 201 may take into account a proposed usage of a Deployment of a Structure based upon values for Deployment variables, and specify aspects of one or more of: Machine s 211; building support 212; and utility support 213 based upon one or both of a proposed usage and values for Deployment variables.
  • Proposed usage may include, for example, how many human resources will occupy a Structure, demographics of the resources that will occupy the Structure; percentage of time that the Structure will be occupied, whether the Structure is a primary residence, whether the Structure is a leased property and typical duration of leases entered into, environmental conditions experienced by the Structure , such as exposure to ocean salt, Winter conditions, desert conditions, high winds, heavy rain, high humidity, or other weather conditions.
  • a Total Cost of Deployment 214 may change based upon a time period 215 used to assess the Total Cost of Deployment 214.
  • a ROI may include one or more of: a rental value that may produce a revenue stream, a resale value, a cost of operation, real estate taxes based upon structure specifications and almost any other factor that relates to one or both of a cost and value.
  • a designed Structure is ultimately built at a site on a real estate parcel.
  • a build process may be specified and provide metrics that may be used in a process designed by a AVM 201 and also used as a physical build proceeds.
  • time factors associated with a physical build may be important, and in some examples time factors associated with a physical build may be estimated, measured and acted upon as they are generated in a physical build process.
  • time factors may include, one or more of: a time to develop and approve site plans; a time to prepare the site and locate community provided utilities or site provided utilities; a time to lay foundations; a time to build structure; a time to finish structure; a time to install internal utilities and facilities related aspects; a time to install, debug, qualify and release equipment; times to start production runs and to certify compliance of production are all examples of times that can be measured by various techniques and sensing equipment on a Structure’s site.
  • Various time factors for a build are valuable and may become increasingly valuable as a physical build proceeds since the monetary investment in the project builds before revenue flows and monetary investments have clearly defined cost of capital aspects that scale with the time value of money.
  • Various build steps may include material flows of various types. Material flow aspects may be tracked and controlled for cost and efficiency. Various materials may lower a build materials cost, but raise time factors to complete the build. Logical variations may be calculated and assessed in an AVM 201 and optimal build steps may be generated and/or selected based upon a significance placed upon various benefits and consequences of a given variable value. Physical build measurements and/or sensing on physical build projects may also be used as input in an assessment of economic trade-offs.
  • an initial efficiency of design which incurs large costs at early stages of a project may have a dominant impact on Total Cost of Deployment 214 when time factors are weighted to real costs.
  • the ability of a Structure to be flexible over time and to be changed in such flexible manners, where such changes are efficiently designed may dominate even if the initial cost aspects may be less efficient due to the need to design in flexibility.
  • estimates on the expected dynamic nature of demands on a Structure may be modeled against the cost aspects of flexibility to model expectations of Total Cost of Deployment 214 given a level of change.
  • Temperatures may be monitored by thermocouples, semiconductor junction based devices or other such direct measurement techniques.
  • temperature and heat flows may be estimated based on photon based measurement, such as surveying the Structure with infra-red imaging or the like.
  • Ultrastility load may be monitored on a Structure wide basis and/or at point of use monitoring equipment located at hubs or individual pieces of equipment itself.
  • Flow meters may be inline, or external to pipes wires or conduits. Gases and liquid flows may be measured with physical flow measurements or sound based measurement. In other examples, electricity may be monitored as direct current measurements or inferred inductive current measurement.
  • the nature and design of standard usage patterns of a Structure and an associated environment may have relevance to Total Cost of Ownership. For example, usage that includes a larger number of ingress and egress will expose an HVAC system to increased load and usage that includes a significant number of waking hours with inhabitants in the building may incur increased usage of one or more of: machinery 211; building support devices 212; and utilities 234.
  • vibration monitoring Sensors may indicate various activities that take place within the structure and facilitate more accurate modeling of a life expectancy of various aspects of the structure as well as machines located within the structure.
  • Noise levels are another type of vibrational measurement which is focused on transmission through the atmosphere of the Structure.
  • noise may emanate from one location after moving through solid structure from its true source at another location.
  • measurement of ambient sound with directional microphones or other microphonic sensing types may be used to elucidate the nature and location of noise emanations.
  • other study of the noise emanations may lead to establishment of vibrational measurement of different sources of noise.
  • Floors, ceilings, doorways, countertops, windows and other aspects of a Structure may be monitored in order to quantify and extrapolate noise levels.
  • Noise and vibrational measurement devices may be global and monitor a region of a Structure, or they may be inherently incorporated into or upon individual equipment of the Structure.
  • an energy source for heating, cooling, humidification and dehumidification equipment may be modeled and managed.
  • a source of energy used may be one or more of electric, natural gas, propane, fuel oil or natural gas.
  • Emergency backup may also be modeled and managed.
  • Solar and fuel based energy consumption may be modeled and controlled based on upon market forecasts. Estimates may be periodically adjusted according to world and/or market events.
  • a plurality of information may be thus easily accessible inside the Structure, and may be used for a variety of functions, including finding a specific machine to then diagnose and service a problem, regular inspection of equipment, guided tours of the Structure, or many other functions.
  • This information may be conveyed to the individual in a plurality of possible formats, such as lists that show up on the screen, clickable icons that show up next to the equipment in a Virtual Reality (“VR”) camera feed, or many other possibilities.
  • VR Virtual Reality
  • the user may receive a plurality of information, instructions, etc. while the user is proximate to the various aspects of the structures. For example, the user machines themselves, seeing them work, hearing the sounds they make, etc. to better inspect or service, among other possible functions, the Structure’s equipment.
  • the Structure may be proximate to the various aspects of the structures. For example, the user machines themselves, seeing them work, hearing the sounds they make, etc. to better inspect or service, among other possible functions, the Structure’s equipment.
  • Similar travel, guidance, or inspection capabilities for a functional Structure may be achieved completely remotely from the Structure itself. Additionally, with VR systems, these capabilities may occur prior, during, or after the construction and deployment of a Structure.
  • Various examples of data to be acquired, relating to life expectancy of equipment may include, but is not limited to, hours of operation, conditions of operation (whether and how long the equipment may be running under capacity, at rated capacity, or over capacity), or many environmental conditions for operation; environmental conditions may include the ambient temperature (or the difference in ambient temperature from an ideal or other measured value), ambient humidity (or the difference in ambient humidity from an ideal or other measured value), ambient air particulate content (or a comparison of the current air particulate level to a filter change schedule), presence or concentration of ambient gasses (if relevant) such as carbon dioxide, or other gas, a number of times of ingress or egress into the Structure which may change ambient conditions or other trackable data.
  • environmental conditions may include the ambient temperature (or the difference in ambient temperature from an ideal or other measured value), ambient humidity (or the difference in ambient humidity from an ideal or other measured value), ambient air particulate content (or a comparison of the current air particulate level to a filter change schedule), presence or concentration of ambient gasses (if relevant) such as carbon
  • Data aggregated and stored for reference in calculation of Cost of Upkeep considered in a TOC may include data related to some or all of:
  • GPS may be used in combination with other location technologies
  • functionality may therefore include modeled and tracked Performance of a Structure and equipment contained within the Structure , including consumables 233 used and timing of receipt and processing of consumables; modeled and actual maintenance 232, including quality of maintenance performed; equipment Performance including yields; Consumables 233 tracking may include a frequency of replacement and quantity of replaced consumables; Utilities 234 tracking may include projected and actually units of energy consumed.
  • Structure related information may also include features generally related to a structure such as underground plumbing locations, stud locations, electrical conduit and wiring, vertical plumbing piping, and HVAC systems or other duct work.
  • the acquisition of the data may allow the model system to accurately locate these interior and exterior features. Acquisition of As Built data during different points of the construction completion allows measurements to be taken prior to aspects involved in a measurement process being concealed by concrete, sheetrock or other various building materials.
  • Data is acquired that is descriptive of actual physical features as the features are built and converted into a 3D model which may be referred to as the“As Built” model.
  • the As Built model will include“key components” of the structure and be provided with a level of artificial intelligence that fully describes the key component.
  • the As Built model may be compared to a design model.
  • “intelligent parameters” are associated with key components within the 3D model. For example, key components and associated information may further be associated with intelligent parameters.
  • Key components of the structure may have an identification device such as a two or three dimensional graphical code (such as a QR code label) a Radio Frequency Identification Chip (RFID) attached that is accessible to a user, such as a structure owner, structure builder or service technician.
  • RFID Radio Frequency Identification Chip
  • a user interface on a display of various types, such as a tablet may use the associated identification, such as a QR code, to provide direct access to related information.
  • the display may show textual or tabular representations of related data.
  • key components may include doors, windows, masonry, roofing materials, insulation, HVAC equipment and machinery.
  • the As Built model includes information in a database and dynamic model functionality exists that commences as a building structure is being constructed, the model may assume new support aspects to the construction process itself.
  • a benefit from the definition and utilization of many components within a Structure utilizing the system herein includes the ability to pre-cut and/or pre-fabricate studs and framing, roofing cuts, masonry, under-slab plumbing, HVAC ductwork, electrical, and other such components.
  • the dimensions of these various components may be dynamically updated based on an original model that may be compared to actual fabricated structure as realized on a building site.
  • a structure builder may use a display interface associated with the system and model to display a comparison of an original set of building plans to a current structure at a point in time which may allow the builder to authorize any structural changes or variances to design and thereafter allow the description of following components to be dynamically adjusted as appropriate.
  • the system may be of further utility to support various inspections that may occur during a building project which may associate detected variances with design expert review and approval. An inspector may be able to utilize the system as allowed on site or operate a window into the system from a remote location such as his office.
  • the AVM system can autonomously and/or interactively obtain, store and process data that is provided to it by components of the Structure as the structure is built, installed or additions are made to the structure.
  • the generation, modeling, capture, use, and retention of data relating to Performances in specific equipment or in some cases aspects relating to the design of a facility, may be monitored by the system.
  • Operational Performance may be assessed by processing sampled data with algorithms of various kinds. Feedback of the status of operation and of the structure as a whole or in part, as assessed by algorithmic analysis may be made to a structure owner or a structure builder. In addition, a variety of data points gathered via appropriate Sensors, visual and sound data may be recorded and stored and correlated to 3D models of the facility. Experiential Sensor readings may include, by way of non-limiting example: temperature, power usage, utilities used, consumables, product throughput, equipment settings, and equipment Performance measurement, visual and audible data.
  • data may also be combined with manufacturer equipment specifications and historical data to model expectations related to actual operation of the structure and property aspects.
  • a 3D model of structure such as a structure, which may be integrated with information related to the key components and laser scanned location information, may be made available to the structure owner/structure builder through a computer, an iPad or tablet, or smart device.
  • the resulting system may be useful to support virtual maintenance support.
  • a viewable section of the model may be displayed through the viewing medium (whether on a screen, or through a viewing lens), where the viewer’s perspective changes as the accelerometer equipped device moves, allowing them to change their view of the model.
  • the viewer’s Vantage Point may also be adjusted, through a certain user input method, or by physical movement of the user, as non limiting examples.
  • the presented view may be supplemented with“hidden information”, which may include for example, depictions of features that were scanned before walls were installed including pipes, conduits, ductwork and the like. Locations of beams, headers, studs and building structure may be depicted. In some examples, depiction in a view may include a superposition of an engineering drawing with a designed location, in other examples images of an actual structure may be superimposed upon the image based upon As Built scans or other recordations.
  • “hidden information” may include for example, depictions of features that were scanned before walls were installed including pipes, conduits, ductwork and the like. Locations of beams, headers, studs and building structure may be depicted. In some examples, depiction in a view may include a superposition of an engineering drawing with a designed location, in other examples images of an actual structure may be superimposed upon the image based upon As Built scans or other recordations.
  • a structure owner/structure builder desires information related to an machinery, it may be found by positioning a device with a location determining device within it in proximity to the machinery and accessing the parallel model in the Virtual Structure such as by clicking on the machinery in the Virtual Structure model or by scanning the Code label attached to machinery.
  • an internet of things equipped machine may have the ability to pair with a user’s viewing screen and allow the system model to look up and display various information.
  • the user may have access to various intelligent parameters associated with that machinery such as service records, a manual, service contract information, warranty information, consumables recommended for use such as detergents, installation related information, power hooked up and the like.
  • an AVM system may include interfaces of various kinds to components of the system. Sensors and other operational parameter detection apparatus may provide a routine feedback of information to the model system. Therefore, by processing the data-stream with various algorithms autonomous characterization of operating condition may be made. Therefore, the AVM system may provide a user with alerts when anomalies in system Performance are recognized.
  • standard structure maintenance requirements may be sensed or tracked based on usage and/or time and either notification or in some cases scheduling of a service call may be made.
  • the alert may be sent via text, email, or both. The structure user may, accordingly, log back into the Virtual Structure to indicate completion of a maintenance task; or as appropriate a vendor of such service or maintenance may indicate a nature and completion of work performed.
  • a Virtual Structure may take additional autonomous steps to support optimal operation of a system.
  • a Virtual Structure may take steps to order and facilitate shipping of anticipated parts needed for a scheduled maintenance ahead of a scheduled date for a maintenance event (for example, shipping a filter ahead of time so the filter arrives prior to the date it is scheduled to be changed).
  • a Virtual Structure may recall notes from an Original Equipment Manufacturer (OEM) that could be communicated to a user through the Virtual Structure.
  • OEM Original Equipment Manufacturer
  • a Virtual Structure may support a user involved in a real estate transaction by quantifying service records and Performance of a real property.
  • the AVM may establish a standard maintenance and warranty program based on manufacturers published data and the ability to advise structure owners of upcoming needs and/or requirements.
  • the model system may facilitate allowing for structure builders, rental companies, or maintenance companies to consolidate information for volume discounts on parts or maintenance items.
  • the model system may also facilitate minimizing unnecessary time expenditure for structure builders hoping to minimize needless service calls for warranty issues, and allowing structure builders and rental companies attempting to sell a structure or a rental to demonstrate that care has been taken to maintain a structure.
  • Benefits derived from monitoring and tracking maintenance with a Virtual Structure may include positively reassuring and educating lenders and/or lien holders that their investment is being properly cared for.
  • insurance companies may use access to a Virtual Structure to provide factual support that their risk is properly managed.
  • a data record in a Virtual Structure model system and how an owner has cared for their facility may be used by insurance companies or lenders to ensure that good care is being taken.
  • Maintenance records demonstrating defined criteria may allow insurance companies to offer a structure owner policy discount, such as, for example, installation of an alarm system.
  • access to a Virtual Structure may allow municipalities and utilities to use the info for accurate metering of utility usage without having to manually check; and peaks in utility demand may be more accurately anticipated.
  • Virtual Structure may also be used to assist with structure improvement projects of various types.
  • the structure improvement projects may include support for building larger additions and modifications, implementing landscaping projects. Smaller projects may also be assisted, including in a non-limiting example such a project as hanging a picture, which may be made safer and easier with the 3D“as-built” point cloud information.
  • Hidden water piping, electrical conduits, wiring, and the like may be located, or virtually“uncovered”, based on the model database.
  • KPIs Key Performance Indicators
  • an HVAC system may be added to a facility during construction and a simultaneously a Performance monitor may be added to the HVAC system.
  • the Performance monitor may be used to monitor various KPIs for an HVAC system.
  • KPIs may include outdoor air temperature, discharge air temperature, discharge air volume, electrical current, and the like. Similar monitoring capabilities may be installed to all machinery and utilities systems in a facility. The combination of these numerous system monitors may allow for a fuller picture of the efficiency of operations of various systems.
  • Use of the Virtual Structure may allow owners to receive periodic reports, such as in a non-limiting sense monthly emails which may show their current total energy consumption as well as a breakdown of what key components are contributing to the current total energy consumption.
  • the systems presented herein may be used by owners and facility managers to make decisions that may improve the cost effectiveness of the system.
  • An additional service for Owners may allow the structure owner to tap into energy saving options as their structure ages. As an example, if a more efficient HVAC system comes on the market, which may include perhaps a new technology node, the user may receive a“Savings Alert”. Such an alert may provide an estimated energy savings of the recommended modification along with an estimate of the cost of the new system. These estimates may be used to generate a report to the owner of an estimated associated retum-on-investment or estimated payback period should the structure owner elect to replace their HVAC system.
  • a AVM of a Virtual Structure may set a threshold value for the required ROI above which they may be interested in receiving such an alert with that ROI is achieved. This information will be based on data derived from actual operating conditions and actual historical usage as well as current industry information. Predictive maintenance and energy savings to key systems via Smart Structure Total Cost of Ownership (“TCO”) branded Sensors.
  • TCO Total Cost of Ownership
  • the aggregation of data and efficiency experience from numerous systems may allow for analysis of optimization schemes for various devices, machinery and other structure components that includes real installed location experience. Analysis from the aggregated data may be used to provide feedback to equipment manufacturers, building materials fabricators and such suppliers.
  • FIG. 3A a depiction of a site for building a facility structure is illustrated.
  • the depiction may represent an image that may be seen from above the site.
  • Indications of property boundaries such as comers 301 and property borders 302 are represented and may be determined based on site scanning with property markings from site surveys or may be entered based on global coordinates for the property lines.
  • An excavated location 303 may be marked out.
  • Roadways, parking and/or loading areas 304 may be located. Buried utilities such as buried telephone 305, buried electric 306, buried water and sewer 307 are located in the model as illustrated. In some examples, such other site service as a buried sprinkler system 308 may also be located.
  • FIG. 3C a wall 331 of the Structure in the process of build is illustrated.
  • the structure may be scanned by a scanning element 330.
  • a laser three dimensional scanner may be used.
  • the wall may have supporting features like top plates 333, headers 336, studs 332, as well as internal items such as pipes 334, electrical conduits and wires 335.
  • There may be numerous other types of features within walls that may be scanned as they occur such as air ducts, data cables, video cables, telephone cables, and the like.
  • the wall may be completed with structure components behind wall facing 340 may no longer be visible. Electrical outlets 341 and door structures 342 may be scanned by a scanning element 330.
  • internal components such as machinery may be installed. As a non-limiting example, a machine 350 may be installed and the resulting three dimensional profiles may be scanned by a scanning element 330.
  • an operational monitor 351 may be attached to the machinery.
  • an operational monitor may be part of the machinery. The operational monitor may have the ability to communicate 352 data to various receivers that may be connected to the model system of the residence.
  • key structural components, such as doors may have identifying devices such as a QR label 353. The label may be visible or painted into the structure with non- visible paint. The identifying devices may provide information related to the device itself and warrantees of the device as non-limiting examples.
  • the model may include the various structure elements hidden and visible and may be used to create output to a display system of a user.
  • FIG. 3F an example display is illustrated.
  • the various non-visible layers may be shown by rendering the covering layers with a transparency.
  • the display shows the machine profile 350 as well as the internal features that may be concealed like pipes 334, electrical conduits with wires 335, and headers 336 as examples.
  • a wall that has been scanned with an HVAC unit 360 may include a Performance Monitor 351 which may communication various information wirelessly 352.
  • the communication may be received at an antenna 370 of a router 371 within the facility.
  • the facility may be interconnected through the internet 372 to a web located server 373 which processes the communication.
  • the web located server 373 also can include the various model data about the facility and it can provide composite displays that can summarize the structure as well as the operational Performance of the HVAC unit 360. It may aggregate the various data into textual and graphic reports. In some examples it may communicate these reports back through internet connections.
  • wireless Smart Device communications may be sent to cellular towers 374 which may transmit 375 to a Smart Device 376 of a user associated with the facility.
  • FIG. 3H an illustration of a virtual reality display in concert with the present invention is illustrated.
  • a machinery 350 of the facility may communicate information to the model server.
  • a user 380 may receive may an integrated communication from the server.
  • the resulting communication may be provided to a virtual reality headset 381.
  • the virtual reality headset may provide a display 382 to the user that provides a three-dimensional view of the physical data as well as simulated imagery that may allow views through objects to hidden elements behind the object.
  • a heads up type display of information about an object may be superimposed.
  • Deployment aspects may be specified for a Structure and incorporated into a virtual model, such as an AVM discussed above.
  • Deployment aspects may include for example, a purpose for an As Built structure that is built based of the AVM.
  • the purpose may include, by way of non-limiting example, one or more of: manufacturing, processing, data processing, health care, research, assembly, shipping and receiving, prototyping and the like.
  • Deployment aspects may also include a level of use, such continual, shift schedule or periodic.
  • a climate in which the structure will be placed may also be considered in the Deployment aspects.
  • climate may include one or more of: four seasons; primarily winter; tropical, desert; exposed to salt air; and other environmental factors.
  • Performance aspects of machinery that may be included in the AVM may be digitally modeled and may include a level of use of the machinery and an expected satisfaction of the machinery as deployed according to the Deployment aspects.
  • Maintenance expectations, including a number of repair calls and a preventive maintenance schedule may also be modeled and associated costs.
  • Performance aspects of equipment that may be included in the AVM may be digitally modeled and may include a level of use of the equipment and an expected satisfaction of the machinery as deployed according to the Deployment aspects. Maintenance expectations, including a number of repair calls and a preventive maintenance schedule may also be modeled and associated costs.
  • As Built aspects of a structure are recorded as discussed herein, preferably recordation of As Built aspects begins as construction begins and continues throughout the existence of the structure.
  • the physical structure may be identified via a location.
  • a physical location may include, for example, Cartesian Coordinates, such as Latitude and Longitude coordinates, GPS coordinates, or other verifiable set of location parameters.
  • more exact location specifications may include survey designations.
  • a position within or proximate to the Structure may be determined via positioning identifiers.
  • the position within or proximate to the Structure may be determined.
  • an update may be made to a physical Structure and at method step 410, the update to the physical structure may be recorded and reflected in the AVM.
  • a method flow diagram for monitoring and maintenance is illustrated.
  • a user may obtain a scanning device or devices that may scan a building site.
  • the user or a service of the user may mark property boundaries of the site.
  • work on the site may continue with the excavation of a building base and the laying down of utilities and other buried services.
  • the scanning device is used to scan the location of the various aspects of the building site.
  • work may continue with the laying of footings and foundations and other such foundational building activities.
  • scanning of the footings and foundations may be accomplished.
  • a structure may be framed and features such as pipe conduit, electrical wiring communications wiring and the like may be added.
  • the building site may again be scanned to locate the various elements.
  • the framing of the residence may commence along with running of pipe, wiring, conduits, ducts and various other items that are located within wall structures.
  • the framed structure may be scanned at 418. Thereafter, the framed structure may be enclosed with walls 419. Thereafter, the walls may be scanned with the scanning device at step 420.
  • a Structure may already be built and may have various data layers already located in the model system.
  • machinery may be added to the Structure.
  • an ID tag, or a QR tag, or and RFID tag or an internet of things device may be associated with the machinery and may be programmed into the model system.
  • the model system may be interfaced to the machinery ID and into the Structure model.
  • a scanning step may be used to input three dimensional structure data at the installed location into the model system.
  • an operational monitor function of the device may be added or activated.
  • operational data may be transferred from the operational monitor to the server with the Structure model.
  • a state may include, for example, one or more of: a vibration measured with an accelerometer; a temperature of at least a portion of the structure; an electrical current measurement to equipment installed in the Structure, a number of cycles of operation of equipment installed in the Structure; a number of cycles of operation of an machinery installed in the Structure; an electrical current measurement to an machinery installed in the Structure; a vibration associated with movement of an occupant of the Structure.
  • a vibration pattern may be associated with a specific occupant and tracking the movement of the specific occupant through the structure may be based upon measured vibration patterns.
  • a vibration pattern may be associated with a particular activity of a specific occupant and the activity of the specific occupant may be tracked within the structure based upon measured vibration patterns.
  • FIG. 5 illustrates location and positioning identifiers 501-504 that may be deployed in a Structure according to some embodiments of the present invention to determine a user position 500 within or proximate to the Structure 505.
  • Positioning identifiers may include a device that is fixed in a certain location and may be used to determine via calculation a position of a user with a tablet, smart phone or other network access device able to recognize the position identifiers.
  • the position identifiers 501-504 may include devices, such as, for example, a radio transmitter, a light beacon, or an image recognizable device.
  • a radio transmitter may include a router or other WiFi device.
  • a position identifier may include a WiFi router that additionally provides access to a distributed network, such as the Internet.
  • Cartesian Coordinates, such as a GPS position 506, may be utilized to locate and identify the Structure 505.
  • a precise location may be determined via triangulation based upon a measured distance from three 501-503 or more position identifiers 501-504. For example a radio transmission or light signal may be measured and compared from the three reference position identifiers 501- 503.
  • Other embodiments may include a device recognizable via image analysis and a camera or other Image Capture Device, such as a CCD device, may capture an image of three or more position identifiers 501-504.
  • Image analysis may recognize the identification of each of three or more of the position identifiers 501-504 and a size ratio of the respective image captured position identifiers 501-504 may be utilized to calculate a precise position.
  • a height designation may be made via triangulation using the position identifiers as reference to a known height or a reference height.
  • the processor 620 is also in communication with a storage device 630.
  • the storage device 630 may comprise any appropriate information storage device, including combinations of magnetic storage devices (e.g., magnetic tape and hard disk drives), optical storage devices, and/or semiconductor memory devices such as Random Access Memory (RAM) devices and Read Only Memory (ROM) devices.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the storage device 630 can store a software program 640 with executable logic for controlling the processor 620.
  • the processor 620 performs instructions of the software program 640, and thereby operates in accordance with the present invention.
  • the processor 620 may also cause the communication device 610 to transmit information, including, in some instances, control commands to operate apparatus to implement the processes described above.
  • the storage device 630 can additionally store related data in a database 650 and database 660, as needed.
  • a microphone 710 and associated circuitry may convert the sound of the environment, including spoken words, into machine-compatible signals.
  • Input facilities may exist in the form of buttons, scroll wheels, or other tactile Sensors such as touch-pads.
  • input facilities may include a touchscreen display.
  • Visual feedback to the user is possible through a visual display, touchscreen display, or indicator lights.
  • Audible feedback 734 may come from a loudspeaker or other audio transducer.
  • Tactile feedback may come from a vibrate module 736.
  • a motion Sensor 738 and associated circuitry convert the motion of the mobile device 702 into machine-compatible signals.
  • the motion Sensor 738 may comprise an accelerometer that may be used to sense measurable physical acceleration, orientation, vibration, and other movements.
  • motion Sensor 738 may include a gyroscope or other device to sense different motions.
  • the mobile device 702 comprises logic 726 to interact with the various other components, possibly processing the received signals into different formats and/or interpretations.
  • Logic 726 may be operable to read and write data and program instructions stored in associated storage or memory 730 such as RAM, ROM, flash, or other suitable memory. It may read a time signal from the clock unit 728.
  • the mobile device 702 may have an on-board power supply 732. In other embodiments, the mobile device 702 may be powered from a tethered connection to another device, such as a Universal Serial Bus (USB) connection.
  • USB Universal Serial Bus
  • a reader may scan some coded information from a location marker in a facility with the mobile device 702.
  • the coded information may include for example a hash code, bar code, RFID or other data storage device.
  • the scan may include a bit-mapped image via the optical capture device 708.
  • Logic 726 causes the bit-mapped image to be stored in memory 730 with an associated time-stamp read from the clock unit 728.
  • Logic 726 may also perform optical character recognition (OCR) or other post-scan processing on the bit-mapped image to convert it to text.
  • OCR optical character recognition
  • a reader may capture some text from an article as an audio file by using microphone 710 as an acoustic capture port.
  • Logic 726 causes audio file to be stored in memory 730.
  • Logic 726 may also perform voice recognition or other post-scan processing on the audio file to convert it to text.
  • the reader may then upload the audio file (or text produced by post-scan processing performed by logic 726) to an associated computer via network interface 716.
  • a directional sensor 741 may also be incorporated into the mobile device 702.
  • the directional device may be a compass and be based upon a magnetic reading, or based upon network settings.
  • a processor may generate an AVM model of a Structure.
  • the AVM model may be based upon a physical layout of the Structure and include a layout of each item of machinery, equipment as well as facility features.
  • the AVM may receive data indicative of one or more performance metrics. Data may include data generated via a sensor and/or input by a user. In some examples, data may include performance metrics, utility cost, maintenance cost and replacement cost.
  • a data connection between a deployed facility and an AVM may be automated to generate and transmit data to the model on an automated basis without human intervention or artificial delay. All or some data may be stored in a storage.
  • the AVM may access received and/or historical data from the same or other AVM models.
  • Artificial Intelligence routines or other logic may integrate relevant indices, including one or more of: geographic location, labor organization, market conditions, labor costs, physical conditions, property status or data descriptive of other variables.
  • an AVM may generate a value for build and deployment cost, and at step 807 the AVM may include utility and consumables cost.
  • an AVM may generate one or more of: predicted and actual quantifications from the structure; energy consumption and process throughput.
  • the wearable display 905 may comprise a set of goggles or glasses, wherein the goggles or glasses may comprise one or more lenses.
  • the goggles or glasses may comprise one or more lenses.
  • a single wrapped lens may allow a user to experience panoramic views.
  • dual lenses may provide different image data, wherein the combined images may allow the user to have stereoscopic perception of the performance event.
  • the wearable display 905 may comprise a helmet, which may allow for more detailed immersion.
  • a helmet may allow for temperature control, audio isolation, broader perspectives, or combinations thereof.
  • the wearable display may comprise an accelerometer configured to detect head movement.
  • the accelerometer may be calibrated to the natural head movements of a user 1000.
  • the calibration may allow the user to tailor the range to the desired viewing area. For example, a user may be able to move their head 110° comfortably, and the calibration may allow the user to view the entire 180° relative the natural 110° movement.
  • the wearable display may be configured to detect vertical motions.
  • a user may look up to shift the viewing area to a range in the positive y axis grids, and user may look down to shift the viewing area to a range in the negative y axis grids.
  • the wearable display may be configured to detect both horizontal and vertical head motion, wherein the user may be able to have almost a 270° viewing range.
  • the wearable display may be able to detect 360° of horizontal movement, wherein the user may completely turn around and change the neutral viewing range by 180°.
  • the wearable display may be configured to detect whether the user may be sitting or standing, which may shift the perspective and viewing area.
  • a user may be allowed to activate or deactivate the motion detection levels, based on preference and need. For example, a user may want to shift between sitting and standing throughout the experience without a shift in perspective.
  • the wearable display may further comprise speakers, wherein audio data may be directed to the user.
  • a User 1200 may position a Smart Device 1205 in a first position 1201 proximate to a portion of a structure for which a representation in the AVM the User 1200 wishes to retrieve and display.
  • the first position 1201 of the Smart Device 1205 may be determined (as discussed herein via GPS and/or triangulation) and recorded.
  • the vector may have a length determined by the AVM that is based upon a length of a next Feature in the AVM located in the direction of the generated vector.
  • the vector will represent a distance 1203 from the second position 1202 to an item 1225 along the Z axis defined by a line between the first position 1201 and the second position 1202.
  • a ray will include a starting point and a direction.
  • the change in the Z direction is associated with a zero change in the X and Y directions.
  • the process may also include a second position 1205 that has a value other than zero in the X and/or Y directions.
  • a User 1200 may deploy a laser, accelerometer, sound generator or other device to determine a distance from the Smart Device 1205 to the feature, such as a piece of equipment.
  • a feature such as a piece of equipment.
  • Such unique methods of determining a location and direction of data capture may be utilized to gather data during construction of modeled buildings or other structures and during Deployment of the structures during the Operational Stage.
  • An additional non-limiting example may include direction based identification; with a fixed location, or in tandem with a location means, a device may have capabilities to deduce orientation based information of the device. This orientation information may be used to deduce a direction that the device is pointing in. This direction based information may be used to indicate that the device is pointing to a specific piece of equipment 1225 that may be identified in the AVM.
  • a device with a controller and an accelerometer such as mobile Smart Device 1205
  • the Smart Device determines a first position 1201 based upon triangulation with the reference points. The process of determination of a position based upon triangulation with the reference points may be accomplished, for example via executable software interacting with the controller in the Smart Device, such as, for example by running an app on the Smart Devices 1205.
  • some embodiments may include an electronic and/or magnetic directional indicator that may be aligned by a user in a direction of interest. Alignment may include, for example, pointing a specified side of a device, or pointing an arrow or other symbol displayed upon a user interface on the device towards a direction of interest.
  • triangulation may be utilized to determine a relative elevation of the Smart Device as compared to a reference elevation of the reference points.
  • An unmanned vehicle may include for example, an unmanned aerial vehicle (“UAV”) or ground level unit, such as a unit with wheels or tracks for mobility and a radio control unit for communication.
  • UAV unmanned aerial vehicle
  • ground level unit such as a unit with wheels or tracks for mobility and a radio control unit for communication.
  • multiple unmanned vehicles may capture data in a synchronized fashion to add depth to the image capture and/or a three dimensional and 4 dimensional (over time) aspect to the captured data.
  • UAV position will be contained within a perimeter and the perimeter will have multiple reference points to help each UAV (or other unmanned vehicle) determine a position in relation to static features of a building within which it is operating and also in relation to other unmanned vehicles.
  • Still other aspects include unmanned vehicles that may not only capture data but also function to perform a task, such as paint a wall, drill a hole, cut along a defined path, or other function.
  • the captured data may be incorporated into an AVM.
  • captured data may be compared to a library of stored data using recognition software to ascertain and/or affirm a specific location, elevation and direction of an image capture location and proper alignment with the virtual model.
  • recognition software may be used to ascertain and/or affirm a specific location, elevation and direction of an image capture location and proper alignment with the virtual model.
  • Still other aspects may include the use of a compass incorporated into a Smart Device.
  • additional apparatus and methods for determining a geospatial location and determination of a direction of interest may include one or both of an enhanced smart device and a smart device in logical communication with wireless position devices 1303- 1310.
  • a smart device 1301 may be in logical communication with one or more wireless position devices 1303-1310 strategically located in relation to the physical dimensions of the smart device.
  • the smart device 1301 may include a smart phone or tablet device with a user interface surface 1320 that is generally planar.
  • the user interface surface 1320 will include a forward edge 1318 and a trailing edge 1319.
  • the smart device will be fixedly attached to a smart receptacle 1302.
  • the smart receptacle 1302 may include an appearance of a passive case, such as the type typically used to protect the smart device 1301 from a damaging impact.
  • the smart receptacle 1302 will include digital and/or analog logical components, such as wireless position devices 1303-1310.
  • the wireless position devices 1303-1310 include circuitry capable of receiving wireless transmissions from multiple wireless positional reference transceivers 1311-1314.
  • the wireless transmissions will include one or both of analog and digital data suitable for calculating a distance from each respective reference point 1311-1314.
  • the smart receptacle 1302 will include a connector 1315 for creating an electrical path for carrying one or both of electrical power and logic signals between the smart device 1301 and the smart receptacle 1302.
  • the connector 1315 may include a mini-usb connector or a lightening connector. Additional embodiments may include an inductive coil arrangement for transferring power.
  • Embodiments may also include wireless transmitters and receivers to provide logical communication between the wireless position devices 1303-1310 and the smart device 1301. Logical communication may be accomplished, for example, via one or more of: Bluetooth, ANT, and infrared mediums.
  • Reference transceivers 1311-1314 provide wireless transmissions of data that may be received by wireless position devices 1303-1310. The wireless transmissions are utilized to generate a position of the respective wireless position devices 1303-1310 in relation to the According to the present invention, reference transceivers 1311-1314 providing the wireless transmissions to the wireless position devices 1303-1310 are associated with one or more of: a position in a virtual model; a geographic position; a geospatial position in a defined area, such as Structure; and a geospatial position within a defined area (such as, for example a Property).
  • a smart device may be placed into a case, such as a smart receptacle 1302 that includes two or more wireless position devices 1303-1310.
  • the wireless position devices 1303-1310 may include, for example, one or both of: a receiver and a transmitter, in logical communication with an antenna configured to communicate with reference transceivers 1311-1314.
  • Communications relevant to location determination may include, for example, one or more of: timing signals; SIM information; received signal strength; GPS data; raw radio measurements; Cell-ID; round trip time of a signal; phase; and angle of received/transmitted signal; time of arrival of a signal; a time difference of arrival; and other data useful in determining a location.
  • the wireless position devices 1303-1310 may be located strategically in the case 1302 to provide intuitive direction to a user holding the case 1302, and also to provide a most accurate determination of direction. Accordingly, a forward wireless position device 1303 may be placed at a top of a smart device case and a reward wireless position device 1304 may be placed at a bottom of a smart device case 1302. Some embodiments each of four corners of a case may include a wireless position device 1305, 1306, 1307, 1308. Still other embodiments may include a wireless position device 1309 and 1310 on each lateral side.
  • the present invention provides for determination of a location of two or more wireless positioning devices 1303-1310 and generation of one or more directional vectors 1317 and/or rays based upon the relative position of the wireless positioning devices 1303-1310.
  • discussion of a vector that does not include specific limitations as to a length of the vector and is primarily concerned with a direction, a ray of unlimited length may also be utilized.
  • multiple directional vectors 1317 are generated and a direction of one or more edges, such as a forward edge, is determined based upon the multiple directional vectors 1317.
  • a geospatial location relative to one or more known reference points is generated.
  • the geospatial location in space may be referred to as having an XY position indicating a planar designation (e.g. a position on a flat floor), and a Z position (e.g. a level within a structure, such as a second floor) may be generated based upon indicators of distance from reference points. Indicators of distance may include a comparison of timing signals received from wireless references.
  • a geospatial location may be generated relative to the reference points.
  • a geospatial location with reference to a larger geographic area is associated with the reference points, however, in many embodiments, the controller will generate a geospatial location relative to the reference point(s) and it is not relevant where the position is located in relation to a greater geospatial area.
  • a position of a smart device may be ascertained via one or more of: triangulation; trilateration; and multilateration (MLT) techniques.
  • a geospatial location based upon triangulation may be generated based upon a controller receiving a measurement of angles between the position and known points at either end of a fixed baseline.
  • a point of a geospatial location may be determined based upon generation of a triangle with one known side and two known angles.
  • a geospatial location based upon trilateration may be generated based upon a controller receiving wireless indicators of distance and geometry of geometric shapes, such as circles, spheres, triangles and the like.
  • a geospatial location based upon multilateration may be generated based controller receiving measurement of a difference in distance to two reference positions, each reference position being associated with a known location.
  • Wireless signals may be available at one or more of: periodically, within determined timespans and continually.
  • the determination of the difference in distance between two reference positions provides multiple potential locations at the determined distance.
  • a controller may be used to generate a plot of potential locations.
  • the potential determinations generally form a curve. Specific embodiments will generate a hyperbolic curve.
  • the controller may be programmed to execute code to locate an exact position along a generated curve, which is used to generate a geospatial location.
  • the multilateration thereby receives as input multiple measurements of distance to reference points, wherein a second measurement taken to a second set of stations (which may include one station of a first set of stations) is used to generate a second curve. A point of intersection of the first curve and the second curve is used to indicate a specific location.
  • some embodiments may include an electronic and/or magnetic directional indicator that may be aligned by a user in a direction of interest. Alignment may include, for example, pointing a specified side of a device, or pointing an arrow or other symbol displayed upon a user interface on the device towards a direction of interest.
  • triangulation may be utilized to determine a relative elevation of the Smart Device as compared to a reference elevation of the reference points.
  • a Smart Device is generally operated by a human user
  • some embodiments of the present invention include a controller, accelerometer, and data storage medium, Image Capture Device, such as a Charge Coupled Device (“CCD”) capture device and/or an infrared capture device being available in a handheld or unmanned vehicle.
  • Image Capture Device such as a Charge Coupled Device (“CCD”) capture device and/or an infrared capture device being available in a handheld or unmanned vehicle.
  • CCD Charge Coupled Device
  • An unmanned vehicle may include for example, an unmanned aerial vehicle (“UAV”) or an unmanned ground vehicle (“UGV”), such as a unit with wheels or tracks for mobility.
  • UAV unmanned aerial vehicle
  • UGV unmanned ground vehicle
  • a radio control unit may be used to transmit control signals to a UAV and/or a UGV.
  • a radio control unit may also receive wireless communications from the unmanned vehicle.
  • multiple unmanned vehicles may capture data in a synchronized fashion to add depth to the image capture and/or a three dimensional and 4 dimensional (over time) aspect to the captured data.
  • a UAV position will be contained within a perimeter and the perimeter will have multiple reference points to help each UAV (or other unmanned vehicle) determine a position in relation to static features of a building within which it is operating and also in relation to other unmanned vehicles.
  • Still other aspects include unmanned vehicles that may not only capture data but also function to perform a task, such as paint a wall, drill a hole, cut along a defined path, or other function.
  • the captured data may be incorporated into an AVM.
  • captured data may be compared to a library of stored data using recognition software to ascertain and/or affirm a specific location, elevation and direction of an image capture location and proper alignment with the virtual model.
  • recognition software may be used to ascertain and/or affirm a specific location, elevation and direction of an image capture location and proper alignment with the virtual model.
  • Still other aspects may include the use of a compass incorporated into a Smart Device.
  • functions of the methods and apparatus presented herein may include one or more of the following factors that may be modeled and/or tracked over a defined period of time, such as, for example, an expected life of a build (such as, 10 years or 20 years).
  • wireless position devices 1303A- 1310A may be incorporated into a smart device 1301 A and not require a smart receptacle to house wireless position devices 1303-1310.
  • Wireless position devices 1303A-1310A that are incorporated into a smart device, such as a smart phone or smart tablet, will include internal power and logic connections and therefore not require wireless communication between the controller in the smart device 1301A and the C.
  • a smart device 1301A with integrated wireless position devices 1303-1310 and a smart device 1301 with wireless position devices 1303-1310 in a smart receptacle 1302 may provide a directional indication, such as a directional vector 1317 1317A, without needing to move the smart device from a first position to a second position since a directional vector may be determined from a relative position of a first wireless position devices 1303-1310 and a second wireless positional device wireless position devices 1303-1310.
  • the distances may be triangulated based on measurements of WiFi strength at two points.
  • WiFi signal propagates outward as a wave, ideally according to an inverse square law.
  • the crucial feature of the present invention relies on measuring relative distances at two points. In light of the speed of WiFi waves and real-time computations involved in orienteering, these computations need to be as computationally simple as possible. Thus, depending upon the specific application and means for taking the measurements, various coordinate systems may be desirable. In particular, if the smart device moves only in a planar direction while the elevation is constant, or only at an angle relative to the ground, the computation will be simpler.
  • an exemplary coordinate system is a polar coordinate system.
  • a three-dimensional polar coordinate system is a spherical coordinate system.
  • a spherical coordinate system typically comprises three coordinates: a radial coordinate, a polar angle, and an azimuthal angle (r, Q, and f, respectively, though a person of ordinary skill in the art will understand that Q and f are occasionally swapped).
  • Point 1 is considered the origin for a spherical coordinate system (i.e., the point (0, 0, 0)).
  • Each WiFi emitter ei, e 2 , e 3 can be described as points (p, q ⁇ , f ⁇ ), (r 2 , qi, fi), and (r 3 , respectively.
  • Each of the iy s (1 ⁇ i ⁇ 3) represent the distance between the WiFi emitter and the WiFi receiver on the smart device.
  • the orienteering occurs in a multi-story building, in which WiFi emitters may be located above and/or below the technician.
  • a cylindrical coordinate system may be more appropriate.
  • a cylindrical coordinate system typically comprises three coordinates: a radial coordinate, an angular coordinate, and an elevation (r, Q, and z, respectively).
  • a cylindrical coordinate system may be desirable where, for example, ah WiFi emitters have the same elevation.
  • a smart device 1301 and a smart receptacle 1302 may be rotated in a manner (such as, for example in a clockwise or counterclockwise movement 1320 1322 relative to a display screen) that repositions one or more wireless position devices 1303-1310 from a first position to a second position.
  • a vector 1326 may be generated at an angle that is perpendicular 1325 or some other designated angle in relation to the smart device 1301. In some embodiments, an angle in relation to the smart device is perpendicular 1325 and thereby viewable via a forward looking camera on the smart device.
  • a user may position the smart device 1301 such that an object in a direction of interest is within in the camera view.
  • the smart device may then be moved to reposition one or more of the wireless position devices 1303-1310 from a first position to a second position and thereby capture the direction of interest via a generation of a vector in the direction of interest.
  • a vector in a direction of interest 1325 may be based upon a rocking motion 1323-1324 of the smart device 1301, such as a movement of an upper edge 1318 in a forward arcuate movement 1323.
  • the lower edge 1319 may also be moved in a complementary arcuate movement 1324 or remain stationary.
  • the movement of one or both the upper edge 1318-1319 also results in movement of one or more wireless position devices 1303-1310.
  • the movement of the wireless position devices 1303-1310 will be a sufficient distance to register to geospatial positions based upon wireless transmissions.
  • a front and center wireless position device 1401 may be paired with a rear center wireless position device 1402; each corner of the vehicle may include a wireless position device 1403-1406; interior comers may include a respective wireless position device 1409-1412; and exterior locations, such as on rear view mirrors may contain wireless position devices 1407-1408.
  • a controller may be included in a smart device paired to the vehicle and/or a transmitter 1416 may transmit data received from the multiple wireless position devices 1401-1412 to a remote processor which may determine a directional orientation.
  • the remote processor and/or a smart device may also transmit the directional orientation back to a display viewable by an operator of the vehicle.
  • the wireless position devices 1503-1508 enter into logical communication with multiple wireless positional reference transceivers 1510-1513.
  • a direction of interest will include an item of interest 1509, such as an apparatus or other piece of equipment.
  • a direction of interest 1514 may include a vector with a direction pointing towards the item of interest 1509. The vector length will be sufficient to reach the item of interest 1509.
  • a vector indicating a direction of interest 1514 may be used to reference an AVM and the SVM may provide a selection mechanism, such as a drop down menu that includes potential items of interest 1509 along the vector direction. A selection of an item of interest may then be used to determine a length of the vector 1514.
  • Geospatial location services will be cross-referenced with database registry of as built virtually modeled facilities and may be used in conjunction with a network of registered service technicians to route the nearest available service technician to the structure experiencing equipment malfunction. Service technician may register with the system to accept geospatial location tracking services by the system.
  • the service technician’s entry into the structure will be registered. Registration of entry into the structure may be achieved through multiple methods, which may include, by way of non-limiting example, on or more of: WiFi gateway detection, infrared detection, magnetic door locking systems, Bluetooth services, and the like. Upon entry into the structure requesting the service call, system will register the service technician’s entry into the structure.
  • a support unit for a smart device such as service technician or an unmanned vehicle may be tacked via a change in triangulation values and/or an accelerometer and a position and direction within the structure is tracked.
  • the methods used may be, by means of non-limiting example, one or more of: use of data gleaned from accelerometers located on or in possession of service technicians, Wifi services, radio frequency (RF) triangulation, Bluetooth technology, infrared detection, RFID badges, and the like.
  • RF radio frequency
  • a smart device will be registered as entering within structure. Following the smart device entry into structure.
  • a smart device may be associated with one or both of a person and an entity.
  • the smart device is pre-registered by the system with detailed instructions regarding a reason for the device to be at a particular location.
  • the reason may be, for example, one or more of: a service call placed from structure to system detailing current equipment malfunction, service calls from structure detailing non-specific malfunctions and symptomatic data indicating equipment malfunction, a service call placed by self-assessing equipment utilizing internet of things (IoT) and machine learning functionality to ascertain malfunctions and predictive analytics to anticipate malfunctions, and the like.
  • IoT internet of things
  • the system may integrate data reports into the AVM and relay as much to the smart device in the field.
  • the smart device may arrive at the structure without prior knowledge of a purpose.
  • system Upon entry into the structure and registration of the smart device as described in method steps 1601 through 1604, system will relay data gleaned from the AVM, operational data uploaded to the system through IoT processes, and other experiential data reported to the system and thereby relayed to the smart device on site.
  • Methods for relation of such data to the on site smart device may include, by means of non-limiting example, referential data based on proprietary orienteering processes to determine smart device location within structure, which location will be cross-referenced with AVM data.
  • a position within or proximate to the structure may be determined via positioning identifiers. The position within or proximate to the structure is determined and detailed instructions directing smart device to the source of a malfunction is relayed by the system to the smart device directly or by means of smart device application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

La présente invention concerne l'orientation à l'aide d'un dispositif intelligent. Le dispositif intelligent permet d'identifier une position à l'intérieur d'un bâtiment et d'utiliser une surveillance d'emplacement à grain fin pour guider un utilisateur vers un emplacement souhaité. Lorsqu'il atteint l'emplacement souhaité, l'utilisateur peut pointer le dispositif intelligent au niveau d'un mur ou d'un élément et apprendre, interroger ou compléter des détails techniques et d'autres informations concernant le mur ou l'élément. Les détails techniques et autres informations sont mémorisés dans un modèle virtuel augmenté du bâtiment. Le mouvement du dispositif intelligent, combiné à une surveillance d'emplacement à grain fin, établit un vecteur qui est utilisé comme interrogation directionnelle du modèle virtuel augmenté. Ceci permet une entrée facile de données contextuelles dans des structures de données complexes avec un entraînement d'utilisateur minimal. De plus, un agent externe à un bâtiment peut recevoir des informations relatives au bâtiment sur la base d'une position et d'une direction d'intérêt indiquée.
PCT/US2019/024398 2018-09-26 2019-03-27 Procédé et appareil d'orientation WO2020068156A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3114190A CA3114190A1 (fr) 2018-09-26 2019-03-27 Procede et appareil d'orientation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/142,275 US10433112B2 (en) 2017-02-22 2018-09-26 Methods and apparatus for orienteering
US16/142,275 2018-09-26

Publications (1)

Publication Number Publication Date
WO2020068156A1 true WO2020068156A1 (fr) 2020-04-02

Family

ID=69949419

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/024398 WO2020068156A1 (fr) 2018-09-26 2019-03-27 Procédé et appareil d'orientation

Country Status (2)

Country Link
CA (1) CA3114190A1 (fr)
WO (1) WO2020068156A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11049072B1 (en) * 2019-04-26 2021-06-29 State Farm Mutual Automobile Insurance Company Asynchronous virtual collaboration environments
TWI806207B (zh) * 2021-10-29 2023-06-21 開曼群島商粉迷科技股份有限公司 啟始適地性主題的方法與系統

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050208951A1 (en) * 2002-05-31 2005-09-22 Armando Annunziato Method for locating mobile terminals, system and components therefor
US20090189810A1 (en) * 2008-01-24 2009-07-30 Broadcom Corporation Weighted aiding for positioning systems
US7994981B1 (en) * 2010-09-07 2011-08-09 etherwhere Corporation System framework for mobile device location
US20180239840A1 (en) * 2017-02-22 2018-08-23 Stellar VDC Commercial, LLC Building model with capture of as built features and experiential data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050208951A1 (en) * 2002-05-31 2005-09-22 Armando Annunziato Method for locating mobile terminals, system and components therefor
US20090189810A1 (en) * 2008-01-24 2009-07-30 Broadcom Corporation Weighted aiding for positioning systems
US7994981B1 (en) * 2010-09-07 2011-08-09 etherwhere Corporation System framework for mobile device location
US20180239840A1 (en) * 2017-02-22 2018-08-23 Stellar VDC Commercial, LLC Building model with capture of as built features and experiential data

Also Published As

Publication number Publication date
CA3114190A1 (fr) 2020-04-02

Similar Documents

Publication Publication Date Title
US11010501B2 (en) Monitoring users and conditions in a structure
US10268782B1 (en) System for conducting a service call with orienteering
EP3586327B1 (fr) Modèle de construction amélioré comportant une capture de caractéristiques de l'état définitif et de données expérientielles
US10983026B2 (en) Methods of updating data in a virtual model of a structure
US11120172B2 (en) Apparatus for determining an item of equipment in a direction of interest
US20180239840A1 (en) Building model with capture of as built features and experiential data
US10984147B2 (en) Conducting a service call in a structure
CA3043686C (fr) Methode et appareil pour une orientation et des modeles virtuels accrus
US11054335B2 (en) Method and apparatus for augmented virtual models and orienteering
US10467353B2 (en) Building model with capture of as built features and experiential data
US20210034796A1 (en) Methods for generating a user interface based upon orientation of a smart device
US20220004672A1 (en) Apparatus for displaying information about an item of equipment in a direction of interest
CA3054521C (fr) Systeme de conduite d'un appel de service avec orientation
US10433112B2 (en) Methods and apparatus for orienteering
WO2020068156A1 (fr) Procédé et appareil d'orientation
US20220382929A1 (en) Position based performance monitoring of equipment
WO2020091836A1 (fr) Système de conduite d'un appel de service avec orientation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19868086

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3114190

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19868086

Country of ref document: EP

Kind code of ref document: A1