AU2015200346A1 - Facilitating Interactions - Google Patents

Facilitating Interactions Download PDF

Info

Publication number
AU2015200346A1
AU2015200346A1 AU2015200346A AU2015200346A AU2015200346A1 AU 2015200346 A1 AU2015200346 A1 AU 2015200346A1 AU 2015200346 A AU2015200346 A AU 2015200346A AU 2015200346 A AU2015200346 A AU 2015200346A AU 2015200346 A1 AU2015200346 A1 AU 2015200346A1
Authority
AU
Australia
Prior art keywords
project
display
details
user
sign
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2015200346A
Inventor
Rebecca Lee
Daithí Ó'gliasáin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VIRTUALIIS Pty Ltd
Original Assignee
VIRTUALIIS Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2014900224A external-priority patent/AU2014900224A0/en
Application filed by VIRTUALIIS Pty Ltd filed Critical VIRTUALIIS Pty Ltd
Priority to AU2015200346A priority Critical patent/AU2015200346A1/en
Publication of AU2015200346A1 publication Critical patent/AU2015200346A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A mobile communication device (10) for facilitating interaction between parties regarding a project, the device comprising: a controller; storage storing electronic program instructions for controlling the controller; a display for displaying a user interface; and an input means; wherein the controller is operable, under control of the electronic program instructions, to: receive input via the input means, the input comprising details of a sign (46) associated with the project; recognise the sign, and on the basis of the recognition, retrieve project details of the project, the project details comprising information and/or data associated with the project; and display the retrieved project details via the display. { 1T 2/18 HE! J~to i $2P 4L tL uj; ,Ma AN

Description

Facilitating Interactions TECHNICAL FIELD [0001] The present invention relates generally to facilitating interactions. [0002] Although the present invention will be described with particular reference to facilitating interactions between a developer of a construction project being developed in a public space and members of the general public or community, it will be appreciated that it may be used to facilitate interactions between any parties in respect of additional or alternative projects or issues. BACKGROUND ART [0003] When undertaking a project, in many instances it is desirable for a manager of the project to take into consideration input from parties who will be affected by or otherwise have an interest in the project, or who can contribute to the project. However, difficulties can arise for the project manager in communicating information regarding the project to such parties, receiving their input, understanding the input, and taking it into consideration in the project. [0004] This is particularly the case where the project comprises a construction project being developed in a public space or area. [0005] Ideas and information regarding such projects are typically communicated to an audience such as clients, committees and the general public via architectural/construction models comprising a three dimensional (3D) representation of the construction to be built. Current techniques for displaying said models are limited to: * Physical scale models. These are typically 1:50 to 1:500 scale models made by specialist manufacturers with a long lead time; * Static renders/pictures of the 3D models in a simulated environment; * Movie-like visualisations which present the models in a "walk-through" simulation; and * Display mode in computer -aided design (CAD) system software, which allow end-users to create various views of the models, but are restricted by being contained within the device hosting the system software (generally desktop/laptop computers). Additional problems associated with these techniques include: * Being unable to view effect(s) or consequence(s) of a construction in its environment, such as, for example, shadow effects, and asthetic results in comparison to surrounding constructions). Viewers are required to rely upon their imagination to perceive such effect(s)/consequence(s); and * Being unable to make and display edits or amendments, including small changes, to the design model at low cost. For example, physical models, render images and movie-like visualisations need to be recreated, and the audience typically needs to visit an office of the project manager/designer, or have physical material of or associated with the recreated model delivered to them, to virtually see the edited design. [0006] Once the project information has been communicated, gathering consensus information from people in real-time is only currently possible with basic techniques such as voting or polling systems. As the information (input) is being gathered, it has the potential to exert an effect on the input, similar to a feedback effect. For example, the psychological effect of people voting for a selection who/which is popular/not popular and the effect that has on the overall results, which in turn feeds back in as an input. This may be seen, for example, the swing of voting for a candidate once they start to gather momentum. [0007] It is against this background that the present invention has been developed.
SUMMARY OF INVENTION [0008] It is an object of the present invention to overcome, or at least ameliorate, one or more of the deficiencies of the prior art mentioned above, or to provide the consumer with a useful or commercial choice. [0009] Other objects and advantages of the present invention will become apparent from the following description, taken in connection with the accompanying drawings, wherein, by way of illustration and example, preferred embodiments of the present invention are disclosed. [0010] According to a first broad aspect of the present invention, there is provided a mobile communication device for facilitating interaction between parties regarding a project, the device comprising: a controller; storage storing electronic program instructions for controlling the controller; a display for displaying a user interface; and an input means; wherein the controller is operable, under control of the electronic program instructions, to: receive input via the input means, the input comprising details of a sign associated with the project; recognise the sign, and on the basis of the recognition, retrieve project details of the project, the project details comprising information and/or data associated with the project; and display the retrieved project details via the display. [0011] The controller may comprise computer processing means. [0012] The display, user interface and input means may be integrated, in a touchscreen for example. Alternatively, they may be discrete.
[0013] Preferably, the project details comprise at least one representation of the project. The at least one representation may comprise a three dimensional (3D) model of the project. [0014] The sign may comprise any letter, word, numeral, device, brand, image, shape, colour, sound, object, article, or thing, or any combination thereof. In preferred embodiments of the invention, the sign may comprise a trigger code, which may have the form of a frame marker. [0015] Preferably, there is a relationship or link between one or more features of the sign and one or more features of the model. The one or more features may be size. In such a case, the image marker size may directly correspond with the size of the model being viewed. [0016] Preferably, the input means comprises an imaging means, which may be a digital camera. In such a case, when the camera is focused on the sign, the controller is operable, under control of the software, to recognise the sign, and on the basis of the recognition, retrieve the model of the project from the project details, and display the model via the display. Preferably, the model is displayed overlaid the image viewed by the camera prior to the model being retrieved. [0017] The input means may also comprise at least one sensor, which may be part of a sensor system or a set of sensors. Individual sensors within the set of sensors may comprise an acceleration sensor, an orientation sensor, a direction sensor, and a position sensor. [0018] Preferably, the input comprises user instructions which are input by a user via the input means. The user instructions may comprise a command to perform an action in respect of the project, in which case the controller is operable, under control of the electronic program instructions, to perform the action according to the received user instructions. The action may comprise an interaction action, and may include one or more of the following: accessing a source related to or associated with the project; providing comments on the project; viewing comments on the project provided by others; creating material; publishing created material; and casting a vote in respect of the project.
[0019] Preferably, the controller is operable, under control of the electronic program instructions, to perform an analysis on the basis of the input received, generate a report on the analysis, and present the report via the display. [0020] The project details may be retrieved from the storage of the device, or from storage remote from the device. [0021] The project details may be stored in a database. [0022] Preferably, the electronic program instructions comprise software. The device may comprise a smartphone having the software installed thereon. The software may be provided as a software application downloadable to the smartphone. [0023] According to a second broad aspect of the present invention, there is provided a method for facilitating interaction between parties regarding a project, the method comprising: storing electronic program instructions for controlling a controller, and information or data; and controlling the controller via the electronic program instructions, to: receive input via an input means, the input comprising details of a sign associated with the project; recognise the sign, and on the basis of the recognition, retrieve project details of the project, the project details comprising information and/or data associated with the project; and display the retrieved project details via the display. [0024] According to a third broad aspect of the present invention, there is provided a computer-readable storage medium on which is stored instructions that, when executed by a computing means, causes the computing means to perform the method according to the second broad aspect of the present invention as hereinbefore described. [0025] According to a fourth broad aspect of the present invention, there is provided a computing means programmed to carry out the method according to the second broad aspect of the present invention as hereinbefore described.
[0026] According to a fifth broad aspect of the present invention, there is provided a data signal including at least one instruction being capable of being received and interpreted by a computing system, wherein the instruction implements the method according to the second broad aspect of the present invention as hereinbefore described. [0027] According to a sixth broad aspect of the present invention, there is provided a system for facilitating interaction between parties regarding a project comprising a mobile communication device according to the first broad aspect of the present invention as hereinbefore described. BRIEF DESCRIPTION OF THE DRAWINGS [0028] In order that the invention may be more fully understood and put into practice, preferred embodiments thereof will now be described with reference to the accompanying drawings, in which: Figure 1A depicts a schematic diagram of a mobile communication device in accordance with an embodiment of the present invention; Figure 1 B depicts a system diagram of a communication system comprising the device; Figure 1C is a flow chart depicting a logical structure of pathways available to a user on the device Figures 2A to 7Q are screen shots depicting a number of user interactive functions provided by the device; Figure 8 depicts a flow chart of a process using the device; Figure 9 depicts a representation of data flow in an aggregation step using the device; and Figure 10 depicts a screen shot of a viewing portal screen interactive function provided by the device. DESCRIPTION OF EMBODIMENTS [0029] In the drawings, like features have been referenced with like reference numbers.
[0030] In Figure 1A, there is depicted an embodiment of a portable or mobile electronic communication device 10 for facilitating interactions between parties regarding a project in accordance with an aspect of the present invention. [0031] In the embodiment, the project is at least one of a plurality of construction projects being, or proposed to be, developed in a respective public space or area. [0032] Each of the plurality of construction projects has a respective set of project details associated with it, comprising information and/or data relevant to or associated with the project. In the embodiment, the project details include: identification of the project. This may comprise a project name or title, and/or an image of or associated with the project, which may comprise a thumbnail; identification of one or more parties or entities responsible for the project, such as the project developer or manager, for example; identification of a location of the project, which may comprise a physical address of the project; identification of a type of the project; a written description of the project, which may be a brief description and/or a detailed description; a timeline for completion of the project; an indication of whether the project details have been downloaded onto the device 10 or are available for such download; at least one representation of the project, which may comprise a model of the project. In cases where there may be options available in respect of the project, there may be more than one representation, each of which may correspond to a different available option; and an identification of a type of the representation of the project. [0033] Additionally, each of the plurality of construction projects has at least one respective triggering sign associated with it, and provided at or proximate to the corresponding physical location of the project. An indication or identification of the associated sign(s), such as an image of or representing it, along with its physical location, is also included in the respective project details. [0034] The triggering sign, which may also be referred to as an image or frame marker, may comprise any indication or marking capable of being recognized by the device 10 as will be described in further detail below. The sign may include any letter, word, numeral, device, brand, image, shape, colour, sound, object, article, or thing, or any combination thereof. In preferred embodiments of the invention, the triggering sign may comprise a code. In a preferred embodiment, the triggering sign comprises data specifying a particular geographic location, such as Global Positioning System (GPS) coordinates. Preferably, the sign comprises a unique graphical element which will trigger a recognition by the device 10, allowing the appropriate model of the project to be loaded. In preferred embodiments, the triggering sign also serve as regular marketing and promotional media, in addition to be used to obtain a reference point from which the model will obtain its orientation in a viewfinder of the device 10. [0035] In preferred embodiments of the invention, there is a relationship or link between one or more features of the image marker and one or more features of the model. The one or more features may be size. In such a case, the image marker size may directly correspond with the size of the model being viewed, so by increasing the size of the image marker, you can increase the size of the model. The scaling factor may be relative, i.e. a double the size of the image marker, double the size of the model. As another example, an A0 image marker, would create a virtual model five times the size of an A4 image marker. [0036] This provides advantages including allowing models to be easily scalable to suit the presenting conditions. For example, the same graphic may correspond to the same model, a large graphic outdoors may correspond to a large model in life size on a street, whilst a small graphic in a brochure may correspond to a smaller model on a users table. [0037] A further advantage includes allowing flexibility of viewing options to the user. For example, the image marker may comprise a poster on a table or an image in a brochure. The client only needs one marker which can be used over various promotional material. In embodiments of the invention, image markers may be stored remotely and downloaded, for example from a website, printed in different sizes enabling the corresponding model(s) to be viewed as desired. This provides additional flexability in that a user can print one or more image markers to match their requirements, with the model size being within their control. [0038] An additional advantage is that it removes the requirement for multiple models of different scales, leading to benefits in memory management within the device 10, and it being easier to download one model as opposed to multiple models. [0039] Types of projects available in the embodiment described include whether the project is: a Showcase project (which may be a case study project, used to showcase products during sales pitches and/or for users to be able to see the capabilities of the device 10 and an interaction app (described in further detail below)), an Architecture project (which may be a development proposed by an architect), a Private Development project (which may be a privately run development), a Public Development project (which may be a Government run development), or an Other project (which may be a project that doesn't fit into the previously mentioned project types, such as public art or alleys, for example). [0040] As will be described in further detail, the embodiment of the invention facilitates interactions between parties responsible for the construction projects and members of an audience being the general public or community. It will be appreciated that the invention is not limited in this regard, and embodiments of it may be used to facilitate interactions between any parties or entities in respect of additional or alternative projects or issues. [0041] The processes of embodiments of the invention have application and may be used in any area where there are a large number of end-users (or inputs to be taken into account) which can provide information to the system which can then be displayed visually and interacted with to create new inputs. [0042] Such applications additional to building construction projects include music creation, art (including collaborative art), collaborative educational projects; gathering consensus data, and solving problems which draw on a variety of opinions which are prone to change based on real-time supply of visual information, and involve selecting the most relevant data from noise based on user preferences. [0043] Embodiments of the invention may be integrating into a learning environment to provide a new medium for collaborate information sharing. For example, this may comprise using an implementation of the invention to facilitate the interaction of students in respect of a mathematics problem, all harnessing the knowledge of each other in a visual medium. [0044] As will be described in further detail, the embodiment of the invention creates a virtual interface, with Augmented Reality (AR) technology, between end parties, who may be, for example, the general public and businesses, and creates a database of information which can be then be data-mined in various manners. [0045] The embodiment of the invention is operable to gather input data from end users, and then display that data in a manner which will then influence the input source to either remain the same or change its state. It is a real-time information exchange process which is used for providing real-time visualisations of data which is being created by people interacting with the visualisations. [0046] The device 10 comprises a plurality of components, subsystems or modules operably coupled via appropriate circuitry and connections to enable the device 10 to perform the functions and operations herein described. The device 10 comprises suitable components necessary to receive, store and execute appropriate computer instructions such as a method for facilitating interaction between parties regarding a project in accordance with an embodiment of the present invention. [0047] Particularly, the device 10 comprises computing means which in this embodiment comprises a controller 12 and storage comprising a storage means, medium or device14 for storing electronic program instructions for controlling the controller 12, and information or data; a display 16 for displaying a user interface 18; and an input means 20; all housed within a container or housing 22. [0048] As will be described in further detail, the controller 12 is operable, under control of the electronic program instructions, to: receive user instructions which are input by a user via the input means 20, the user instructions comprising details of a sign associated with a respective one of the plurality of projects; recognise the sign, and on the basis of the recognition, retrieve the relevant project details of the corresponding project; and display the retrieved project details via the display 16. [0049] Particularly, the device 10 is operable to display the representation of the project, and preferably the model of the project, via the display 16. The model may comprise a life size model or a mini model of the project. In the embodiment, a life size model is a 1:1 model of the proposed construction in its actual environment. A mini model is the same model but displayed as a smaller size from an image marker placed, for example on a horizontal surface (such as from a street map rather than a sign at a location). The mini model associated with such an image marker is useful to facilitate viewing of the model in full, allowing the user to be able to walk around the model seeing all sides and the top view, for example. In the case where the project relates to the construction of a building, the model preferably comprises a life-size three dimensional (3D) model of an architectural design of the building. The models may be created in any standard 3D modeling software package. [0050] As will be described in further detail, the device 10 is operable to present 3D models of buildings in life-size, in their actual physical environments and allows end users to walk around and create their own walk-through experience in real-time. It uses, in the embodiment described, the mobility and display functionality of smartphone/tablet technology to create a "window", which may also be referred to as a viewing portal, via the display 16 for the end-user to look through. This window presents a view of the 3D model of the construction, overlaid on the environment that will remain once the construction is complete. As the window is as mobile as the end-user holding it, it enables the end-user to walk around "virtual" buildings and obtain whatever view of the construction they require. [0051] The controller 12 comprises processing means in the form of a processor. [0052] The storage device 14 comprises read only memory (ROM) and random access memory (RAM). [0053] The device 10 is operable to communicate via one or more communications link(s) 22, which may variously connect to one or more remote devices 24 such as servers, personal computers, terminals, wireless or handheld computing devices, landline communication devices, or mobile communication devices such as a mobile (cell) telephone. At least one of a plurality of communications links 22 may be connected to an external computing network through a telecommunications network. [0054] In the embodiment described, the remote devices 24 include other devices 10, owned and/or operated by other users, as well as a system administration module 25 and a system reporting module 27 comprising a web interface. Together, the one or more devices 10, system administration module 25, and system reporting module 27, comprise a system for facilitating interactions between parties regarding a project in accordance with an aspect of the present invention. [0055] The device 10 is capable of receiving instructions that may be held in the ROM or RAM and may be executed by the processor. The processor is operable to perform actions under control of electronic program instructions, as will be described in further detail below, including processing/executing instructions and managing the flow of data and information through the device 10. [0056] The electronic program instructions are provided via a single software application ("app") or module which may be referred to as an interaction app. In the embodiment described, the app is marketed under the trade mark VIRTUALIS T M (with building device T M ), and can be downloaded from a website (or other suitable electronic device platform) or otherwised saved to or stored on the storage device 14 of the device 10. [0057] In preferred embodiments of the invention, the device 10 comprises a smartphone such as that marketed under the trade mark IPHONE® by Apple Inc, or by other provider such as Nokia Corporation, or Samsung Group, having Android, WEBOS, Windows, or other Phone app platform. Alternatively, the device 10 may comprise other computing means such as a personal, notebook or tablet computer such as that marketed under the trade mark IPAD® or IPOD TOUCH@by Apple Incor by other provider such as Hewlett-Packard Company, or Dell, Inc, for example. [0058] The software app, or software, electronic instructions or programs for the computing components of the device 10, can be written in any suitable language, as are well known to persons skilled in the art. For example, for operation on a device 10 comprising an IPHONE@ smartphone, the software app may be written in the Objective-C language. In embodiments of the invention, rather than being a single software app, the electronic program instructions may comprise a set or plurality of software, electronic instructions or programs and can be provided as stand-alone application(s), via a network, or added as middleware, depending on the requirements of the implementation or embodiment. In preferred embodiments of the invention, the software app comprises AR software. [0059] The device 10 also includes an operating system which is capable of issuing commands and is arranged to interact with the software app to cause the device to carry out the respective steps, functions and/or procedures in accordance with the embodiment of the invention described herein. Particularly, the operating system and software app provide a framework and visualization engine for integrating the 3D models into the display of the device 10. The operating system may be appropriate for the device 10. For example, in the case where the device 10 comprises an IPHONE® smartphone, the operating system may be iOS. [0060] In alternative embodiments of the invention, the software may comprise one or more modules, and may be implemented in hardware. In such a case, for example, the modules may be implemented with any one or a combination of the following technologies, which are each well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA) and the like. [0061] The computing means can be a system of any suitable type, including: a programmable logic controller (PLC); digital signal processor (DSP); microcontroller; personal, notebook or tablet computer, or dedicated servers or networked servers. [0062] The processor can be any custom made or commercially available processor, a central processing unit (CPU), a data signal processor (DSP) or an auxiliary processor among several processors associated with the computing means. In embodiments of the invention, the processing means may be a semiconductor based microprocessor (in the form of a microchip) or a macroprocessor, for example. [0063] In embodiments of the invention, the storage means, medium or device can include any one or combination of volatile memory elements (e.g., random access memory (RAM) such as dynamic random access memory (DRAM), static random access memory (SRAM)) and non-volatile memory elements (e.g., read only memory (ROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), etc.). The storage means, medium or device may incorporate electronic, magnetic, optical and/or other types of storage media. Furthermore, the storage medium can have a distributed architecture, where various components are situated remote from one another, but can be accessed by the processing means. For example, the ROM may store various instructions, programs, software, or applications to be executed by the processing means to control the operation of the device 10 and the RAM may temporarily store variables or results of the operations. [0064] The use and operation of computers using software applications is well known to persons skilled in the art and need not be described in any further detail herein except as is relevant to the present invention. [0065] Furthermore, any suitable communication protocol can be used to facilitate connection and communication between any subsystems or components of the device 10, and the device 10 and other devices or systems, including wired and wireless, as are well known to persons skilled in the art and need not be described in any further detail herein except as is relevant to the present invention. [0066] Where the words "store", "hold" and "save" or similar words are used in the context of the present invention, they are to be understood as including reference to the retaining or holding of data or information both permanently and/or temporarily in the storage means, device or medium for later retrieval, and momentarily or instantaneously, for example as part of a processing operation being performed. [0067] Additionally, where the terms "system", "device", and "machine" are used in the context of the present invention, they are to be understood as including reference to any group of functionally related or interacting, interrelated, interdependent or associated components or elements that may be located in proximity to, separate from, integrated with, or discrete from, each other. [0068] Furthermore, in embodiments of the invention, the word "determining" is understood to include receiving or accessing the relevant data or information. [0069] In the embodiment of the invention, the display 16 for displaying the user interface 18 and the user input means 20 are integrated in a touchscreen 26. In alternative embodiments these components may be provided as discrete elements or items. [0070] The touchscreen 26 is operable to sense or detect the presence and location of a touch within a display area of the device 10. Sensed "touchings" of the touchscreen 26 are inputted to the device 10 as commands or instructions and communicated to the controller 12. It should be appreciated that the user input means 20 is not limited to comprising a touchscreen, and in alternative embodiments of the invention any appropriate device, system or machine for receiving input, commands or instructions may be used, including, for example, a keypad or keyboard, a pointing device, or composite device. [0071] In this regard, in the embodiment described, the input means 20 further comprises imaging means or an imager in the form of a digital camera 28 operable to capture an image. The camera 28 is integrated with the device 10 in the embodiment. The imaging means may comprise any suitable system or device facilitating the acquisition of still and/or moving images. For example, in the case where the device 10 comprises an IPHONE@ smartphone, the imaging means may be an iSightTM camera. The use and operation of cameras is well-known to persons skilled in the art and need not be described in any further detail herein except as is relevant to the present invention. [0072] Input may also be received via at least one sensor which is part of a sensor system or a set of sensors 30 of the device 10. Individual sensors within the set of sensors 30 are operable to monitor, sense and gather or measure sensor data and/or information associated with or relating to characteristics, properties and parameters of the device 10, the surrounding environment, or components, systems or devices associated therewith or coupled thereto. For example, the set of sensors 30 is operable to sense and gather sensor data relating to a state of the device 10 and/or a state of the environment surrounding the device 10. The state of the mobile device 10 comprises a position of the device 10. In an embodiment, the state of the device 10 further comprises a velocity and/or speed of the device 10.The set of sensors 30 include an inertial sensor system 32 comprising an acceleration sensor 34 and an orientation sensor 36, a direction sensor 38 and a position sensor 40. Alternative embodiments of the invention may comprise additional or alternative sensors. [0073] The acceleration sensor 34 is operable to measure an acceleration of the device 10 and produce an acceleration data. For example, the acceleration sensor 34 may be an accelerometer. The orientation sensor 36 is operable to measure a rate of change of the orientation (i.e., angular rate) of the device 10 and produce an orientation data. For example, the orientation sensor 36 may be a gyroscope.The direction sensor 38 is operable to determine a direction relative to the Earth's magnetic poles and produce a direction data. For example, the direction sensor 38 may be an electronic compass. The position sensor 40 is operable to determine a position of the device 10 and produce a position data. For example, the position sensor 40 may be a Global Positioning System (GPS). The use and operation of such sensors is well-known to persons skilled in the art and need not be described in any further detail herein except as is relevant to the present invention. [0074] One or more sensors of the set of sensors 30 may be integrated with the device 10, is may be the case where it comprises an IPHONE® smartphone. Alternatively, the mobile device 10 may be operably coupled to one or more of the above-described set of sensors 30. [0075] The project details of at least some of the plurality of projects are stored or saved in a database 42 or databank residing on the storage device 14 and accessible by the controller 12 under control of the interaction app. These may be installed as part of the interaction application. The controller 12 is arranged to interact with the database 42 to cause the device 10 to carry out the respective steps, functions and/or procedures in accordance with the embodiment of the invention described herein. [0076] The project details of others of the plurality of projects are stored or saved remotely, for example in one or more remote database modules 44 residing on respective storage means, mediums or devices of one or more remote systems or devices 24 and accessible by the device 10 via the one or more communications link(s) 22. The controller 12 is arranged to facilitate user interaction with the one or more remote databases 44, such as via an online applications shop or store, to make the remotely stored content available for free or on payment of a fee according to a fee schedule. [0077] It will be understood that the database(s) may reside on any suitable storage device, which may encompass solid state drives, hard disc drives, optical drives or magnetic tape drives. The database(s) may reside on a single physical storage device (as in the embodiment described) or may be spread across multiple storage devices or modules. [0078] The database 42 is coupled to the controller 12 and in data communication therewith in order to enable information and data to be read to and from the database 42 as is well known to persons skilled in the art. Any suitable database structure can be used, and there may be one or more than one database. In embodiments of the invention, the database 42 can be provided locally as a component of the device 10 (such as in the storage device 14) or remotely such as on a remote server, as can the electronic program instructions, and any other data or information to be gathered and/or presented. In an embodiment, several computers or devices can be set up in this way to have a network client-server application, hosting the 3D models for download as required or desired, for example. Such an embodiment has the advantage that the end user only has to download models which are relevant or of interest to them. [0079] Once the interaction app is installed on the device 10, the controller 12 is operable, under control of the interaction app, to present, via the touchscreen 26, a sequence of electronic pages, screens and forms to a user or operator of the device 10 allowing for the inputting or capture of information and/or data, including images sensed via the camera 28, data and/or information sensed via sensors of the set of sensors 30, instructions and commands pertinent to operation of the device 10. Figure 1C of the drawings depicts a logical structure of pathways available to a user of the device 10 in this regard. [0080] Figures 2A and 2B of the drawings show examples of what may be referred to as a "Home" screen or homepage displayed via the touchscreen 26 on initial start up or execution of the interaction app. It is to be appreciated that these, and the other screens illustrated in the drawings of the specification, are examples of screen shots of the embodiment. In embodiments of the invention, these may be altered to suit user demand or feedback or to improve functionality, for example, and so other screens having a difference visual appearance are possible. [0081] In embodiments of the invention, successful login or authentication of the user is required prior to access to the Home screen (and other features of the device), being granted. In embodiments, similar action is also required to access details of particular projects. That is, for some listings the user needs to sign in to view them. At Figure 2C, there is shown an example of a login screen. Given the nature of the information held in the database, the device provides a mechanism by which a user can provide authentication. The authentication is carried out by the user interacting with the interface to provide login credentials such as a login identifier and password, which is then received by an authentication module arranged to allow access to the device 10, and/or details of the protected particular project, for example, upon authentication of the user.
[0082] Interface elements in the form of buttons are provided on the Home screen in a footer navigation bar, allowing the user to access screens facilitating the performance or execution of actions including: accessing information about the device 10, including tips and instructions for using the device and production credits, via an About Us Button; loading models via a Load Model button; opening and operating a viewing portal (the hereinbefore described window) via a Viewing Portal button; and specifying operating settings for the device 10 via a Settings button. The operating setting may relate to, for example, downloading of model options, account detail options, and map functionality to find local examples. [0083] The Home screen may include a written description of how to use the device 10 or software app. [0084] Execution of the About Us button leads to an "About Us" screen, an example of which is depicted in Figure 3A, being automatically displayed via the touchscreen 26 and enabling the user to access information about the device 10, which may include general content. From the About Us screen, other General Page screens providing relevant information such as terms and conditions governing use of the device, credits, provider contact details, and tips, may be navigated to and from. Figure 3B depicts an example of a General Page screen. [0085] In embodiments of the invention, the controller 12 is operable, under control of the interaction app, to receive position data specifying a current geographic location of the device 10 from the position sensor 40 and, on the basis of the position data, automatically determine the geographic location of the device 10 in the world and display, via the touchscreen 26, information related or relevant to the determined geographic location. There is integration with a maps functionality to display location(s) and related/relevant information. The displayed information may include an identification of projects of the plurality of projects in the area, along with pertinent directions, website links, and contact details, for example, and may be overlaid on the display of the touchscreen 26 on top of existing signs and landmarks of a map of the area/location. Figure 2A of the drawings depicts an example where this information is displayed on the Home screen. Of course, in alternative embodiments of the invention, the information may be displayed on any screen, including a screen dedicated to that purpose. [0086] Execution of the Load Model button leads to a "Load Model" screen, examples of which are depicted in Figures 4A and 4B, being automatically displayed via the touchscreen 26 and enabling the user to input user instructions or commands comprising a selection of a project from the stored project details, and to load/download different models. Available projects are stored and searchable by terms or identifiers including the respective project details. [0087] Particularly, in the embodiment described, projects are primarily grouped and searchable by the project type specified in the project details. A list of interface element buttons corresponding to each available project type is provided enabling the user to make their selection via the touchscreen 26. This screen allows the user to choose the type of project they wish to be the subject of the interaction. In alternative embodiments of the invention, additional or alternative items specified in the project details may be available to choose from primarily. [0088] The user is able to return from the Load Model screen to the Home screen via execution of a respective navigation interface element button provided on a title portion of the screen and labeled "Back". Similar navigation interface element buttons are provided on the other screens displayed as well, with the exception of the Home screen. [0089] Selection of the project type leads to at least a selection of the project details of projects satisfying that criteria in their project details being automatically displayed via the touchscreen 26, populating under the relevant criteria heading. They may be displayed on the Load Model screen, via a drop-down list for example as depicted in Figure 4C, or via a new screen as depicted in Figures 4D and 4E, for example. The displayed projects are further searchable and selectable within the list. In the embodiment, the projects are listed or grouped according to criteria including the respective project details. In embodiments of the invention, the criteria may comprise additional details for each respective project, including: developer options, including a selection of project developer from different developers such as, for example, Pindan Pty Ltd, Colgan Industries Pty Ltd; costs, including a price associated with the project, which may include a cost of a property being developed if it is for sale, for example, apartment prices for 1, 2 or 3 bedroom options, listed by the relevant developer; ratings allocated to the project from other users, for example, which may be in accordance with a prescribed rating scale such as, for example, a number of "stars" and based on user and/or community feedback; and distance of the project from the present geographic location of the device 10, which may be displayed in a Location screen, an example of which is depicted in Figure 4F. In the Location screen depicted in Figure 4F a map view shows all listings in the area on a local map, with flags for listings being displayed in different colours according to criteria, which may include their target market. [0090] A list of interface element buttons corresponding to each available search option is provided enabling the user to refine and make their selection of a project to interact with via the touchscreen 26. Via use of these interface buttons, in the embodiment described, the user is able to refine their search to, for example: show only listings from certain developers show only listings in a certain price range; show only listings that are commercial space or residential property; show only listings from star ratings; and show only listings based on distance away from current location or selected post codes. [0091] Selection of a project to interact with via the Load Model screen leads to a Selected Project screen, examples of which are depicted in Figures 5A and 5B, being automatically displayed via the touchscreen 26. In the example depicted in 5A, the Selected Project screen displays at least some of the project details of the selected project, including its thumbnail, title, description, representation type, whether the associated representation has been downloaded or is available for download, and physical location of triggering signs for it. It also displays user instructions and tips for using the device 10 to interact with the selected project. Additionally, it displays an executable (clickable in the embodiment) communication link to a map providing the physical location of triggering signs and an executable (clickable in the embodiment) communication link facilitating communication of the indication or identification of the associated sign(s), which may be in the form of an email link facilitating email communication. An executable link is also provided for accessing additional project details and information. In the example depicted in Figure 5B, of these, only the user instructions and tips for using the device 10 to interact with the selected project are displayed. [0092] An interface element in the form of a Ready button, labeled to indicate that the user is ready to proceed, is provided on the Selected Project screen allowing the user to access a "Viewing Portal" screen providing for operation of the viewing portal or window. In the embodiment described, the full Viewing Portal screen opens and becomes available once the Ready button is clicked. The Viewing Portal screen may also be accessed via the previously described Viewing Portal button provided on the Home screen (in the case where the desired project has been pre-selected, for example). [0093] Execution of the Ready button or Viewing Portal button leads to the Viewing Portal screen being automatically displayed via the touchscreen 26 and enabling the user to operate the viewing portal to engage in one or more interactions in respect of the selected project. Until an event occurs, which in the embodiment comprises recognition of a sign associated with the selected project, the controller 12 is operable, under control of the interaction app, to display in the Viewing Portal screen input being received via the camera 28. That is, it displays whatever the camera 28 is being focused on or directed towards. Whilst the Viewing Portal screen is activated, this continues until the camera is focused on or directed towards one of the one or more triggering signs associated with the selected project. When such an event occurs, the controller 12 is operable, under control of the interaction app, to recognise the sign, and on the basis of the recognition, retrieve the model of the project from the relevant project details of the selected project, and display the model in the Viewing Portal screen. [0094] Particularly, the display shows how the area would look when the selected project is completed, with those portions of the area unaffected by the project remaining unchanged in the display shown in the Viewing Portal screen. The architectural design of the building of the selected project is presented in its proposed construction environment by overlaying the life-size, 3D building model onto the landscape of the location. An example of this, where the sign is referenced by numeral 46, is depicted in Figure 6. [0095] The user can then walk around the perimeter of the building to explore the lifesize 3D model via the Viewing Portal screen. In embodiments of the invention, the sign needs to be in view at all times for the model to be showing. In the described embodiment the distance between the user (and hence the device 10) and the sign is preferably at least about 5 meters (that is, the sign is 5+ meters away), so that the the user is able to orient the device 10 and move the Viewing Portal around and walk around the building model while still keeping the sign in view. [0096] Figures 7A to 7Q depict further examples of projects being interacted with via the Viewing Portal screen of the device 10. [0097] Interface elements are provided on the Viewing Portal screen enabling the user to input user instructions or commands to engage in interactions in respect of the project. Examples of such interface elements and the operations they facilitate will now be described. [0098] Figure 10 depicts another example a project being interacted with via the Viewing Portal screen of the device 10, showing header and footer navigation bars and the view shown by the camera 28 being pointed or directed to a marker. [0099] Orientation interface elements in the form of arrows flashing left/right and/or up/down in the embodiment described as appropriate, provide an indication to the user of the direction in which the device 10 should be orientated to help find nearby trigger points. [00100] In the described embodiment, the device 10 is operable, via the controller 12 under control of the interaction app, to use data including orientation data produced via the internal gyroscope (of the orientation sensor 36 calculating the orientation of the device 10), and position data produced via the GPS technology (of the position sensor 40 establishing location coordinates) to calculate which direction to point (display) the arrow(s) to direct the user to the trigger point. [00101] One Viewing Portal screen interface element comprises an Information button. Execution of the Information button leads to the previously described About Us screen being automatically displayed via the touchscreen 26, providing information about the device 10, including tips and instructions for using the device and production credits. [00102] Execution of an options interface element portion of the displayed model, which may be done by the user "clicking" or "tapping" on it for example as depicted in Figures 7B and 7D, results in an Interaction Options screen being automatically displayed via the touchscreen 26. The Interaction Options screen provides a selection of options available to the user for interacting in respect of the project. The interaction may comprise interfacing with social networking platforms, private corporate networks, and general public websites (such as Wikipedia), for example. Additionally, or alternatively, the interaction may comprise accessing information relevant to a selection portion or part of the displayed model. The information may comprise any or all of the associated project details as hereinbefore described. The available options may be displayed on the Viewing Portal screen, via a drop-down list for example as depicted in Figures 7C and 7E, or via a new screen. [00103] In the embodiment described, the available options include: accessing links, which may be to, for example, floor plans of a building the subject of the project, an electronic page for submitting an enquiry regarding the project, a webpage or other site associated with a developer of the project, images of window views from different apartments windows of a building the subject of a project, voting options or feedback options about the development, social networks, and/or to request updates from the developer; adding or submitting comments regarding the selected project, and viewing previously provided comments (as shown in Figure 7N, for example) post material on a social networking service such as TwitterTM or Facebook TM; cast a vote or provide a star rating in respect of the project or an aspect of the project; and view statistical information relevant to how the project is being perceived, such as how many "likes" it has received from users (this may be presented in a ticker as shown in Figure 70, for example). [00104] A list of interface element buttons corresponding to each available option is provided enabling the user to make their selection via the touchscreen 26. Thus, this screen allows the user to select how they would like to interact in respect of the project. In alternative embodiments of the invention, additional or alternative interaction options may be available to choose from, as appropriate to the project. [00105] For example, in the case where the project relates to a commercial/residential building comprising a plurality of apartment properties, the information accessed may relate to a particular one of the apartment properties, selectable via the interface element portion of the displayed model.
[00106] The embodiment of the invention therefore creates a two way information exchange between content providers and end-users. As will be described in further detail, a system comprising a plurality of the devices 10 is operable to provide a real time representation to end-users of the data inputted or selected by all or a selection of end-users utilising the system. [00107] The user can also manipulate the model displayed by performing gesture actions in respect of the touchscreen 26, including "hovering", "dragging", "pinching", and "spreading" actions, examples of which are depicted in Figures 7F to 71. [00108] Execution of a hovering action may cause information regarding the status of the project to be displayed. In the above mentioned case where there is a plurality of apartment properties, the status information may comprise whether a particular apartment property being hovered over is sold or available for purchase. If sold, then no additional interaction may be taken in respect of that property via the interface element buttons in the embodiment. [00109] Execution of a dragging action may allow a selection of one of a plurality of models available for the project to be displayed in the location being viewed. Additionally, or alternatively, it may allow a selection of different characteristics or features available for the project to be displayed. In the case of a residential property such as a house, for example, these may comprise different construction elements or materials including available bricks, roof tiles, and skirting paint, allowing for custom design of the house. Once so custom designed, the user is able to capture the custom design and promote it via use of an Image Capture button described in further detail below. [00110] Execution of a pinching action causes the screen to zoom out, whilst execution of a spreading action causes the screen to zoom in. [00111] Upon execution of the interface element relating to the option to add comments regarding the selected project, the controller 12 is operable, under control of the interaction app, to automatically display a set of tools via the touchscreen 26. Tools of the set of tools are operable to allow the user to create an amended representation of the displayed model, and may include an executable virtual keyboard allowing the user to input characters representing written language and other symbols and signs for inclusion in a message, and a drawing tool for marking alterations, which may include painting or drawing on top of displayed models. Figure 7P depicts an example of an amended representation of the displayed model. [00112] Another Viewing Portal screen interface element comprises an Image Capture button, which may have the form of an icon representing a camera. Upon execution of the Image Capture button, by the user tapping or pressing it, for example, as illustrated in Figues 7J and 7K, the controller 12 is operable, under control of the interaction app, to record what is being displayed in the Viewing Portal screen at that time (which may include amendments as described above) and automatically display a Captured Image Options screen via the touchscreen 26. The recording may comprise a single still image (if the icon is tapped) or a sequence of still images representing scenes in motion (i.e. video) (if the icon is pressed).The Captured Image Options screen provides a selection of options available to the user in respect of the recording. The available options may be displayed on the Viewing Portal screen, via a drop-down list for example as depicted in Figure 7L (in the case where no amendments have been made to the displayed model, but a person is present in the view of the camera 28), or Figure 7Q (where amendments have been made), or via a new screen. [00113] In the embodiment described, the available options include: publishing by posting or uploading as appropriate the recorded material on a social or sharing networking service such as FacebookTM, TwitPic
TM
, Instagram TM and YouTube, for example; saving/storing the recorded material on storage, in a library, database or other suitable file, for example; and sending the recorded material to a remote device 24, such as a Cultural Center screen, which may comprise a large television or other display screen in a designated cultural area. [00114] A list of interface element buttons corresponding to each available option is provided enabling the user to make their selection via the touchscreen 24. Thus, this screen allows the user to further select how they would like to interact in respect of the project. In alternative embodiments of the invention, additional or alternative interaction options may be available to choose from.
[00115] Together, the one or more devices 10, system administration module 25, and system reporting module 27, of the system are operable to gather, collate, manipulate, and present end-user data. [00116] The data may be manipulated to accommodate any database technology. In the embodiment described, the controller 12 is operable, under control of the interaction application to push the user data to a remote device 24 comprising a server, which is operable to store the data in an appropriate datafile format. The server is further operable to present the data to end users via a web portal, which is operable to draw on the datafiles to display the data. All standard data manipulation processes can be executed from the web portal, including, for example charts, sums, averages, and standardd deviation (std dev) processes as are well known to persons skilled in the art. [00117] In the embodiment, the data can also be exported into a variety of standard formats, including, for example, as text, comma-separated values (CSV), and Excel) for further analysis inone or more software applications, which may be commercially available and provided by a third party. [00118] In the embodiment, the data which is captured can comprise any parameter passed through the interaction application, including, for example, input from a user or other source, interaction from user, and visual display data. Any inputs are counted by attaching counter functionality inside the interaction application, which is operable to count the number of times the end user clicks on or otherwise executes any particular item/button/screen/data, or combination thereof. [00119] Via processing of the data, a report can be generated. This may be done via the system reporting module 27. The report may be, for example, a statistics report on how many views a particular construction project has had, what links have been clicked, which views were preferred, which colours were selected, how long users spent viewing screen(s), and a number of models viewed, etc. [00120] In the embodiment, the controller 12 is operable, under control of the interaction application, to store, send and receive information including notes, comments, and data within the device 10. Individual account functionality may be provided, allowing one or more users, to maintain private or confidential information including notes, pictures, and other data in association with a particular model. For example, a particular developer may use this feature to store data regarding future enhancements or potential customers for a particular apartment. [00121] The system reporting module 27 is operable to facilitate the performance of reporting actions by, for example, project managers. The reporting actions include: preparing and viewing standard graphical reports comprising charts, graphs, trends, and basic raw data; creating custom charts; and exporting raw data. Alternative embodiments of the invention may be operable to facilitate the performance of additional and/or alternative reporting actions. [00122] As hereinbefore described, information made available for presentation via the system reporting module 27 includes statistics on each listing in the embodiment. The statistics may include: how many unique views; how many total views; how many comments; most popular listing; how many photos were taken; tally of clicks for different links. Ths system reporting module 27 also facilitates the publication or uploading of communications, such as FacebookTM notifications. [00123] The system administration module 25 is operable to facilitate the performance of administration actions by, for example, project managers. The administration actions include: managing the upload of new projects; inserting new models, images, and trigger markers into the interaction application for download and use by end users; modifying existing models and trigger markers; and deleting old or redundant models and trigger markers. Alternative embodiments of the invention may be operable to facilitate the performance of additional and/or alternative administration actions. [00124] An account module may also be provided, operable to allow project managers to set up accounts and upload their own 3D models and make them live on the system. [00125] The preceding description regarding variations in implementation of the controller, software, and storage of the device 10 is also applicable to the implementation of system administration module 25, the system reporting module 27, and the account module. [00126] The above and other features and advantages of the embodiment of the invention will now be further described with reference to the device 10 in use.
[00127] A user installs and executes the interface application of the device 10. [00128] The user then interfaces with the device 10 and provides user instructions via the touchscreen 26, executing the screens as hereinbefore described, to conduct interactions in respect of desired projects. [00129] Figure 8 depicts a flow chart of the process using the device. [00130] In this manner, the device 10 facilitates actions including: 1. displaying life-size 3D models in a relevant physical location by overlaying the model in the touchscreen 24 of the device 10; 2. gathering of end-user data using the 3D models as the information exchange medium. The data gathered includes data input directly by the user via the touchscreen 26 and also via the camera 28, and/or one or more sensors of the set of sensors 30 ; 3. collating the input data gathered on a continuous basis in a suitable database structure to create a database field relationship between information and its physical/geographical position or orientation. The database may comprise a y dimension database. In addition to the collating, analyzing and manipulating the end user analytics to obtain trends and aggregation of preferences; 4. displaying the analytics via the touchscreen 26 of the device 10; and 5. enabling collaborative working on projects by integrating above actions (2), (3), and (4) into a seamless user experience (extendable beyond the use of 3D models in embodiments of the invention). [00131] Together, the one or more devices 10, system administration module 25, and system reporting module 27, of the system are operable to query an x-dimension database by integrating the y-dimension database into the 3D physical space where x > y. That is, transposing a database of information which can be traversed in the AR 3D realm. They are also operable to aggregate the input data from numerous sources at once for query. [00132] The one or more devices 10, system administration module 25, and system reporting module 27, of the system are operable to takea quantity of data and display it overlaying an existing physical object (such as a table). To traverse or interrogate the data, a user moves the device 10 physically closer to the data, revealing further more in-depth information. The end-user can then manipulate the data via the touchscreen 26 as hereinbefore described. The data is assigned coordinates in 3D physical space, which is then maneuvered and manipulated by the physical movement of the device 10 relative to the image marker(s) determined using data including orientation data produced via the internal gyroscope (of the orientation sensor 36 calculating the orientation of the device 10) and position data produced via the GPS technology (of the position sensor 40 establishing location coordinates). [00133] Together, the one or more devices 10, system administration module 25, and system reporting module 27, of the system are are additionally operable to create visual representations of queries to display in the AR environment. That is, to create feedback for end-user which will feedback into the system. Processes are also provided for interfacing with AR software (display algorithms for displaying data). [00134] In this regard, one or a first user can manipulate data which exists associated with a 3D physical object (laid out on the above mentioned table), which in turn will influence the data view by another or a second user who is interacting with the same data or a derivative of that data.The workspace is an overlay onto 3D space in front of the user. [00135] An example of use of the invention will now be described with reference to Figure 9. [00136] In this example, a new construction project contains three buildings. The height, width, orientation, and particular design features (number of doors, windows, sides) of the buildings is open to submission. End users input their preferences in the AR environment using their respective devices 10. Together, the one or more devices 10, system administration module 25, and system reporting module 27, of the system query the dataset to find the aggregate optimal design based on users preferences and display that visualisation to the end-users via the touchscreens 26 of their respective devices 10. [00137] Variables for building design in the example include: H: Height (numerical) W: Width (numerical) P: Position (Front, Middle, Back) [00138] The system comprises algorithms implemented in software which operate on the input data. In this example, with 3 data parameters, at least one of the algorithms is operable to create a 3D scatter plot of the various input selections, and calculate an optimal (or average) data range from which to derive an alternative model. [00139] Example: Userl: 10-10-20, User2: 20-20-40, Optimal: 15:15:30 [00140] In this manner, the system allows adjustments to be accessed and edited to create a overall aggregated display. [00141] It will be appreciated that the embodiment of the invention provides several advantages. [00142] The invention addresses the requirement coming from the lack of processes to gather information and display it in real-time to the end-user, for them to interact with and create further interaction and so on. [00143] The embodiment of the invention addresses common issues facing many new construction projects which are developed in public spaces. A particular issue in this case being the initial resistance offered by the general public when the construction projects are being designed and built. This resistance has in the past lead to delays, which for construction projects may result in monetary and scheduling losses, at least. [00144] The embodiment of the invention allows the general public to access information about the construction designs, and visualise the designs/relevant information in a physically appropriate environment, being the actual location of the building being interacted with. Furthermore, it provides a two-way information exchange with the inputs, gathered from each end-user, visually accessible by each user of the process, e.g. people voting on a desired size of a building will see the building grow/shrink in their visual display units. This type of process could be utilised over a short space of time e.g. planning meeting where feedback is solicited from attendees, or a longer time. e.g. an extensive planning process involving the opinions of an entire city. [00145] It allows for collaborative design efforts on buildings, by gathering inputs and collating the data from a larger number of sources than is currently possible, which when visually created in the AR medium, can then be further manipulated, and provide more inputs to the system. [00146] The embodiment of the invention addresses difficulties of managing and accessing many-to-one and many-to-many communication and information exchanges which involve a number of of sensory inputs. It does this by aggregating the large amounts of data for analysis into a visual representation, which when displayed using AR technology, allows the user to process the large amount of input data which could not be measured in real-time. [00147] By creating new interactions between end parties utilising the tactile and visual senses in the AR environment, the invention creates an exchange of information. This information can be used by end-users in a variety of manners. [00148] The information enables individuals to comprehend the flow and nature of data from other individuals connected to the process, in a visual manner which will facilitate greater understanding. [00149] The embodiments of the invention enable large scale gathering of information and the ability to interact, query and display data and concepts. [00150] Embodiments of the invention allow all the information for a project to be in anybodys pocket (via the device 10). Physical material is not required to be delivered to the audience to view models. Neither does the audience need to be in a certain location to view models, with the exception of life size location models. [00151] Embodiments of the invention allow for graphically representing the causal relationships between various datasets (user data), by applying algorithms to identify constructive and destructive interference. For example, the more people select one option in a dataset, if it makes them more likely to select a corresponding "unrelated" option in another dataset. [00152] Embodiments of the invention: 1. allow viewing of life-size building/construction models in their intended physical environments; 2. assist architects/construction firms/designers/planners (stakeholders) to showcase their plans; 3. provide a two-way communication channel between stakeholders and end-users utilising social networks, corporate networks, and physical objects in the public domain; 4. allow for the gathering of end-user generated information and content inputted, using the 3D models as the medium for the communication; 5. allow for the display of an appropriately manipulated version of the end-user generated information in the visual display of end-user's devices. [00153] Embodiments of the invention provide advantages in information gathering, assisting with: 1. selection and decision making processes relating to designs; 2. voting on alternative scenarios by presenting various options in their intended environments; 3. government approvals and planning processes; and 4. comparing various scenarios (by swiping on the screen of the device). [00154] Embodiments of the invention provide advantages in information display, assisting with: 1. presentation of end-user input data which is manipulated to display that information in the end-users AR experience. The information may comprise how other end-users are interacting with the 3D models (on a component level and/or a holistic level), in regard to features such as, for example, approval of a certain colour for a building, or placement of a window; 2. moving physical designs around and manipulating the 3D model in real-time. This may include making changes to the 3D model and those changes being represented in real time in other end-user's devices; 3. group design projects in real time. This may comprise multiple end-users working with a 3D model at the same time such as, for example, a first person moving the placement of a door whilst a second person is changing colours; 4. providing real-time visualization of end-user submitted preferences relating to content. This may comprise graphs displaying approvals, changes, number of scenarios modeled; and 5. linking to more suitable database forums. [00155] Embodiments of the invention provide advantages in information provisioning, assisting with: 1. physical visual representation of end-user submitted content relative to a specific construction (building); 2. physical visual representation of stakeholder submitted content relative to a specific construction (building); 3. real-time representation of other end-users positions and preferences via the AR interface. This may comprise, for example, a first person observing a second person's relative position to the building and their voting preferences; 4. comparing and contrasting numerous physical scenarios simultaneously. This may comprise, for example, different buildings beside each other, and/or various shells (such as walls, windows, roofs, colours); and 5. overlays (advertisements). [00156] Embodiments of the invention provide advantages in collaboration, assisting with: 1. providing a collaboration forum for end-users to contact and create workspaces to submit user generated content. This may take the form of a building design competition, wherein a first person observes a second person submit content relevant to his preferences; and 2. social networking context where end-users can see each other (and data inputs and/or preferences) in the AR environment. [00157] It will be appreciated by those skilled in the art that variations and modifications to the invention described herein will be apparent without departing from the spirit and scope thereof. The variations and modifications as would be apparent to persons skilled in the art are deemed to fall within the broad scope and ambit of the invention as herein set forth. [00158] Throughout the specification and claims, unless the context requires otherwise, the word "comprise" or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or group of integers but not the exclusion of any other integer or group of integers. [00159] Throughout the specification and claims, unless the context requires otherwise, the term "substantially" or "about" will be understood to not be limited to the value for the range qualified by the terms. [00160] It will be clearly understood that, if a prior art publication is referred to herein, that reference does not constitute an admission that the publication forms part of the common general knowledge in the art in Australia or in any other country. [00161] Also, future patent applications may be filed in Australia or overseas on the basis of, or claiming priority from, the present application. It is to be understood that the following provisional claims are provided by way of example only, and are not intended to limit the scope of what may be claimed in any such future application. Features may be added to or omitted from the provisional claims at a later date so as to further define or re-define the invention or inventions.

Claims (28)

1. A mobile communication device for facilitating interaction between parties regarding a project, the device comprising: a controller; storage storing electronic program instructions for controlling the controller; a display for displaying a user interface; and an input means; wherein the controller is operable, under control of the electronic program instructions, to: receive input via the input means, the input comprising details of a sign associated with the project; recognise the sign, and on the basis of the recognition, retrieve project details of the project, the project details comprising information and/or data associated with the project; and display the retrieved project details via the display.
2. The device according to claim 1, wherein the controller comprises computer processing means.
3. The device according to claim 1 or 2, wherein the display, user interface and input means are integrated.
4. The device according to any one of the preceding claims, wherein the project details comprise at least one representation of the project.
5. The device according to claim 4, wherein the at least one representation comprises a three dimensional (3D) model of the project.
6. The device according to any one of the preceding claims, wherein the sign comprises any letter, word, numeral, device, brand, image, shape, colour, sound, object, article, or thing, or any combination thereof.
7. The device according to claim 6, wherein the sign comprises a trigger code.
8. The device according to any one of the preceding claims, wherein there is a relationship or link between one or more features of the sign and one or more features of the model.
9. The device according to claim 8, wherein the one or more features is size, and the size of the sign directly corresponds with the size of the model being viewed.
10. The device according to any one of the preceding claims, wherein the input means comprises an imaging means.
11. The device according to claim 10, wherein the imaging means comprises a digital camera, and, when the camera is focused on the sign, the controller is operable, under control of the software, to recognise the sign, and on the basis of the recognition, retrieve the model of the project from the project details, and display the model via the display.
12. The device according to claim 11, wherein the model is displayed overlaid the image viewed by the camera prior to the model being retrieved.
13. The device according to any one of the preceding claims, wherein the input means comprises at least one sensor.
14. The device according to claim 13, wherein the at least one sensor is part of a set of sensors, and individual sensors within the set of sensors comprise at least one of an acceleration sensor, an orientation sensor, a direction sensor, and a position sensor.
15. The device according to any one of the preceding claims, wherein the input comprises user instructions which are input by a user via the input means.
16. The device according to claim 15, wherein the user instructions comprise a command to perform an action in respect of the project, and the controller is operable, under control of the electronic program instructions, to perform the action according to the received user instructions.
17. The device according to claim 16, wherein the action comprise an interaction action including one or more of the following: accessing a source related to or associated with the project; providing comments on the project; viewing comments on the project provided by others; creating material; publishing created material; and casting a vote in respect of the project.
18. The device according to any one of the preceding claims, wherein the controller is operable, under control of the electronic program instructions, to perform an analysis on the basis of the input received, generate a report on the analysis, and present the report via the display.
19. The device according to any one of the preceding claims, wherein the project details are retrieved from the storage of the device, or from storage remote from the device.
20. The device according to any one of the preceding claims, wherein the project details are stored in a database.
21. The device according to any one of the preceding claims, wherein the electronic program instructions comprise software.
22. The device according to claim 21, wherein the device comprises a smartphone having the software installed thereon.
23. The device according to claim 22, wherein the software is provided as a software application downloadable to the smartphone.
24. A method for facilitating interaction between parties regarding a project, the method comprising: storing electronic program instructions for controlling a controller, and information or data; and controlling the controller via the electronic program instructions, to: receive input via an input means, the input comprising details of a sign associated with the project; recognise the sign, and on the basis of the recognition, retrieve project details of the project, the project details comprising information and/or data associated with the project; and display the retrieved project details via the display.
25. A computer-readable storage medium on which is stored instructions that, when executed by a computing means, causes the computing means to perform the method according to claim 24.
26. A computing means programmed to carry out the method according to claim 24.
27. A data signal including at least one instruction being capable of being received and interpreted by a computing system, wherein the instruction implements the method according to claim 24.
28. A system for facilitating interaction between parties regarding a project comprising a mobile communication device according to any one of claims 1 to 23.
AU2015200346A 2014-01-24 2015-01-23 Facilitating Interactions Abandoned AU2015200346A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2015200346A AU2015200346A1 (en) 2014-01-24 2015-01-23 Facilitating Interactions

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU2014900224A AU2014900224A0 (en) 2014-01-24 Facilitating Interactions
AU2014900224 2014-01-24
AU2015200346A AU2015200346A1 (en) 2014-01-24 2015-01-23 Facilitating Interactions

Publications (1)

Publication Number Publication Date
AU2015200346A1 true AU2015200346A1 (en) 2015-08-13

Family

ID=53835720

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2015200346A Abandoned AU2015200346A1 (en) 2014-01-24 2015-01-23 Facilitating Interactions

Country Status (1)

Country Link
AU (1) AU2015200346A1 (en)

Similar Documents

Publication Publication Date Title
Wang et al. Integrating Augmented Reality with Building Information Modeling: Onsite construction process controlling for liquefied natural gas industry
US20180005329A1 (en) Systems and Methods for Cloud-based Data Exchange, Synchronous Interaction, and Generating User Interfaces
CN109271685B (en) BIM-based urban updating and reconstruction data storage method and device
US20130222373A1 (en) Computer program, system, method and device for displaying and searching units in a multi-level structure
Lee et al. V3DM+: BIM interactive collaboration system for facility management
Shojaei et al. Design and development of a web-based 3D cadastral visualisation prototype
Dambruch et al. Leveraging public participation in urban planning with 3D web technology
CN104956433A (en) Method for controlling the display of a portable computing device
Ullah et al. A study of information technology adoption for real-estate management: A system dynamic model
US10636207B1 (en) Systems and methods for generating a three-dimensional map
US20160300293A1 (en) Device, system and method for designing a space
TW201619904A (en) A computer program for realestate transaction
US20220189075A1 (en) Augmented Reality Display Of Commercial And Residential Features During In-Person Real Estate Showings/Open Houses and Vacation Rental Stays
Rajaratnam et al. Potential use of Augmented Reality in pre-contract design communication in construction projects
US9911257B2 (en) Modeled physical environment for information delivery
TW201810170A (en) A method applied for a real estate transaction information providing system
TWI625692B (en) A method applied for a real estate transaction medium system
EP4042291A1 (en) Geographically referencing an item
Krasić et al. Comparative analysis of terrestrial semi-automatic and automatic photogrammetry in 3D modeling process
Dodds et al. Mastering Autodesk Navisworks 2012
Read et al. Mastering Autodesk Revit Architecture 2013
AU2015200346A1 (en) Facilitating Interactions
Styliaras Towards a web-based archaeological excavation platform for smartphones: review and potentials
CN108062786B (en) Comprehensive perception positioning technology application system based on three-dimensional information model
Son et al. Tangible interface for shape modeling by block assembly of wirelessly connected blocks

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application