AU2016379448A1 - Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments - Google Patents
Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments Download PDFInfo
- Publication number
- AU2016379448A1 AU2016379448A1 AU2016379448A AU2016379448A AU2016379448A1 AU 2016379448 A1 AU2016379448 A1 AU 2016379448A1 AU 2016379448 A AU2016379448 A AU 2016379448A AU 2016379448 A AU2016379448 A AU 2016379448A AU 2016379448 A1 AU2016379448 A1 AU 2016379448A1
- Authority
- AU
- Australia
- Prior art keywords
- user
- data
- body scan
- scan data
- service provider
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/12—Acquisition of 3D measurements of objects
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Epidemiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure herein on relates to computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments. Some embodiments have been developed to provide technology that enables a hybrid between automated and expert human analysis of body scan data. Some embodiments have been developed for verifying an online user's identity before allowing them access to online services where verification of a user's identity is important such as on-line dating or virtual business meetings and/or allowing them access to digital data including data representative of a virtual 3D representation of the user. Further embodiments have been developed to provide technology that enables collaborative sharing of body scan data, for example in the context of clothing tailoring.
Description
COMPUTER IMPLEMENTED FRAMEWORKS AND METHODOLOGIES CONFIGURED TO ENABLE THE GENERATION, PROCESSING AND MANAGEMENT OF 3D BODY SCAN DATA, INCLUDING SHARED DATA ACCESS PROTOCOLS AND COLLABORATIVE DATA UTILISATION, AND IDENTIFY VERIFICATION FOR 3D ENVIRONMENTS
FIELD OF THE INVENTION
[0001] The present invention relates to computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data , including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments. Some embodiments have been developed to provide technology that enables a hybrid between automated and expert human analysis of body scan data. Some embodiments have been developed for verifying an online user’s identity before allowing them access to online services where verification of a user’s identity is important such as on-line dating or virtual business meetings and/or allowing them access to digital data including data representative of a virtual 3D representation of the user. Further embodiments have been developed to provide technology that enables collaborative sharing of body scan data, for example in the context of clothing tailoring.
BACKGROUND
[0002] Any discussion of the prior art throughout the specification should in no way be considered as an admission that such prior art is widely known or forms part of common general knowledge in the field.
[0003] It is known for users to obtain 3D body scans via public booths. However, there are limits to the usefulness of such data. By way of example, the personal nature of such data limits the extent to which a framework is able to deliver hybridized advice comprising both automated data analysis and expert human analysis (noting that human analysis incorporates various levels of subjectivity which are not able to be provided purely by a computer system). Manual sharing of the data (for example via email or the like) for the purpose of obtaining separate human advice is a time consuming process, and much of the power of complex data is typically lost through such a sharing process.
[0004] Furthermore,
SUMMARY OF THE INVENTION
[0005] One embodiment provides a computer implemented framework configured to enable generation and management of 3D body scan data, the framework including: [0006] a plurality of scanning units, wherein each scanning unit includes: [0007] (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; [0008] (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and [0009] (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; [0010] a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a historical user body scan data collection for the respective user; [0011] a shared access management module, wherein the shared access approval module is configured to: receive data representative of an instruction to associate a given user with a given advisor account; and in response to the instruction associate that given user with that given advisor account; [0012] an advisor portal module that is configured to: [0013] (i) authenticate access in respect of a given advisor account from an advisor terminal; [0014] (ii) cause rendering at the advisor terminal of a historical progress display interface that provides access to an computer-generated historical analysis of the historical user body scan data collection each user associated with that advisor account; and [0015] (iii) provide a user interface object to enable delivery of an electronic message from the advisor account to a selected one or more of the users associated with that advisor account.
[0016] One embodiment provides a computer system configured to enable generation and management of 3D body scan data, the framework including: [0017] a scanning unit management server in communication with a plurality of scanning units, wherein each scanning unit includes: [0018] (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; [0019] (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and [0020] (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; [0021] a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a historical user body scan data collection for the respective user; [0022] a shared access management module, wherein the shared access approval module is configured to: receive data representative of an instruction to associate a given user with a given advisor account; and in response to the instruction associate that given user with that given advisor account; [0023] an advisor portal module that is configured to: [0024] (i) authenticate access in respect of a given advisor account from an advisor terminal; [0025] (ii) cause rendering at the advisor terminal of a historical progress display interface that provides access to an computer-generated historical analysis of the historical user body scan data collection each user associated with that advisor account; and [0026] (iii) provide a user interface object to enable delivery of an electronic message from the advisor account to a selected one or more of the users associated with that advisor account.
[0027] One embodiment provides a computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment, the method including: [0028] operating 3D scanning hardware thereby to cause creation of data representative of a 3D representation of a person; [0029] collecting one or more identifying biometrics from said person at the place and time of the creation of the 3D representation; [0030] storing data derived from said one or more identifying biometrics in association with said data representative of said 3D representation; [0031] receiving, from a client terminal, a request to deliver the data representative of the 3D representation to a virtual environment, wherein said request includes one or more identifying biometrics from a user of said client terminal; [0032] performing a biometric verification process thereby to verify the identity of said user of said client terminal based on the identifying biometrics in said request and the data derived from said one or more identifying biometrics generated at the place and time stored in association with said data representative of said 3D representation; [0033] in the case that the verification process is successful, delivering the data representative of the virtual 3D representation to the virtual environment.
[0034] One embodiment provides a computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment, the method including: [0035] obtaining, from a terminal controlled by a user, user biometric data; [0036] verifying that the user biometric data matches stored biometric data associated with the virtual 3D representation; and [0037] if the user biometric data matches stored biometric data associated with the virtual 3D representation, allowing delivery of the virtual 3D representation.
[0038] One embodiment provides a computer implemented framework configured to enable accessing, analysis and collaborative utilisation of three-dimensional body scan data, the framework including: [0039] a plurality of scanning units, wherein each scanning unit includes: [0040] (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; [0041] (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and [0042] (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; [0043] a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; [0044] an approval management module, wherein the approval management module is configured to enable a given user to provide user-specified access permissions for that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; [0045] a service provider portal, wherein the service provider portal enables a given service provider user to access web-delivered functionalities via a logon process via the service provider user’s service provider account; and [0046] a measurement module, wherein the measurement module enables the service provider portal to provide, via a web interface, to a given service provider user having predefined user-specified access permissions for a specific user’s repository of body scan data, a measurement interface that enables generation of user measurements in accordance with a service-provider defined measurement regime.
[0047] One embodiment provides a computer implemented framework configured to configurable third party access to data derived from three-dimensional body scan data, the framework including: [0048] a plurality of scanning units, wherein each scanning unit includes: [0049] (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; [0050] (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and [0051] (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; [0052] a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; [0053] an approval management module, wherein the approval management module is configured to enable a given user to grant user-specified access permissions in respect of that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; [0054] a service provider portal, wherein the service provider portal is configured to: [0055] (i) enable a service provider user to configure a customised body attribute monitoring regime, wherein the customised body attribute monitoring regime is configured to provide one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data; [0056] (ii) enable the service provider user to associate the customised body attribute monitoring regime with a set of users that have granted access permissions in respect of that user’s repository of body scan to the service provider; [0057] (iii) enable scheduling of execution of a set of automated processes thereby to implement the customised body attribute monitoring regime against the set of users, wherein the set of automated processes include automated processes configured to, for a given one of the users: calculate the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data; and deliver via the service provider terminal the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data.
[0058] One embodiment provides a computer implemented framework configured to configurable third party access to data derived from three-dimensional body scan data, the framework including: [0059] a plurality of scanning units, wherein each scanning unit includes: [0060] (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; [0061] (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and [0062] (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; [0063] a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; [0064] an approval management module, wherein the approval management module is configured to enable a given user to grant user-specified access permissions in respect of that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; [0065] a service provider portal, wherein the service provider portal is configured to deliver, to a service provider computer system, one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data, thereby to cause controlling of one or more characteristics of an insurance policy for the user.
[0066] Further embodiments include methods performed by individual components of such a framework, those components, and methods performed collectively by two or more of those components.
[0067] Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
[0068] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
[0069] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
[0070] As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of quality that would be categorised as “exemplary” as an indicator of quality.
[0071] It will be appreciated that any of the features of the methods and systems described herein can be provided independently or any combination with each other.
[0072] Furthermore, it will be appreciated that the methods and systems described herein can provide numerous advantages, including, but not limited to providing a system and method which can accurately measure a physical body and provide an accurate anatomically realistic representation of the physical body, typically in an image form. The image of the physical body and associated data generated (for example scanned information or measurement data) can be used to provide certain recommendations to the user of the physical body. In one example, these include fashion and/or health recommendations, although it will be appreciated that other applications of the system and method described herein also fall within the scope of this document.
BRIEF DESCRIPTION OF THE DRAWINGS
[0073] Embodiments of the technology are further described by way of example with reference to the accompanying drawings.
[0074] FIG. 1A to FIG. 1E illustrate frameworks according to various embodiments.
[0075] FIG. 2A to FIG. 2F illustrates methods according to various embodiments.
[0076] FIG. 3 illustrates a client-server framework leveraged by various embodiments.
[0077] FIG. 4A to FIG. 4C illustrate in-home scanning arrangements according to embodiments.
[0078] FIG. 5A and FIG. 5B illustrate methods according to various embodiments.
[0079] FIG. 6A and FIG.6B illustrate embodiments of scanning units.
[0080] FIG. 7A to FIG. 7D illustrate screenshots according to an exemplary embodiment.
DETAILED DESCRIPTION
[0081] The disclosure herein on relates to computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments. Some embodiments have been developed to provide technology that enables a hybrid between automated and expert human analysis of body scan data. Some embodiments have been developed for verifying an online user’s identity before allowing them access to online services where verification of a user’s identity is important such as on-line dating or virtual business meetings and/or allowing them access to digital data including data representative of a virtual 3D representation of the user. Further embodiments have been developed to provide technology that enables collaborative sharing of body scan data, for example in the context of clothing tailoring.
Example Framework [0082] The technology herein described is in some cases implemented in the context of a framework that provides for distributed collection and utilisation of body scan data, primarily body size and shape information. This data is preferably collected via distributed hardware devices, which may include autonomous scanning booths, in-home (e.g. portable) scanning devices, and/or other scanning hardware (including from image-based, stereoscopic, microwave, infrared, and other scanning technologies).
[0083] In some embodiments a scanning booth includes user interface components which implement a predefined logical process thereby to guide a user though a scanning procedure. Body scan data is then uploaded to a central server, and via this server is made available to one or more third party platforms, such as websites and software applications (for example using APIs, widgets, and the like). This enables the third party platforms to implement functionalities which leverage body scan data. Examples of such functionalities include selection of appropriately sized clothing, monitoring of health and fitness, rating/ranking, competitions, and so on.
[0084] FIG. 1A illustrates a framework according to one embodiment. Exemplary implementations of various components within this framework are described in more detail further below.
[0085] FIG. 1A centres around a body scan data management server 110. This server may, in practical embodiments, be defined by one or more individual computing devices, optionally distributed over a number of physical locations. Server 110 is configured to communicate with: • User computing terminal 100, which may include the likes of personal computers, notebooks, smartphones, tablets, gaming consoles, and the like. For example, these computing terminals execute respective web browser applications, which enable local rendering of user interface components provided by a user interface component module 111 of server 110. These user interface components provide users with access to functionalities native to server 110, which preferably includes account management (for example registration of a new account, and modification of existing account details, with user account data being maintained in a repository 112) and in some cases scan management (for example modification of avatars, deletion of scan data, and so on). • Scanning booths 120, which may include user-driven autonomous booths (such as those described below) and in some embodiments other scanning hardware (such as in-home scanning hardware described further below). Scanning booth interaction modules 113 are responsible for enabling interaction between scanning booths 120 and server 110. This may include user account data management (for example where a user is enabled to register and/or login via a user interface provided at a scanning booth terminal), terminal maintenance (for example monitoring, downloading of software patches/updates), serving of advertising and/or promotional content, and so on. • Third party platforms, which may include the likes of websites, proprietary software applications (including, but not limited to, mobile apps). Third party integration modules allow server 110 to communicate with such third party platforms, via a plurality of technological approaches. This may include widget based approaches (where code served by server 110 is embedded within a web page provided by one of the third party platforms and rendered at a given one of computing terminals 100, API-based approaches (whereby a third party platform communicates with and interacts with server 110 via a predefined communications protocol), and other approaches.
[0086] In the example of FIG. 1A, a particular category of third party platforms are illustrated: service provider systems 130. These, as discussed in more detail further below, are computer systems which enable generation of user-specific measurements for users having body scan data 114. Service provider systems in some cases communicate directly with user terminals 100, for example by providing respective websites that are accessed via user client terminals, and enable the users to provide credentials enabling the service provider system to access data relating to specific users via server 110. For example, user credentials submitted via a system 130 enable authentication of a user at server 110, and enables server 110 to deliver particular aspects of user data (including aspects of body scan data) to the service provider system.
[0087] Further examples of scanning systems and associated technology is disclosed in PCT/AU2014/000514, PCT/AU2015/000278 and PCT/AU2015/000719, all of which are incorporated herein by cross reference.
Autonomous User-Driven Scanning Overview [0088] Embodiments described herein are primarily focussed on arrangements whereby scanning booths provide autonomous user-driven scanning. This means that a scanning booth provides user interface and user stimuli components which implement a logical process thereby to guide a user through a body-scanning procedure without intervention by a second human user. That is, a user is enabled to approach a booth, and have a user interface guide them through an entire scanning process, from login (or registration in the context of a non-registered user) through to scan completion (and in some embodiments avatar approval).
[0089] In general terms, a scanning booth configured to provide an autonomous user-driven scanning includes the following components: • A user interface which provides a user interface thereby to enable a user to identify with the booth. This may include either or both of local registration (i.e. provision of personal information and the like thereby to create a new user account) and user login. A user login may include providing user credentials, such as a username and password, defined subject to a previous local registration or a previous remote registration (using a terminal 100 in communication with server 110). Various technological means for user identification may be used, including the likes of NFC devices, biometrics, electronic device recognition, and so on. • A user interface and associated stimuli devices (for example visual and/or audible stimuli devices) configured to enable delivery of user instructions, thereby to enable a scan. These instructions include (i) preparation (for example clothing removal), (ii) stance and posture (for example positioning relative to defined feet positions and body position, preferably assisted by way of visual stimuli and automated feedback), and other such instructions. This allows automated scanning hardware (preferably in the form of infrared sensors) to collect body scan data from a body that is in a predefined desired stance and position. It will be appreciated that this greatly assists in analysis of collected measurements. • Scanning components, such as infrared sensors, which are configured to determine body scan measurements. These measurements are used thereby facilitate downstream functionalities, for example avatar generation. • A user interface which guides a user through avatar generation and approval. Following approval, body scan data is transmitted to server 110 thereby to be available for downstream use.
[0090] The user interfaces described above may be delivered by one or more screens, driven by one or more computing terminals.
In-Home Scanning Hardware Arrangements [0091] FIG. 1A relates to an arrangement whereby users visit scanning booths 120 thereby to have respective sets of body scan data defined and made available via server 110. For example, such scanning booths are preferably provided in public locations, such as shopping malls and the like.
[0092] In some embodiments, such a FIG. 1C, server 110 is configured to additionally interact with private (in-home) scanning hardware arrangements 121. In general terms, arrangements 121 are defined by one or more hardware devices which are able to be installed in home environments, thereby to allow the defining of body scan data in such an environment. In practice, a driving factor is to enable users to undergo body scanning in an environment where they are already accustomed/comfortable being naked. For example, the scanning hardware arrangements are configured to be installed in locations such as bathrooms and bedrooms. This not only assists in providing an enhanced user experience through comfort, but also through efficiency (given that body scan data is able to be obtained far more conveniently, potentially even on daily basis).
[0093] In general terms, any functionality provided via server 110 is equally able to be provided to users of arrangements 121 and arrangements 120.
[0094] FIG. 4A to FIG. 4C illustrate exemplary in-home scanning hardware arrangements. In overview, each exemplary arrangement includes a plurality of discrete scanning units. Each scanning unit includes: (i) A body that is configured to be mounted to a surface. For example, in some embodiment this includes a wall mounting formation. Other exemplary mounts include clamps and the like that are configured to be coupled to doors and/or door frames, and mounts that attach to stands (for example tripod type stands). (ii) At least one camera device that is configured, upon mounting of the body to the surface, to capture scan data from within a scanning region. For example, camera devices such as infrared scanning cameras, depth sensing cameras, and the like, may be used. In some embodiments the camera device is adjustably mounted (for example on a pivot) thereby to enable adjustment of a location of the scanning region. In practice, the respective camera devices of the multiple scanning units are directed to define a common overlapped scanning one in which a user is contained for the purpose of body scanning.
[0095] A processing unit is configured to process scan data derived from the plurality of discrete scanning units, thereby to define sets of body scan data. In a preferred embodiment the processing unit obtains scan data from multiple angles (using the multiple discrete scanning units) and uses this to compile data into a single combined scan model. This in some embodiments includes point cloud data. In some embodiments defining a given set of body scan data includes processing a plurality of time-synchronised sets of scan data from the discrete scanning units. In some embodiments a given set of body scan data includes identifying one or more known body poses (for example by applying a video analysis algorithm thereby to determine a time when a known body pose is adopted, and utilise body scan data for that time).
[0096] The processing unit is in some cases provided by one of the scanning units, and in some cases by an alternate device (examples are described below).
[0097] A communications module is configured to upload at least a subset of the sets of body scan data to the body scan data management server. The communications module is in some cases provided by one of the scanning units, and in some cases by an alternate device (examples are described below).
[0098] In some embodiments, the scanning process utilises algorithms that are configured to defined three-dimensional body scan data based on scanning of a moving human body. For example, an overall set of data requirements is predefined, this representing body parts, and scan data (for example point cloud data) is collected during body movements until the predefined data requirements are satisfied. This accounts for situations where the scanning zone defined discrete camera units has “blind spots”, which necessitate movement before a full human body is captured.
[0099] Some embodiments implement “passive scanning”, whereby a user is autonomously scanned without providing a positive scanning request, for example based on a motion sensing trigger provided by one or more of the scanning units. Scan data is only defined in the case that adequate capture occurs during an instance of passive scanning to meet the predefined data requirements. In some cases additional data requirements are implemented, for example a data requirement that discounts data obtained in the event that a user is detected to be wearing clothing.
[00100] In some embodiments an active scanning approach is used, whereby a user provides a command to scan (which is optionally either by interaction with a user interface device, a button on a scanning unit, or by adopting a specific pose in the scanning zone). In some such embodiments a user adopts and holds one or more predefined poses thereby to assist an effective body scan.
[00101] In the example of FIG. 4A, two scanning units 400 and 400’ are mounted thereby to define a scanning zone 450. Units 400 and 400’ each include a microprocessor 401 coupled to a memory module 402 and a communications module (for example WiFi or another wireless communications protocol) 403. These components configure scanning hardware 404 to capture data, in the form of 3D scan data, for objects contained in scanning zone 450. Units 400 and 400’ provide scanning data to a user PC 420, which executes a software application that is configured to process the scam data thereby to define body scan data for upload to server 110.
[00102] In the example of FIG. 4B, PC 420 is replaced by a scanning unit controller 430, which provides processing and upload capabilities, and a user interface device 440, which may be a smartphone or the like. A user operates the user interface device thereby to perform configuration operations for controller 430 and scanning units 400 and 400’. For example, these include operations relevant to scanning unit discovery, scanning hardware calibration/configuration, and so on. The user interface device is also configured to interact with server 110. For example, a proprietary app executes on device 440 which enables both interaction with the scanning units and access to functionalities provided by server 110.
[00103] In the example of FIG. 4C, functionalities of scanning unit controller 430 are taken on by one of the scanning units, which operates as a master scanning unit 400M. The other scanning unit operates as a slave scanning unit 400S. In some embodiments master and slave devices are different from a hardware perspective. For example: greater processing power may be provided at the master; slave-master communications may be via a first communications technology (such as Bluetooth) whereas the master has WiFi for upload functionaries; the master may include additional user interface components; and so on. In other embodiments all scanning units include the same hardware, and one is configured to adopt a master role.
[00104] In some embodiments component (such as the processing component) is configured to identify a user, such that a given set of body scan data is associated with the identified user. For example, that component is configured to identify a user is configured to compare a new set of body scan data with a plurality of sets of pre-existing body scan data that are associated with respective known users, thereby to determine whether the new set of body scan data represents one of the known users. Or, more generally, known users are autonomously identified based on visual or other characteristics able to be identified in body scan data (even in spite of minor body attribute variations).
[00105] In some embodiments at least one of the discrete scanning units is configured to perform a monitoring process, and initiate a body scanning process in response to an output of the monitoring process. For example, this may be a motion tracking process, or a process that identifies prescribed motions (for example a user adopting a specific body pose). This enabled automated triggering of a body scanning process. However, in some embodiments a body scanning process is initiated in response to user input.
Accessing, Analysis and Collaborative Utilisation of Three-Dimensional Body Scan Data [00106] As noted, in the example of FIG. 1A, a particular category of third party platforms are illustrated: service provider systems 130. These facilitate accessing, analysis and collaborative utilisation of three-dimensional body scan data. In examples described below, such functions are focussed on enabling a service provider to perform measurements in accordance with their own respective measurement regimes, based on user body scan data.
[00107] The framework of FIG. 1A includes a plurality of scanning units, wherein each scanning unit includes: (i)a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii)scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii)a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server. Body scan management server 110 is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user in repository 114.
[00108] An access approval management module 116 is configured to enable a given user to provide user-specified access permissions for that user’s repository of body scan data to one or more designated service providers (with each service provider is associated with a respective service provider account defined for server 110). This is optionally achieved via an interaction between a user and server 110, or via an interaction between a user and a service provider system 130. For instance, in relation to the latter, a given service provider system 130 is authorised to modify user data relating of access management on behalf of a user, in the case that the relevant user provides authenticated credentials recognised by server 100 via the relevant system 130. In some embodiments there are multiple access levels available. For the present purposes, we assume that each system 130 is provided with an access level which enables adequate access to a user’s body scan data to enable the functionalities described.
[00109] A service provider portal 117 is configured to enable a given service provider user to access web-delivered functionalities via a logon process via the service provider user’s service provider account. In various examples described below, this enables a user of a service provider system to view user interface objects generated based on processing of user body scan data.
[00110] A measurement module 115 is configured to enable the service provider portal to provide, via a web interface, to a given service provider user having predefined user-specified access permissions for a specific user’s repository of body scan data, a measurement interface that enables generation of user measurements in accordance with a service-provider defined measurement regime. Examples of how such measurements are generated are provided in FIG. 2A and FIG. 2B.
[00111] In the example of FIG. 2A, the service-provider defined measurement regime is defined by a service provider user via interaction with a graphical user interface element. This graphical user interface element includes a rendering of at least a portion of a human body (see block 201). In the example of FIG. 2A, the rendering is a 3D generic body model, provided in a user interface components that enables manipulation in 3 dimensions, common, and other such functions (a wider range of 3D object rendering and manipulation models are well known in the art). In other embodiments alternate renderings are used, including 2 dimensional renderings.
[00112] The graphical user interface object also enables user-selection of measurement points. For example, this is achieved by placing a virtual marker (for example via a click, drag-and-drop, or other GUI operation) at desired points, (see block 202). In this regard the service-provider defined measurement regime includes data representative of a plurality of measurement instructions, wherein each measurement instruction is defined by a pair of measurement end points defined relative to the generic body model. The user interface object enables defining of those pairs of measurement end points. For instance, in some cases the use interface provides an option to “add new measurement location”, and the user graphically adds end points.
[00113] Subsequently, a service provider user wishes to apply the generic body defined measurement instructions to a specific human user (via that specific user’s body scan data). In this regard, as shown by block 203, the generation of user measurements in accordance with a service-provider defined measurement regime includes applying a transformation thereby to translate the end points defined relative to the generic body model to user-specific end points defined relative to a body model derived from the specific user’s body scan data. This transformation is configured to determine how a point on the generic body model moves inward or outward to reach a point on a surface defined by the user-specific body model. Measurements of point-to-point distance along the user model body surface are then able to be performed.
[00114] This use of a generic body model and transformations allows a service provider to define one set of measurement instructions (for example based on standard measurements they would take from a human via a conventional in-person measurement process), and have those automatically applied to any user having body scan data available via server 110. For example, this is especially useful for tailoring of clothes, where service providers have predefined “patterns” which define garment portions, and those patterns are sized based on particular user measurements (i.e. the service provide can define measurement instructions for the measurements required for that service provider’s patterns).
[00115] In some embodiments there are multiple generic body models covering different approximate body types, for example male and female. Additionally/alternately, a service provider defines different measurement instructions for different body types (again for example male and female).
[00116] In some embodiments, there is an additional micro-adjustment process of block 204. Here, the measurement module enables display to the user of a rendering of at least a portion of body model based on the specific user’s body scan data, the rendering showing locations of the translated measurement end points. The measurement module also enables user-controlled manipulation of the translated measurement end points relative to the rendered body model, thereby to enable manual micro-adjustment to the translated measurement end points. This allows fine tuning of the automated transformations.
[00117] In some embodiments the micro adjustment process is provided for some users and not others, based on the level of access permissions they provide to service providers. It will be appreciated that, omitting step 204, it is possible for the service provider user to obtain user-specific measurements as numbers (metric or imperial) without ever having to see the user’s body shape.
[00118] Measurements are generated at block 205, using point-to-point distances following a surface defined by the user-specific 3D body model between end point pairs associated with each measurement instruction.
[00119] In the example of FIG. 2B, the service-provider defined measurement regime is defined by a service provider user via interaction with a graphical user interface element, again being a graphical user interface element that includes a rendering of at least a portion of a human body (see block 211). However, in the example of FIG. 2B, the rendering is a 3D body model defined from a specific user’s body scan data. Again, this is provided in a user interface component that enables manipulation in 3 dimensions.
[00120] The graphical user interface object also enables user-selection of measurement points on the rendering of the specific person’s body. For example, this is achieved by placing a virtual marker (for example via a click, drag-and-drop, or other GUI operation) at desired points, (see block 212). In this regard the service-provider defined measurement regime includes data representative of a plurality of measurement instructions, wherein each measurement instruction is defined by a pair of measurement end points defined relative to the generic body model. The user interface object enables defining of those pairs of measurement end points. For instance, in some cases the use interface provides an option to “add new measurement location”, and the user graphically adds end points. Measurement generation then occurs at 213, using a point-to-point measurement along a surface defined by the body model.
Example Measurement Approval Process [00121] In some embodiments, a measurement approval module is provided thereby to enable approval of a service provider’s measurements by a user (for example prior to the tailoring of garments). The measurement approval module is configured to: (i) make available, to the specific user, data representative of the generated user measurements; (ii) enable the specific user to provide input representative of approval or rejection of the generated user measurements; and (iii) make available, to the given service provider user, data representative of the approval or the rejection.
[00122] There are multiple technologies that may be used to enable such an approval process. Examples include: • A hyperlink to a service provider webpage, that webpage including graphical objects that provide the data representative of the generated user measurements. The link may be communicated by email or other means. • A hyperlink to a webpage associated with server 110, that webpage including graphical objects that provide the data representative of the generated user measurements. The link may be communicated by email or other means. • Requiring a user to log in to a website provided via server 110 and/or service provider 130, from which a web page providing the data representative of the generated user measurements is made available. • Sending an email )or other message) that includes an attached file or embedded code to enable rendering of the data representative of the generated user measurements.
[00123] In some embodiments, the data representative of the generated user measurements includes numerical measurements. However, it will be appreciated that graphical approaches are more useful. In that regard, in some embodiments the data representative of the generated user measurements includes data that enables rendering of at least a portion of a body model (derived from the specific user’s body scan data) showing locations of the translated measurement end points, optionally along with a visible line along which the measurement is made.
[00124] In some embodiments the data representative of the generated user measurements includes numerical measurements includes data that enables a rendering of image data providing a representation of garment fit relative to the user, based upon a virtual garment generation process which translates a predefined generic virtual garment model based on the generated user measurements. For example, in some embodiments an interface is provided which enables a service provider to define how an item of clothing is constructed from patterns, the patterns being cut based upon the measurements (for example Pattern X has a Dimension Y that is calculated based on Measurement Z). This allows a virtual garment generation module to define, in three dimensions, a garment that is able to be superimposed on a rendering of a 3D body scan. It will be appreciated, in this regard, that garments are in almost all cases defined by a set of 2D fabric pieces that are stitched together to create 3D garments.
[00125] In some embodiments, alternate graphics representative of user-to-garment size are provided, for example images showing the location of hems relative to hand/feet, length relative to torso, and so on. In any event, it will be appreciated that the provision of 2D and/or 3D user-to-garment size comparisons assists a user in making a decision to approve or reject (and, in the case or rejection, request modification) of proposed measurements that will be used as the basis for constructing a physical garment.
[00126] It should be appreciated that, although the example of garment creation is used as a focus, that measurements of human bodies based on body scan data are in further embodiments used for a range of alternate purposes.
Framework with Data Sharing Protocol [00127] FIG. 1C illustrates a further exemplary framework. In this embodiment, server 110 operates in combination with a shared data management platform 190. As context, platform 190 may be physically defined by one or more computing devices which are collectively configured to deliver software functionalities that allow for: (i) the delivery of user interface data to user computing terminal 100, for example via locally executing apps and/or web browser arrangements; and (ii) the provision of functionalities described by reference to components 193 to 198. In some embodiments some or all functionalities provided by platform 190 are integrated within server 110.
[00128] Interaction between server 110 and platform 190 provides a computer implemented framework configured to enable generation and management of 3D body scan data, including provision of a data sharing protocol. In overview, this sharing protocol enables users who interact with scanning booths to selectively provide access to their body scan data to one or more specified advisor users (with each advisor user being associated with advisor account data, for example including a unique advisor identifier). This allows each advisor user to view body scan data, and in particular historical body scan data, for users that have granted them access to that data. Accordingly, a given advisor user is able to view automatically processed body scan data, including data representative of progressive results over a historical period, and provide expert human input specifically tailored for a given user. By way of example, advisor users may include the likes of personal trainers, doctors (and other health professionals), nutritionists, and so on.
[00129] As described in previous examples, the framework provides a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii)scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii)a communications module configured to transmit the generated point-in-time set of 3D body scan data to body scan management server 110. Server 110 is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a historical user body scan data collection for the respective user in body scan data 114.
[00130] Server 110 provides shared access management module 191. Module 191 is configured to: receive data representative of an instruction to associate a given user with a given advisor account; and in response to the instruction associate that given user with that given advisor account. For example, a user of a computing terminal 100 logs into to their account with server 110, and identifies a desired advisor account identifier. This is in some embodiments implemented by way of defining advisor specific hyperlinks: an advisor user is enabled to send (for example via email) a personalised hyperlink to a user, with that hyperlink being configured to cause navigation by the user to a webpage that enables the user to provide a request to grant access to the relevant advisor user. In some cases that hyperlink causes navigation of intermediary pages/processes that result in user identification (for example via cookies or manual credential input) or a sign-up process for a user who does not have an account in user account data 112.
[00131] The associating of a given user with a given advisor account may occur at either or both of server 112 (for example by updating user data 112, such that a request associated with a given advisor account is able to be processed at server 110 thereby to determine whether the relevant user account is associated with a shared access permission for the relevant advisor account) and platform 190 (for example advisor user data 193 is updated with data representative of access permissions for users defined in user account data 112 in respect of their body scan data 114).
[00132] Server 100 additionally provides a shared access platform integration module 192, which is configured to communicate with a shared data access module 194 of platform 190, thereby to enable communication and sharing of data between server 110 and platform 190. It will be appreciated that modules 192 and 194 are redundant in some embodiments where server 110 and platform 190 are integrated.
[00133] An advisor portal module 190 is configured to facilitate the delivery of user interface data in relation to platform 190 to computing terminals 100 (via either or both of an app-based arrangement and a web browser arrangement). Portal module 190 is configured to authenticate access in respect of a given advisor account from an advisor terminal (from a practical perspective any computing terminal 100 may function as an advisor terminal; it is a matter of the user having an advisor account and in some cases the user having specific software installed).
[00134] Module 190 is, in some embodiments, configured to enable deliver of user interface data to computing terminals thereby to enable rendering of a historical progress display interface that provides access to a computer-generated historical analysis of the historical user body scan data collection each user associated with that advisor account. Th9s, and other forms of data that are rendered in some embodiments, are described in more detail below.
[00135] In some embodiments, the user interface data includes client data. This includes, for each advisor user, data representing those users who are “clients” of the advisor user. This includes some or all users in respect of which shared data access is granted in respect of the advisor account. For example, in some embodiments a user places a request to be accepted as a client, and an advisor user reviews and selectively approves that request via the user interface thereby to selectively confirm the requesting user as a client.
[00136] In some embodiments, the user interface data includes client progress dashboard. The client progress dashboard includes data representative of the identity of each client, and associated data representative of progress over a defined period (for example progress in terms of one or more attributes of body transformation. In a preferred embodiment the dashboard additionally indicates when each client last had a body scan performed, and provides alert notifications (for example where a given client has new scan data which has not been reviewed by the advisor user, and/or where a given user is overdue for a body scan). An example of a client progress dashboard is shown in FIG. 7A, which shows a single user male.demo@mport.com and progress in terms of weight and fat%.
[00137] In some embodiments, the user interface data includes client management tools. For example, as shown in FIG. 7A, these may include an input object that enables inputting of an email address for a proposed client, causing delivery of an electronic notification to that client (for instance an email with a hyperlink as discussed above, or a message delivered directly to an internal messaging inbox provided via server 110). The client management tools may also, again as show in FIG. 7A, include an interface object configured to manage pending request from users for client status.
[00138] In some embodiments, the user interface data includes a client-specific dashboard interface. This is rendered on a per-client basis, thereby to enable the advisor user to both (i) view summary progress information for his/her clients as a group (via the client dashboard); and (ii) view detailed progress information for each individual client. The detailed progress information is extracted from body scan data 114 and pre-processed by scan data processing modules 195 to transform it into a form for rendering at an advisor terminal. The detailed progress information is preferably representative of the following client attributes: weight; bust; upper waist; hip; thighs (left and right); calves (left and right); and knees (left and right). An example is shown in FIG. 7B.
[00139] In some embodiments, the detailed progress includes health metrics. The health metrics include values that are derived from the body scan data via the application of predefined algorithms which convert body scan data into health metric data. The health metric data includes some or all of: BMI; body fat percentage; fat mass in kg; waist to hip ratio; and lean muscle mass. It will be appreciated that there are technical challenges associated with calculating some of these metrics from body scan data. A preferred embodiment makes use of a repository of benchmarking data, whereby particular aspects of body shape are associated with algorithm input parameters. For example, the benchmarking data is configured to match a given body region shape (for example an upper arm) with a known benchmark, wherein health metric data (such as body fat) is known for that benchmark. This in some embodiments includes performing a size transformation between the user body san data and the benchmark data (to account for different body dimensions, but where there are similar shapes). Such an approach enables convenient automated estimation of whether a particular body shape represents benchmark types such as: high muscle, low fat; high muscle, medium fat; high muscle, high fat; medium muscle, low fat; medium muscle, medium fat; medium muscle, high fat; low muscle, low fat; low muscle, medium fat; low muscle, high fat. This is a representative selection of benchmarks only, and it will be appreciated that the effectiveness of automated health metric calculation increases with the number of available comparative benchmarks that are available to be compared with specific user body scan data. An example of a health metric presentation interface is shown in FIG. 7C.
[00140] In some embodiments, the detailed progress includes avatar data. As noted above, body scan data is in some embodiments used to generate body avatars, which are approximately representative of user body size and shape, whilst being devoid of certain aspects of personalising information. That is, the avatar represents a human body having key aspects of shape and size derived from body scan data, but without being recognisable as the user. In this manner, by creating avatars from selected aspects of 3D body scan data from the ground up (and excluding aspects that would be subjectively regarded as invasive or personalising, such as face regions and nether regions) a non-invasive avatar is provided which highlights aspects of body shape and composition which are of relevance to the advisor user. In the example of FIG. 7D, the user interface provides two avatar display objects, each object having a scan data selection interface. This allows side-by-side comparison of avatars at different points in time. The avatar display objects preferably enable 3D manipulation and/or zoom functionalities.
[00141] In some embodiments multiple avatars are combined (or multiple sets of body scan data combined to create a composite avatar) which graphically represents differences over time. This is discussed in more detail in the section below entitled Tracking Progress against Fitness and Body Transformation Objectives.
[00142] A rules engine 196 is configured to automate generation of notifications and reminders to an advisor user of platform 190. The rules may be updated over time thereby to provide enhanced functionality. In general terms, rules are configured to monitor data, such as data representative of clients’ body scan data (for example availability of new data), deadlines (for example to provide notifications where a predefined period has elapsed since a given client last uploaded body scan data) and so on.
[00143] A messaging interface module 197 enables an advisor user to transmit message data to a selected one or more clients. This may include causing transmission of a body scan reminder data to one or more users (for instance in response to a rule-triggered notification that a given client has not had a body scan performed for greater than a threshold time). It may additional and/or alternatively include causing transmission of a feedback message to one or more users, wherein the feedback message is defined by a user of the advisor terminal (for example feedback providing advice, congratulations on results, and so on).
[00144] It will be appreciated that frameworks as described above are effective in enabling the sharing of body scan data in a manner that facilitates both convenient sharing and effective presentation of data to human advisors, who are then able to provide expert tailored advice to their clients as a result of the operation of the technological platform.
Tracking Progress against Fitness and Body Transformation Objectives [00145] In some embodiments, scanning booths are implemented as a means for assisting persons in tracking their performance against body goals, for example fitness and body transformation objectives. In broad terms, body scan data from different points in time is compared, thereby to enable presentation of graphical information which accurately shows how a person’s body shape has changed.
[00146] Various embodiments provide computer implemented methods for enabling tracking of body shape variations. The methods include authenticating a user Ui, that user Ui being associated with first user record data , wherein the first user data includes a first set of user physical attribute data associated with a time n (PADiTn)· The set of physical attribute data is derived from a threedimensional body scanning process. For example, that data is collected and maintained via scanning booths and server 110.
[00147] The methods then include receiving input representative of further set of first user physical attribute data, the further set being associated with a time n+x (PAD,Tn+x). For example, a user might return to a scanning booth (or visit another scanning booth) two weeks following a pervious body scanning session.
[00148] A computer system is configured to perform analysis of a relationship between PAD,Tn, and PAD,Tn+x. This analysis may be performed, for example, at server 110, locally at a scanning booth, at a third party server, or across a combination of two or more of those locations. The method then include providing output configured to enable rendering, at a client device associated with user Ui, a graphical object representative of the relationship between PAD,Tn, and PAD,Tn+x. For example, the client device may be a mobile device associated with the user, a computer operated by the user, or a scanning booth at which the user is present (for example the user is provided with the output following scanning).
[00149] In some embodiments each set of three dimensional body scan data includes a three dimensional point cloud, and analysis of relationships between PAD,Tn, and PAD,Tn+x is based on comparison of the respective point clouds. For example, comparison of the point clouds includes utilisation of vector mathematics to determination spatial variations between corresponding points.
[00150] In some embodiments the graphical object is defined based on an overlay of the point cloud for PAD,Tn+Xl with respect to the point cloud at PAD,Tn. The graphical object need not be displayed as point cloud data; it is preferably a stylised rendering of a human body shape in three dimensions, preferably with transparent characteristics thereby to enable visualisation of variations between PAD,Tn, and PAD,Tn+x.
[00151] The graphical object preferably provides visual indicators representative of variations between PAD,Tn, and PAD,Tn+x. For example, the visual indicators include colours. This may be a region of colour in an object which indicates a region of variation between overlaid three dimensional models derived from PAD,Tn, and PAD,Tn+x respectively. In some cases a first visual indicator (for example red colouring) is used to identify an increase in localised body size, and a second visual indicator (for example blue colouring) is used to identify a decrease in localised body size. As a more advanced arrangement, in some cases a first visual indicator is used to identify an identified improvement towards a predefined body goal, and a second visual indicator is used to identify regression away from the predefined body goal. The predefined body goal may include fat reduction (or a specific aspect of fat reduction) or muscle gain (or a specific aspect of muscle gain). It will be appreciated that muscle gain will typically involve an increase in local body dimensions, whereas fat loss would typically involve a reduction. In some embodiments an algorithm is configured to autonomously predict whether a given variation is a result of fat loss or muscle gain, allowing for automated categorisation of a variation as being “good” or “bad”.
[00152] The comparison process may be driven through scanning booths themselves, through server 110, or via a third party platform 140 or shared data management platform 190. In the example of FIG. 1B, a fitness service platform (which may form part of or operate in conjunction with a platform 130 or platform 190), is specifically configured to perform the comparison process (for example via scan data processing module 195, as foreshadowed above). Platform 180 includes a fitness objective data input interface, which enables a user (for example via a user interface rendered at a client device) to input data representing a fitness objective, for example fat reduction or muscle increase. A scan data analysis engine causes the comparison of scan data, either at server 182 or (in some embodiments) by instructing that the processing be performed at server 110. A graphical object generator module 108 is configured to deliver, to a user’s client device, data to enable rendering of a graphical object that shows body shape variation as described above.
[00153] FIG. 2C illustrates a method according to one embodiment. Functional block 221 represents a login/registration process, whereby a user registers to enable tracking of body shape variations. This process provides permissions for the user’s body scan data (maintained by server 110) to be accessed for such a purpose. The user then selects body transformation objectives at 222 (this step is omitted in some cases), for example “fat loss”, “muscle gain”, or in some cases more precise objectives (for example quadriceps muscle growth”). Functional block 223 represents a process including user triggering of a body comparison for PADITn, and PAD1Tn+x. This may result from an explicit instruction, or from an implicit instruction (for example derived from a user obtaining a new body scan at a scanning booth). Functional block 224 represents a process including comparison and analysis of point cloud data, resulting in defining of graphical output. Data to enable rendering of that graphical output is provided at 225.
[00154] FIG. 2D illustrates a method according to a further embodiment. Functional block 231 represents a process including accessing, from server 110, point cloud data for PADITn, and PAD1Tn+x. Functional block 232 represents a process including comparing point cloud data for PAD,T„, and PAD,Tn+x. Functional block 233 represents a process including categorising variances (for example size increase/decrease, movement towards/away from objective, etc). Functional block 234 represents a process including defining an output graphic based on the categorisation.
[00155] As noted above, graphical objects generated by such methods are in some embodiments delivered in the context platform 190.
Body Appearance Forecasting and Progress Tracking Based on Defined Goals and/or Activity Programs [00156] The preceding examples relate to tracking of progress in terms of variations in body shape. In some embodiments, functionality is provided thereby to provide forecasting in relation to body shape. On overview, existing physical attribute data (e.g. derived from body scan data) is processed via one or more transformation algorithms thereby to define forecasted future physical attribute data. The algorithms may be associated with defined goals (for example muscle gain or weight loss), or activity programs (which are associated with anticipated outcomes in terms of muscle gain and/or weight loss).
[00157] An exemplary method is shown in FIG. 2E. Functional block 241 represents a login/registration process. This includes authenticating a user , the user being associated with first user record data, wherein the first user data includes a first set of user physical attribute data associated with a time n (PAD,Tn), wherein the set of physical attribute data is derived from a threedimensional body scanning process.
[00158] Functional block 242 represents a process including the user selecting body transformation objectives. For example, the forecasting is based on either or both of: (i) a defined goal; or (ii) an activity program. In relation to (i), the forecasting is in some embodiments based upon a body transformation goal, for example in relation to weight loss and/or muscle gain. For instance, this may be percentage-based (such as a 5% reduction in body fat) or the like. In practice, a user provides a request/query associated with a defined goal (for example “how might I look if I lose 3kg”, or “how might I look if I put on more muscle”). In relation to (ii), the forecasting is in some embodiments based upon an activity platform, for example a training and/or dietary regime. This is in some cases associated with percentage-based body variations (such as a 5% reduction in body fat, 5% increase in muscles) or the like, which is in some embodiments tailored across specific body regions (for example a given training program might be tailored towards developing certain muscles) . In practice, a user selects an activity program, and is shown how they might look over time if they follow that program.
Functional block 243 includes identifying a forecasting protocol based on the user’s selection of transformation objectives. The forecasting protocol is in turn associated with transformations that are applied to physical attribute data thereby to define forecasted physical attribute data at a future time. In this regard, functional block 244 represents a process including processing the physical attribute data associated with time n (PAD,Tn) based on the a selected forecasting protocol thereby to define forecasted user physical attribute data for user associated with a time n+x (PAD,Tn+x). In this embodiment, the physical attribute data is represented as a point cloud, and transformed point cloud data is defined at 245.
[00159] Functional block 246 represents a process including providing output configured to enable rendering, at a client device associated with user Ui, a graphical object representative of PAD,Tn+x.
[00160] In some embodiments the graphical object is representative of the relationship between PAD,Tn, and PAD,Tn+x, for example using overlay techniques discussed further above.
[00161] In some embodiments a combination of tracking and forecasting is implemented, such that a user is enabled to compare actual body shape transformation progress against both previous body shape and forecasted data. This in some embodiments enables generation of updated forecasts based on observation that a given user is displaying better/worse than expected rates of progress. In some embodiments, comparison between progress and forecasting is used to enable a user to obtain additional instructional information, for example advice from a personal trainer associated with a given selected activity program.
Exemplary identification verification system using biometric data [00162] In the example of FIG. 1E, a particular category of third party platforms are illustrated: virtual environment host systems 131. These are systems that provide access to interfaces in which users are represented by virtual bodies, defined by rendering of 3D models. As discussed below, these 3D models are in some embodiments defined based on user body scan data (for example to provide realistic and representative facial and/or body shapes), and a biometric type security framework of implemented thereby to restrict misappropriation of likenesses in the context of the virtual environments.
[00163] Figure 5A illustrates a computer framework for delivering data representative of a virtual 3D representation of a person, also referred to herein as ‘avatar data’, to a virtual environment according to one embodiment of the invention. The user operates the scanning unit 501 thereby to cause creation of data representative of a 3D representation of themself. The scanning unit also collects biometric data for one or more identifying biometrics from the user as part of a scanning procedure. In this way, the scanning unit is able to collect the one or more identifying biometrics from the user at the place and time of the creation of the user’s avatar data.
[00164] In the context of a scanning booth, example hardware configured to accept biometric data include: • Finger print scanner; • Hand print scanner; • Camera; • Microphone; • Infrared camera; • Some other biometric scanning device.
[00165] In the context of an in-home scanning arrangement, example hardware configured to accept biometric data includes: • Smart phone camera; • Smart phone finger print scanner; • Smart phone microphone; • Finger print scanner; • Hand print scanner; • Camera; • Microphone; • Infrared camera; • Some other biometric scanning device.
[00166] The data derived from the one or more identifying biometrics is stored in association with the body scan data in a database 502 which is managed by a server 503.
[00167] For a user to make use of their avatar data in a virtual environment, a request needs to be made to the server managing the database containing the avatar data to release the avatar data to the virtual environment host 507. The request to deliver the avatar data to a virtual environment originates from the client terminal 504 which includes biometric scanning hardware 505 and a user computing terminal 506 to access the internet. Included in the request is one or more identifying biometrics collected from the user at the client terminal 504. In this regard, scanning hardware collects corresponding form(s) of biometric data to scanning unit 501. The server 503 performs a biometric verification process thereby to verify the identity of the user of the client terminal. The verification process is based on the identifying biometrics in the request and the identifying biometric data stored in the database 502. In the case that the verification process is successful, the server delivers the avatar data to the virtual environment for the user.
[00168] The system can be configured such that the client terminal 504 makes the request directly to the server 503 as shown in Figure 5A or indirectly via the virtual environment host 507 as shown in Figure 5B. The request can also go through any number of intermediaries.
[00169] In another embodiment, upon successful verification, the server issues an identifying mark or permission for an identifying mark to be displayed in association with the 3D representation of the user in the virtual environment. An exemplary identifying mark is a virtual name tag, which is included on the avatar of the verified user. Any other identifying mark can also be used. By such an approach, third parties using/observing the virtual environment are able to visually identify a biometrically verified user.
[00170] Example use cases for the technology described above include the following: • Virtual meetings which use realistic virtual avatars, where there are security sensitivities that favour ensuring an avatar is actually controlled by the person it represents. • Virtual meet-up and/or dating environment, where there are reasons for needing to ensure that a person in reality matches their virtual appearance. • Other situations where personal appearance verification is required for a virtual environment.
[00171] The scanning unit 601 is shown in greater detail in Figure 6A and Figure 6B. The unit includes 3D scanning hardware 602 to create the avatar data representative of the user 603 and biometric scanning hardware 604 to record the users identifying biometrics when the avatar data is created. The 3D scanning hardware and the biometric scanning hardware can be separate as shown in Figure 6A or combined into a single piece of hardware as shown in Figure 6B. The scanning unit may also include any number of pieces of hardware. The dashed lines in Figure 6A and Figure 6B represent the scanning zone for the biometric and 3D scanning hardware.
[00172] Exemplary identifying biometrics include one or more finger prints, one or more hand prints, face, voice, iris pattern, retina pattern or any other identifying biometric.
Customisable Measurement Monitoring Regimes using Three-Dimensional Body Scan Data [00173] In the example of FIG. 1D, a particular category of third party platforms are illustrated: service provider systems 130. These facilitate customisable measurement monitoring regimes using three-dimensional body scan data. In examples described below, such functions are focussed on enabling a service provider to implement a monitoring regime for its customers, based on user body scan data, thereby to adapt aspects of services they provide. For example, in some cases a service provider is an insurance service provider, and configuration of automated monitoring of customers’ body scan data enables determinations related to each customer’s health (for example whether they are gaining/losing weight, increasing muscle, etc.), which can be used as input to affect insurance products. Examples include determination of premiums/incentives and the like (for example to incentivise a healthy lifestyle).
[00174] The framework of FIG. 1A includes a plurality of scanning units, wherein each scanning unit includes: (i)a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii)scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii)a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server. Body scan management server 110 is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user in repository 114.
[00175] An access approval management module 116 is configured to enable a given user to provide user-specified access permissions for that user’s repository of body scan data to one or more designated service providers (with each service provider is associated with a respective service provider account defined for server 110). This is optionally achieved via an interaction between a user and server 110, or via an interaction between a user and a service provider system 130. For instance, in relation to the latter, a given service provider system 130 is authorised to modify user data relating of access management on behalf of a user, in the case that the relevant user provides authenticated credentials recognised by server 100 via the relevant system 130. In some embodiments there are multiple access levels available. For the present purposes, we assume that each system 130 is provided with an access level which enables adequate access to a user’s body scan data to enable the functionalities described.
[00176] A service provider portal 117 is configured to enable a given service provider user to access web-delivered functionalities via a logon process via the service provider user’s service provider account. In various examples described below, this enables a user of a service provider system to view user interface objects generated based on processing of user body scan data. In some embodiments portal 117 provides additional/alternate means for delivering functionalities, including API integration. For example, one approach includes providing an API and plugin that enables direct communication between a software application (for example a CRM system) used by a service provider with server 110. Portal 117 optionally provides multiple modes of communication.
[00177] Portal 117 is configured to enable a service provider user to configure a customised body attribute monitoring regime. The customised body attribute monitoring regime is configured to provide, to the service provider, one or more data values representative of a change in body shape attributes for a user. These are based on a data comparison between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data (i.e. in some cases more than two time-specific data points are used). The comparison is preferably measurement based - optionally each body measurement is defined as a distance between two points along line defined on a virtual surface represented by 3D body scan data. Alternate approaches may be used (for example surface area calculations and the like).
[00178] An underlying premise is to enable a service provider to define a protocol for converting a measurable variation in body shape characteristics to a predicted variation in overall health. This may be tailored to particular demographics (for example one example regime is defined for people of a particular known health status in a particular age range, another example regime is defined for people of a different particular known health status in a the same age range, and a further example regime is defined for people of a the same known health status in a different age range. For example, this allows differentiation between encouraging elderly people from losing excessive size and discouraging middle-aged people from putting on belly fat.
[00179] The manner by which a service provider user configures a customised body attribute monitoring regime varies between embodiments. Examples include: • A service provider user selects one or more predefined regimes made available via server 110. Preferably each predefined regime (or collection thereof) is associated with a plain-language explanation of what it is intended to determine. • A service provider user defines a set of measurements (for example torso girth, thigh girth, etc.) which are to be taken, and optionally a way in which they are combined (for example rules by which each individual measurement is to be weighted when defining a combined value). • Server 110 performs an analysis, for a given user, of a recent variation trend between existing of existing point-in-time body scan data, and suggests a regime for association with that user based on predictive analysis of how the user’s health is progressing, and desirable future variation data that would represent positive improvement. For example, it may be determined that a user has been gaining unhealthy body fat at a constant/increasing rate.
[00180] Portal 117 enables the service provider user to associate the customised body attribute monitoring regime with a set of users. These are users that have granted access permissions in respect of that user’s repository of body scan to the service provider (and in some cases an access permission verification step is performed). The access right may be granted via the service provider (for example the service provider embeds an approval widget object in a service provider web page thereby to enable a user to upload authentication data for server 110, and hence grant approval). The association of regimes to users is optionally driven based on user characteristics (for example demographic, body shape and/or historical variation derived characteristics), thereby to enable association of users to regimes that are appropriate for those users.
[00181] Portal 117 additionally enables scheduling of execution of a set of automated processes thereby to implement the customised body attribute monitoring regime against the set of associated users. In some embodiments, the set of automated processes include automated processes configured to, for a given one of the users: • Calculate the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data. • Deliver via the service provider terminal the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data. • Generation and delivery of body scan reminder messages, thereby to provide to each given user one or more reminders to generate a new point-in-time set of 3D body scan data representative of the user. For example, these may include email, SMS, website-specific, or other forms of electronic reminders.
[00182] In some embodiments, for a given customised body attribute monitoring regime, calculating the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data includes: (i) taking a first set of body measurements using the first point-in-time set of 3D body scan data; (ii) taking a second set of body measurements using the second point-in-time set of 3D body scan data; (iii) determining a variance between the first set of measurements and the second set of measurements; and. (iv) based on the determined variance between the first set of measurements and the second set of measurements, performing an aggregation calculation that transforms a plurality of individual determined variance values into a single aggregate variance value.
[00183] The single aggregate value is a value representative of health improvement/deterioration. For example, by taking a set of measurements thereby to determine variations in size/surface area of a person using 3D body scan data, a single value representative of overall health improvement/degradation is defined. Not all embodiments are limited to aggregate values; in some cases single measurements are important (for example some embodiments make use of a single “belly fat” measurement as a rough and simple health indicator).
[00184] It should be appreciated that the examples above provide a service provider only with limited data representative of health variation, as opposed to providing comprehensive (and perhaps personally sensitive) body scan data. In this way, a user is able to retain substantive privacy, whilst enabling limited release of calculated data values to services providers with whom they choose to share such data.
[00185] By way of example, as described above is optionally implemented such that one or more characteristics of an insurance policy are affected by the one or more data values representative of a change in body shape attributes. Example of how this achieved include, but are not limited, to the following: • Requiring that a user submit periodic 3D body scan data via a framework such as that described above, and automatically adjusting insurance characteristics (for example premiums) based on observed trends. • Inviting a user to submit periodic 3D body scan data via a framework such as that described above, and automatically adjusting insurance characteristics (for example premiums) based on observed trends. • Incentivising a user to improve health, and inviting the user to submit periodic 3D body scan data via a framework such as that described above, thereby to provide evidence of that improvement (and hence obtain an incentive reward). • Providing a given user a particular body transformation challenge, and associating that with an incentive (for example a premium reduction). For example, the user is invited to participate in a challenge to reduce belly fat, waist girth, or the like by a specified percentage over a specified timeframe.
[00186] Of course, this is by no means limited to insurance, and in further embodiments alternate forms of services and/or products are influenced by way of a framework as described above.
[00187] FIG. 2F illustrates a method according to one embodiment, which is optionally performed via a framework as described above.
[00188] Block 251 represents a process including the defining, by a service provider, of body variation monitoring regimes and user applicability rules. For example, the service provider user accesses a computer interface that provides access to functionalities delivered via a body scan data provider server, and selects: (i) measurement techniques that are to be used thereby to derive variation data; and (ii) where there are multiple measurement techniques, a protocol for determining association between users and measurement techniques (for example demographics, manual designation, and so on).
[00189] Block 252 represents a process including inviting (or requesting) that a user participates in a body monitoring influences service program. For example, this may include display of data via a web interface rendered at a user’s computing device. This interface invites (or requests) the user to agree to participation, and provide data that grants the service provider a threshold level of access to the user’s body scan data see block 253). In some cases, where the user is not registered to use a body scan generation service provider, a registration process is triggered to enable such registration (which provides user identification details which a user is able to input to a scanning booth or the like thereby to generate a 3D scan).
[00190] Block 254 represents a process including configuration of automated body scan reminders. These may be provided via a range of forms of electronic communication, and are optionally provided by either or both of the service provider and a body scan service provider. By way of example, automated reminders may be generated and provided in the lead up to a designated period during which body scan data is requited in the context of a service program delivered to a user by a service provider (for example an insurance policy), and further reminders provided during the designated period in the case that a body scan has not been uploaded by the user.
[00191] Block 255 represents a process including identifying that new body scan data has been uploaded for a given user (for example via a scanning booth or in-home scanning hardware). This automatically triggers generation of body shape variation data in line with a regime associated with that user by a given service providers (and in some cases multiple calculations for a plurality of service providers that have respectively associated variation monitoring regimes with that user). The variation data is provided to the relevant service provider at 257, which enables an associated service program to be controlled based on characteristics if the provided data.
Exemplary Client-Server Framework [00192] In some embodiments, methods and functionalities considered herein leverage a client-server framework, for example as illustrated in FIG. 3.
[00193] In some embodiments, methods and functionalities considered herein are implemented by way of a server, as illustrated in FIG. 3. In overview, a web server 302 provides a web interface 303. This web interface is accessed by the parties by way of computing terminals 304. In overview, users access interface 303 over the Internet by way of computing terminals 304, which in various embodiments include the likes of personal computers, PDAs, cellular telephones, gaming consoles, and other Internet enabled devices.
[00194] Server 302 includes a processor 305 coupled to a memory module 306 and a communications interface 307, such as an Internet connection, modem, Ethernet port, wireless network card, serial port, or the like. In other embodiments distributed resources are used. For example, in one embodiment server 302 includes a plurality of distributed servers having respective storage, processing and communications resources. Memory module 306 includes software instructions 308, which are executable on processor 305.
[00195] Server 302 is coupled to a database. In further embodiments the database leverages memory module 306.
[00196] In some embodiments web interface 303 includes a website. The term “website” should be read broadly to cover substantially any source of information accessible over the Internet or another communications network (such as WAN, LAN or WLAN) via a browser application running on a computing terminal. In some embodiments, a website is a source of information made available by a server and accessible over the Internet by a web-browser application running on a computing terminal. The web-browser application downloads code, such as HTML code, from the server. This code is executable through the web-browser on the computing terminal for providing a graphical and often interactive representation of the website on the computing terminal. By way of the web-browser application, a user of the computing terminal is able to navigate between and throughout various web pages provided by the website, and access various functionalities that are provided.
[00197] Although some embodiments make use of a website/browser-based implementation, in other embodiments proprietary software methods are implemented as an alternative. For example, in such embodiments computing terminals 304 maintain software instructions for a computer program product that essentially provides access to a portal via which a framework is accessed (for instance via an iPhone app or the like).
[00198] In general terms, each terminal 304 includes a processor 311 coupled to a memory module 314 and a communications interface 312, such as an internet connection, modem, Ethernet port, serial port, or the like. Memory module 314 includes software instructions 313, which are executable on processor 311. These software instructions allow terminal 304 to execute a software application, such as a proprietary application or web browser application and thereby render on-screen a user interface and allow communication with server 302. This user interface allows for the creation, viewing and administration of profiles, access to the internal communications interface, and various other functionalities.
Conclusions and Interpretation [00199] It will be appreciated that the disclosure above provides various significant devices, frameworks and methodologies for enabling user-driven determination of body size and shape information and utilisation of such information across a networked environment.
[00200] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," “determining”, analyzing” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
[00201] In a similar manner, the term "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A “computer” or a “computing machine” or a "computing platform" may include one or more processors.
[00202] The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM. A bus subsystem may be included for communicating between the components. The processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. The term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit. The processing system in some configurations may include a sound output device, and a network interface device. The memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute computer-readable carrier medium carrying computer-readable code.
[00203] Furthermore, a computer-readable carrier medium may form, or be included in a computer program product.
[00204] In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
[00205] Note that while diagrams only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect.
For example, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
[00206] Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that is for execution on one or more processors, e.g., one or more processors that are part of web server arrangement. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium, e.g., a computer program product. The computer-readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
[00207] The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an exemplary embodiment to be a single medium, the term "carrier medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "carrier medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks. Volatile media includes dynamic memory, such as main memory. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. For example, the term "carrier medium" shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media; a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that, when executed, implement a method; and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.
[00208] It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.
[00209] It should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
[00210] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
[00211] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
[00212] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[00213] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
[00214] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.
Claims (49)
- CLAIMS:1. A computer implemented framework configured to enable generation and management of 3D body scan data, the framework including: a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a historical user body scan data collection for the respective user; a shared access management module, wherein the shared access approval module is configured to: receive data representative of an instruction to associate a given user with a given advisor account; and in response to the instruction associate that given user with that given advisor account; an advisor portal module that is configured to: (i) authenticate access in respect of a given advisor account from an advisor terminal; (ii) cause rendering at the advisor terminal of a historical progress display interface that provides access to an computer-generated historical analysis of the historical user body scan data collection each user associated with that advisor account; and (iii) provide a user interface object to enable delivery of an electronic message from the advisor account to a selected one or more of the users associated with that advisor account.
- 2. A framework according to claim 1 wherein the historical progress display interface provides data representative of (i) a set of body shape attributes; and (ii) a set of health attributes.
- 3. A framework according to claim 1 wherein the historical progress display interface provides three dimensional avatar data.
- 4. A framework according to claim 1 wherein an advisor portal module that is configured to cause rendering at the advisor terminal of a dashboard that shows data representative of pending access requests.
- 5. A framework according to claim 1 wherein an advisor portal module that is configured to cause rendering at the advisor terminal of a dashboard that shows data representative of users associated with that advisor account I respect of which new point-in-time sets of body scan data are available.
- 6. A framework according to claim 1 wherein an advisor portal module that is configured to cause rendering at the advisor terminal of an object that, when utilised, causes transmission of a body scan reminder data to one or more users.
- 7. A framework according to claim 1 wherein an advisor portal module that is configured to cause rendering at the advisor terminal of an object that, when utilised, causes transmission of a feedback message to one or more users, wherein the feedback message is defined by a user of the advisor terminal.
- 8. A method according to claim 1 wherein a given instruction to associate a given user with a given advisor account is triggered by a request placed by the user via a body scan management user interface.
- 9. A computer program product that is rendered at a computing terminal, wherein the computer program product is configured to communicate with an advisor portal module according to any preceding claim thereby to render user interface data delivered by the advisor portal module.
- 10. A computer system configured to enable generation and management of 3D body scan data, the framework including: a scanning unit management server in communication with a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a historical user body scan data collection for the respective user; a shared access management module, wherein the shared access approval module is configured to: receive data representative of an instruction to associate a given user with a given advisor account; and in response to the instruction associate that given user with that given advisor account; an advisor portal module that is configured to: (i) authenticate access in respect of a given advisor account from an advisor terminal; (ii) cause rendering at the advisor terminal of a historical progress display interface that provides access to an computer-generated historical analysis of the historical user body scan data collection each user associated with that advisor account; and (iii) provide a user interface object to enable delivery of an electronic message from the advisor account to a selected one or more of the users associated with that advisor account.
- 11. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment, the method including: operating 3D scanning hardware thereby to cause creation of data representative of a 3D representation of a person; collecting one or more identifying biometrics from said person at the place and time of the creation of the 3D representation; storing data derived from said one or more identifying biometrics in association with said data representative of said 3D representation; receiving, from a client terminal, a request to deliver the data representative of the 3D representation to a virtual environment, wherein said request includes one or more identifying biometrics from a user of said client terminal; performing a biometric verification process thereby to verify the identity of said user of said client terminal based on the identifying biometrics in said request and the data derived from said one or more identifying biometrics generated at the place and time stored in association with said data representative of said 3D representation; in the case that the verification process is successful, delivering the data representative of the virtual 3D representation to the virtual environment.
- 12. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment according to claim 11 wherein said 3D scanning hardware is a 3D scanning booth or in-home scanning equipment.
- 13. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment according to claim 1 wherein the one or more identifying biometrics include one or more of the following: one or more finger prints; a hand print; face; voice; iris pattern; retina pattern; some other identifying biometric.
- 14. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment according to any one of the previous claims wherein upon successful verification the server causes a visual indicator to be displayed in the virtual environment to indicate to other users which 3D representations have been verified.
- 15. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment according to claim 4 wherein the visual indicator to be displayed in the virtual environment is a virtual name tag.
- 16. A computer implemented method for delivering data representative of a virtual 3D representation of a person to a virtual environment, the method including: obtaining, from a terminal controlled by a user, user biometric data; verifying that the user biometric data matches stored biometric data associated with the virtual 3D representation; and if the user biometric data matches stored biometric data associated with the virtual 3D representation, allowing delivery of the virtual 3D representation.
- 17. A computer implemented framework configured to enable accessing, analysis and collaborative utilisation of three-dimensional body scan data, the framework including: a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; an approval management module, wherein the approval management module is configured to enable a given user to provide user-specified access permissions for that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; a service provider portal, wherein the service provider portal enables a given service provider user to access web-delivered functionalities via a logon process via the service provider user’s service provider account; and a measurement module, wherein the measurement module enables the service provider portal to provide, via a web interface, to a given service provider user having predefined user-specified access permissions for a specific user’s repository of body scan data, a measurement interface that enables generation of user measurements in accordance with a service-provider defined measurement regime.
- 18. A framework according to claim 17 wherein the service-provider defined measurement regime is defined by a service provider user via interaction with a graphical user interface element, wherein the graphical user interface element: (i) includes a rendering of at least a portion of a human body, and (ii) enables user-selection of measurement points.
- 19. A framework according to claim 18 wherein the rendering of at least a portion of a human body is a three-dimensional rendering.
- 20. A framework according to claim 18 wherein the rendering of at least a portion of a human body is derived from a generic body model.
- 21. A framework according to claim 20 wherein: service-provider defined measurement regime includes data representative of a plurality of measurement instructions, wherein each measurement instruction is defined by a pair of measurement end points defined relative to the generic body model; and the generation of user measurements in accordance with a service-provider defined measurement regime includes: (i) applying a transformation thereby to translate the end points defined relative to the generic body model to user-specific end points defined relative to a body model derived from the specific user’s body scan data; and (ii) performing measurement calculations between the measurement end points in accordance with the measurement instructions thereby to generate the user measurements.
- 22. A framework according to claim 21 wherein the measurement module enables display to the user of a rendering of at least a portion of body model based on the specific user’s body scan data, the rendering showing locations of the translated measurement end points.
- 23. A framework according to claim 22 wherein the measurement module enables user-controlled manipulation of the translated measurement end points relative to the rendered body model, thereby to enable manual micro-adjustment to the translated measurement end points.
- 24. A framework according to any one of claims 21 to claim 23 including a measurement approval module, wherein the measurement approval module is configured to: make available, to the specific user, data representative of the generated user measurements; enable the specific user to provide input representative of approval or rejection of the generated user measurements; and make available, to the given service provider user, data representative of the approval or the rejection.
- 25. A framework according to claim 24 wherein the data representative of the generated user measurements includes numerical measurements.
- 26. A framework according to claim 24, when appended to claim 6 wherein the data representative of the generated user measurements includes data that enables rendering of at least a portion of a body model showing locations of the translated measurement end points, wherein the body model is derived from the specific user’s body scan data.
- 27. A framework according to claim 24 wherein the data representative of the generated user measurements includes numerical measurements includes data that enables a rendering of image data providing a representation of garment fit relative to the user, based upon a virtual garment generation process which translates a predefined generic virtual garment model based on the generated user measurements.
- 28. A framework according to claim 18 wherein the rendering of at least a portion of a human body is derived from the specific user’s body scan data.
- 29. A framework according to claim 28 wherein: the service-provider defined measurement regime includes data representative of a plurality of individual measurements, wherein each individual is generated based on user defined placement of a pair of measurement end points on the rendering of at least a portion of the human body derived from the specific user’s body scan data.
- 30. A framework according to claim 29 including a measurement approval module, wherein the measurement approval module is configured to: make available, to the specific user, data representative of the generated user measurements; enable the specific user to provide input representative of approval or rejection of the generated user measurements; and make available, to the given service provider user, data representative of the approval or the rejection.
- 31. A framework according to claim 30 wherein the data representative of the generated user measurements includes numerical measurements.
- 32. A framework according to claim 30 wherein the data representative of the generated user measurements includes data that enables rendering of at least a portion of a body model showing locations of the service provider user defined measurement end points, wherein the body model is derived from the specific user’s body scan data.
- 33. A framework according to claim 30 wherein the data representative of the generated user measurements includes numerical measurements includes data that enables a rendering of image data providing a representation of garment fit relative to the user, based upon a virtual garment generation process which translates a predefined generic virtual garment model based on the generated user measurements.
- 34. A framework according to claim 17 including a measurement approval module, wherein the measurement approval module is configured to: make available, to the specific user, data representative of the generated user measurements; enable the specific user to provide input representative of approval or rejection of the generated user measurements; and make available, to the given service provider user, data representative of the approval or the rejection.
- 35. A framework according to claim 34 wherein the data representative of the generated user measurements includes numerical measurements.
- 36. A framework according to claim 34, wherein the data representative of the generated user measurements includes data that enables rendering of at least a portion of a body model showing locations of the measurement end points from which the numerical measurements are defined, wherein the body model is derived from the specific user’s body scan data.
- 37. A method according to claim 34 wherein the data representative of the generated user measurements includes numerical measurements includes data that enables a rendering of image data providing a representation of garment fit relative to the user, based upon a virtual garment generation process which translates a predefined generic virtual garment model based on the generated user measurements.
- 38. A computer implemented framework configured to configurable third party access to data derived from three-dimensional body scan data, the framework including: a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; an approval management module, wherein the approval management module is configured to enable a given user to grant user-specified access permissions in respect of that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; a service provider portal, wherein the service provider portal is configured to: (i) enable a service provider user to configure a customised body attribute monitoring regime, wherein the customised body attribute monitoring regime is configured to provide one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data; (ii) enable the service provider user to associate the customised body attribute monitoring regime with a set of users that have granted access permissions in respect of that user’s repository of body scan to the service provider; (iii) enable scheduling of execution of a set of automated processes thereby to implement the customised body attribute monitoring regime against the set of users, wherein the set of automated processes include automated processes configured to, for a given one of the users: calculate the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data; and deliver via the service provider terminal the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data.
- 39. A framework according to claim 38 wherein, for a given customised body attribute monitoring regime, calculating the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data includes: (i) taking a first set of body measurements using the first point-in-time set of 3D body scan data; (ii) taking a second set of body measurements using the second point-in-time set of 3D body scan data; and (iii) determining a variance between the first set of measurements and the second set of measurements.
- 40. A framework according to claim 39 wherein, for a given customised body attribute monitoring regime, calculating the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data additionally includes: (iv) based on the determined variance between the first set of measurements and the second set of measurements, performing an aggregation calculation that transforms a plurality of individual determined variance values into a single aggregate variance value.
- 41. A framework according to claim 40 wherein the single aggregate value is a value representative of health improvement/deterioration.
- 42. A framework according to claim 41 wherein the first set of body measurements is a single measurement, and wherein the second first set of body measurements is a single measurement.
- 43. A framework according to claim 41 wherein each body measurement is defined as a distance between two points along line defined on a virtual surface represented by 3D body scan data.
- 44. A framework according to claim 40 wherein the set of automated processes include generation and delivery of body scan reminder messages, thereby to provide to each given user one or more reminders to generate a new point-in-time set of 3D body scan data representative of the user.
- 45. A framework according to claim 40 wherein the service provider portal is configured to enable a service provider user to configure a plurality of customised body attribute monitoring regimes.
- 46. A framework according to claim 45 wherein each monitoring regime is associated with a respective set of users.
- 47. A framework according to claim 46 wherein the service provider portal is configured to enable a service provider user to configure a set of rules thereby to automate association of each monitoring regime with a respective set of users based on predefined user characteristics.
- 48. A framework according to claim 40 wherein one or more characteristics of an insurance policy are affected by the one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data.
- 49. A computer implemented framework configured to configurable third party access to data derived from three-dimensional body scan data, the framework including: a plurality of scanning units, wherein each scanning unit includes: (i) a computer system that provides an interface configured to identify a user, based on locally-inputted user data and server-managed repository of user data; (ii) scanning hardware that is configured to generate a point-in-time set of 3D body scan data representative of the user; and (iii) a communications module configured to transmit the generated point-in-time set of 3D body scan data to a body scan management server; a body scan management server that is configured to process the received point-in-time sets of 3D body scan data from the plurality of scanning units and, for each point-in-time set of 3D body scanning data, update a repository of body scan data maintained for the respective user; an approval management module, wherein the approval management module is configured to enable a given user to grant user-specified access permissions in respect of that user’s repository of body scan data to one or more designated service providers, wherein each service provider is associated with a respective service provider account; a service provider portal, wherein the service provider portal is configured to deliver, to a service provider computer system, one or more data values representative of a change in body shape attributes for a user between at least a first point-in-time set of 3D body scan data and a second first point-in-time set of 3D body scan data, thereby to cause controlling of one or more characteristics of an insurance policy for the user.
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2015905404 | 2015-12-24 | ||
AU2015905404A AU2015905404A0 (en) | 2015-12-24 | Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3D scan data based on a shared data access protocol | |
AU2016903089 | 2016-08-05 | ||
AU2016903089A AU2016903089A0 (en) | 2016-08-05 | Computer implemented frameworks and methodologies configured to enable the accessing, analysis and collaborative utilisation of three-dimensional body scan data | |
AU2016903245 | 2016-08-16 | ||
AU2016903245A AU2016903245A0 (en) | 2016-08-16 | Computer implemented frameworks and methodologies configured to enable configurable third party system integration with three-dimensional body scan data collection and management framework, including application to insurance system integration | |
AU2016903302 | 2016-08-19 | ||
AU2016903302A AU2016903302A0 (en) | 2016-08-19 | Frameworks and methodologies configured to provide identity verification for 3d representations of people for use in a virtual environment | |
PCT/AU2016/051289 WO2017106934A1 (en) | 2015-12-24 | 2016-12-23 | Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2016379448A1 true AU2016379448A1 (en) | 2018-08-09 |
Family
ID=59088657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2016379448A Abandoned AU2016379448A1 (en) | 2015-12-24 | 2016-12-23 | Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2016379448A1 (en) |
WO (1) | WO2017106934A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11995891B2 (en) * | 2020-07-20 | 2024-05-28 | Imedisync, Inc. | Computer program and method for training artificial neural network model based on time-series biosignal |
CN113129450B (en) * | 2021-04-21 | 2024-04-05 | 北京百度网讯科技有限公司 | Virtual fitting method, device, electronic equipment and medium |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020186818A1 (en) * | 2000-08-29 | 2002-12-12 | Osteonet, Inc. | System and method for building and manipulating a centralized measurement value database |
US20070180050A1 (en) * | 2006-02-01 | 2007-08-02 | Bruner David A | Systems, methods and computer program products for sharing three-dimensional body scan data |
US10628729B2 (en) * | 2010-06-08 | 2020-04-21 | Styku, LLC | System and method for body scanning and avatar creation |
WO2012075298A2 (en) * | 2010-12-01 | 2012-06-07 | Cornell University | Body shape analysis method and system |
US9306999B2 (en) * | 2012-06-08 | 2016-04-05 | Unitedhealth Group Incorporated | Interactive sessions with participants and providers |
US20140378810A1 (en) * | 2013-04-18 | 2014-12-25 | Digimarc Corporation | Physiologic data acquisition and analysis |
NZ631213A (en) * | 2014-05-13 | 2016-03-31 | Mport Pty Ltd | Frameworks and methodologies for enabling provision of computer implement functionalities via centralized management of data derived from three dimensional body scans |
-
2016
- 2016-12-23 AU AU2016379448A patent/AU2016379448A1/en not_active Abandoned
- 2016-12-23 WO PCT/AU2016/051289 patent/WO2017106934A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2017106934A1 (en) | 2017-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2014268117B9 (en) | Devices, frameworks and methodologies for enabling user-driven determination of body size and shape information and utilisation of such information across a networked environment | |
US10417445B2 (en) | Context-aware privacy meter | |
US10216909B2 (en) | Health monitoring | |
US11426628B2 (en) | System and method for identifying and correcting muscular and skeletal disbalances through exercise | |
KR20160124841A (en) | Method, apparatus, and system for trying out or sampling objects | |
US20170357770A1 (en) | Health risk cloud analysis system capable of integrating gene information and acquired lifestyle | |
AU2016379448A1 (en) | Computer implemented frameworks and methodologies configured to enable the generation, processing and management of 3d body scan data, including shared data access protocols and collaborative data utilisation, and identify verification for 3d environments | |
US20190303537A9 (en) | Architecture, system and method for dynamic therapy and prognosis | |
CN107516003A (en) | Dynamic questionnaire survey method | |
KR102143439B1 (en) | Method for coordinating and sharing clothes | |
US11282132B2 (en) | Frameworks and methodologies configured to enable generation and utilisation of three-dimensional body scan data | |
KR20190103649A (en) | System and method for providing related services of immersive virtual fitting | |
JP2014052996A (en) | Behavioral economics type disease management program, server system, and server device | |
KR101785612B1 (en) | Method and apparatus for configuring network between users in communication system | |
KR102289759B1 (en) | Beauty service system and its method | |
KR20180087633A (en) | Management server, system and method for skin care including the same | |
KR20180008246A (en) | Method, Apparatus and System for Evaluating Dress Code, and Method and System for Recommending Dress Code | |
KR102593277B1 (en) | Method and system for creating a customized treatment program | |
JP2020144528A (en) | Matching support device, matching support method, and matching support program | |
KR102323667B1 (en) | Method and apparatus for providing simulation information for body management procedure using artificial intelligence | |
JP2023001477A (en) | Activity amount calculation device and activity amount calculation method | |
KR20240009264A (en) | System for personal healthcare based on metaverse | |
CN117275764A (en) | Online inquiry method, device and system based on virtual reality and storage medium | |
WO2024206460A1 (en) | Scalable online place recommendation | |
CN117037208A (en) | Image detection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
HB | Alteration of name in register |
Owner name: MPORT LTD Free format text: FORMER NAME(S): MPORT PTY LTD |
|
MK5 | Application lapsed section 142(2)(e) - patent request and compl. specification not accepted |