WO2022251902A1 - Three-dimensional visualised soldier systems integration platform - Google Patents

Three-dimensional visualised soldier systems integration platform Download PDF

Info

Publication number
WO2022251902A1
WO2022251902A1 PCT/AU2022/050522 AU2022050522W WO2022251902A1 WO 2022251902 A1 WO2022251902 A1 WO 2022251902A1 AU 2022050522 W AU2022050522 W AU 2022050522W WO 2022251902 A1 WO2022251902 A1 WO 2022251902A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
item
military
soldier
virtual
Prior art date
Application number
PCT/AU2022/050522
Other languages
French (fr)
Inventor
Shane Sarlin
Original Assignee
Buzzworks Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021901633A external-priority patent/AU2021901633A0/en
Application filed by Buzzworks Pty Ltd filed Critical Buzzworks Pty Ltd
Priority to AU2022285319A priority Critical patent/AU2022285319A1/en
Publication of WO2022251902A1 publication Critical patent/WO2022251902A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Definitions

  • the present disclosure relates to a simulation system to trial various aspects of military accessories (hardware and/or clothing) in a safe environment, and in a fulfilling, integrated manner for sizing of the military accessories to the customised physical attributes of a particular person or soldier.
  • Modern military accessories are typically tested 100% virtually, or using live participants only.
  • a 100% virtual environment can differ in unforeseen aspects compared to live environments.
  • a live environment can be expensive and time- consuming when development speed can prove critical.
  • a basic definition of soldier systems integration emphasises interoperability, in this context the requirement that each military system work in concert with other systems to enhance the physical performance, endurance, sustainability, survivability or lethality of the user. What is needed is a system and method that permits one or more of these aspects.
  • the present disclosure in one preferred aspect provides for a method for assessing the form, fit and capability of an item in a military situation.
  • the method includes obtaining personal physical user characteristic data, including user weight and height; providing at least one item of military nature to the user virtually through a virtual medium; assessing user interaction with the at least one item to create an assessment.
  • the assessment includes at least: an anthropomorphic load assessment on the physiological structure of the wearer; a thermal scan; and a multispectral scan of the user during the user’s interaction with the at least one item.
  • the method further includes providing, in a non-virtual environment, the user with the item customised to user specifications based on the assessment.
  • the method may involve where the item is an article of clothing or load carriage equipment, or is piece of military hardware.
  • the assessment may include user interaction with the item in a virtual environment that includes predetermined climatic conditions. An ergometric analysis of the user’s interaction with the item analysis may be conducted as part of the assessment.
  • Fig. 1 is a wireframe diagram of a method for assessing the form, fit and capability of an item in a military situation.
  • FIG. 1 shows a wireframe diagram 100 of a method for the configuration of a virtual soldier and analysis of resultant anthropomorphic and ergonomic effects on the comfort and performance of the virtual soldier, using a virtual reality platform.
  • the virtual reality platform includes a main screen 102 that operates through web- based preferably utilities 104 using: 3D model file types (for example Oculus Quest or HTC Vive) on user devices (such as mobile phones, laptops or other portable computing devices), and physical user interface components (for example buttons, dials and gestures) of the user devices.
  • 3D model file types for example Oculus Quest or HTC Vive
  • user devices such as mobile phones, laptops or other portable computing devices
  • physical user interface components for example buttons, dials and gestures
  • the main screen 102 will output a three-dimensional visualisation of a virtual soldier and equipment based on user inputs.
  • the three-dimensional visualisation of the virtual soldier may show various musculoskeletal and anthropomorphic analyses separately or as an overlay on the virtual soldier.
  • the user inputs are made via the configuration workshop 106 and the evaluation environment 108.
  • the configuration workshop 106 allows a user to input personal physical user characteristic data, including user gender 107, weight and height 109, and providing at least one item of military nature to the user virtually.
  • the configuration workshop 106 allows a user to focus virtually on one soldier system from a number of soldier systems (for example the head, torso, extremities, load carriage or ballistic protective elements) and reconfigure the virtual soldier.
  • the configuration workshop 106 allows a user to reconfigure the military equipment of a virtual soldier from a number of preconfigured contemporary roles, such as: Australian Defense Force current fit-out (e.g.: Infantry Rifleman Soldier Combat Ensemble 19) 112, prototype fit-out (e.g.: market scanning Best of Breed) 114, sub system fit-out (e.g.: head as a system) 116, specialist role fit-out (e.g.: infantry grenadier, machine gunner or combat engineer (demolitions)) 118 and sub-system fit-out (e.g.: other systems) 120.
  • the at least one item of military nature may be an item of clothing, load carriage equipment, or a piece of military hardware.
  • the evaluation environment 108 allows a user to input visual, thermal, multispectral, physical and other data to create virtual environments 122 that replicate virtually the visual and climatic conditions of selected environments or the internal fittings of military vehicles.
  • the evaluation environment 108 also allows a user to input camouflage changes 124 to the user and equipment.
  • the evaluation environment 108 also allows a user to input the action of the virtual soldier (e.g.: lift and carry 128 or other simple tasks 130) and position of the virtual soldier (e.g.: firing positions 132).
  • the integration laboratory 110 assesses the virtual soldier interaction with the at least one item of military nature in the evaluation environment, the assessment preferably including at least a thermal scan and multispectral scan of the virtual soldier during the virtual soldier’s interaction with the at least one item of military equipment.
  • the assessment preferably includes: field of view analysis 134 to determine the effect of the configuration of the at least one item of military equipment on the field of view of the virtual soldier; range of motion analysis 136 to determine the effect of the configuration of the at least one item of military equipment on the range of motion of the virtual soldier; thermal burden analysis 138 to determine the effect of the configuration of the at least one item of military equipment on the body temperature of the virtual soldier; load based posture analysis 140 to determine the effect of the configuration of the at least one item of military equipment on the posture of the virtual soldier; and vehicle integration analysis 142 to determine the effect of the configuration of the at least one item of military equipment on the ability of the user to ride in a vehicle.
  • the integration laboratory 110 determines the resultant physiological stresses on the various systems of the human body based on the inputs.
  • the integration laboratory includes an integration optimiser that may use a calculation, machine learning, artificial intelligence algorithm to recommend integration of the various inputs in ways that will minimise the resultant physiological stresses on the various systems of the human body based on the inputs.
  • the details of machine learning and artificial intelligence will involve calculations using mathematical algorithms that would be understood by those of ordinary skill in the art (for example only, a Bayesian algorithm for use as a classifier). Feature sets may involve human physiological stress conditions and items of equipment and/or articles of clothing.
  • the user can then change the inputs in the configuration workshop 106 and the evaluation environment 108.
  • the integration laboratory 110 utilises user experience, machine learning, artificial intelligence, anthropomorphic and ergonomic analysis to assess the impacts of this change and informs the user of the feasibility, acceptability, and suitability of the change.
  • the output of this analysis is shown on the main screen 102 which visually depicts the virtual soldier’s ability to perform standard tasks (for example move through range of motion, adopt standard firing positions and interact with various military vehicles) based on the inputs.
  • the main screen 102 visually depicts an adjustment to the posture of the virtual soldier based on the inputs.
  • the output data of the virtual reality platform is used to assign relative capability and performance metrics (for example, weight, protection, mobility) against the preconfigured soldier systems configuration for future users.
  • relative capability and performance metrics for example, weight, protection, mobility
  • the virtual reality platform may additionally use augmented reality to visually display annotations or analysis of the configuration of a physical soldier from the system overlaid on reality using reference points, QR codes, or other visual markers.
  • the virtual reality platform may additionally use photogrammetry to visually ingest the form factor of a physical item in order to model it in the system.
  • the disclosure may relate to a user interaction system to assess user interaction with a military item to produce a customised version of the military item for later user use, including: a database configured to store a particular user’s personal physical data, including weight and height; a processor configured to create a virtual reality environment where a virtual reality presentation of the military item is presented for user interaction; and a virtual reality headset configured to permit the user to interact with the virtual reality platform, wherein the headset is configured to transmit user interaction data to the processor, and the processor outputs customised user data in a deliverable form to enable manufacture of the military item interacted with by the user.
  • This system may include a sensor array including a multispectral scanner and a thermal scanner.
  • Military items may include an article of clothing, a military grade weapon (e.g., a variety of guns, etc.), and/or a personal, load-bearing article for carrying equipment (e.g., a harness).
  • a military grade weapon e.g., a variety of guns, etc.
  • a personal, load-bearing article for carrying equipment e.g., a harness
  • the processor may be configured to include the user’s physical data in combination with an initial military item design and assess user physical adaptability in combination with the initial item design, including user reaction speed and user energy output over different terrains.
  • the present disclosure in a preferred form provides the advantages that the virtual reality platform can assess current configuration of at least one item of military equipment and reconfigure it automatically to best suit the virtual soldier.
  • This enables designers and manufacturers to analyse the form and function of their own visual prototypes and iterate through design cycles virtually without the requirement to produce expensive physical prototypes until the design is more advanced.
  • Virtual iteration of equipment virtual prototypes through the system will save time and money for designers and manufacturers of military equipment, enabling new equipment to be produced faster to meet soldier operational needs in the fluid battlespace environment.

Abstract

The system uses virtual reality as a tool to accelerate soldier system of systems integration by analysing the effects on the soldier system when a new item is introduced. The system operates through a virtual reality headset (client device) connected through web-based interface and is updated remotely based on user requirements and user virtual model inputs. The system accurately depicts a modern soldier in contemporary configuration with full equipment and the user can virtually locate and affix their virtual model onto the configured soldier for the system to measure and analyse the impact of this addition on the soldier system of systems. The platform enables configuration experimentation to inform physical equipment selection, including one or more of relative configuration, weight, mobility, range of motion, thermal burden and personal protection compared to the original soldier system configuration.

Description

THREE-DIMENSIONAL VISUALISED SOLDIER SYSTEMS INTEGRATION PLATFORM
Field of the Invention
The present disclosure relates to a simulation system to trial various aspects of military accessories (hardware and/or clothing) in a safe environment, and in a fulfilling, integrated manner for sizing of the military accessories to the customised physical attributes of a particular person or soldier.
Background of the Invention
Modern military accessories are typically tested 100% virtually, or using live participants only. A 100% virtual environment can differ in unforeseen aspects compared to live environments. A live environment can be expensive and time- consuming when development speed can prove critical.
A basic definition of soldier systems integration emphasises interoperability, in this context the requirement that each military system work in concert with other systems to enhance the physical performance, endurance, sustainability, survivability or lethality of the user. What is needed is a system and method that permits one or more of these aspects.
Summary
The present disclosure in one preferred aspect provides for a method for assessing the form, fit and capability of an item in a military situation. The method includes obtaining personal physical user characteristic data, including user weight and height; providing at least one item of military nature to the user virtually through a virtual medium; assessing user interaction with the at least one item to create an assessment. The assessment includes at least: an anthropomorphic load assessment on the physiological structure of the wearer; a thermal scan; and a multispectral scan of the user during the user’s interaction with the at least one item. The method further includes providing, in a non-virtual environment, the user with the item customised to user specifications based on the assessment. In one or more preferred aspects, the method may involve where the item is an article of clothing or load carriage equipment, or is piece of military hardware. The assessment may include user interaction with the item in a virtual environment that includes predetermined climatic conditions. An ergometric analysis of the user’s interaction with the item analysis may be conducted as part of the assessment.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. In the present specification and claims, the word “comprising” and its derivatives including “comprises” and “comprise” include each of the stated integers, but does not exclude the inclusion of one or more further integers.
It will be appreciated that reference herein to “preferred” or “preferably” is intended as exemplary only. The claims as filed and attached with this specification are hereby incorporated by reference into the text of the present description.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate several embodiments of the invention and together with the description, serve to explain the principles of the invention.
Brief Description of the Figures
Fig. 1 is a wireframe diagram of a method for assessing the form, fit and capability of an item in a military situation.
Detailed Description of the Drawings
Reference will now be made in detail to the present preferred embodiments of the disclosure, examples of which are illustrated in the accompanying drawings.
It will be appreciated that reference herein to “preferred” or “preferably” is intended as exemplary only. Fig. 1 shows a wireframe diagram 100 of a method for the configuration of a virtual soldier and analysis of resultant anthropomorphic and ergonomic effects on the comfort and performance of the virtual soldier, using a virtual reality platform.
The virtual reality platform includes a main screen 102 that operates through web- based preferably utilities 104 using: 3D model file types (for example Oculus Quest or HTC Vive) on user devices (such as mobile phones, laptops or other portable computing devices), and physical user interface components (for example buttons, dials and gestures) of the user devices. This allows for the user to be anywhere in the world and have ability to upload data in pre-determined acceptable formats to use in the system environment.
The main screen 102 will output a three-dimensional visualisation of a virtual soldier and equipment based on user inputs. The three-dimensional visualisation of the virtual soldier may show various musculoskeletal and anthropomorphic analyses separately or as an overlay on the virtual soldier.
The user inputs are made via the configuration workshop 106 and the evaluation environment 108.
The configuration workshop 106 allows a user to input personal physical user characteristic data, including user gender 107, weight and height 109, and providing at least one item of military nature to the user virtually. The configuration workshop 106 allows a user to focus virtually on one soldier system from a number of soldier systems (for example the head, torso, extremities, load carriage or ballistic protective elements) and reconfigure the virtual soldier. The configuration workshop 106 allows a user to reconfigure the military equipment of a virtual soldier from a number of preconfigured contemporary roles, such as: Australian Defence Force current fit-out (e.g.: Infantry Rifleman Soldier Combat Ensemble 19) 112, prototype fit-out (e.g.: market scanning Best of Breed) 114, sub system fit-out (e.g.: head as a system) 116, specialist role fit-out (e.g.: infantry grenadier, machine gunner or combat engineer (demolitions)) 118 and sub-system fit-out (e.g.: other systems) 120. The at least one item of military nature may be an item of clothing, load carriage equipment, or a piece of military hardware.
The evaluation environment 108 allows a user to input visual, thermal, multispectral, physical and other data to create virtual environments 122 that replicate virtually the visual and climatic conditions of selected environments or the internal fittings of military vehicles. The evaluation environment 108 also allows a user to input camouflage changes 124 to the user and equipment. The evaluation environment 108 also allows a user to input the action of the virtual soldier (e.g.: lift and carry 128 or other simple tasks 130) and position of the virtual soldier (e.g.: firing positions 132).
The integration laboratory 110 assesses the virtual soldier interaction with the at least one item of military nature in the evaluation environment, the assessment preferably including at least a thermal scan and multispectral scan of the virtual soldier during the virtual soldier’s interaction with the at least one item of military equipment. The assessment preferably includes: field of view analysis 134 to determine the effect of the configuration of the at least one item of military equipment on the field of view of the virtual soldier; range of motion analysis 136 to determine the effect of the configuration of the at least one item of military equipment on the range of motion of the virtual soldier; thermal burden analysis 138 to determine the effect of the configuration of the at least one item of military equipment on the body temperature of the virtual soldier; load based posture analysis 140 to determine the effect of the configuration of the at least one item of military equipment on the posture of the virtual soldier; and vehicle integration analysis 142 to determine the effect of the configuration of the at least one item of military equipment on the ability of the user to ride in a vehicle. Based on these analyses, the integration laboratory 110 determines the resultant physiological stresses on the various systems of the human body based on the inputs. The integration laboratory includes an integration optimiser that may use a calculation, machine learning, artificial intelligence algorithm to recommend integration of the various inputs in ways that will minimise the resultant physiological stresses on the various systems of the human body based on the inputs. The details of machine learning and artificial intelligence will involve calculations using mathematical algorithms that would be understood by those of ordinary skill in the art (for example only, a Bayesian algorithm for use as a classifier). Feature sets may involve human physiological stress conditions and items of equipment and/or articles of clothing.
The user can then change the inputs in the configuration workshop 106 and the evaluation environment 108. The integration laboratory 110 utilises user experience, machine learning, artificial intelligence, anthropomorphic and ergonomic analysis to assess the impacts of this change and informs the user of the feasibility, acceptability, and suitability of the change.
The output of this analysis is shown on the main screen 102 which visually depicts the virtual soldier’s ability to perform standard tasks (for example move through range of motion, adopt standard firing positions and interact with various military vehicles) based on the inputs. The main screen 102 visually depicts an adjustment to the posture of the virtual soldier based on the inputs.
The output data of the virtual reality platform is used to assign relative capability and performance metrics (for example, weight, protection, mobility) against the preconfigured soldier systems configuration for future users.
The virtual reality platform may additionally use augmented reality to visually display annotations or analysis of the configuration of a physical soldier from the system overlaid on reality using reference points, QR codes, or other visual markers. The virtual reality platform may additionally use photogrammetry to visually ingest the form factor of a physical item in order to model it in the system. The foregoing description is by way of example only, and may be varied considerably without departing from the scope of the present disclosure. For example only, the disclosure may relate to a user interaction system to assess user interaction with a military item to produce a customised version of the military item for later user use, including: a database configured to store a particular user’s personal physical data, including weight and height; a processor configured to create a virtual reality environment where a virtual reality presentation of the military item is presented for user interaction; and a virtual reality headset configured to permit the user to interact with the virtual reality platform, wherein the headset is configured to transmit user interaction data to the processor, and the processor outputs customised user data in a deliverable form to enable manufacture of the military item interacted with by the user. This system may include a sensor array including a multispectral scanner and a thermal scanner.
Military items may include an article of clothing, a military grade weapon (e.g., a variety of guns, etc.), and/or a personal, load-bearing article for carrying equipment (e.g., a harness).
The processor may be configured to include the user’s physical data in combination with an initial military item design and assess user physical adaptability in combination with the initial item design, including user reaction speed and user energy output over different terrains.
The components described with respect to one embodiment may be applied to other embodiments, or combined with or interchanged with the features of other embodiments, as appropriate, without departing from the scope of the present disclosure.
The present disclosure in a preferred form provides the advantages that the virtual reality platform can assess current configuration of at least one item of military equipment and reconfigure it automatically to best suit the virtual soldier. This enables designers and manufacturers to analyse the form and function of their own visual prototypes and iterate through design cycles virtually without the requirement to produce expensive physical prototypes until the design is more advanced. Virtual iteration of equipment virtual prototypes through the system will save time and money for designers and manufacturers of military equipment, enabling new equipment to be produced faster to meet soldier operational needs in the fluid battlespace environment.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of forms of the embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims

What is claimed is:
1 . A method for assessing the form, fit and capability of an item in a military situation, comprising: obtaining personal physical user characteristic data, including user weight and height; providing at least one item of military nature to the user virtually through a virtual medium; assessing user interaction with the at least one item to create an assessment, the assessment including at least: an anthropomorphic load assessment on the physiological structure of the wearer; a thermal scan; and a multispectral scan of the user during the user’s interaction with the at least one item; and providing, in a non-virtual environment, the user with the item customised to user specifications based on the assessment.
2. The method of claim 1 , wherein the item is an article of clothing or load carriage equipment.
3. The method of claim 1 , wherein the item is a piece of military hardware.
4. The method of claim 1 , wherein the assessment includes user interaction with the item in a virtual environment that includes predetermined climatic conditions.
5. The method of claim 1 , where an ergometric analysis of the user’s interaction with the item analysis is conducted as part of the assessment.
6. A user interaction system to assess user interaction with a military item to produce a customised version of the military item for later user real-world use, comprising: a virtual reality platform, including: a database configured to store a particular user’s personal physical data, including weight and height; a processor configured to create a virtual reality environment where a virtual reality presentation of the military item is presented for user interaction based on the particular user’s personal physical data; and a virtual reality headset configured to permit the user to interact with the virtual reality platform, wherein said headset is configured to transmit user interaction data to said processor, and said processor outputs customised user data in a deliverable form to enable non-virtual manufacture of the military item interacted with by the user.
7. The system of claim 6, further comprising a sensor array including a multispectral scanner.
8. The system of claim 7, wherein the sensor array further includes a thermal scanner.
9. The system of any one of claims 6 to 8, wherein the military item is an article of clothing.
10. The system of any one of claims 6 to 8, wherein the military item is a military grade weapon.
11 . The system of any one of claims 6 to 8, wherein the military item is a personal, load-bearing article for carrying equipment.
12. The system of any one of claims 6 to 11 , wherein said processor is configured to include the user’s physical data in combination with an initial military item design and assess user physical adaptability in combination with the initial item design, including user reaction speed and user energy output over different terrains.
PCT/AU2022/050522 2021-06-01 2022-05-30 Three-dimensional visualised soldier systems integration platform WO2022251902A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2022285319A AU2022285319A1 (en) 2021-06-01 2022-05-30 Three-dimensional visualised soldier systems integration platform

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021901633A AU2021901633A0 (en) 2021-06-01 Three-Dimensional Visualised Soldier Systems Integration Platform
AU2021901633 2021-06-01

Publications (1)

Publication Number Publication Date
WO2022251902A1 true WO2022251902A1 (en) 2022-12-08

Family

ID=78488581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/050522 WO2022251902A1 (en) 2021-06-01 2022-05-30 Three-dimensional visualised soldier systems integration platform

Country Status (2)

Country Link
AU (2) AU2021106739A4 (en)
WO (1) WO2022251902A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015011678A1 (en) * 2013-07-26 2015-01-29 Eva S.R.L. Method for determining a fitting index of a garment based on anthropometric data of a user, and device and system thereof
US20160063320A1 (en) * 2014-08-29 2016-03-03 Susan Liu Virtual body scanner application for use with portable device
US20160247017A1 (en) * 2010-06-08 2016-08-25 Raj Sareen Method and system for body scanning and display of biometric data
US20160284132A1 (en) * 2015-03-23 2016-09-29 Electronics And Telecommunications Research Institute Apparatus and method for providing augmented reality-based realistic experience
US20200074521A1 (en) * 2018-09-05 2020-03-05 Gerber Technology Llc Method and appratus for the production of garments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160247017A1 (en) * 2010-06-08 2016-08-25 Raj Sareen Method and system for body scanning and display of biometric data
WO2015011678A1 (en) * 2013-07-26 2015-01-29 Eva S.R.L. Method for determining a fitting index of a garment based on anthropometric data of a user, and device and system thereof
US20160063320A1 (en) * 2014-08-29 2016-03-03 Susan Liu Virtual body scanner application for use with portable device
US20160284132A1 (en) * 2015-03-23 2016-09-29 Electronics And Telecommunications Research Institute Apparatus and method for providing augmented reality-based realistic experience
US20200074521A1 (en) * 2018-09-05 2020-03-05 Gerber Technology Llc Method and appratus for the production of garments

Also Published As

Publication number Publication date
AU2021106739A4 (en) 2021-11-11
AU2022285319A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
US11164381B2 (en) Clothing model generation and display system
US10475113B2 (en) Method system and medium for generating virtual contexts from three dimensional models
JP6116784B1 (en) 3D model generation system, 3D model generation method, and program
Blanchonette Jack Human Modelling Tool: A Review.
EP3136346B1 (en) Method and system for vision measure for digital human models
Valentini Interactive virtual assembling in augmented reality
US11145418B2 (en) System and method for model-based calculation of blast exposure
Kwon et al. Optimal camera point selection toward the most preferable view of 3-d human pose
Eldar et al. Ergonomic design visualization mapping-developing an assistive model for design activities
WO2022251902A1 (en) Three-dimensional visualised soldier systems integration platform
Abdel-Malek et al. Development of the virtual-human Santos TM
US20160357880A1 (en) Computer-readable storage medium, design apparatus, and design method
Matthews et al. Shape aware haptic retargeting for accurate hand interactions
Colvin et al. Multiple user motion capture and systems engineering
Zhou et al. Anthropometry model generation based on ANSUR II database
Fečová et al. Devices and software possibilities for using of motion tracking systems in the virtual reality system
Hariri et al. Optimization-based prediction of aiming and kneeling military tasks performed by a soldier
Deitz Human-integrated design
Lockett et al. Proposed integrated human figure modeling analysis approach for the army's future combat systems
Yang et al. The IOWA interactive digital-human virtual environment
Hariri Optimization-Based Prediction of the Motion of a Soldier Performing the ‘Going Prone’and ‘Get Up from Prone’Military Tasks
Boyd et al. Using multivariate analysis to select accommodation boundary manikins from a population database
Perret Haptic device integration
Wang et al. Vehicle Accessibility Evaluation Method Based on Digital Prototype
Smith et al. Studying visibility as a constraint and as an objective for posture prediction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22814600

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2022285319

Country of ref document: AU

Ref document number: AU2022285319

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2022285319

Country of ref document: AU

Date of ref document: 20220530

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE