AU2021106739A4 - Three-Dimensional Visualised Soldier Systems Integration Platform - Google Patents

Three-Dimensional Visualised Soldier Systems Integration Platform Download PDF

Info

Publication number
AU2021106739A4
AU2021106739A4 AU2021106739A AU2021106739A AU2021106739A4 AU 2021106739 A4 AU2021106739 A4 AU 2021106739A4 AU 2021106739 A AU2021106739 A AU 2021106739A AU 2021106739 A AU2021106739 A AU 2021106739A AU 2021106739 A4 AU2021106739 A4 AU 2021106739A4
Authority
AU
Australia
Prior art keywords
user
item
soldier
military
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2021106739A
Inventor
Shane Sarlin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Buzzworks Think Tank Pty Ltd
Original Assignee
Buzzworks Think Tank Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021901633A external-priority patent/AU2021901633A0/en
Application filed by Buzzworks Think Tank Pty Ltd filed Critical Buzzworks Think Tank Pty Ltd
Application granted granted Critical
Publication of AU2021106739A4 publication Critical patent/AU2021106739A4/en
Assigned to Buzzworks Think Tank Pty Ltd reassignment Buzzworks Think Tank Pty Ltd Request for Assignment Assignors: Buzzworks Pty Ltd
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)
  • Golf Clubs (AREA)
  • Micro-Organisms Or Cultivation Processes Thereof (AREA)
  • Road Signs Or Road Markings (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A basic definition of soldier systems integration emphasises interoperability, in this context the requirement that each military system work in concert with other systems to enhance the physical performance, endurance, sustainability, survivability or lethality of the user. This soldier systems integration platform uses virtual reality as a tool to accelerate soldier system of systems integration by analysing the effects on the soldier system when a new item is introduced. The system operates through a standard virtual reality headset (client device) connected through web-based interface and is updated remotely based on user requirements and user virtual model inputs. The system accurately depicts a modern soldier in contemporary configuration with full equipment and the user can virtually locate and affix their virtual model onto the configured soldier for the system to measure and analyse the impact of this addition on the soldier system of systems. The platform enables configuration experimentation to inform physical equipment selection, including but not limited to relative configuration, weight, mobility, range of motion, thermal burden and personal protection compared to the original soldier system configuration. 8 1 of 1 cm co, cv, cm, r4 00 La 00 -40

Description

1 of 1
cm co,
cv,
cm,
r4
00
La
00
THREE-DIMENSIONAL VISUALISED SOLDIER SYSTEMS INTEGRATION PLATFORM
Brief Description of the Figures Fig. 1 is a wireframe diagram of a method for assessing the form, fit and capability of an item in a military situation.
Detailed Description of the Drawings Reference will now be made in detail to the present preferred embodiments of the disclosure, examples of which are illustrated in the accompanying drawings.
It will be appreciated that reference herein to "preferred" or "preferably" is intended as exemplary only.
Fig. 1 shows a wireframe diagram 100 of a method for the configuration of a virtual soldier and analysis of resultant anthropomorphic and ergonomic effects on the comfort and performance of the virtual soldier, using a virtual reality platform.
The virtual reality platform includes a main screen 102 that operates through web based preferably utilities 104 using: 3D model file types (for example Oculus Quest or HTC Vive) on user devices (such as mobile phones, laptops or other portable computing devices), and physical user interface components (for example buttons, dials and gestures) of the user devices. This allows for the user to be anywhere in the world and have ability to upload data in pre-determined acceptable formats to use in the system environment.
The main screen 102 will output a three-dimensional visualisation of a virtual soldier and equipment based on user inputs. The three-dimensional visualisation of the virtual soldier may show various musculoskeletal and anthropomorphic analyses separately or as an overlay on the virtual soldier.
The user inputs are made via the configuration workshop 106 and the evaluation environment 108.
The configuration workshop 106 allows a user to input personal physical user characteristic data, including user gender 107, weight and height 109, and providing at least one item of military nature to the user virtually. The configuration workshop 106 allows a user to focus virtually on one soldier system from a number of soldier systems (for example the head, torso, extremities, load carriage or ballistic protective elements) and reconfigure the virtual soldier. The configuration workshop 106 allows a user to reconfigure the military equipment of a virtual soldier from a number of preconfigured contemporary roles, such as: ADF current fit-out (eg: Rifleman SCE18) 112, prototype fit-out (eg: BoB rifleman) 114, sub system fit-out (eg: head as a system) 116, specialist role fit-out (eg: IAW L125-4 ISS ITR) 118 and sub-system fit-out (eg: other systems) 120. The at least one item of military nature may be an item of clothing or a piece of military hardware.
The evaluation environment 108 allows a user to input visual, thermal, multispectral, physical and other data to create virtual environments 122 that replicate virtually the visual and climatic conditions of selected environments or military vehicles. The evaluation environment 108 also allows a user to input camouflage changes 124 to the user and equipment. The evaluation environment 108 also allows a user to input the action of the virtual soldier (eg: lift and carry 128 or other simple tasks 130) and position of the virtual soldier (eg: firing positions 132).
The integration laboratory 110 assesses the virtual soldier interaction with the at least one item of military nature in the evaluation environment, the assessment preferably including at least a thermal scan and multispectral scan of the virtual soldier during the virtual soldier's interaction with the at least one item of military equipment. The assessment preferably includes: field of view analysis 134 to determine the effect of the configuration of the at least one item of military equipment on the field of view of the virtual soldier; range of motion analysis 136 to determine the effect of the configuration of the at least one item of military equipment on the range of motion of the virtual soldier; thermal burden analysis 138 to determine the effect of the configuration of the at least one item of military equipment on the body temperature of the virtual soldier; load based posture analysis 140 to determine the effect of the configuration of the at least one item of military equipment on the posture of the virtual soldier; and vehicle integration analysis 142 to determine the effect of the configuration of the at least one item of military equipment on the ability of the user to ride in a vehicle. Based on these analyses, the integration laboratory 110 determines the resultant physiological stresses on the various systems of the human body based on the inputs. The integration laboratory comprises an integration optimiser that recommends integration of the various inputs in ways that will minimise the resultant physiological stresses on the various systems of the human body based on the inputs.
The user can then change the inputs in the configuration workshop 106 and the evaluation environment 108. The integration laboratory 110 utilises user experience, anthropomorphic and ergonomic analysis to assess the impacts of this change and informs the user of the feasibility, acceptability, and suitability of the change.
The output of this analysis is shown on the main screen 102 which visually depicts the virtual soldier's ability to perform standard tasks (for example move through range of motion, adopt standard firing positions and interact with various military vehicles) based on the inputs. The main screen 102 visually depicts an adjustment to the posture of the virtual soldier based on the inputs.
The output data of the virtual reality platform is used to produce a used to assign relative capability and performance metrics (for example weight, protection, mobility) against the preconfigured soldier systems configuration for future users.
The virtual reality platform of this invention may additionally use augmented reality to visually analyse the configuration of a physical soldier and model them in the system. The virtual reality platform may additionally use photogrammetry to visually ingest the form factor of a physical item in order to model it in the system.
The foregoing description is by way of example only, and may be varied considerably without departing from the scope of the present disclosure. For example only, the disclosure may relate to a user interaction system to assess user interaction with a military item to produce a customised version of the military item for later user use, including: a database configured to store a particular user's personal physical data, including weight and height; a processor configured to create a virtual reality environment where a virtual reality presentation of the military item is presented for user interaction; and a virtual reality headset configured to permit the user to interact with the virtual reality platform, wherein the headset is configured to transmit user interaction data to said processor, and the processor outputs customised user data in a deliverable form to enable manufacture of the military item interacted with by the user. This system may comprise a sensor array including a multispectral scanner and a thermal scanner.
The components described with respect to one embodiment may be applied to other embodiments, or combined with or interchanged with the features of other embodiments, as appropriate, without departing from the scope of the present disclosure.
The present disclosure in a preferred form provides the advantages that the virtual reality platform can assess currently configuration of the at least one item of military equipment and reconfigure it automatically to best suit the virtual soldier. This enables designers and manufacturers to analyse the form and function of their own visual prototypes and iterate through design cycles virtually without the requirement to produce expensive physical prototypes until the design is more advanced. Virtual iteration of equipment virtual prototypes through the system will save time and money for designers and manufacturers of military equipment, enabling new equipment to be produced faster to meet soldier operational needs in the fluid battlespace environment.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of forms of the embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (8)

What is claimed is:
1. A method for assessing the form, fit and capability of an item in a military situation, comprising: obtaining personal physical user characteristic data, including user weight and height; providing at least one item of military nature to the user virtual through a virtual medium; assessing user interaction with the at least one item, the assessment including at least: a thermal scan; and a multispectral scan of the user during the user's interaction with the at least one item; and providing the user with the item customised to the user based on the assessment, the providing of the item being conducted in a non-virtual environment.
2. The method of claim 1, wherein the item is an article of clothing.
3. The method of claim 1, wherein the item is a piece of military hardware.
4. The method of claim 1, wherein the assessment includes user interaction with the item in a virtual environment that includes predetermined climatic conditions.
5. The method of claim 1, where an ergometric analysis of the user's interaction with the item analysis is conducted as part of the assessment.
6. A user interaction system to assess user interaction with a military item to produce a customised version of the military item for later user use, comprising: a virtual reality platform, including: a database configured to store a particular user's personal physical data, including weight and height; a processor configured to create a virtual reality environment where a virtual reality presentation of the military item is presented for user interaction; and a virtual reality headset configured to permit the user to interact with the virtual reality platform, wherein said headset is configured to transmit user interaction data to said processor, and said processor outputs customised user data in a deliverable form to enable manufacture of the military item interacted with by the user.
7. The system of claim 6, further comprising a sensor array including a multispectral scanner.
8. The system of claim 7, wherein the sensor array further includes a thermal scanner.
1 of 1
AU2021106739A 2021-06-01 2021-08-24 Three-Dimensional Visualised Soldier Systems Integration Platform Active AU2021106739A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021901633A AU2021901633A0 (en) 2021-06-01 Three-Dimensional Visualised Soldier Systems Integration Platform
AU2021901633 2021-06-01

Publications (1)

Publication Number Publication Date
AU2021106739A4 true AU2021106739A4 (en) 2021-11-11

Family

ID=78488581

Family Applications (2)

Application Number Title Priority Date Filing Date
AU2021106739A Active AU2021106739A4 (en) 2021-06-01 2021-08-24 Three-Dimensional Visualised Soldier Systems Integration Platform
AU2022285319A Pending AU2022285319A1 (en) 2021-06-01 2022-05-30 Three-dimensional visualised soldier systems integration platform

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2022285319A Pending AU2022285319A1 (en) 2021-06-01 2022-05-30 Three-dimensional visualised soldier systems integration platform

Country Status (2)

Country Link
AU (2) AU2021106739A4 (en)
WO (1) WO2022251902A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10702216B2 (en) * 2010-06-08 2020-07-07 Styku, LLC Method and system for body scanning and display of biometric data
ITTO20130634A1 (en) * 2013-07-26 2015-01-27 Eva S R L METHOD FOR DETERMINING AN INDEX OF WEAR OF A CLOTHING ON THE BASIS OF AN ANTHROPOMETRIC DATA OF A USER, AND RELATED TO THE DEVICE AND SYSTEM
US20160063320A1 (en) * 2014-08-29 2016-03-03 Susan Liu Virtual body scanner application for use with portable device
KR101740326B1 (en) * 2015-03-23 2017-06-08 한국전자통신연구원 Realistic experience apparatus based augmented reality and method of providing realistic experience based augmented reality
US20200071111A1 (en) * 2018-09-05 2020-03-05 Gerber Technology Llc Flexible material transport system

Also Published As

Publication number Publication date
WO2022251902A1 (en) 2022-12-08
AU2022285319A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
US11164381B2 (en) Clothing model generation and display system
TWI505709B (en) System and method for determining individualized depth information in augmented reality scene
US10416769B2 (en) Physical haptic feedback system with spatial warping
US9477312B2 (en) Distance based modelling and manipulation methods for augmented reality systems using ultrasonic gloves
JP2012168798A (en) Information processing device, authoring method, and program
US20120265104A1 (en) Posture observer for ergonomic observation, posture analysis and reconstruction
Ma et al. A framework for interactive work design based on motion tracking, simulation, and analysis
US20200118336A1 (en) Information processing device, information processing method, and program
KR102585821B1 (en) Augmented reality device and positioning method
JP2010257081A (en) Image procession method and image processing system
CN106485748B (en) Method and system for visual measurement of digital mannequins
Bogatinov et al. Firearms training simulator based on low cost motion tracking sensor
Punpongsanon et al. Extended LazyNav: Virtual 3D ground navigation for large displays and head-mounted displays
Gerschütz et al. A review of requirements and approaches for realistic visual perception in virtual reality
KR102502488B1 (en) Program, system, electronic device and method for recognizing three-dimensional objects
AU2021106739A4 (en) Three-Dimensional Visualised Soldier Systems Integration Platform
Matthews et al. Shape aware haptic retargeting for accurate hand interactions
US11847792B2 (en) Location determination and mapping with 3D line junctions
Lee et al. 3D scan to product design: Methods, techniques, and cases
Colvin et al. Multiple user motion capture and systems engineering
KR102157246B1 (en) Method for modelling virtual hand on real hand and apparatus therefor
TWM598411U (en) Augmented reality device
Abdullah et al. A virtual environment with haptic feedback for better distance estimation
Barhoush et al. A novel experimental design of a real-time VR tracking device
WO2023054661A1 (en) Gaze position analysis system and gaze position analysis method

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
PC Assignment registered

Owner name: BUZZWORKS THINK TANK PTY LTD

Free format text: FORMER OWNER(S): BUZZWORKS PTY LTD