US11854424B2 - Virtual reality simulation and method - Google Patents
Virtual reality simulation and method Download PDFInfo
- Publication number
- US11854424B2 US11854424B2 US17/278,913 US201917278913A US11854424B2 US 11854424 B2 US11854424 B2 US 11854424B2 US 201917278913 A US201917278913 A US 201917278913A US 11854424 B2 US11854424 B2 US 11854424B2
- Authority
- US
- United States
- Prior art keywords
- processor
- display
- controller
- user
- responsive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 154
- 238000000034 method Methods 0.000 title claims abstract description 119
- 241000894006 Bacteria Species 0.000 claims abstract description 61
- 230000033001 locomotion Effects 0.000 claims description 165
- 239000002245 particle Substances 0.000 claims description 68
- 230000001580 bacterial effect Effects 0.000 claims description 54
- 230000000007 visual effect Effects 0.000 claims description 51
- 230000015654 memory Effects 0.000 claims description 50
- 238000010438 heat treatment Methods 0.000 claims description 48
- 239000007788 liquid Substances 0.000 claims description 42
- 238000012545 processing Methods 0.000 claims description 39
- 230000004913 activation Effects 0.000 claims description 38
- 238000011179 visual inspection Methods 0.000 claims description 36
- 230000003993 interaction Effects 0.000 claims description 35
- 230000008569 process Effects 0.000 claims description 28
- 230000012010 growth Effects 0.000 claims description 21
- 230000003247 decreasing effect Effects 0.000 claims description 11
- 238000012800 visualization Methods 0.000 claims description 11
- 239000012530 fluid Substances 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 6
- 230000000750 progressive effect Effects 0.000 claims description 5
- 239000003086 colorant Substances 0.000 claims 4
- 230000002452 interceptive effect Effects 0.000 abstract description 21
- 238000012549 training Methods 0.000 abstract description 11
- 239000000835 fiber Substances 0.000 description 35
- 238000010586 diagram Methods 0.000 description 22
- 230000000875 corresponding effect Effects 0.000 description 21
- 239000011521 glass Substances 0.000 description 16
- 238000001816 cooling Methods 0.000 description 14
- 238000011109 contamination Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 230000001954 sterilising effect Effects 0.000 description 8
- 238000004659 sterilization and disinfection Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 230000005484 gravity Effects 0.000 description 6
- 238000013500 data storage Methods 0.000 description 5
- 238000010200 validation analysis Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 4
- 230000000630 rising effect Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000011534 incubation Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003278 mimic effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000009987 spinning Methods 0.000 description 2
- 241001085205 Prenanthella exigua Species 0.000 description 1
- 101150110932 US19 gene Proteins 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 239000003708 ampul Substances 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000010261 cell growth Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000005043 peripheral vision Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/60—In silico combinatorial chemistry
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/24—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for chemistry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/10—Analysis or design of chemical reactions, syntheses or processes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/40—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- the present disclosure generally relates to an apparatus and methods for virtual reality training, and more particularly to methods and devices utilizing a processor device, visual outputs, sensor devices and special sensors combination for use in facilitating virtual reality simulations including a microscope simulation, a bacteria streaking simulation, and/or a visually inspecting containers simulation.
- Training for laboratory situations is required for most workers in a laboratory and/or manufacturing environment.
- Real-world training can be time consuming (e.g., waiting for the bacteria to grow), expensive (e.g., microscopes can be damaged by improper use), and/or risky if students make mistakes on real products and contaminated containers get sent to hospitals and/or are used on patients.
- specific elements cannot be emphasized, or altered to better ingrain training. This is particularly the case for training that requires extensive laboratory training time for various certification or degreed programs. Such training is both expensive in capital equipment, as well as requiring tear down, cleaning, and set-up costs.
- One aspect of the present disclosure comprises a non-transitory computer readable medium storing instructions executable by an associated processor to perform a method for implementing a bacteria streaking simulation comprising generating a three-dimensional initial view based upon a view selection input by a user, sending instructions to present the initial view to a user display of a headset, the user display comprised within the headset, receiving an input from a controller comprising at least one sensor indicating user movement within the initial view, and accessing memory to identify an assigned status of a loop.
- the processor instructs the initial view to be presented on a user display comprised within the headset, the at least one controller sends an input to the processor indicating the controller is moving within the initial view, and the processor instructs the movement of the controller of the at least one controller be presented on the user display.
- the processor assigns the loop to be controlled by movement of the controller, the controller sends an input indicating that the controller is moving and interacting with the streaking plate, and the processor generates and stores a pattern of interaction between the loop and the streaking plate, wherein the processor generates the pattern by assigning a linearly decreasing bacterial concentration to the loop responsive to a distance traveled by said loop while interacting with the streaking plate, the processor generates a series of waypoints, wherein the bacterial concentration of the loop at the location of a waypoint creation is assigned to the waypoint, the processor instructs the user display to illustrate the waypoints as a line forming the pattern on the streaking plate.
- Yet another aspect of the present disclosure comprises a non-transitory computer readable medium storing instructions executable by an associated processor to perform a method for implementing a visual inspection simulation comprising generating a three-dimensional initial view based upon a view selection input by a user, sending instructions to present the initial view to a user display of a headset, the user display comprised within the headset, receiving an input from a controller comprising at least one sensor indicating user movement within the initial view, and accessing memory to identify an assigned status of a selected container of one or more containers.
- the processor assigns the selected container to be controlled by movement of the controller, continuously generating a fluid flow pattern utilizing a visualization state machine, wherein a single point mass on a mass spring with a lateral damper is virtually attached to a fixed point in a center of the container. Responsive to receiving an input from the controller that the container moved in a defined direction, swinging the single point mass back and forth along the defined direction then settling the single point mass into an original position, and displaying a two-dimensional liquid top surface as following the orientation of the single point mass, wherein the liquid top surface is continually oriented to face a line of sight from the user display.
- Yet another aspect of the present disclosure comprises a virtual reality system for providing a visual inspection simulation, the system comprising a processing device having a processor configured to perform a predefined set of operations in response to receiving a corresponding input from at least one of virtual reality headset and at least one controller, the processing device comprising memory, wherein a three-dimensional initial view of a visual inspection simulation is stored, the initial view comprising at least one container, the processor instructs the initial view to be presented on a user display comprised within the headset, and the at least one controller sends an input to the processor indicating the controller is moving within the initial view. The processor instructs the movement of the controller of the at least one controller be presented on the user display.
- the processor assigns a selected container of the at least one container to be controlled by movement of the controller, the controller sends an input indicating that the controller is moving the selected container, and the processor generates a continuous fluid flow pattern utilizing a visualization state machine, wherein the visualization state machine comprises a single point mass on a mass spring with a lateral damper virtually attached to a fixed point in a center of the container.
- the controller sends a signal to the processor that the container moved in a defined direction, and the processor swings the single point mass back and forth along the defined direction then has the single point mass settle into an initial position, wherein the processor instructs the user display to display a two-dimensional liquid top surface as following the orientation of the single point mass, wherein the process instructs the user display to display the liquid to surface to be continually oriented to face a line of sight.
- Yet another aspect of the present disclosure comprises a non-transitory computer readable medium storing instructions executable by an associated processor to perform a method for implementing a microscope simulation comprising generating a three-dimensional initial view based upon a view selection input by a user, sending instructions to present the initial view to a user display of a headset, the user display comprised within the headset, receiving an input from a controller comprising at least one sensor indicating user movement within the initial view, and accessing memory to identify an assigned status of a microscope including lateral wheel, longitudinal wheel, objective wheel and focus wheel inputs. Responsive to an input from the controller, instructing the user display to display one of the lateral wheel, the longitudinal wheel, the objective wheel and the focus wheel.
- Yet another aspect of the present disclosure comprises a virtual reality system for providing a microscope simulation, the system comprising a processing device having a processor configured to perform a predefined set of operations in response to receiving a corresponding input from at least one of virtual reality headset and at least one controller, the processing device comprising memory, wherein a three-dimensional initial view of a microscope simulation is stored, the initial view comprising a microscope.
- the processor instructs the initial view to be presented on a user display comprised within the headset, the at least one controller sends an input to the processor indicating the controller is moving within the initial view, and the processor instructs the movement of the controller of the at least one controller be presented on the user display.
- the processor instructs the user display to display one of the lateral wheel, the longitudinal wheel, the objective wheel and the focus wheel, wherein responsive to the processor having assigned a slide present on the microscope no bacteria, the processor retrieves a blank slide from memory and instructs the user display to display the slide present on the microscope and on a heads-up display displaying a microscope view.
- the processor instructs the user display to display the objective wheel rotating based upon the controller movement and to display the heads-up display displaying the microscope view having an altered magnification
- the processor instructs the user display to display the focus wheel rotating based upon the controller movement and to display the heads-up display displaying the microscope view having an altered blurriness, wherein the slide is presented as having the altered blurriness.
- FIG. 3 B illustrates a microscope simulation utilizing a focus interactive functionality generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 D illustrates a microscope simulation utilizing a longitudinal interactive functionality generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 E illustrates a microscope simulation utilizing an objective interactive functionality generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 G illustrates a closeup view of an objective lens element of microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 H illustrates a closeup view of an objective lens element of microscope simulation utilizing an oil dropper interactive functionality generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 H 1 illustrates a view of a microscope simulation utilizing an oil dropper interactive functionality generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 I illustrates a microscope view corresponding to a slide present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 J illustrates a slide present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 K illustrates a microscope view corresponding to a view of a slide, wherein the view of the slide has been altered through use of the focus wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 L illustrates a microscope view corresponding to a view of a slide, wherein the view of the slide has been altered through use of the lateral wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 L 1 illustrates a microscope view corresponding to a view of a slide, wherein the view of the slide is at an initial position on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 L 2 illustrates a microscope view corresponding to a view of a slide, wherein the view of the slide has been altered to a maximum forward position through use of the lateral wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 L 4 illustrates a microscope view corresponding to a view of a slide, wherein the view of the slide has been altered to a maximum rear position through use of the lateral wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 L 5 illustrates a view of a slide, wherein the view of the slide has been altered to a maximum rear position through use of the lateral wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 M illustrates a microscope view corresponding to a view of a slide wherein the view of the slide has been altered through use of the longitudinal wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 M 1 illustrates a microscope view corresponding to a view of a slide wherein the view of the slide has been altered to a maximum left position through use of the longitudinal wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 M 2 illustrates a view of a slide wherein the view of the slide has been altered to a maximum left position through use of the longitudinal wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 M 3 illustrates a microscope view corresponding to a view of a slide wherein the view of the slide has been altered to a maximum right position through use of the longitudinal wheel present on a microscope of a microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 3 O illustrates a bacteria image source constant, according to one example embodiment of the present disclosure
- FIG. 3 P illustrates a slide image source constant, according to one example embodiment of the present disclosure
- FIG. 3 Q illustrates a vignette mask source constant, according to one example embodiment of the present disclosure
- FIG. 3 R illustrates a targeting crosshair image constant, according to one example embodiment of the present disclosure
- FIG. 4 A is a schematic diagram of a method of using a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 4 B is a schematic diagram of a method of using a focus function in a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 4 C is a schematic diagram of a method of using a lateral function in a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 4 D is a schematic diagram of a method of using a longitudinal function in a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 4 E is a schematic diagram of a method of using an objective function in a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 4 F is a schematic diagram of a method of using an oil function in a selected microscope simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 A illustrates a schematic view of a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 B illustrates a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 C illustrates a streaking simulation including a streaking plate, a source plate, a heating element and a loop generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 D illustrates a streaking simulation including a streaking plate volume and streaking plate, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 E illustrates a streaking simulation including a first portion of a streaking plate volume and streaking plate, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 F illustrates a streaking simulation including a view of a lifting arrow generated based upon interaction in a first portion of a streaking plate volume, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 G illustrates a streaking simulation including a magnified view of a lifting arrow generated based upon interaction in a first portion of a streaking plate volume, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 H illustrates a streaking simulation including a second portion of a streaking plate volume and streaking plate, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 I illustrates a streaking simulation including a source volume and source plate including a source lifting arrow generated based upon interaction in the source volume, generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 J illustrates a heating element first phase interaction with a loop in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 K illustrates a loop in a first heating phase in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 L illustrates a heating element second phase interaction with a loop in first heating phase in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 M illustrates a loop in a first cooling phase in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 N illustrates a loop in a second cooling phase in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 O illustrates a loop in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 O 1 illustrates a loop containing a source colony in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 P illustrates a streaking simulation including a streaking plate, a source plate having a cap removed, and a loop generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 P 1 illustrates a streaking simulation including a streaking plate having a streaking cap removed, a source plate, and a loop generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 Q illustrates a streaking simulation including a loop interacting with a streaking plate having a streaking cap removed generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 R illustrates a series of waypoints generated in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 S illustrates a series of patterns of bacterial growth generated on a streaking plate generated in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 T illustrates a second series of patterns of bacterial growth generated on a streaking plate generated in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 5 U illustrates a series of waypoints and corresponding rectangles generated in a streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 A is a schematic diagram of a method 600 of using a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 B is a schematic diagram of a method 600 of using a loop sterilization process in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 C is a schematic diagram of a method 600 of using removing a cap from a source plate in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 D is a schematic diagram of a method 600 of using a loop to capture a bacterial colony in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 E is a schematic diagram of a method 600 of using a streaking plate rotation process in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 F is a schematic diagram of a method 600 of using a streaking plate cap removal process and a loop validation process in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 G is a schematic diagram of a method 600 of using a series of inputs to generate a series of waypoints in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 H is a schematic diagram of a method 600 of using a series of inputs to generate a pattern in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 I is a schematic diagram of a method 600 of using a series of inputs to generate a pattern based upon existing pattern interactions in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 J is a schematic diagram of a method 600 of calculating bacterial growth based upon the pattern generated based upon a series of inputs based upon existing pattern interactions in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 6 K is a schematic diagram of a method 699 a of calculating haptic feedback based upon a series of inputs in a selected streaking simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 7 A illustrates a schematic view of inputs used in a visual inspection simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 7 B illustrates a visual inspection simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 7 C illustrates a container generated in a visual inspection simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure
- FIG. 7 E illustrates a container generated in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 I illustrates an assigned orientation of a 2 dimensional (2D) surface relative to a user display used in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 J illustrates a container that has been assigned shards generated in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 L illustrates a container having fibers assigned to be second visual indicator fibers illustrated as in front of a first background generated in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 M illustrates a container having particles assigned to be first visual indicator particles illustrated as in front of a second background generated in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 N illustrates a container having particles assigned to be second visual indicator particles illustrated as in front of a first background generated in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 7 R illustrates a series of images of fibers from a sequence of rotation images in a visual inspection simulation generated by an example virtual reality system, according to yet another example embodiment of the present disclosure
- FIG. 8 C is a schematic diagram of a method 800 of using a container rotation process and generating bubbles or particles within a container in a selected visualization simulation generated by an example virtual reality system, according to one example embodiment of the present disclosure.
- FIG. 1 illustrates a schematic diagram of a virtual reality system 100 , in accordance with one of the exemplary embodiments of the disclosure.
- the virtual reality system 100 includes a processing device 110 , a virtual reality headset, “headset 120 ”, and at least one controller 130 , where the processing device 110 is connectable and/or connected to the virtual reality headset 120 and the controller 130 .
- the display 122 may comprise one of a liquid crystal display (LCD), a light-emitting diode (LED) display, or the like.
- the motion sensor 114 may comprise a combination of an accelerometer (e.g. G-sensor), a gyroscope (e.g. gyro-sensor), and/or a sensor that detects the linear and/or rotational movement (e.g. rotational angular velocity or rotational angle) of the headset 120 .
- the motion sensor includes one or more locators 142 that generate a motion sensing grid 144 , wherein motion of the controller 130 and/or the headset 120 is monitored, and identified by the one or more sensors.
- the locator 142 comprises base stations including, for example, a spinning laser sheet. Sensors 114 , 116 on the headset 120 and controllers 130 detect transmit to the processor 112 when (e.g., a specific time) the laser sheet passes various points on the headset 120 and/or the controller 130 . The processor 112 , utilizing a time the various points were detected, triangulates position and orientation of the controller 130 and/or headset 120 from the times.
- the sensor 114 of the headset 120 e.g., an Oculus Rift S of an Oculus Quest headset
- the locator 142 comprises one or more cameras that detect lights that are projected from the headset 120 and controllers.
- the headset 120 outputs headset motion data to the processing device 110 (e.g., via the locator 142 , and/or motion sensors) and the processor 112 of the processing device instruct the headset to display images on the user display 122 that correlate the headset motion data (e.g., the user turns their head left, and the display alters to show a volume leftward of the user's original gaze).
- the processing device 110 e.g., via the locator 142 , and/or motion sensors
- the processor 112 of the processing device instruct the headset to display images on the user display 122 that correlate the headset motion data (e.g., the user turns their head left, and the display alters to show a volume leftward of the user's original gaze).
- such instructions are non-transitory, such as computer readable media, that can be transmitted to the devices of the system 100 to be processed on the respective processor of the respective devices 120 , 130 .
- the controller 130 communicates with the processing device 110 , the locators 142 , and/or the headset 120 via any wireless standard and/or is in wired communication with the processing device 110 .
- handheld motion sensor includes sensors 116 located on the controller 130 , and/or sensible elements that are sensed by other devices, such as the locator 142 .
- the processor 112 instructs that the icon 302 be shown on the user display 122 (see, for example, FIG. 3 B- 3 E ).
- the icon 302 comprises a hand, and/or hands, which mimic the user's hands in the virtual space or virtual world.
- the sensor 116 of the controller is activated to detect the user's hand motion.
- the sensor 116 may be detected by the locators 142 .
- the user's hand motions, including lateral, longitudinal, rotational, axial, etc. is detected by the sensor 116 .
- the sensor 114 of the headset 120 remains active while the user is in the virtual space.
- the processor 112 designates wheel activation volumes 324 , 326 , 328 , 330 and an oil activation volume 344 (see FIGS. 3 F, 3 A ) in Cartesian coordinate systems corresponding to the focus icon 304 , the lateral icon 306 , the longitudinal icon 308 , the objective icon 310 and/or the oil dropper 334 .
- the wheel activation volumes 324 , 326 , 328 , 330 comprise three dimensional spatial coordinate volumes radiating out along x, y, and z axes (hereinafter “volume” unless otherwise defined) from a central location (coordinates 0,0,0) wherein the respective icon is located, or a center point of the respective icon.
- the processor 112 calculates a slide image and a bacteria image utilizing Equations 3 and 4, below.
- position information e.g., input using the lateral or longitudinal wheels 316 , 318 , discussed below
- the processor responsive to the sensor 116 detecting motion in the lateral wheel volume 326 , the processor generates a lateral wheel 316 and instructs the user display 122 to display the lateral wheel 316 (see FIG. 3 C ).
- the processor 112 instructs the user display 122 to continue to display the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the lateral wheel 316 .
- the processor 112 instructs the user display 122 to remove the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the lateral wheel 316 .
- the processor 112 instructs the user display 122 to revert to displaying the focus icon 304 , the lateral icon 306 , the longitudinal icon 308 , the objective icon 310 and/or the oil dropper 334 .
- the processor 112 Responsive to the movement of the microscope view 330 being outside a single quadrant 342 a , 342 b , 342 c , 342 d of the slide 342 , the processor 112 does not generate or instruct the user display 122 to display the sample identification label.
- the slide_image_z and the bacteria_image_z are calculated utilizing a function zoom, which scales the image based on the current objective level about the current stage_y and stage_x locations
- Stage_y is the same variable as described above with regard to Equations 6 and 7.
- Stage_x remains constant, and is dependent upon movement of the longitudinal wheel 318 .
- the objective_magnification corresponds to the current objective set on the microscope, and is 10.0 for 10 ⁇ , 40.0 for 40 ⁇ , 75.0 for 75 ⁇ , and 100.0 for 100 ⁇ .
- the sensor 116 detects motion in a longitudinal wheel volume 328 defined by the longitudinal icon 308 ( FIGS. 3 A, 3 F ).
- the sensor 116 detects motion in the focus wheel volume 324 and/or the lateral wheel volume 326 prior to the longitudinal wheel volume 328 , and the processor stores and instructs the user display 122 to present an altered microscope view based upon method 400 steps 440 - 445 and/or 460 - 476 (e.g., any focus or lateral inputs by the user are reflected in the microscope view).
- the processor 112 instructs the user display 122 to display the initial microscope view 330 .
- the sensor 116 detects motion in a longitudinal wheel volume 328 defined by the longitudinal icon 308 ( FIGS. 3 A, 3 D ).
- the processor responsive to the sensor 116 detecting motion in the longitudinal wheel volume 328 , the processor generates a longitudinal wheel 318 and instructs the user display 122 to display the longitudinal wheel 318 (see FIG. 3 D ).
- the virtual reality system 100 determines if the user is interacting with the tactile element 118 while the sensor 116 detects motion within the longitudinal wheel coordinate system.
- the processor 112 instructs the user display 122 to maintain presentation of the longitudinal wheel 318 .
- the processor 112 instructs the user display 122 to continue to display the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the longitudinal wheel 318 .
- the processor 112 instructs the user display 122 to remove the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the longitudinal wheel 318 .
- the processor 112 instructs the user display 122 to revert to displaying the focus icon 304 , the lateral icon 306 , the longitudinal icon 308 , the objective icon 310 and/or the oil dropper 334 (see FIG. 3 A ).
- the processor generates and instructs the user display 122 to continue to display the initial microscope view 330 (see FIG. 3 L 1 ) and/or an altered microscope view (see FIGS. 3 M 1 - 3 M 4 ) based upon user interaction with other icons (see FIG. 3 A , and FIGS. 3 K- 3 N ).
- the sensor 116 detects a velocity/degree of rotation of a motion within the longitudinal wheel 318 coordinate system.
- the processor 112 instructs the user display 122 to show the longitudinal wheel 318 rotating and generates and instructs the user display 122 to show a microscope view 330 based upon the velocity/degree of rotation of the motion.
- the processor 112 instructs the user display 122 to show the longitudinal wheel 318 remaining stationary and instructs the user display 122 to continue to show the microscope view 330 displayed at step 480 .
- the processor 112 also instructs the user display 122 to show a microscope stage 343 and slide 342 moving left or right (see FIGS.
- the processor 112 calculates a longitudinal movement of the microscope view based upon the rotation of the longitudinal wheel 318 and instructs the user display 122 to display images of the slide 342 in the microscope view moving a calculated longitudinal distance through the following equations.
- the processor 112 calculates a longitudinal movement of the microscope view (e.g., translated images slide_image_t and bacteria_image_t) utilizing Equations 6 and 7 above.
- stage_x is a variable that depends on the velocity of the motion of the longitudinal wheel 318 .
- Stage_y remains constant, and is dependent upon movement of the lateral wheel 316 .
- Stage_x is the same variable as described above with regard to Equations 6 and 7.
- Stage_y remains constant, and is dependent upon movement of the lateral wheel 316 .
- the processor 112 calculates the slide_image_b and bacteria_image_b using Equations 3 and 4 above, then calculates the Microscope View 330 utilizing Equations 5, 10, and 11 above, and instructs the user display 122 to display the microscope view 330 that corresponds to the longitudinal movement instigated by the motion of the user and the longitudinal wheel 318 .
- the processor 112 calculates the slide_image_b and bacteria_image_b using Equations 3 and 4 above, then calculates the Microscope View 330 utilizing Equations 5, 10, and 11 above, and instructs the user display 122 to display the microscope view 330 that corresponds to the longitudinal movement instigated by the motion of the user and the longitudinal wheel 318 .
- the longitudinal wheel 318 has been utilized by the user to generate an altered microscope view 330 wherein the slide 342 has been moved along direction B, such that only side portions of quadrants 342 a , 342 c are visible.
- the longitudinal wheel 318 has been utilized by the user to generate an altered microscope view 330 wherein the slide 342 has been moved along direction D a maximum distance.
- the longitudinal wheel 318 has been utilized by the user to generate an altered microscope view 330 wherein the slide 342 has been moved along direction B a maximum distance.
- the method 400 returns to 478 , wherein the user may proceed with adjusting the longitudinal wheel 318 again or by proceeding with steps 479 , 430 , 432 , and 434 the user may proceed to the focus wheel 314 (steps 436 - 454 ) or the lateral wheel 316 (steps 460 - 476 ), or may proceed to objective wheel 320 or oil dropper 334 .
- the sensor 116 detects motion in an objective wheel volume 332 defined by the objective icon 310 (see FIGS. 3 A, 3 F ).
- the processor responsive to the sensor 116 detecting motion in the objective wheel volume 332 , the processor generates an objective wheel 320 and instructs the user display 122 to display the objective wheel 320 (see FIG. 3 E ).
- the virtual reality system 100 determines if the user is interacting with the tactile element 118 while the sensor 116 detects motion within the objective wheel coordinate system.
- the processor instructs the user display 122 to maintain presentation of the objective wheel 320 .
- the processor 112 instructs the user display 122 to continue to display the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the objective wheel 320 .
- the processor 112 instructs the user display 122 to remove the icons 304 , 406 , 308 , 310 , and/or the oil dropper 334 when displaying the objective wheel 320 .
- the processor 112 instructs the user display 122 to revert to displaying the focus icon 304 , the lateral icon 306 , the longitudinal icon 308 , the objective icon 310 , and/or the oil dropper 334 .
- the processor generates and instructs the user display 122 to continue to display the initial microscope view 330 and/or an altered microscope view based upon user interaction with other icons (see FIG. 3 A , and FIGS. 3 K- 3 N ).
- the sensor 116 detects a motion within the objective wheel coordinate system.
- the processor 112 instructs the user display 122 to show the objective wheel 320 and lenses 330 A- 330 C (see FIGS. 3 G- 3 H ) rotating, and the processor generates and instructs the user display 122 to show a microscope view 330 based upon the degree of the rotation motion.
- FIGS. 3 G- 3 H the processor generates and instructs the user display 122 to show a microscope view 330 based upon the degree of the rotation motion.
- a first lens 330 A corresponds to a first objective level (e.g., a second lens 330 B corresponds to a second objective level (e.g., 40 ⁇ ), and a third lens 330 C corresponds to a third objective level (e.g., 100 ⁇ ). It would be appreciated by one having ordinary skill in the art that more or fewer lenses with differing objective levels (e.g. 55 ⁇ , 75 ⁇ , etc.) may be present.
- the processor 112 instructs the user display 122 to show the objective wheel 320 and lenses 330 A- 330 C (see FIGS.
- the processor 112 determines the selected objective level based upon the degree of motion and instructs the user display 122 to display the corresponding lens 330 A- 330 C over the slide 342 .
- the processor 112 accesses memory to determine if oil is present on the slide 342 . Responsive to oil being present, the processor 112 sets variable no_oil_multiplier to 1.0 and the processor 112 proceeds to step 497 illustrated at section line G-G on FIG. 4 E . At 499 B, responsive to oil not being present, the processor 112 sets variable no_oil_multiplier to 0.9 and generates and instructs the user display 122 to display an oil warning 333 (see FIG. 3 H 1 ).
- the oil warning 333 is presented to the user on the user display 122 , based upon the processor's 112 instruction as the user is attempting to switch the objective level to 100 ⁇ (e.g., before the microscope objective is altered to 100 ⁇ ).
- the oil warning 333 comprises text stating oil is needed.
- the oil warning 333 includes bringing attention to an oil dropper 334 (see FIGS. 3 A, 3 H ), such as by providing attention attracting lights, movement, sound, etc. to the oil dropper.
- the processor 112 instructs that oil be permitted to be applied to the slide 342 .
- No_oil_multiplier is set to 1.0 upon switching from 100 ⁇ .
- the processor 112 responsive to the user changing the objective level from 100 ⁇ , the processor 112 generates instructions (e.g., text, visual demonstration, etc.) to apply oil to the slide 342 .
- the processor 112 responsive to the user not changing the objective level from 100 ⁇ , the processor 112 generates instructions (e.g., text, visual demonstration, etc.) to alter the objective level.
- the processor 112 prohibits the addition of oil if objective is 100 ⁇ , as there would be no room to add oil when the 100 ⁇ objective is in place over the slide 342 .
- the processor 112 instructs the user display 122 to display the icon retaining the oil dropper 334 .
- the oil dropper activation volume 344 is the same or similar to the wheel activation volumes, wherein the dropper bottle comprises the coordinate points (0,0,0).
- the oil dropper 334 being retained by the icon 302 comprises the oil dropper moving with the icon within the user display 122 .
- the processor 112 instructs the user display 122 to display the oil dropper 334 retaining the oil (e.g., the oil dropper will not drip oil outside of the dropper activation volume 346 ).
- the dropper release volume 446 is any area directly over the slide (e.g. any plane that resides over the boundaries of the slide 342 ).
- the processor 112 instructs the user display 122 to display the oil dropper 334 dripping oil onto the slide 342 , and indicating the oil is present.
- the processor 112 calculates a slide blurriness and a bacteria blurriness utilizing Equations 1 and 2 above.
- the slide blurriness and the bacterial blurriness are calculated utilizing the variable no_oil_multiplier wherein oil is added and present in step 499 c , wherein the oil being present equals 1, and no oil is added wherein at 495 the selected objective level was below 100 ⁇ .
- the processor 112 calculates the slide image and the bacteria image utilizing Equations 8 and 9, then 3 and 4, above.
- the processor 112 calculates and generates an altered microscope view based upon Equation 5, 10, and 11 above, by multiplying three (3) images together, and then setting the alpha channel of the result to the vignette mask, thereby generating the microscope view 330 having the indicated objective level.
- the objective wheel 320 has been utilized by the user to generate an altered microscope view 330 wherein the slide 342 has been magnified to 100 ⁇ , such that only a single quadrant 342 a - 342 d is visible.
- the method 400 returns to 489 , wherein the user may proceed with adjusting the objective wheel 320 again or by proceeding with steps 490 , 430 , 432 , and 434 the user may proceed to the focus wheel 314 (steps 436 - 454 ), the lateral wheel 316 (steps 460 - 476 ), or the longitudinal wheel 318 (steps 478 - 488 ), or the oil dropper 334 .
- the interactive microscope simulation 300 allows users to train on a microscope without damaging an expensive microscope, and without having to provide the user access to a microscope, when microscopes may be in short supply. Further, the user is presented with the fundamental aspects of the microscope, and the physics of said microscope which are ideal for learning how to operate said microscope in the real world. Additionally, the wheel activation volumes provide the ability for the user to use large motions with the controller 130 to adjust the microscope rather than through the adjustment of small knobs, which could require fine motor skills and would be an impediment to learning.
- a plate view 530 includes an enlarged streaking plate 520 having one or more defined areas.
- the enlarged streaking plate 520 defines four areas/quadrants 520 A- 530 D, wherein the initial plate view 530 illustrates a first quadrant 520 A at top portion farthest from the user, a second quadrant 520 B at right portion relative to the user, a third quadrant 520 AC at bottom portion nearest to the user, and/or a fourth quadrant 520 D at left portion relative to the user.
- the initial plate view 530 comprises the view prior to user input, and subsequent altered initial plate views comprise the views including the user inputs.
- the processer 112 generates and instructs the user display 122 to display the source plate 504 , the streaking plate 520 , the loop 506 , and/or the heating element 508 having the first indicator (see, for example, FIGS. 5 A- 5 C ). Steps 604 - 610 may be performed in any order, and/or may be performed simultaneously.
- the processor 112 designates a first portion 532 a or a second portion 532 b of a streaking plate volume 532 (see, for example, FIGS. 5 D- 5 H ), a source plate volume 534 (see, for example, FIG. 5 I ), a loop activation volume 538 (see, for example, FIG. 5 C ), and/or a heating activation volume 540 (see, for example, FIG. 5 C ), in Cartesian coordinate systems corresponding to the source plate 504 , the streaking plate 520 , the loop 504 and/or, the heating element 508 , respectively.
- the processor 112 receiving the signal the controller 130 is within the first portion 532 a , instructs the user display 122 to display a streaking lifting arrow 528 .
- the processor 112 receiving the signal the controller 130 is within the second portion 532 b , instructs the user display 122 to display streaking rotation arrows 526 .
- the processor 112 receiving the signal the controller 130 is within a source plate volume 534 , instructs the user display 122 to display source lifting arrow 536 .
- the processer 112 instructs the user display 122 to display, respectively, one of the streaking lifting arrow 528 , the streaking rotation arrow 526 , and/or the source lifting arrow 536 responsive to the sensor 116 sending a signal to the processor that the controller 130 is within one of the first portion 532 a , the second portion 532 b , or the source plate volume 534 .
- the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 542 comprise three dimensional volumes radiating out along x, y, and z axes from a central location (coordinates 0,0,0) wherein the respective icon (e.g., streaking plate 520 , source plate 504 , loop 506 , heating element 508 ) is located, or a center point of the respective icon.
- the respective icon e.g., streaking plate 520 , source plate 504 , loop 506 , heating element 508
- the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 542 extend between 1 inch to about 7 inches along the x axis, between 1 inch to about 7 inches along the y axis, and/or between 1 inch to about 7 inches along the z axis, wherein the volume defined within comprises the respective activation volumes. Inches in vertical space is based upon perceived distance by the user. At 612 the sensor 116 detects motion.
- the sensor 116 detects motion in the loop activation volume 538 defined by the loop 506 ( FIG. 5 C ).
- the processor 112 instruct the user display 12 to display a user icon 502 (e.g., same or similar to the user icon 302 described above) holding the loop 506 .
- the user icon 502 retains the loop 504 until the processor 112 receives a signal that the user, holding the loop, is interacting with the tactile element 118 while the sensor 116 detects motion within the loop activation volume 538 , wherein the processor instructs the user display 122 to show the user relinquishing the loop and the loop returns to its initial position (see FIG. 5 B ).
- the processor 112 disallows the user holding the loop 504 or other element with the user icon 502 from picking up any additional items, until the respective item is put down.
- the processor instructs the user display 122 to maintain the initial streaking view 530 .
- the sensor 116 may detect motion in any of the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 542 .
- the sensor 116 detects motion in the heating activation volume 540 defined by the heating element 508 ( FIG. 5 C ).
- the processor 112 responsive to the virtual reality system 100 having stored that the user icon 302 is holding the loop 504 , the processor 112 generates and instructs the user display 122 to display the loop 504 interacting with the heating elements 508 (see FIG. 5 J ).
- loop sterilization process 600 a As continued in example method 600 in FIG. 6 B is loop sterilization process 600 a .
- the loop sterilization process 600 a is continued from section line A-A of FIG.
- the processor generates a first phase 507 a of a loop heating indication (e.g., color or texture change, and/or text notice of heating) responsive the sensor 116 detecting the loop 504 in the heating volume 540 for a first heating duration (see FIG. 5 K ).
- a loop heating indication e.g., color or texture change, and/or text notice of heating
- the first heating duration is between 1 seconds and 4 seconds.
- the first heating duration is 2 seconds.
- the first phase 507 a corresponds to a loop temperature, wherein a number from zero to one is assigned to the loop. Zero represents room temperature, while one corresponds to when the loop is in the first phase 507 a .
- the processor 112 instructs the user display 122 to revert to displaying an initial indication phase of the heating element 508 (see FIG. 5 C ) and the processor 112 assigns the loop 506 a null bacteria concentration (e.g., a concentration of 0).
- the processor 112 instruct the user display 122 to display a first phase cooling indication change 507 b (see FIG. 5 M ) (e.g., color or texture change, and/or text notice of cooling) after a first cooling duration.
- the first cooling duration is between 1 seconds and 2 seconds. In another example embodiment, the first cooling duration is 1.5 seconds.
- the loop 504 changes color from a hot color, such as bright white, yellow, to a relatively less bright yellow or orange.
- the processor 112 instruct the user display 122 to display a second phase cooling indication change 507 c (see FIG. 5 N ) (e.g., color or texture change, and/or text notice of cooling) after a second cooling duration.
- the second cooling duration is between 1 seconds and 4 seconds.
- the first cooling duration is 3.5 seconds.
- the loop 504 changes color from a warm color, such as the relatively less bright yellow or orange, to an even less bright red or orange.
- the processor 112 instruct the user display 122 to display the source lifting arrow 536 (see FIG. 5 I ). Absent the virtual reality system 100 receiving a signal that the user has activated the tactile element 118 while the sensor 116 detects a continued presence of the controller 130 within the source volume 534 , the processor 112 instructs the user display 122 to continue to display the source lifting arrow 536 .
- the processor 112 instruct the user display 122 to display the user icon 502 holding the cap 510 (see FIG. 5 P ).
- the processor 112 instruct the user display 122 to display the user icon 502 holding the cap 510 .
- the cap lifting duration is between 1 second to 5 seconds.
- the processor 112 responsive to the virtual reality system 100 receiving a signal that the user has continued to actuate the tactile element 118 while the sensor 116 detects a motion of the controller 130 away from the source volume 534 over a velocity threshold, instruct the user display 122 to display the user icon 502 holding the cap 510 .
- the velocity threshold is between 0.1 foot per second to about 1.5 feet per second. It would be understood by one of ordinal skill in the art that any combination of signals described above could be utilized by the processer 112 to generate instructions to remove the cap 510 .
- the processor 112 instructs the user display 122 to display the cap 510 as minimized.
- the processor 122 instructs the user display 122 to display a contamination warning (e.g., text, visual, and/or audio).
- a contamination warning e.g., text, visual, and/or audio
- the contamination threshold angle 510 a is between 0 degrees to 89 degrees from the x-axis of FIG. 5 P .
- bacteria in the air may use gravity to land on the inside of said cap 510 , which will cause contamination within the source plate 504 .
- the processor 112 responsive to the processor 112 having the stored memory of the user icon 502 holding the loop 504 , the stored memory that the loop 504 has interacted with the heating element 508 over the heating element threshold, and the stored memory that the loop 504 has cooled from interaction with the heating element 508 over the cool threshold, the processor 112 enables access to a top surface 505 supporting one or more bacteria colonies 504 a of the source plate 520 (see FIG. 5 C ).
- the processor 122 receives information from two tactile elements 118 and at least two sensors 116 , wherein the processor identifies a location of the first controller 130 , and designates a first user icon 502 a as corresponding to the first controller 130 , and a second user icon 502 b as corresponding to the second controller 130 , wherein the first user icon 502 a may be holding the loop 504 and the second user icon 504 b may interact with the cap 510 , the streaking cap 524 , the source lifting arrow 534 , streaking lifting arrow 528 , the streaking rotation arrow 526 , the cap 510 , and/or the streaking cap 524 (see FIGS. 5 P - 5 P 1 ).
- the processor 112 instruct the user display 122 to display the user icon 502 holding the cap 510 (see FIG. 5 P ).
- the processor 112 instruct the user display 122 to display the user icon 502 recapping the source plate 504 with the cap 510 (see FIG. 5 C ).
- the processor 112 instruct the user display 122 to display the user icon 502 recapping the source plate 504 with the cap 510 .
- the sensor 116 detects motion in the second portion 532 b of streaking plate volume 532 defined by the streaking plate 520 ( FIG. 5 H ).
- FIG. 6 E at section line F-F, is streaking plate rotation process 600 b .
- the streaking plate rotation process 600 b begins at 663 , as illustrated.
- the processor 112 instructs the user display 122 to continue to display the streaking plate 520 (see FIG. 5 C ).
- the sensor 116 may detect motion in any of the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 532 .
- the processor 112 instructs the user display 122 to display the streaking rotation arrows 526 (see FIG. 5 H ).
- the processor 112 instructs the user display 122 to display the streaking plate 520 rotating based upon a sensed degree of motion (see FIG. 5 H ).
- the sensed degree of motion is determined based upon a rotational movement of the controller 130 , wherein a 30-degree rotational movement by the user causes the processor 112 to instruct the user display 122 to display the streaking plate 520 rotating 30 degrees (e.g., a 1:1 relationship, although 1:2, 1:3, etc. relationships are contemplated). It would be understood by one having ordinary skill in the art that a combination of sensed speed and sensed degree of rotation may be used to determine a degree of rotation of the streaking plate 520 .
- the processor 112 instruct the user display 122 to maintain the initial view of the streaking plate 520 (see FIG. 5 H ).
- the sensor 116 detects motion in the first portion 532 a of streaking plate volume 532 defined by the streaking plate 520 ( FIG. 5 E ).
- a streaking plate cap removal process 600 c is a streaking plate cap removal process 600 c .
- the streaking plate cap removal process 600 c begins at 668 , as illustrated.
- 668 as illustrated in FIG.
- the processor 112 instructs the user display 122 to continue to display the streaking plate 520 (see FIG. 5 C ).
- steps 617 and 612 as illustrated in FIG.
- the processor 112 instructs the user display 122 to display the initial streaking view 530 , and the sensor 116 may detect motion in any of the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 542
- the processor 112 instruct the user display 122 to display the streaking lifting arrow 528 (see FIGS. 5 F- 5 G ).
- the processor 112 instruct the user display 122 to continue displaying the streaking lifting arrow 528 (see FIGS. 5 F- 5 G ).
- the processor 112 instruct the user display 122 to display the user icon 502 b holding the streaking cap 524 (see FIG. 5 P 1 ).
- the processor 112 instruct the user display 122 to cease the display of the streaking lifting arrow 528 .
- the first user icon 502 a is holding the loop 506 while the second user icon 502 b is interacting with the streaking cap 524 .
- the second user icon 502 b may hold the loop 504 , and/or the first user icon 502 a could be interacting with the streaking cap 524 , and further that one user icon need not be holding the loop for the streaking cap 524 to be removable.
- the processor 122 instructs the user display 122 to display the contamination warning (e.g., text, visual, and/or audio) (see FIG. 5 P 1 ).
- the contamination warning e.g., text, visual, and/or audio
- the processor 112 instructs the user display 122 to display the streaking cap 524 as minimized.
- the loop 504 remains stationary, and no instruction is generated.
- the processor 112 responsive to the processor 112 lacking a stored memory that the loop 504 has interacted with the heating element 508 over the heating element threshold (e.g., the loop has been assigned a bacteria count of 0), the processor generates an instruction to heat the loop (e.g., text, visual, and/or audio). In one example embodiment, no instruction is generated.
- the processor responsive to the processor 112 lacking a stored memory that the loop 504 has cooled from interaction with the heating element 508 over the cool threshold, the processor generates an instruction to cool the loop (e.g., text, visual, and/or audio). In one example embodiment, no instruction is generated.
- the method 600 continues to 617 by section line E-E in FIG. 6 A , wherein the processor instructs the user display 122 to maintain the initial streaking view 530 (e.g., wherein the initial streaking view 530 includes past user inputs).
- the sensor 116 may detect motion in any of the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 532 .
- the processor 112 responsive to the processor 112 having the stored memory of the user icon 502 holding the streaking cap 524 , the user icon 502 holding the loop 504 , the stored memory that the loop 504 has interacted with the heating element 508 over the heating element threshold, the stored memory that the loop 504 has cooled from interaction with the heating element 508 over the cool threshold, and the sensor 116 detects motion within a surface volume 522 of the streaking plate 520 , the processor 112 generates and instructs the user display 122 to display a line pattern 554 of interaction with loop 504 with a surface 523 of the streaking plate 520 using series of waypoints 550 (see FIG. 5 R, 5 Q ).
- the processor 112 allows user interaction with the streaking plate 520 regardless of the presence of stored memory that the loop 504 has interacted with the heating element 508 over the heating element threshold, and/or the stored memory that the loop 504 has cooled from interaction with the heating element 508 over the cool threshold. Further, the processor 112 enables access to the top surface 523 of the streaking plate 520 that will support bacterial growth of the streaking plate 520 (see FIG. 5 P 1 ).
- the processor 112 continues the method 600 to 617 by section line E-E in FIG. 6 A , wherein the processor instructs the user display 122 to maintain the initial streaking view 530 .
- the sensor 116 may detect motion in any of the volumes 532 A, 532 B, 534 , 538 , 540 , and/or 542 .
- the processor 112 stores the pattern of interaction with the surface 523 in memory, and displays the pattern as it is being generated by the user input.
- the processor 112 stores a pattern of interaction with the surface 523 in memory for a colony growth process.
- the processor generates waypoints 550 at a predetermined generation distance 556 , wherein a series of waypoints 552 are connected to form a first pattern 554 (see FIGS. 5 Q- 5 S ).
- the processor 112 assigns a bacterial level/concentration to each waypoint 550 .
- the path of the loop 506 on the surface 523 of the streaking plate 520 is represented as the series of waypoints 552 connected to form a line. As illustrated in the example embodiment of FIG.
- an additional waypoint 550 a is generated by the processor 112 responsive to the loop 504 moving a distance greater than predetermined generation distance 556 (e.g., 5 millimeters) from a previous waypoint 550 h . If a waypoint 550 is a first waypoint of a pattern, the bacterial level or concentration on the loop 504 remains at the initial loop bacteria level value (e.g., 200). The sequential waypoints are connected together to form lines that form the first pattern 554 .
- predetermined generation distance 556 e.g., 5 millimeters
- the initial concentration is 200 (200 is also the maximum concentration), wherein the concentration value of a second waypoint would equal 200 times the constant This progresses while additional waypoints 550 are being generated.
- that waypoint is assigned a number of bacteria equal to the loop's 504 bacteria level (e.g., the loop's bacteria level is 200 after interacting with a bacteria colony 504 a of the source plate 504 ).
- the processor 112 decreases the assigned bacteria concentration on the loop 504 as the pattern 554 is formed.
- bacteria would strictly transfer from the loop 504 , advantageously, modeling the distribution bacteria as described above, proves a more illustrative simulation of the real-world results of good and poor technique.
- the processor 112 terminates an initial line 554 a of pattern 554 and begins new line 554 b of first pattern 554 (see FIG. 5 S )(Note FIG. 5 S illustrates bacterial growth, however the line 554 shape and terminations are the same whether bacteria is present or not).
- the contact threshold is 0.1 seconds. In another example embodiment, the contact threshold is between 0.005-1.0 seconds.
- the processor 112 continues to form the line of first pattern 554 (see FIG. 5 T ).
- section line J-J in FIG. 6 H illustrated in FIG. 6 G as continuing from section line J-J or 679 a , at 680 , responsive to the processor 112 receiving a signal from the controller 130 that the user is continuing interaction with the tactile element 118 or the sensor 116 detects motion or presence within the first portion 532 a volume of the streaking plate 520 , the processor instructs the user display 122 to continue to display the streaking cap 524 being held by the user icon 502 .
- the processor 112 responsive to the processor 112 receiving a signal from the controller 130 that the user ceased interaction with the tactile element 118 while the sensor 116 detects motion or the user presence within the first portion 532 a volume of the streaking plate 520 , the processor instructs the user display 122 to display the streaking cap 524 recapping the streaking plate 520 .
- the processor 112 receives inputs from the loop sterilization process 600 a illustrated in FIG. 6 B .
- the processor 112 receives optional inputs from the streaking plate rotation process 600 b illustrated in FIG. 6 E .
- the processor 112 receives inputs from the streaking plate cap removal process 600 c illustrated in FIG. 6 F .
- the user icon 502 continues to hold the streaking cap 524 .
- the processor 112 receives inputs from the loop validation process 600 d illustrated in FIG. 6 F .
- the processor 112 receives inputs from the sensor 116 generated at 678 .
- the processor 112 stores the pattern of interaction with the surface 523 in memory for colony growth process.
- the processor 112 receives input form the sensor 116 that the loop 504 is interacting with one or more lines of the first pattern 554 and/or intermediate patterns 558 , 562 .
- the bacteria concentration assigned to the loop 504 will decrease as a line of the first pattern 554 is drawn.
- the processor 112 generates waypoints 550 at the predetermined distance, wherein the series of waypoints of the intermediate/second and/or third patterns 558 , 562 , and/or the final pattern 566 are connected to form a line.
- the processor 112 assigns a bacteria level to each waypoint based upon the interaction of the loop 504 with the one or more waypoints 550 of first, and/or intermediate patterns 554 , 558 , 562 .
- the bacteria concentration assigned to the loop will be increased by an amount proportional to the difference between the assigned loop concentration at the most recently formed waypoint 550 and the waypoint of the existing pattern 558 , 562 , 566 nearest the overlap 556 , 560 , 564 of the existing pattern and the pattern being formed (see FIG. 5 T ). As illustrated in the example embodiment of FIG.
- the processor 112 updates the assigned bacteria level of progressive waypoints 550 based on a number of overlaps of waypoints of first, and/or intermediate patterns 554 , 558 , 562 based upon Equations 13, 14, and/or 15, below.
- the bacterial concentration assigned to the loop 504 is based upon a pattern overlap, wherein the waypoint 550 in the existing pattern that is overlapped has an assigned bacterial concentration greater than the bacterial concentration assigned to the loop at the location of the overlap,
- the assigned bacteria concentration based upon said overlaps is calculated by Equations 13, 14, and 15 below.
- Concentration_0 is equal to the concentration or bacteria level assigned to the loop 504 at the location of the overlap, as calculated by Equation 12 above.
- Alpha is a constant used in the fuzzy math formula presented in Equation 14. Alpha is a constant greater than zero.
- steps 686 and 687 are repeated to generate the pattern (e.g., with or without broken lines).
- steps 686 and 687 are repeated to generate the pattern (e.g., with or without broken lines).
- the processor 112 responsive to the processor 112 receiving the signal from the controller 130 that the user is continuing interaction with the tactile element 118 or the sensor 116 detects motion or the user presence within the first portion 532 a volume of the streaking plate 520 , the processor instructs the user display 122 to continue to display the streaking cap 524 being held by the user icon 502 .
- the processor 112 responsive to the processor 112 receiving a signal from the controller 130 that the user ceased interaction with the tactile element 118 while the sensor 116 detects motion or the user presence within the first portion 532 a volume of the streaking plate 520 , the processor instructs the user display 122 to display the streaking cap 524 recapping the streaking plate 520 .
- the processor 112 receives inputs from the loop sterilization process 600 a illustrated in FIG. 6 B .
- the processor 112 receives optional inputs from the streaking plate rotation process 600 b illustrated in FIG. 6 E .
- the processor 112 receives inputs from the streaking plate cap removal process 600 c illustrated in FIG. 6 F .
- the processor 112 receives inputs from the loop validation process 600 d illustrated in FIG. 6 F .
- the processor 112 responsive to the processor 112 having a stored memory that the final pattern 566 is present on the surface 523 of the streaking plate 520 , the processor 112 generates cellular growth based upon the assigned concentrations/bacteria levels of the waypoints 550 of the first, intermediate, and final pattern 554 , 558 , 562 , 566 (see FIG. 5 T ). It would be appreciated by one having ordinary skill in the art that in one example embodiment, the final pattern 566 may be any pattern that extends into the fourth quadrant 520 d (see, for example, FIG.
- the first pattern may also comprise the final pattern responsive the pattern initiating in the first quadrant 520 a and ending in the fourth quadrant 520 d .
- the first pattern 554 is any pattern generated during the first streaking cap 524 removal and replacement, wherein after loop sterilization, the second pattern 558 is any pattern generated during a second streaking cap 524 removal and replacement, etc.
- the processor 112 assigns first and second rectangles 558 a , 558 b (see FIG. 5 U ) to connect adjacent waypoints 550 a , 550 b , 550 c .
- first rectangles 558 a , 558 b are calculated for each waypoint 550 on a line of a pattern 554 , 558 , 562 , 566 .
- the first rectangle 558 a extends from the waypoint 550 h to halfway to a previous waypoint 550 c .
- the second rectangle 558 b extends from the waypoint 550 b to halfway to a next waypoint 550 a .
- the processor 112 calculates the cellpotential, based upon Equation 16 below, of the first rectangle 558 a and the second rectangle 558 b of the waypoints 550 .
- cellPotential waypointConcentration*330*vMagnitude/0.005 Equation 16
- the Magnitude in Equations 17 and 18 is the Euclidian distance of a three-dimensional vector with components x,y,z, defined as sqrt(x ⁇ circumflex over ( ) ⁇ 2+y ⁇ circumflex over ( ) ⁇ 2+z ⁇ circumflex over ( ) ⁇ 2).
- the currentWaypointPosition in both Equations 17 and 18, is the location of the waypoint 550 b expressed as a three-dimensional vector relative to the coordinate system of the streaking plate 520 .
- the previousWaypointPosition in Equation 17 is the location of the previous waypoint 550 c expressed as a three-dimensional vector realative to the coordinate system of the streaking plate 520 .
- nextWaypointPosition in Equation 17 is the location of the next waypoint 550 a expressed as a three-dimensional vector relative to the coordinate system of the streaking plate 520 .
- steps 696 - 698 may occur in any order, or simultaneously.
- steps 697 and 698 occur before step 696 .
- the assignment of cells correlates to bacterial growth in the real world.
- the streaking plate 520 must be left to grow for 12-24 hours, at a temperature conducive to bacterial growth, thus the user does not know if they have successfully isolated a bacterial colony until the 12-24 hours has passed.
- the processor 112 may assign a growth time (e.g., the time it takes for colonies to fully grow) that is much shorter than the real-world growth time.
- the growth time is 1 sec to about 4 minutes. In another example embodiment, the growth time is 10 seconds.
- the processor 112 randomly assigns particles 716 , glass shards 715 , and/or fibers 718 to one or more containers 708 , 710 , 712 .
- the processor 112 assigns the particles 716 as a first visual indicator particles 716 A (e.g., white), a second visual indicator particles 716 B (e.g., black), glass shards 715 (see FIG. 7 J ), first visual indicator fibers 718 A (e.g., white), and/or second visual indicator fibers 718 B (e.g., black) to the one or more containers 708 , 710 , 712 (see FIGS. 7 K- 7 L ).
- first visual indicator particles 716 A e.g., white
- a second visual indicator particles 716 B e.g., black
- glass shards 715 see FIG. 7 J
- first visual indicator fibers 718 A e.g., white
- second visual indicator fibers 718 B e.g., black
- a single point mass 742 on the mass spring 740 with the lateral damper 736 is virtually attached to a fixed point 738 in the center of the container 708 , 710 , 712 .
- the processor 112 receives an input from the sensor 116 that the container 708 , 710 , 712 has altered its container orientation 724 (see FIG. 7 A ), a liquid top surface 720 a is generated and drawn perpendicular to a vector drawn from the mass 742 to the center of the fixed point 738 .
- the processor instructs the user display 122 to display the liquid rotating (e.g., swirling), and responsive to the respective container being assigned particles 716 , glass shards 715 , and/or fibers 718 , the user displays any air bubbles, glass shards, fibers, and/or particles contained in the liquid as rotating with the liquid.
- the processor instructs the user display 122 to display the liquid 720 still rotating for a momentum duration.
- the momentum duration is between 1 second and about 4 seconds. Responsive to the respective container being displayed as being held still (e.g., the respective container is static), while the liquid is being displayed as still moving, visibility of particles 716 , glass shards 715 , fibers 718 and/or air bubbles moving relative to the container body is greatly increased. The mechanics of the liquid rotation and display as generated by the processor 112 of particles 716 , glass shards 715 , fibers 718 and/or air bubbles are described below.
- the inversion threshold is a rotation of the respective container over 155 degrees, either clock-wise or counter-clockwise, and returning to a non-inverted state (e.g., an initial orientation).
- Steps 814 , 816 - 820 can be done in any order.
- the processor 112 instructs the user display 122 to display the respective container as inverted.
- the processor 112 instructs the user display to display the liquid 720 in the respective container as inverted.
- the processor 112 stores in memory that the respective container was inverted and stores a time when the inversion occurred, to determine a duration since the inversion.
- the processor 112 Responsive to the respective container being held in front of a background (e.g., first or second background 704 , 706 ) with contrasting color to that of the particle or the fiber, the processor 112 will instruct the user display 122 to display the particle or fiber as visible. For example, the processor 112 would instruct the user display 122 to display the first visual indicator particles 716 A (e.g., white) and the first visual indicator fibers 718 A (e.g., white) as visible in front of the second background 706 (e.g., black) and the second visual indicator particles 716 B (e.g., black), and the second visual indicator fibers 718 B (e.g., black) as visible in front of the first background 704 (e.g., white) (see FIG. 7 T ).
- first visual indicator particles 716 A e.g., white
- the first visual indicator fibers 718 A e.g., white
- the second background 706 e.g., black
- the processor 112 Responsive to the respective container being held in front of a background (e.g., first or second background 704 , 706 ) that lacks a contrasting color to that of the particle or the fiber, the processor 112 will instruct the user display 122 to display the particle or fiber as nearly invisible or invisible.
- a background e.g., first or second background 704 , 706
- the processor 112 would instruct the user display 122 to display the first visual indicator particles 716 A (e.g., white) and the first visual indicator fibers 718 A (e.g., white) as invisible in front of the first background 704 (e.g., white) and the second visual indicator particles 716 B (e.g., black), and the second visual indicator fibers 718 B (e.g., black) as invisible in front of the second background 706 (e.g., black) (see FIG. 7 T ).
- the first visual indicator particles 716 A e.g., white
- the first visual indicator fibers 718 A e.g., white
- the second visual indicator particles 716 B e.g., black
- the second visual indicator fibers 718 B e.g., black
- the processor 112 instructs the user display 122 to display the air bubbles as visible in the liquid 720 when a rotational velocity of the liquid (variable liquid_rotational_velocity calculated below in Equation 22) exceeds a rotational threshold.
- the rotational threshold is between 0.1 to about 1 revolutions per second. In one example embodiment, the rotational threshold is 0.5 revolutions per second. Responsive to the processors 112 determining that the liquid rotational velocity has dropped below the revolution threshold, the processor 112 will instruct the user display 122 to display the air bubbles rising (e.g., in direction indicated by arrow B in FIG. 7 G ) and then disappear after a bubble duration.
- the bubble duration is between 2 seconds to about 6 seconds. In another example embodiment, the bubble duration is about 4 seconds.
- the processor 112 instructs the user display 122 to display the bubbles as visible after inversion over the inversion threshold.
- the processor 112 receives a signal from the sensor 116 that the respective container is being rotated such that liquid_rotation_velocity exceeds the rotational threshold.
- the processor 112 instructs the user display 122 to display the respective container as being rotated.
- the processor 112 instructs the user display 122 to display an inversion warning.
- the processor 112 instructs the user display 122 to display bubbles 721 responsive to the rotation.
- the processor 112 instructs the user display 122 to continue to display the bubbles 721 swirling as indicated by Equations 22-24.
- the processor 112 instructs the user display 122 to display the bubbles 721 traveling in upward direction and disappearing after a bubble duration and instructs particles 716 and/or fibers 718 , when present, to remain suspended.
- the bubbles 721 rotate more slowly as indicated by Equations 22-24 when the rotation speed is decreased.
- the bubbles 721 will slow to a rising revolution speed and start to rise (along direction B)(see FIG. 7 G ).
- the rising revolution speed is between 0.25 to about 0.75 revolutions per second.
- the rising revolution speed is 0.5 revolutions per second.
- the gravity duration is between 2-5 seconds. In another example embodiment, the gravity duration is 4 seconds.
- the image that the processor 112 instructs the user display 122 to display is calculated based upon a container orientation of the respective container and a calculated rotational velocity of the liquid 720 , as illustrated in Equations 23-24 below.
- the Container_orientation is a degree of rotation from 0 degrees about container_up_vector 799 , as illustrated in FIG. 7 S
- the liquid_rotational_velocity is calculated below with to Equation 22, and the speed constant is 1.0, wherein the speed constant is adjustable by the processor 112 , wherein fluids having different viscosities are being displayed.
- Equation 22 provides views of the liquid 720 within the respective container from different angles, whether the controller 130 is rotating the respective container manually about container_up_vector 799 or the liquid is swirling within the respective container.
- the mass's 742 rotational velocity in the container's UP direction is used to generate the liquid_rotational_velocity which is used in a visualization state machine 722 of the processor 112 to simulate swirling particles 716 , fibers 718 and/or air bubbles 721 .
- the liquid_rotational_velocity increases to 60% nearer to a target velocity (e.g., the actual velocity at which the mass 742 of the respective container 708 , 710 , 712 is rotating), thus creating a lag in a ramp up or down of the liquid_rotational_velocity, and simulating liquid with realistic rotational momentum.
- the processor 112 assigns observed or not observed to each container 708 , 710 , 712 .
- a respective container will be assigned as observed by the processor 112 responsive to an input from the sensor 114 that an observation ray 744 is intersecting at least one of the first or second backgrounds 704 , 706 and that the observation ray 744 is within a degree threshold (e.g., within 90 degrees) of the line of sight 743 (horizontally or vertically).
- the observation ray 744 and the line of sight 743 are identified based upon an input orientation of the user display 122 by the sensor 114 .
- the input from the sensors 114 allows peripheral vision of the user to qualify as observed. This design permits users with bi- or trifocals to successfully inspect the containers 708 , 710 , 712 .
- a respective container will be assigned as observed responsive to the input from the sensor 114 that the observation ray 744 is intersecting at least one of the first or second backgrounds 704 , 706 over a viewing duration (e.g., 5 seconds). Responsive to the sensor 114 indicating that the observation ray 744 has left both of the first or second backgrounds 704 , 706 , the processor 112 pauses a timer timing the viewing duration. Responsive to the observation ray 744 having left both of the first or second backgrounds 704 , 706 over an observation threshold (e.g., 1 second) the timer resets, and the full viewing duration will be observed prior to the processor 112 assigning the respective container as observed.
- an observation threshold e.g. 1 second
- Coupled as used herein is defined as connected or in contact either temporarily or permanently, although not necessarily directly and not necessarily mechanically.
- a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Chemical & Material Sciences (AREA)
- Biomedical Technology (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Medicinal Chemistry (AREA)
- Mathematical Optimization (AREA)
- Algebra (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- General Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Analytical Chemistry (AREA)
- Chemical Kinetics & Catalysis (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
slide_blurriness=(1−(1−abs(focus_wheel-SLIDE_FOCUS_LEVEL))*no_oil_multiplier)*objective_level Equation 1:
bacteria_blurriness=(1−(1−abs(focus_wheel-BACTERIA_FOCUS_LEVEL))*no_oil_multiplier)*
TABLE 1 | |||
Constant Name | Assigned Value | ||
SLIDE_FOCUS_LEVEL | .7 | ||
BACTER1A_FOCUS_LEVEL | .85 | ||
OBJECTIVE_LEVEL_10X | .1 | ||
OBJECTIVE_LEVEL_40X | .4 | ||
OBJECTIVE_LEVEL_75X | .8 | ||
OBJECTIVE_LEVEL_100X | 1.2 | ||
TABLE 2 | |||
Image Name | FIG. | ||
Bacteria_image_source | 3O | ||
Slide_image_source | 3P | ||
vignette_mask | 3Q | ||
targeting_crosshair_image | 3R | ||
Kernel_image | 3S | ||
TABLE 3 | |
Variable | type |
no_oil_multiplier | float |
focus_wheel | float |
lateral_wheel | float |
| float |
objective_wheel | |
10X, 40X, 75X, 100X (set of names) | |
slide_blurriness | float |
bacteria blurriness | float |
slide_image_t | image |
slide_image_z | image |
slide_image_b | image |
bacteria_image_t | image |
bacteria_image_z | image |
bacteria_image_b | image |
microscope_view_full | image |
stage_x | float |
stage_y | float |
objective_magnification | Float = {10.0, 40.0, 75.0, 100.0} depending on |
objective | |
objective_level | Float = {.1, .4, .8, 1.2} depending on objective |
slide_image_b=disc_blur(slide_image_z,kernel_image,slide_blurriness)
bacteria_image_b=disc_blur(bacteria_image_z,kernel_image,bacteria_blurriness)
Microscope View_full=slide_image_b*bacteria_image_b*targeting_crosshair_image Equation 5
Microscope_view_full·alpha=vignette mask Equation 10
Microscope View330=Microscope_view_full Equation 11
slide_image_t=translate(slide_image_source,stage_x,stage_y) Equation 6:
bacteria_image_t=translate(bacteria_image_source,stage_x,stage_y) Equation 7:
slide_image_z=zoom(slide_image_t,stage_x,stage_y,objective_magnification) Equation 8:
bacteria_image_z=zoom(bacteria_image_t,stage_x,stage_y,objective_magnification) Equation 9:
Concentration_n=previousWaypointsConcentration*0.9975
Concentration_1=concentration_n+beta*(overlappedSegmentsConcentration−concentration_n)*0.00075 EQUATION 13
beta=2/(1+e{circumflex over ( )}(alpha*(1−r)))−1 EQUATION 14
r=overlappedSegmentsConcentration/concentration_n EQUATION 15
cellPotential=waypointConcentration*330*vMagnitude/0.005
vMagnitude=Magnitude(currentWaypointPosition−previousWaypointPosition)
vMagnitude=Magnitude(currentWaypointPosition−nextWaypointPosition)
averageSpeed=instantaneousSpeed*exponentialAverageCoefficientAlpha+averageSpeed_0*(1−exponentialAverageCoefficientAlpha) Equation 19
movementFactor=averageSpeed/0.02 Equation 20
vibeStrength=baseStrength*
Image_number=Container_orientation/360 degrees*100+liquid_rotational_velocity*speed_constant Equation 23
If Image_number>100,then image_number=0Else if image_number<0then image_number=100 Equation 24
liquid_rotational_velocity=mass_rotational_velocity[dot product] container_up_vector*0.6+liquid_rotational_velocity_0*0.4
Wherein mass_rotational_velocity [dot product] container_up_vector calculates the rotational velocity of the mass about the container_up_vector. 0.6 and 0.4 are constants that affect the ramp up or ramp down speed, and can be adjusted by the
Claims (25)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/278,913 US11854424B2 (en) | 2018-12-10 | 2019-10-14 | Virtual reality simulation and method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862744753P | 2018-12-10 | 2018-12-10 | |
US17/278,913 US11854424B2 (en) | 2018-12-10 | 2019-10-14 | Virtual reality simulation and method |
PCT/US2019/056136 WO2020123026A1 (en) | 2018-12-10 | 2019-10-14 | Virtual reality simulation and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220044593A1 US20220044593A1 (en) | 2022-02-10 |
US11854424B2 true US11854424B2 (en) | 2023-12-26 |
Family
ID=71077466
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/278,913 Active 2040-11-26 US11854424B2 (en) | 2018-12-10 | 2019-10-14 | Virtual reality simulation and method |
US18/387,232 Pending US20240087473A1 (en) | 2018-12-10 | 2023-11-06 | Virtual reality simulation and method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/387,232 Pending US20240087473A1 (en) | 2018-12-10 | 2023-11-06 | Virtual reality simulation and method |
Country Status (7)
Country | Link |
---|---|
US (2) | US11854424B2 (en) |
EP (1) | EP3894997A4 (en) |
JP (1) | JP7402867B2 (en) |
KR (1) | KR20210100089A (en) |
AU (1) | AU2019399480B2 (en) |
CA (1) | CA3115710A1 (en) |
WO (1) | WO2020123026A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115376383B (en) * | 2022-10-26 | 2023-01-17 | 运易通科技有限公司 | Mutual self-adaptation VR glasses interaction device is felt to body |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005134536A (en) | 2003-10-29 | 2005-05-26 | Omron Corp | Work training support system |
US20090177452A1 (en) | 2008-01-08 | 2009-07-09 | Immersion Medical, Inc. | Virtual Tool Manipulation System |
JP2011502300A (en) | 2007-10-26 | 2011-01-20 | キンバリー クラーク ワールドワイド インコーポレイテッド | Virtual reality tools for the development of infection management solutions |
US20110015913A1 (en) * | 2007-06-19 | 2011-01-20 | Kobelco Eco-Solutions Co., Ltd. | Simulation Method, Simulation Apparatus, Biological Treatment Method, and Biological Treatment Apparatus |
US20140315174A1 (en) * | 2011-11-23 | 2014-10-23 | The Penn State Research Foundation | Universal microsurgical simulator |
US20150000025A1 (en) | 2012-06-27 | 2015-01-01 | sigmund lindsay clements | Touch Free Hygienic Display Control Panel For A Smart Toilet |
US20160370971A1 (en) | 2014-09-18 | 2016-12-22 | Google Inc. | Dress form for three-dimensional drawing inside virtual reality environment |
US20180090029A1 (en) | 2016-09-29 | 2018-03-29 | Simbionix Ltd. | Method and system for medical simulation in an operating room in a virtual reality or augmented reality environment |
WO2018106289A1 (en) | 2016-12-09 | 2018-06-14 | Brent, Roger | Augmented reality procedural system |
US11227509B2 (en) * | 2014-12-29 | 2022-01-18 | Help Me See Inc. | Surgical simulator systems and methods |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106085862A (en) * | 2016-06-06 | 2016-11-09 | 中国科学院沈阳应用生态研究所 | A kind of method of fast high-flux screening oil reservoir origin polymer producing strains |
CN107312715B (en) * | 2017-06-15 | 2020-06-19 | 武汉理工大学 | Method for rapidly screening denitrifying phosphorus accumulating bacteria by two-phase method |
-
2019
- 2019-10-14 CA CA3115710A patent/CA3115710A1/en active Pending
- 2019-10-14 KR KR1020217014024A patent/KR20210100089A/en unknown
- 2019-10-14 EP EP19895297.0A patent/EP3894997A4/en active Pending
- 2019-10-14 AU AU2019399480A patent/AU2019399480B2/en active Active
- 2019-10-14 US US17/278,913 patent/US11854424B2/en active Active
- 2019-10-14 WO PCT/US2019/056136 patent/WO2020123026A1/en unknown
- 2019-10-14 JP JP2021521031A patent/JP7402867B2/en active Active
-
2023
- 2023-11-06 US US18/387,232 patent/US20240087473A1/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005134536A (en) | 2003-10-29 | 2005-05-26 | Omron Corp | Work training support system |
US20110015913A1 (en) * | 2007-06-19 | 2011-01-20 | Kobelco Eco-Solutions Co., Ltd. | Simulation Method, Simulation Apparatus, Biological Treatment Method, and Biological Treatment Apparatus |
JP2011502300A (en) | 2007-10-26 | 2011-01-20 | キンバリー クラーク ワールドワイド インコーポレイテッド | Virtual reality tools for the development of infection management solutions |
US20090177452A1 (en) | 2008-01-08 | 2009-07-09 | Immersion Medical, Inc. | Virtual Tool Manipulation System |
US20140315174A1 (en) * | 2011-11-23 | 2014-10-23 | The Penn State Research Foundation | Universal microsurgical simulator |
US20150000025A1 (en) | 2012-06-27 | 2015-01-01 | sigmund lindsay clements | Touch Free Hygienic Display Control Panel For A Smart Toilet |
US20160370971A1 (en) | 2014-09-18 | 2016-12-22 | Google Inc. | Dress form for three-dimensional drawing inside virtual reality environment |
US11227509B2 (en) * | 2014-12-29 | 2022-01-18 | Help Me See Inc. | Surgical simulator systems and methods |
US20180090029A1 (en) | 2016-09-29 | 2018-03-29 | Simbionix Ltd. | Method and system for medical simulation in an operating room in a virtual reality or augmented reality environment |
WO2018106289A1 (en) | 2016-12-09 | 2018-06-14 | Brent, Roger | Augmented reality procedural system |
Non-Patent Citations (2)
Title |
---|
European Search Report and Examination Report from corresponding application EP19895297.0, dated Aug. 26, 2022. (14 pages). |
Japanese Office Action in corresponding application JP2021-5213578, dated Aug. 24, 2023. (1 page—English translation of prior art listing.) |
Also Published As
Publication number | Publication date |
---|---|
KR20210100089A (en) | 2021-08-13 |
US20220044593A1 (en) | 2022-02-10 |
CA3115710A1 (en) | 2020-06-18 |
WO2020123026A1 (en) | 2020-06-18 |
JP2022513578A (en) | 2022-02-09 |
JP7402867B2 (en) | 2023-12-21 |
EP3894997A4 (en) | 2022-09-07 |
US20240087473A1 (en) | 2024-03-14 |
EP3894997A1 (en) | 2021-10-20 |
AU2019399480A1 (en) | 2021-04-15 |
AU2019399480B2 (en) | 2024-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240087473A1 (en) | Virtual reality simulation and method | |
US10936146B2 (en) | Ergonomic mixed reality step-by-step instructions tethered to 3D holograms in real-world locations | |
CN105793764B (en) | For providing equipment, the method and system of extension display equipment for head-mounted display apparatus | |
Spindler et al. | Use your head: tangible windows for 3D information spaces in a tabletop environment | |
CN105264461B (en) | The interaction of virtual objects and surface | |
CN108780360A (en) | Virtual reality is navigated | |
JP2018505472A (en) | Augmented Reality Object Follower | |
CN102144201A (en) | Method of performing a gaze-based interaction between a user and an interactive display system | |
US20160093230A1 (en) | Domeless simulator | |
CN105074617A (en) | Three-dimensional user interface device and three-dimensional operation processing method | |
Buń et al. | Possibilities and determinants of using low-cost devices in virtual education applications | |
JP2021521572A (en) | Practical laboratory equipment and demonstration machines with hybrid virtual / extended environment and how to use them | |
CN112313605A (en) | Object placement and manipulation in augmented reality environments | |
Adhikarla et al. | Freehand interaction with large-scale 3D map data | |
CN107209567A (en) | The user interface for watching actuating attentively with visual feedback | |
RU2604430C2 (en) | Interaction with three-dimensional virtual scenario | |
EP4279157A1 (en) | Space and content matching for augmented and mixed reality | |
US9939925B2 (en) | Behind-display user interface | |
US20210081051A1 (en) | Methods, apparatus, systems, computer programs for enabling mediated reality | |
Yonov | School atlas with augmented reality | |
TW201724054A (en) | System, method, and computer program product for simulated reality learning | |
Novak-Marcincin et al. | Interactive monitoring of production process with use of augmented reality technology | |
JPWO2020123026A5 (en) | ||
Glueck et al. | DeskCube: using physical zones to select and control combinations of 3D navigation operations | |
Teather | Evaluating 3D pointing techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION UNDERGOING PREEXAM PROCESSING |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: QUALITY EXECUTIVE PARTNERS, INC., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MERSH, CRYSTAL;DUNCAN, BRIAN;MONACHINO, NICOLE;AND OTHERS;SIGNING DATES FROM 20191028 TO 20210407;REEL/FRAME:055863/0419 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |